#
tokens: 2265/50000 2/2 files
lines: on (toggle) GitHub
raw markdown copy reset
# Directory Structure

```
├── .gitignore
├── docs
│   └── README.md
├── LICENSE
├── mcp_client.py
├── mcp_server.py
├── README.md
├── README.zh-CN.md
├── requirements.txt
└── routers
    ├── __init__.py
    ├── base_router.py
    ├── prompt_router.py
    ├── resource_router.py
    ├── sampling_router.py
    └── tool_router.py
```

# Files

--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------

```markdown
  1 | # MCP Server
  2 | 
  3 | [中文文档](README.zh-CN.md)
  4 | 
  5 | ## Project Overview
  6 | 
  7 | Built on FastAPI and MCP (Model Context Protocol), this project enables standardized context interaction between AI models and development environments. It enhances the scalability and maintainability of AI applications by simplifying model deployment, providing efficient API endpoints, and ensuring consistency in model input and output, making it easier for developers to integrate and manage AI tasks.
  8 | 
  9 | MCP (Model Context Protocol) is a unified protocol for context interaction between AI models and development environments. This project provides a Python-based MCP server implementation that supports basic MCP protocol features, including initialization, sampling, and session management.
 10 | 
 11 | ## Features
 12 | 
 13 | - **JSON-RPC 2.0**: Request-response communication based on standard JSON-RPC 2.0 protocol
 14 | - **SSE Connection**: Support for Server-Sent Events connections for real-time notifications
 15 | - **Modular Design**: Modular architecture for easy extension and customization
 16 | - **Asynchronous Processing**: High-performance service using FastAPI and asynchronous IO
 17 | - **Complete Client**: Includes a full test client implementation
 18 | 
 19 | ## Project Structure
 20 | 
 21 | ```
 22 | mcp_server/
 23 | ├── mcp_server.py         # MCP server main program
 24 | ├── mcp_client.py         # MCP client test program
 25 | ├── routers/
 26 | │   ├── __init__.py       # Router package initialization
 27 | │   └── base_router.py    # Base router implementation
 28 | ├── requirements.txt      # Project dependencies
 29 | └── README.md             # Project documentation
 30 | ```
 31 | 
 32 | ## Installation
 33 | 
 34 | 1. Clone the repository:
 35 | 
 36 | ```bash
 37 | git clone https://github.com/freedanfan/mcp_server.git
 38 | cd mcp_server
 39 | ```
 40 | 
 41 | 2. Install dependencies:
 42 | 
 43 | ```bash
 44 | pip install -r requirements.txt
 45 | ```
 46 | 
 47 | ## Usage
 48 | 
 49 | ### Starting the Server
 50 | 
 51 | ```bash
 52 | python mcp_server.py
 53 | ```
 54 | 
 55 | By default, the server will start on `127.0.0.1:12000`. You can customize the host and port using environment variables:
 56 | 
 57 | ```bash
 58 | export MCP_SERVER_HOST=0.0.0.0
 59 | export MCP_SERVER_PORT=8000
 60 | python mcp_server.py
 61 | ```
 62 | 
 63 | ### Running the Client
 64 | 
 65 | Run the client in another terminal:
 66 | 
 67 | ```bash
 68 | python mcp_client.py
 69 | ```
 70 | 
 71 | If the server is not running at the default address, you can set an environment variable:
 72 | 
 73 | ```bash
 74 | export MCP_SERVER_URL="http://your-server-address:port"
 75 | python mcp_client.py
 76 | ```
 77 | 
 78 | ## API Endpoints
 79 | 
 80 | The server provides the following API endpoints:
 81 | 
 82 | - **Root Path** (`/`): Provides server information
 83 | - **API Endpoint** (`/api`): Handles JSON-RPC requests
 84 | - **SSE Endpoint** (`/sse`): Handles SSE connections
 85 | 
 86 | ## MCP Protocol Implementation
 87 | 
 88 | ### Initialization Flow
 89 | 
 90 | 1. Client connects to the server via SSE
 91 | 2. Server returns the API endpoint URI
 92 | 3. Client sends an initialization request with protocol version and capabilities
 93 | 4. Server responds to the initialization request, returning server capabilities
 94 | 
 95 | ### Sampling Request
 96 | 
 97 | Clients can send sampling requests with prompts:
 98 | 
 99 | ```json
100 | {
101 |   "jsonrpc": "2.0",
102 |   "id": "request-id",
103 |   "method": "sample",
104 |   "params": {
105 |     "prompt": "Hello, please introduce yourself."
106 |   }
107 | }
108 | ```
109 | 
110 | The server will return sampling results:
111 | 
112 | ```json
113 | {
114 |   "jsonrpc": "2.0",
115 |   "id": "request-id",
116 |   "result": {
117 |     "content": "This is a response to the prompt...",
118 |     "usage": {
119 |       "prompt_tokens": 10,
120 |       "completion_tokens": 50,
121 |       "total_tokens": 60
122 |     }
123 |   }
124 | }
125 | ```
126 | 
127 | ### Closing a Session
128 | 
129 | Clients can send a shutdown request:
130 | 
131 | ```json
132 | {
133 |   "jsonrpc": "2.0",
134 |   "id": "request-id",
135 |   "method": "shutdown",
136 |   "params": {}
137 | }
138 | ```
139 | 
140 | The server will gracefully shut down:
141 | 
142 | ```json
143 | {
144 |   "jsonrpc": "2.0",
145 |   "id": "request-id",
146 |   "result": {
147 |     "status": "shutting_down"
148 |   }
149 | }
150 | ```
151 | 
152 | ## Development Extensions
153 | 
154 | ### Adding New Methods
155 | 
156 | To add new MCP methods, add a handler function to the `MCPServer` class and register it in the `_register_methods` method:
157 | 
158 | ```python
159 | def handle_new_method(self, params: dict) -> dict:
160 |     """Handle new method"""
161 |     logger.info(f"Received new method request: {params}")
162 |     # Processing logic
163 |     return {"result": "success"}
164 | 
165 | def _register_methods(self):
166 |     # Register existing methods
167 |     self.router.register_method("initialize", self.handle_initialize)
168 |     self.router.register_method("sample", self.handle_sample)
169 |     self.router.register_method("shutdown", self.handle_shutdown)
170 |     # Register new method
171 |     self.router.register_method("new_method", self.handle_new_method)
172 | ```
173 | 
174 | ### Integrating AI Models
175 | 
176 | To integrate actual AI models, modify the `handle_sample` method:
177 | 
178 | ```python
179 | async def handle_sample(self, params: dict) -> dict:
180 |     """Handle sampling request"""
181 |     logger.info(f"Received sampling request: {params}")
182 |     
183 |     # Get prompt
184 |     prompt = params.get("prompt", "")
185 |     
186 |     # Call AI model API
187 |     # For example: using OpenAI API
188 |     response = await openai.ChatCompletion.acreate(
189 |         model="gpt-4",
190 |         messages=[{"role": "user", "content": prompt}]
191 |     )
192 |     
193 |     content = response.choices[0].message.content
194 |     usage = response.usage
195 |     
196 |     return {
197 |         "content": content,
198 |         "usage": {
199 |             "prompt_tokens": usage.prompt_tokens,
200 |             "completion_tokens": usage.completion_tokens,
201 |             "total_tokens": usage.total_tokens
202 |         }
203 |     }
204 | ```
205 | 
206 | ## Troubleshooting
207 | 
208 | ### Common Issues
209 | 
210 | 1. **Connection Errors**: Ensure the server is running and the client is using the correct server URL
211 | 2. **405 Method Not Allowed**: Ensure the client is sending requests to the correct API endpoint
212 | 3. **SSE Connection Failure**: Check network connections and firewall settings
213 | 
214 | ### Logging
215 | 
216 | Both server and client provide detailed logging. View logs for more information:
217 | 
218 | ```bash
219 | # Increase log level
220 | export PYTHONPATH=.
221 | python -m logging -v DEBUG -m mcp_server
222 | ```
223 | 
224 | ## References
225 | 
226 | - [MCP Protocol Specification](https://www.claudemcp.com/specification)
227 | - [FastAPI Documentation](https://fastapi.tiangolo.com/)
228 | - [JSON-RPC 2.0 Specification](https://www.jsonrpc.org/specification)
229 | - [SSE Specification](https://html.spec.whatwg.org/multipage/server-sent-events.html)
230 | 
231 | ## License
232 | 
233 | This project is licensed under the MIT License. See the LICENSE file for details. 
```

--------------------------------------------------------------------------------
/requirements.txt:
--------------------------------------------------------------------------------

```
1 | requests>=2.28.0
2 | anthropic>=0.5.0
3 | python-dotenv>=1.0.0
4 | openai>=1.3.0
5 | fastapi>=0.95.0
6 | uvicorn>=0.22.0
7 | pydantic>=1.10.0
8 | sseclient-py>=1.7.2 
```