#
tokens: 5214/50000 8/8 files
lines: on (toggle) GitHub
raw markdown copy reset
# Directory Structure

```
├── .github
│   └── FUNDING.yml
├── .gitignore
├── LICENSE
├── mcp.json
├── package.json
├── README.md
├── src
│   ├── .DS_Store
│   ├── cli.ts
│   └── index.ts
└── tsconfig.json
```

# Files

--------------------------------------------------------------------------------
/.gitignore:
--------------------------------------------------------------------------------

```
 1 | # Dependencies
 2 | node_modules/
 3 | 
 4 | # Compiled output
 5 | dist/
 6 | 
 7 | # Logs
 8 | npm-debug.log*
 9 | yarn-debug.log*
10 | yarn-error.log*
11 | 
12 | # Environment variables
13 | .env
14 | 
15 | # IDE
16 | .idea/
17 | .vscode/
18 | *.swp
19 | *.swo
20 | 
21 | # OS
22 | .DS_Store
23 | Thumbs.db
```

--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------

```markdown
  1 | # Ollama MCP Server
  2 | An MCP (Model Context Protocol) server for Ollama that enables seamless integration between Ollama's local LLM models and MCP-compatible applications like Claude Desktop.
  3 | 
  4 | ## Features
  5 | 
  6 | - List available Ollama models
  7 | - Pull new models from Ollama
  8 | - Chat with models using Ollama's chat API
  9 | - Get detailed model information
 10 | - Automatic port management
 11 | - Environment variable configuration
 12 | 
 13 | ## Prerequisites
 14 | 
 15 | - Node.js (v16 or higher)
 16 | - npm
 17 | - Ollama installed and running locally
 18 | 
 19 | ## Installation
 20 | 
 21 | ### Manual Installation
 22 | Install globally via npm:
 23 | 
 24 | ```bash
 25 | npm install -g @rawveg/ollama-mcp
 26 | ```
 27 | 
 28 | ### Installing in Other MCP Applications
 29 | 
 30 | To install the Ollama MCP Server in other MCP-compatible applications (like Cline or Claude Desktop), add the following configuration to your application's MCP settings file:
 31 | 
 32 | ```json
 33 | {
 34 |   "mcpServers": {
 35 |     "@rawveg/ollama-mcp": {
 36 |       "command": "npx",
 37 |       "args": [
 38 |         "-y",
 39 |         "@rawveg/ollama-mcp"
 40 |       ]
 41 |     }
 42 |   }
 43 | }
 44 | ```
 45 | 
 46 | The settings file location varies by application:
 47 | - Claude Desktop: `claude_desktop_config.json` in the Claude app data directory
 48 | - Cline: `cline_mcp_settings.json` in the VS Code global storage
 49 | 
 50 | ## Usage
 51 | 
 52 | ### Starting the Server
 53 | 
 54 | Simply run:
 55 | 
 56 | ```bash
 57 | ollama-mcp
 58 | ```
 59 | 
 60 | The server will start on port 3456 by default. You can specify a different port using the PORT environment variable:
 61 | 
 62 | ```bash
 63 | PORT=3457 ollama-mcp
 64 | ```
 65 | 
 66 | ### Environment Variables
 67 | 
 68 | - `PORT`: Server port (default: 3456). Can be used when running directly:
 69 |   ```bash
 70 |   # When running directly
 71 |   PORT=3457 ollama-mcp
 72 |   ```
 73 | - `OLLAMA_API`: Ollama API endpoint (default: http://localhost:11434)
 74 | 
 75 | ### API Endpoints
 76 | 
 77 | - `GET /models` - List available models
 78 | - `POST /models/pull` - Pull a new model
 79 | - `POST /chat` - Chat with a model
 80 | - `GET /models/:name` - Get model details
 81 | 
 82 | ## Development
 83 | 
 84 | 1. Clone the repository:
 85 | ```bash
 86 | git clone https://github.com/rawveg/ollama-mcp.git
 87 | cd ollama-mcp
 88 | ```
 89 | 
 90 | 2. Install dependencies:
 91 | ```bash
 92 | npm install
 93 | ```
 94 | 
 95 | 3. Build the project:
 96 | ```bash
 97 | npm run build
 98 | ```
 99 | 
100 | 4. Start the server:
101 | ```bash
102 | npm start
103 | ```
104 | 
105 | ## Contributing
106 | 
107 | Contributions are welcome! Please feel free to submit a Pull Request.
108 | 
109 | However, this does **not** grant permission to incorporate this project into third-party services or commercial platforms without prior discussion and agreement. While I previously accepted contributions (such as a Dockerfile and related README updates) to support integration with services like **Smithery**, recent actions by a similar service — **Glama** — have required a reassessment of this policy.
110 | 
111 | Glama has chosen to include open-source MCP projects in their commercial offering without notice or consent, and subsequently created issue requests asking maintainers to perform unpaid work to ensure compatibility with *their* platform. This behaviour — leveraging community labour for profit without dialogue or compensation — is not only inconsiderate, but **ethically problematic**.
112 | 
113 | As a result, and to protect the integrity of this project and its contributors, the licence has been updated to the **GNU Affero General Public License v3.0 (AGPL-3.0)**. This change ensures that any use of the software — particularly in **commercial or service-based platforms** — must remain fully compliant with the AGPL's terms **and** obtain a separate commercial licence. Merely linking to the original source is not sufficient where the project is being **actively monetised**. If you wish to include this project in a commercial offering, please get in touch **first** to discuss licensing terms.
114 | 
115 | ## License
116 | 
117 | AGPL v3.0
118 | 
119 | ## Star History
120 | 
121 | [![Star History Chart](https://api.star-history.com/svg?repos=rawveg/ollama-mcp&type=Date)](https://www.star-history.com/#rawveg/ollama-mcp&Date)
122 | 
123 | ## Related
124 | 
125 | - [Ollama](https://ollama.ai)
126 | - [Model Context Protocol](https://github.com/anthropics/model-context-protocol)
127 | 
128 | This project was previously MIT-licensed. As of 20th April 2025, it is now licensed under AGPL-3.0 to prevent unauthorised commercial exploitation. If your use of this project predates this change, please refer to the relevant Git tag or commit for the applicable licence.
129 | 
```

--------------------------------------------------------------------------------
/tsconfig.json:
--------------------------------------------------------------------------------

```json
 1 | {
 2 |   "compilerOptions": {
 3 |     "target": "ES2020",
 4 |     "module": "NodeNext",
 5 |     "moduleResolution": "NodeNext",
 6 |     "outDir": "./dist",
 7 |     "rootDir": "./src",
 8 |     "strict": true,
 9 |     "esModuleInterop": true,
10 |     "skipLibCheck": true,
11 |     "forceConsistentCasingInFileNames": true
12 |   },
13 |   "include": ["src/**/*"],
14 |   "exclude": ["node_modules", "dist"]
15 | }
16 | 
```

--------------------------------------------------------------------------------
/package.json:
--------------------------------------------------------------------------------

```json
 1 | {
 2 |   "name": "@rawveg/ollama-mcp",
 3 |   "version": "1.0.9",
 4 |   "type": "module",
 5 |   "description": "MCP Server for Ollama integration",
 6 |   "main": "dist/index.js",
 7 |   "bin": {
 8 |     "ollama-mcp": "./dist/cli.js"
 9 |   },
10 |   "files": [
11 |     "dist",
12 |     "mcp.json"
13 |   ],
14 |   "scripts": {
15 |     "build": "tsc && chmod +x dist/cli.js",
16 |     "start": "node dist/cli.js",
17 |     "prepare": "npm run build"
18 |   },
19 |   "repository": {
20 |     "type": "git",
21 |     "url": "git+https://github.com/rawveg/ollama-mcp.git"
22 |   },
23 |   "keywords": [
24 |     "ollama",
25 |     "mcp",
26 |     "ai",
27 |     "claude"
28 |   ],
29 |   "author": "tigreen",
30 |   "license": "AGPL-3.0-only",
31 |   "dependencies": {
32 |     "express": "^4.18.2",
33 |     "node-fetch": "^3.3.2"
34 |   },
35 |   "devDependencies": {
36 |     "@types/express": "^4.17.21",
37 |     "@types/node": "^20.11.19",
38 |     "typescript": "^5.3.3"
39 |   }
40 | }
41 | 
```

--------------------------------------------------------------------------------
/.github/FUNDING.yml:
--------------------------------------------------------------------------------

```yaml
 1 | # These are supported funding model platforms
 2 | 
 3 | github: rawveg
 4 | patreon: # Replace with a single Patreon username
 5 | open_collective: # Replace with a single Open Collective username
 6 | ko_fi: # Replace with a single Ko-fi username
 7 | tidelift: # Replace with a single Tidelift platform-name/package-name e.g., npm/babel
 8 | community_bridge: # Replace with a single Community Bridge project-name e.g., cloud-foundry
 9 | liberapay: # Replace with a single Liberapay username
10 | issuehunt: # Replace with a single IssueHunt username
11 | lfx_crowdfunding: # Replace with a single LFX Crowdfunding project-name e.g., cloud-foundry
12 | polar: # Replace with a single Polar username
13 | buy_me_a_coffee: rawveg
14 | thanks_dev: # Replace with a single thanks.dev username
15 | custom: # Replace with up to 4 custom sponsorship URLs e.g., ['link1', 'link2']
16 | 
```

--------------------------------------------------------------------------------
/src/cli.ts:
--------------------------------------------------------------------------------

```typescript
 1 | #!/usr/bin/env node
 2 | /**
 3 |  * @license
 4 |  * Copyright (c) [Your Name or Organisation] [Year]
 5 |  *
 6 |  * This file is part of [Project Name].
 7 |  *
 8 |  * [Project Name] is licensed under the GNU Affero General Public License v3.0.
 9 |  * You may obtain a copy of the license at https://www.gnu.org/licenses/agpl-3.0.html
10 |  *
11 |  * Unless required by applicable law or agreed to in writing, software distributed
12 |  * under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR
13 |  * CONDITIONS OF ANY KIND, either express or implied.
14 |  *
15 |  * See the License for the specific language governing permissions and limitations.
16 |  */
17 | import { OllamaMCPServer } from './index.js';
18 | 
19 | const port = parseInt(process.env.PORT || '3456', 10);
20 | const ollamaApi = process.env.OLLAMA_API || 'http://localhost:11434';
21 | 
22 | const server = new OllamaMCPServer(ollamaApi);
23 | 
24 | server.start(port).catch((error: Error) => {
25 |   console.error('Failed to start server:', error);
26 |   process.exit(1);
27 | });
28 | 
29 | // Handle graceful shutdown
30 | process.on('SIGTERM', () => {
31 |   server.stop().then(() => process.exit(0));
32 | });
33 | 
34 | process.on('SIGINT', () => {
35 |   server.stop().then(() => process.exit(0));
36 | });
```

--------------------------------------------------------------------------------
/mcp.json:
--------------------------------------------------------------------------------

```json
  1 | {
  2 |   "name": "ollama-mcp",
  3 |   "version": "1.0.3",
  4 |   "type": "tool",
  5 |   "tools": [
  6 |     {
  7 |       "name": "list_ollama_models",
  8 |       "description": "List all available Ollama models",
  9 |       "route": {
 10 |         "path": "/models",
 11 |         "method": "GET"
 12 |       },
 13 |       "parameters": {
 14 |         "type": "object",
 15 |         "properties": {}
 16 |       }
 17 |     },
 18 |     {
 19 |       "name": "pull_ollama_model",
 20 |       "description": "Pull a new Ollama model",
 21 |       "route": {
 22 |         "path": "/models/pull",
 23 |         "method": "POST"
 24 |       },
 25 |       "parameters": {
 26 |         "type": "object",
 27 |         "required": ["name"],
 28 |         "properties": {
 29 |           "name": {
 30 |             "type": "string",
 31 |             "description": "Name of the model to pull"
 32 |           }
 33 |         }
 34 |       }
 35 |     },
 36 |     {
 37 |       "name": "chat_with_ollama",
 38 |       "description": "Chat with an Ollama model",
 39 |       "route": {
 40 |         "path": "/chat",
 41 |         "method": "POST"
 42 |       },
 43 |       "parameters": {
 44 |         "type": "object",
 45 |         "required": ["model", "messages"],
 46 |         "properties": {
 47 |           "model": {
 48 |             "type": "string",
 49 |             "description": "Name of the model to chat with"
 50 |           },
 51 |           "messages": {
 52 |             "type": "array",
 53 |             "description": "Array of chat messages",
 54 |             "items": {
 55 |               "type": "object",
 56 |               "required": ["role", "content"],
 57 |               "properties": {
 58 |                 "role": {
 59 |                   "type": "string",
 60 |                   "enum": ["system", "user", "assistant"],
 61 |                   "description": "Role of the message sender"
 62 |                 },
 63 |                 "content": {
 64 |                   "type": "string",
 65 |                   "description": "Content of the message"
 66 |                 }
 67 |               }
 68 |             }
 69 |           }
 70 |         }
 71 |       }
 72 |     },
 73 |     {
 74 |       "name": "get_ollama_model_info",
 75 |       "description": "Get information about a specific Ollama model",
 76 |       "route": {
 77 |         "path": "/models/:name",
 78 |         "method": "GET"
 79 |       },
 80 |       "parameters": {
 81 |         "type": "object",
 82 |         "required": ["name"],
 83 |         "properties": {
 84 |           "name": {
 85 |             "type": "string",
 86 |             "description": "Name of the model to get information about"
 87 |           }
 88 |         }
 89 |       }
 90 |     }
 91 |   ],
 92 |   "env": {
 93 |     "PORT": {
 94 |       "type": "number",
 95 |       "default": 3456,
 96 |       "description": "Port to run the server on"
 97 |     },
 98 |     "OLLAMA_API": {
 99 |       "type": "string",
100 |       "default": "http://localhost:11434",
101 |       "description": "Ollama API endpoint"
102 |     }
103 |   }
104 | }
```

--------------------------------------------------------------------------------
/src/index.ts:
--------------------------------------------------------------------------------

```typescript
  1 | // src/index.ts
  2 | /**
  3 |  * @license
  4 |  * Copyright (c) [Your Name or Organisation] [Year]
  5 |  *
  6 |  * This file is part of [Project Name].
  7 |  *
  8 |  * [Project Name] is licensed under the GNU Affero General Public License v3.0.
  9 |  * You may obtain a copy of the license at https://www.gnu.org/licenses/agpl-3.0.html
 10 |  *
 11 |  * Unless required by applicable law or agreed to in writing, software distributed
 12 |  * under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR
 13 |  * CONDITIONS OF ANY KIND, either express or implied.
 14 |  *
 15 |  * See the License for the specific language governing permissions and limitations.
 16 |  */
 17 | import express from 'express';
 18 | import fetch from 'node-fetch';
 19 | import type { Request, Response } from 'express';
 20 | 
 21 | export class OllamaMCPServer {
 22 |   private app = express();
 23 |   private server: any;
 24 |   private ollamaApi: string;
 25 | 
 26 |   constructor(ollamaApi: string = 'http://localhost:11434') {
 27 |     this.ollamaApi = ollamaApi;
 28 |     this.setupRoutes();
 29 |   }
 30 | 
 31 |   private setupRoutes() {
 32 |     this.app.use(express.json());
 33 | 
 34 |     this.app.get('/models', this.listModels.bind(this));
 35 |     this.app.post('/models/pull', this.pullModel.bind(this));
 36 |     this.app.post('/chat', this.chat.bind(this));
 37 |     this.app.get('/models/:name', this.getModelInfo.bind(this));
 38 |   }
 39 | 
 40 |   private async listModels(req: Request, res: Response) {
 41 |     try {
 42 |       const response = await fetch(`${this.ollamaApi}/api/tags`);
 43 |       const data = await response.json();
 44 |       res.json(data);
 45 |     } catch (error) {
 46 |       res.status(500).json({ error: String(error) });
 47 |     }
 48 |   }
 49 | 
 50 |   private async pullModel(req: Request, res: Response) {
 51 |     const { name } = req.body;
 52 |     if (!name) {
 53 |       return res.status(400).json({ error: 'Model name is required' });
 54 |     }
 55 | 
 56 |     try {
 57 |       const response = await fetch(`${this.ollamaApi}/api/pull`, {
 58 |         method: 'POST',
 59 |         headers: { 'Content-Type': 'application/json' },
 60 |         body: JSON.stringify({ name })
 61 |       });
 62 |       const data = await response.json();
 63 |       res.json(data);
 64 |     } catch (error) {
 65 |       res.status(500).json({ error: String(error) });
 66 |     }
 67 |   }
 68 | 
 69 |   private async chat(req: Request, res: Response) {
 70 |     const { model, messages } = req.body;
 71 |     
 72 |     if (!model || !messages) {
 73 |       return res.status(400).json({ error: 'Model and messages are required' });
 74 |     }
 75 | 
 76 |     try {
 77 |       const response = await fetch(`${this.ollamaApi}/api/chat`, {
 78 |         method: 'POST',
 79 |         headers: { 'Content-Type': 'application/json' },
 80 |         body: JSON.stringify({
 81 |           model,
 82 |           messages,
 83 |           stream: false
 84 |         })
 85 |       });
 86 |       const data = await response.json();
 87 |       res.json(data);
 88 |     } catch (error) {
 89 |       res.status(500).json({ error: String(error) });
 90 |     }
 91 |   }
 92 | 
 93 |   private async getModelInfo(req: Request, res: Response) {
 94 |     const { name } = req.params;
 95 |     try {
 96 |       const response = await fetch(`${this.ollamaApi}/api/show`, {
 97 |         method: 'POST',
 98 |         headers: { 'Content-Type': 'application/json' },
 99 |         body: JSON.stringify({ name })
100 |       });
101 |       const data = await response.json();
102 |       res.json(data);
103 |     } catch (error) {
104 |       res.status(500).json({ error: String(error) });
105 |     }
106 |   }
107 | 
108 |   public start(port: number = 3456): Promise<void> {
109 |     return new Promise((resolve, reject) => {
110 |       try {
111 |         this.server = this.app.listen(port, () => {
112 |           console.log(`Ollama MCP Server running on port ${port}`);
113 |           resolve();
114 |         });
115 | 
116 |         this.server.on('error', (error: Error & { code?: string }) => {
117 |           if (error.code === 'EADDRINUSE') {
118 |             reject(new Error(`Port ${port} is already in use`));
119 |           } else {
120 |             reject(error);
121 |           }
122 |         });
123 |       } catch (error) {
124 |         reject(error);
125 |       }
126 |     });
127 |   }
128 | 
129 |   public stop(): Promise<void> {
130 |     return new Promise((resolve, reject) => {
131 |       if (this.server) {
132 |         this.server.close((err?: Error) => {
133 |           if (err) reject(err);
134 |           else resolve();
135 |         });
136 |       } else {
137 |         resolve();
138 |       }
139 |     });
140 |   }
141 | }
142 | 
143 | export default OllamaMCPServer;
```