#
tokens: 3738/50000 8/8 files
lines: off (toggle) GitHub
raw markdown copy
# Directory Structure

```
├── .github
│   └── FUNDING.yml
├── .gitignore
├── LICENSE
├── mcp.json
├── package.json
├── README.md
├── src
│   ├── .DS_Store
│   ├── cli.ts
│   └── index.ts
└── tsconfig.json
```

# Files

--------------------------------------------------------------------------------
/.gitignore:
--------------------------------------------------------------------------------

```
# Dependencies
node_modules/

# Compiled output
dist/

# Logs
npm-debug.log*
yarn-debug.log*
yarn-error.log*

# Environment variables
.env

# IDE
.idea/
.vscode/
*.swp
*.swo

# OS
.DS_Store
Thumbs.db
```

--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------

```markdown
# Ollama MCP Server
An MCP (Model Context Protocol) server for Ollama that enables seamless integration between Ollama's local LLM models and MCP-compatible applications like Claude Desktop.

## Features

- List available Ollama models
- Pull new models from Ollama
- Chat with models using Ollama's chat API
- Get detailed model information
- Automatic port management
- Environment variable configuration

## Prerequisites

- Node.js (v16 or higher)
- npm
- Ollama installed and running locally

## Installation

### Manual Installation
Install globally via npm:

```bash
npm install -g @rawveg/ollama-mcp
```

### Installing in Other MCP Applications

To install the Ollama MCP Server in other MCP-compatible applications (like Cline or Claude Desktop), add the following configuration to your application's MCP settings file:

```json
{
  "mcpServers": {
    "@rawveg/ollama-mcp": {
      "command": "npx",
      "args": [
        "-y",
        "@rawveg/ollama-mcp"
      ]
    }
  }
}
```

The settings file location varies by application:
- Claude Desktop: `claude_desktop_config.json` in the Claude app data directory
- Cline: `cline_mcp_settings.json` in the VS Code global storage

## Usage

### Starting the Server

Simply run:

```bash
ollama-mcp
```

The server will start on port 3456 by default. You can specify a different port using the PORT environment variable:

```bash
PORT=3457 ollama-mcp
```

### Environment Variables

- `PORT`: Server port (default: 3456). Can be used when running directly:
  ```bash
  # When running directly
  PORT=3457 ollama-mcp
  ```
- `OLLAMA_API`: Ollama API endpoint (default: http://localhost:11434)

### API Endpoints

- `GET /models` - List available models
- `POST /models/pull` - Pull a new model
- `POST /chat` - Chat with a model
- `GET /models/:name` - Get model details

## Development

1. Clone the repository:
```bash
git clone https://github.com/rawveg/ollama-mcp.git
cd ollama-mcp
```

2. Install dependencies:
```bash
npm install
```

3. Build the project:
```bash
npm run build
```

4. Start the server:
```bash
npm start
```

## Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

However, this does **not** grant permission to incorporate this project into third-party services or commercial platforms without prior discussion and agreement. While I previously accepted contributions (such as a Dockerfile and related README updates) to support integration with services like **Smithery**, recent actions by a similar service — **Glama** — have required a reassessment of this policy.

Glama has chosen to include open-source MCP projects in their commercial offering without notice or consent, and subsequently created issue requests asking maintainers to perform unpaid work to ensure compatibility with *their* platform. This behaviour — leveraging community labour for profit without dialogue or compensation — is not only inconsiderate, but **ethically problematic**.

As a result, and to protect the integrity of this project and its contributors, the licence has been updated to the **GNU Affero General Public License v3.0 (AGPL-3.0)**. This change ensures that any use of the software — particularly in **commercial or service-based platforms** — must remain fully compliant with the AGPL's terms **and** obtain a separate commercial licence. Merely linking to the original source is not sufficient where the project is being **actively monetised**. If you wish to include this project in a commercial offering, please get in touch **first** to discuss licensing terms.

## License

AGPL v3.0

## Star History

[![Star History Chart](https://api.star-history.com/svg?repos=rawveg/ollama-mcp&type=Date)](https://www.star-history.com/#rawveg/ollama-mcp&Date)

## Related

- [Ollama](https://ollama.ai)
- [Model Context Protocol](https://github.com/anthropics/model-context-protocol)

This project was previously MIT-licensed. As of 20th April 2025, it is now licensed under AGPL-3.0 to prevent unauthorised commercial exploitation. If your use of this project predates this change, please refer to the relevant Git tag or commit for the applicable licence.

```

--------------------------------------------------------------------------------
/tsconfig.json:
--------------------------------------------------------------------------------

```json
{
  "compilerOptions": {
    "target": "ES2020",
    "module": "NodeNext",
    "moduleResolution": "NodeNext",
    "outDir": "./dist",
    "rootDir": "./src",
    "strict": true,
    "esModuleInterop": true,
    "skipLibCheck": true,
    "forceConsistentCasingInFileNames": true
  },
  "include": ["src/**/*"],
  "exclude": ["node_modules", "dist"]
}

```

--------------------------------------------------------------------------------
/package.json:
--------------------------------------------------------------------------------

```json
{
  "name": "@rawveg/ollama-mcp",
  "version": "1.0.9",
  "type": "module",
  "description": "MCP Server for Ollama integration",
  "main": "dist/index.js",
  "bin": {
    "ollama-mcp": "./dist/cli.js"
  },
  "files": [
    "dist",
    "mcp.json"
  ],
  "scripts": {
    "build": "tsc && chmod +x dist/cli.js",
    "start": "node dist/cli.js",
    "prepare": "npm run build"
  },
  "repository": {
    "type": "git",
    "url": "git+https://github.com/rawveg/ollama-mcp.git"
  },
  "keywords": [
    "ollama",
    "mcp",
    "ai",
    "claude"
  ],
  "author": "tigreen",
  "license": "AGPL-3.0-only",
  "dependencies": {
    "express": "^4.18.2",
    "node-fetch": "^3.3.2"
  },
  "devDependencies": {
    "@types/express": "^4.17.21",
    "@types/node": "^20.11.19",
    "typescript": "^5.3.3"
  }
}

```

--------------------------------------------------------------------------------
/.github/FUNDING.yml:
--------------------------------------------------------------------------------

```yaml
# These are supported funding model platforms

github: rawveg
patreon: # Replace with a single Patreon username
open_collective: # Replace with a single Open Collective username
ko_fi: # Replace with a single Ko-fi username
tidelift: # Replace with a single Tidelift platform-name/package-name e.g., npm/babel
community_bridge: # Replace with a single Community Bridge project-name e.g., cloud-foundry
liberapay: # Replace with a single Liberapay username
issuehunt: # Replace with a single IssueHunt username
lfx_crowdfunding: # Replace with a single LFX Crowdfunding project-name e.g., cloud-foundry
polar: # Replace with a single Polar username
buy_me_a_coffee: rawveg
thanks_dev: # Replace with a single thanks.dev username
custom: # Replace with up to 4 custom sponsorship URLs e.g., ['link1', 'link2']

```

--------------------------------------------------------------------------------
/src/cli.ts:
--------------------------------------------------------------------------------

```typescript
#!/usr/bin/env node
/**
 * @license
 * Copyright (c) [Your Name or Organisation] [Year]
 *
 * This file is part of [Project Name].
 *
 * [Project Name] is licensed under the GNU Affero General Public License v3.0.
 * You may obtain a copy of the license at https://www.gnu.org/licenses/agpl-3.0.html
 *
 * Unless required by applicable law or agreed to in writing, software distributed
 * under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR
 * CONDITIONS OF ANY KIND, either express or implied.
 *
 * See the License for the specific language governing permissions and limitations.
 */
import { OllamaMCPServer } from './index.js';

const port = parseInt(process.env.PORT || '3456', 10);
const ollamaApi = process.env.OLLAMA_API || 'http://localhost:11434';

const server = new OllamaMCPServer(ollamaApi);

server.start(port).catch((error: Error) => {
  console.error('Failed to start server:', error);
  process.exit(1);
});

// Handle graceful shutdown
process.on('SIGTERM', () => {
  server.stop().then(() => process.exit(0));
});

process.on('SIGINT', () => {
  server.stop().then(() => process.exit(0));
});
```

--------------------------------------------------------------------------------
/mcp.json:
--------------------------------------------------------------------------------

```json
{
  "name": "ollama-mcp",
  "version": "1.0.3",
  "type": "tool",
  "tools": [
    {
      "name": "list_ollama_models",
      "description": "List all available Ollama models",
      "route": {
        "path": "/models",
        "method": "GET"
      },
      "parameters": {
        "type": "object",
        "properties": {}
      }
    },
    {
      "name": "pull_ollama_model",
      "description": "Pull a new Ollama model",
      "route": {
        "path": "/models/pull",
        "method": "POST"
      },
      "parameters": {
        "type": "object",
        "required": ["name"],
        "properties": {
          "name": {
            "type": "string",
            "description": "Name of the model to pull"
          }
        }
      }
    },
    {
      "name": "chat_with_ollama",
      "description": "Chat with an Ollama model",
      "route": {
        "path": "/chat",
        "method": "POST"
      },
      "parameters": {
        "type": "object",
        "required": ["model", "messages"],
        "properties": {
          "model": {
            "type": "string",
            "description": "Name of the model to chat with"
          },
          "messages": {
            "type": "array",
            "description": "Array of chat messages",
            "items": {
              "type": "object",
              "required": ["role", "content"],
              "properties": {
                "role": {
                  "type": "string",
                  "enum": ["system", "user", "assistant"],
                  "description": "Role of the message sender"
                },
                "content": {
                  "type": "string",
                  "description": "Content of the message"
                }
              }
            }
          }
        }
      }
    },
    {
      "name": "get_ollama_model_info",
      "description": "Get information about a specific Ollama model",
      "route": {
        "path": "/models/:name",
        "method": "GET"
      },
      "parameters": {
        "type": "object",
        "required": ["name"],
        "properties": {
          "name": {
            "type": "string",
            "description": "Name of the model to get information about"
          }
        }
      }
    }
  ],
  "env": {
    "PORT": {
      "type": "number",
      "default": 3456,
      "description": "Port to run the server on"
    },
    "OLLAMA_API": {
      "type": "string",
      "default": "http://localhost:11434",
      "description": "Ollama API endpoint"
    }
  }
}
```

--------------------------------------------------------------------------------
/src/index.ts:
--------------------------------------------------------------------------------

```typescript
// src/index.ts
/**
 * @license
 * Copyright (c) [Your Name or Organisation] [Year]
 *
 * This file is part of [Project Name].
 *
 * [Project Name] is licensed under the GNU Affero General Public License v3.0.
 * You may obtain a copy of the license at https://www.gnu.org/licenses/agpl-3.0.html
 *
 * Unless required by applicable law or agreed to in writing, software distributed
 * under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR
 * CONDITIONS OF ANY KIND, either express or implied.
 *
 * See the License for the specific language governing permissions and limitations.
 */
import express from 'express';
import fetch from 'node-fetch';
import type { Request, Response } from 'express';

export class OllamaMCPServer {
  private app = express();
  private server: any;
  private ollamaApi: string;

  constructor(ollamaApi: string = 'http://localhost:11434') {
    this.ollamaApi = ollamaApi;
    this.setupRoutes();
  }

  private setupRoutes() {
    this.app.use(express.json());

    this.app.get('/models', this.listModels.bind(this));
    this.app.post('/models/pull', this.pullModel.bind(this));
    this.app.post('/chat', this.chat.bind(this));
    this.app.get('/models/:name', this.getModelInfo.bind(this));
  }

  private async listModels(req: Request, res: Response) {
    try {
      const response = await fetch(`${this.ollamaApi}/api/tags`);
      const data = await response.json();
      res.json(data);
    } catch (error) {
      res.status(500).json({ error: String(error) });
    }
  }

  private async pullModel(req: Request, res: Response) {
    const { name } = req.body;
    if (!name) {
      return res.status(400).json({ error: 'Model name is required' });
    }

    try {
      const response = await fetch(`${this.ollamaApi}/api/pull`, {
        method: 'POST',
        headers: { 'Content-Type': 'application/json' },
        body: JSON.stringify({ name })
      });
      const data = await response.json();
      res.json(data);
    } catch (error) {
      res.status(500).json({ error: String(error) });
    }
  }

  private async chat(req: Request, res: Response) {
    const { model, messages } = req.body;
    
    if (!model || !messages) {
      return res.status(400).json({ error: 'Model and messages are required' });
    }

    try {
      const response = await fetch(`${this.ollamaApi}/api/chat`, {
        method: 'POST',
        headers: { 'Content-Type': 'application/json' },
        body: JSON.stringify({
          model,
          messages,
          stream: false
        })
      });
      const data = await response.json();
      res.json(data);
    } catch (error) {
      res.status(500).json({ error: String(error) });
    }
  }

  private async getModelInfo(req: Request, res: Response) {
    const { name } = req.params;
    try {
      const response = await fetch(`${this.ollamaApi}/api/show`, {
        method: 'POST',
        headers: { 'Content-Type': 'application/json' },
        body: JSON.stringify({ name })
      });
      const data = await response.json();
      res.json(data);
    } catch (error) {
      res.status(500).json({ error: String(error) });
    }
  }

  public start(port: number = 3456): Promise<void> {
    return new Promise((resolve, reject) => {
      try {
        this.server = this.app.listen(port, () => {
          console.log(`Ollama MCP Server running on port ${port}`);
          resolve();
        });

        this.server.on('error', (error: Error & { code?: string }) => {
          if (error.code === 'EADDRINUSE') {
            reject(new Error(`Port ${port} is already in use`));
          } else {
            reject(error);
          }
        });
      } catch (error) {
        reject(error);
      }
    });
  }

  public stop(): Promise<void> {
    return new Promise((resolve, reject) => {
      if (this.server) {
        this.server.close((err?: Error) => {
          if (err) reject(err);
          else resolve();
        });
      } else {
        resolve();
      }
    });
  }
}

export default OllamaMCPServer;
```