# Directory Structure
```
├── .gitignore
├── .python-version
├── Dockerfile
├── LICENSE
├── llms-install.md
├── pyproject.toml
├── README.md
├── requirements-dev.lock
├── requirements.lock
├── smithery.yaml
└── src
└── code2prompt_mcp
├── __init__.py
└── main.py
```
# Files
--------------------------------------------------------------------------------
/.python-version:
--------------------------------------------------------------------------------
```
3.13.1
```
--------------------------------------------------------------------------------
/.gitignore:
--------------------------------------------------------------------------------
```
# python generated files
__pycache__/
*.py[oc]
build/
dist/
wheels/
*.egg-info
# venv
.venv
.env
```
--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------
```markdown
[](https://mseep.ai/app/odancona-code2prompt-mcp)
# code2prompt-mcp
An MCP server that generates contextual prompts from codebases, making it easier for AI assistants to understand and work with your code repositories.
## About
code2prompt-mcp leverages the high-performance [code2prompt-rs](https://github.com/yourusername/code2prompt-rs) Rust library to analyze codebases and produce structured summaries. It helps bridge the gap between your code and language models by extracting relevant context in a format that's optimized for AI consumption.
## Installation
This project uses [Rye](https://rye.astral.sh/) for dependency management, make sure you have it installed.
To install the necessary dependencies, and build the module in the local environment, run:
```bash
# Clone the repository
git clone https://github.com/odancona/code2prompt-mcp.git
cd code2prompt-mcp
# Install dependencies with Rye
rye build
```
It will install all the required dependencies specified in the `pyproject.toml` file in the `.venv` directory.
## Usage
Run the MCP server:
```bash
rye run python code2prompt_mcp.main
```
## License
MIT License - See LICENSE file for details.
## Development
For testing, you can use the MCP Inspector:
```bash
npx @modelcontextprotocol/inspector python -m code2prompt_mcp.main
```
```
--------------------------------------------------------------------------------
/src/code2prompt_mcp/__init__.py:
--------------------------------------------------------------------------------
```python
def hello() -> str:
return "Hello from code2prompt-mcp!"
```
--------------------------------------------------------------------------------
/smithery.yaml:
--------------------------------------------------------------------------------
```yaml
# Smithery.ai configuration
startCommand:
type: stdio
configSchema: {}
commandFunction: |
(config) => ({
command: "python",
args: ["src/code2prompt_mcp/main.py"],
env: {}
})
# Optional build configurations
build:
dockerfile: Dockerfile
dockerBuildPath: .
```
--------------------------------------------------------------------------------
/pyproject.toml:
--------------------------------------------------------------------------------
```toml
[project]
name = "code2prompt-mcp"
version = "0.1.0"
description = "MCP server for Code2Prompt"
authors = [
{ name = "Olivier D'Ancona", email = "[email protected]" },
]
dependencies = [
"mcp>=1.4.1",
"httpx>=0.28.1",
"dotenv>=0.9.9",
"colorlog>=6.9.0",
"code2prompt-rs>=3.2.1",
]
readme = "README.md"
requires-python = ">= 3.8"
[build-system]
requires = ["hatchling"]
build-backend = "hatchling.build"
[tool.rye]
managed = true
dev-dependencies = []
virtual = true
[tool.hatch.metadata]
allow-direct-references = true
[tool.hatch.build.targets.wheel]
packages = ["src/code2prompt_mcp"]
```
--------------------------------------------------------------------------------
/Dockerfile:
--------------------------------------------------------------------------------
```dockerfile
FROM python:3.13-slim
# Install system dependencies for Rust and UV
RUN apt update && apt install -y \
curl build-essential \
&& rm -rf /var/lib/apt/lists/*
# Install Rust and Cargo
RUN curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh -s -- -y
ENV PATH="/root/.cargo/bin:$PATH"
# Install UV
RUN pip install uv
WORKDIR /app
# Copy necessary project metadata files first
COPY pyproject.toml requirements.lock README.md ./
# Install dependencies
RUN uv pip install --no-cache --system -r requirements.lock
# Copy the source code after dependencies
COPY src ./src
CMD ["python", "src/code2prompt_mcp/main.py"]
```
--------------------------------------------------------------------------------
/llms-install.md:
--------------------------------------------------------------------------------
```markdown
# Repomix MCP Server Installation Guide
This guide is specifically designed for AI agents like Cline to install and configure the Repomix MCP server for use with LLM applications like Claude Desktop, Cursor, Roo Code, and Cline.
## Overview of code2prompt-mcp
An MCP server that generates contextual prompts from codebases, making it easier for AI assistants to understand and work with your code repositories.
code2prompt-mcp leverages the high-performance [code2prompt-rs](https://github.com/yourusername/code2prompt-rs) Rust library to analyze codebases and produce structured summaries. It helps bridge the gap between your code and language models by extracting relevant context in a format that's optimized for AI consumption.
## Prerequisites
Before installation, you need:
1. Install rye for dependency management. `curl -sSf https://rye.astral.sh/get | bash` on linux or macOS. Make sure to select to add rye to your PATH when prompted.
## Installation and Configuration
Clone the repository and install dependencies:
```bash
git clone https://github.com/odancona/code2prompt-mcp.git
cd code2prompt-mcp
```
Install all the required dependencies specified in the `pyproject.toml` file in the `.venv` directory with :
```bash
rye build
```
This will create a virtual environment and install all necessary packages.
Then, configure the MCP server configuration file. To run the environnment, you have several options. The first one would be to activate the virtual environment and run the server:
```bash
cd <installation_directory>
source .venv/bin/activate
python code2prompt_mcp.main
```
Alternatively, you can run the server directly using rye:
```bash
rye run python code2prompt_mcp.main
```
It's important to run this command in the cloned directory to use `pyproject.toml` and the virtual environment created by rye.
If you want to be able to run the MCP server from anywhere, you can create a configuration file for your LLM application. Here's an example configuration:
```json
{
"mcpServers": {
"code2prompt": {
"command": "bash",
"args": [
"-c",
"cd /path/to/code2prompt-mcp && rye run python /path/to/code2prompt-mcp/src/code2prompt_mcp/main.py"
],
"env": {}
}
}
}
```
## Verify Installation
To verify the installation is working:
1. Restart your LLM application (Cline, Claude Desktop, etc.)
2. Test the connection by running a simple command like:
```
Please get context from /path/to/project for AI analysis using Code2Prompt.
```
## Usage Examples
Here are some examples of how to use Code2Prompt MCP server with AI assistants:
### Local Codebase Analysis
```
Can you analyze the code in my project at /path/to/project? Please use Code2prompt MCP to get the context.
```
### Specific File Types Analysis
```
Please get all python files and remove markdown files and the folder tests, use Code2prompt MCP for context.
```
```
--------------------------------------------------------------------------------
/src/code2prompt_mcp/main.py:
--------------------------------------------------------------------------------
```python
"""
Code2Prompt MCP Server
An MCP server that allows LLMs to extract context from codebases using the code2prompt_rs SDK.
"""
from typing import Dict, List, Optional, Any
from mcp.server.fastmcp import FastMCP
import logging
import colorlog
from code2prompt_rs import Code2Prompt
mcp = FastMCP("code2prompt")
@mcp.tool()
async def get_context(
path: str = ".",
include_patterns: List[str] = [],
exclude_patterns: List[str] = [],
include_priority: bool = False,
line_numbers: bool = True,
absolute_paths: bool = False,
full_directory_tree: bool = False,
code_blocks: bool = True,
follow_symlinks: bool = False,
include_hidden: bool = False,
template: Optional[str] = None,
encoding: Optional[str] = "cl100k",
) -> Dict[str, Any]:
"""
Retrieve context from a codebase using code2prompt with the specified parameters.
Args:
path: Path to the codebase
include_patterns: List of glob patterns for files to include
exclude_patterns: List of glob patterns for files to exclude
include_priority: Give priority to include patterns
line_numbers: Add line numbers to code
absolute_paths: Use absolute paths instead of relative paths
full_directory_tree: List the full directory tree
code_blocks: Wrap code in markdown code blocks
follow_symlinks: Follow symbolic links
include_hidden: Include hidden directories and files
template: Custom Handlebars template
encoding: Token encoding (cl100k, gpt2, p50k_base)
Returns:
Dictionary with the prompt and metadata
"""
logger.info(f"Getting context from {path} with include patterns: {include_patterns}, exclude patterns: {exclude_patterns}")
# Initialize the Code2Prompt instance with all parameters
prompt = Code2Prompt(
path=path,
include_patterns=include_patterns,
exclude_patterns=exclude_patterns,
include_priority=include_priority,
line_numbers=line_numbers,
absolute_paths=absolute_paths,
full_directory_tree=full_directory_tree,
code_blocks=code_blocks,
follow_symlinks=follow_symlinks,
include_hidden=include_hidden,
)
# Generate the prompt directly using the instance method
# Note: sort_by configuration should be added if supported by the SDK
result = prompt.generate(template=template, encoding=encoding)
# Return structured result
return {
"prompt": result.prompt,
"directory": str(result.directory),
"token_count": result.token_count
}
if __name__ == "__main__":
# Initialize FastMCP server
handler = colorlog.StreamHandler()
formatter = colorlog.ColoredFormatter(
"%(log_color)s%(levelname)-8s%(reset)s %(blue)s%(message)s",
datefmt=None,
reset=True,
log_colors={
"DEBUG": "cyan",
"INFO": "green",
"WARNING": "yellow",
"ERROR": "red",
"CRITICAL": "red,bg_white",
},
secondary_log_colors={},
style="%",
)
handler.setFormatter(formatter)
logger = colorlog.getLogger(__name__)
logger.addHandler(handler)
logger.setLevel(logging.DEBUG)
mcp.run(transport='stdio')
```