#
tokens: 1887/50000 8/8 files
lines: off (toggle) GitHub
raw markdown copy
# Directory Structure

```
├── .gitignore
├── .python-version
├── Dockerfile
├── LICENSE
├── pyproject.toml
├── README.md
├── smithery.yaml
├── src
│   └── markitdown_mcp_server
│       ├── __init__.py
│       └── server.py
└── uv.lock
```

# Files

--------------------------------------------------------------------------------
/.python-version:
--------------------------------------------------------------------------------

```
3.13

```

--------------------------------------------------------------------------------
/.gitignore:
--------------------------------------------------------------------------------

```
# Python-generated files
__pycache__/
*.py[oc]
build/
dist/
wheels/
*.egg-info

# Virtual environments
.venv

```

--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------

```markdown
# MarkItDown MCP Server

[![smithery badge](https://smithery.ai/badge/@KorigamiK/markitdown_mcp_server)](https://smithery.ai/server/@KorigamiK/markitdown_mcp_server)

A Model Context Protocol (MCP) server that converts various file formats to Markdown using the MarkItDown utility.

<a href="https://glama.ai/mcp/servers/sbc6bljjg5"><img width="380" height="200" src="https://glama.ai/mcp/servers/sbc6bljjg5/badge" alt="MarkItDown Server MCP server" /></a>

## Supported Formats

- PDF
- PowerPoint
- Word
- Excel
- Images (EXIF metadata and OCR)
- Audio (EXIF metadata and speech transcription)
- HTML
- Text-based formats (CSV, JSON, XML)
- ZIP files (iterates over contents)

## Installation

### Installing via Smithery

To install MarkItDown MCP Server for Claude Desktop automatically via [Smithery](https://smithery.ai/server/@KorigamiK/markitdown_mcp_server):

```bash
npx -y @smithery/cli install @KorigamiK/markitdown_mcp_server --client claude
```

### Manual Installation

1. Clone this repository
2. Install dependencies:
```bash
uv install
```

## Usage

### As MCP Server

The server can be integrated with any MCP client. Here are some examples:

#### Zed Editor

Add the following to your `settings.json`:

```json
"context_servers": {
  "markitdown_mcp": {
    "settings": {},
    "command": {
      "path": "uv",
      "args": [
        "--directory",
        "/path/to/markitdown_mcp_server",
        "run",
        "markitdown"
      ]
    }
  }
}
```

### Commands

The server responds to the following MCP commands:

- `/md <file>` - Convert the specified file to Markdown

Example:
```bash
/md document.pdf
```

## Supported MCP Clients

Works with any MCP-compliant client listed at [modelcontextprotocol.io/clients](https://modelcontextprotocol.io/clients), including:

- Zed Editor
- Any other MCP-compatible editors and tools

## License

MIT License. See [LICENSE](LICENSE) for details.

## Acknowledgements

https://github.com/microsoft/markitdown#readme

```

--------------------------------------------------------------------------------
/src/markitdown_mcp_server/__init__.py:
--------------------------------------------------------------------------------

```python
from .server import run
import os
import asyncio


def main() -> None:
    os.system("notify-send 'Parseer server started'")
    asyncio.run(run())

```

--------------------------------------------------------------------------------
/smithery.yaml:
--------------------------------------------------------------------------------

```yaml
# Smithery configuration file: https://smithery.ai/docs/config#smitheryyaml

startCommand:
  type: stdio
  configSchema:
    # JSON Schema defining the configuration options for the MCP.
    type: object
    properties: {}
  commandFunction:
    # A function that produces the CLI command to start the MCP on stdio.
    |-
    () => ({command:'uv',args:['--directory', '/app', 'run', 'markitdown']})

```

--------------------------------------------------------------------------------
/pyproject.toml:
--------------------------------------------------------------------------------

```toml
[project]
name = "markitdown-mcp-server"
version = "0.1.0"
description = "Add your description here"
readme = "README.md"
authors = [
    { name = "korigamik-hypr", email = "[email protected]" }
]
requires-python = ">=3.12"
dependencies = [
    "markitdown>=0.0.1a3",
    "mcp>=1.2.1",
]

[project.scripts]
markitdown = "markitdown_mcp_server:main"

[build-system]
requires = ["hatchling"]
build-backend = "hatchling.build"

```

--------------------------------------------------------------------------------
/Dockerfile:
--------------------------------------------------------------------------------

```dockerfile
# Generated by https://smithery.ai. See: https://smithery.ai/docs/config#dockerfile
# Use a Python image with uv pre-installed
FROM ghcr.io/astral-sh/uv:python3.12-bookworm-slim AS uv

# Install the project into /app
WORKDIR /app

# Enable bytecode compilation
ENV UV_COMPILE_BYTECODE=1

# Copy from the cache instead of linking since it's a mounted volume
ENV UV_LINK_MODE=copy


# Install the project's dependencies using the lockfile and settings
RUN --mount=type=cache,target=/root/.cache/uv \   
    --mount=type=bind,source=uv.lock,target=uv.lock \
    --mount=type=bind,source=pyproject.toml,target=pyproject.toml \
    uv sync --frozen --no-install-project --no-dev --no-editable

# Then, add the rest of the project source code and install it
# Installing separately from its dependencies allows optimal layer caching
ADD . /app
RUN --mount=type=cache,target=/root/.cache/uv \
    uv sync --frozen --no-dev --no-editable

FROM python:3.12-slim-bookworm

WORKDIR /app
 
COPY --from=uv /root/.local /root/.local
COPY --from=uv --chown=app:app /app/.venv /app/.venv

# Place executables in the environment at the front of the path
ENV PATH="/app/.venv/bin:$PATH"

ENTRYPOINT ["markitdown_mcp_server"]

```

--------------------------------------------------------------------------------
/src/markitdown_mcp_server/server.py:
--------------------------------------------------------------------------------

```python
from mcp.server import Server, stdio, models, NotificationOptions
import mcp.types as types
from markitdown import MarkItDown
from typing import Tuple

PROMPTS = {
    "md": types.Prompt(
        name="md",
        description="Convert document to markdown format using MarkItDown",
        arguments=[
            types.PromptArgument(
                name="file_path",
                description="A URI to any document or file",
                required=True,
            )
        ],
    )
}


def convert_to_markdown(file_path: str) -> Tuple[str | None, str]:
    try:
        md = MarkItDown()
        result = md.convert(file_path)
        return result.title, result.text_content

    except Exception as e:
        return None, f"Error converting document: {str(e)}"


# Initialize server
app = Server("document-conversion-server")


@app.list_prompts()
async def list_prompts() -> list[types.Prompt]:
    return list(PROMPTS.values())


@app.get_prompt()
async def get_prompt(
    name: str, arguments: dict[str, str] | None = None
) -> types.GetPromptResult:
    if name not in PROMPTS:
        raise ValueError(f"Prompt not found: {name}")

    if name == "md":
        if not arguments:
            raise ValueError("Arguments required")

        file_path = arguments.get("file_path")

        if not file_path:
            raise ValueError("file_path is required")

        try:
            markdown_title, markdown_content = convert_to_markdown(file_path)

            return types.GetPromptResult(
                messages=[
                    types.PromptMessage(
                        role="user",
                        content=types.TextContent(
                            type="text",
                            text=f"Here is the converted document in markdown format:\n{'' if not markdown_title else markdown_title}\n{markdown_content}",
                        ),
                    )
                ]
            )

        except Exception as e:
            raise ValueError(f"Error processing document: {str(e)}")

    raise ValueError("Prompt implementation not found")


async def run():
    async with stdio.stdio_server() as (read_stream, write_stream):
        await app.run(
            read_stream,
            write_stream,
            models.InitializationOptions(
                server_name="example",
                server_version="0.1.0",
                capabilities=app.get_capabilities(
                    notification_options=NotificationOptions(),
                    experimental_capabilities={},
                ),
            ),
        )

```