#
tokens: 49167/50000 74/106 files (page 1/6)
lines: on (toggle) GitHub
raw markdown copy reset
This is page 1 of 6. Use http://codebase.md/alexander-zuev/supabase-mcp-server?lines=true&page={x} to view the full context.

# Directory Structure

```
├── .claude
│   └── settings.local.json
├── .dockerignore
├── .env.example
├── .env.test.example
├── .github
│   ├── FUNDING.yml
│   ├── ISSUE_TEMPLATE
│   │   ├── bug_report.md
│   │   ├── feature_request.md
│   │   └── roadmap_item.md
│   ├── PULL_REQUEST_TEMPLATE.md
│   └── workflows
│       ├── ci.yaml
│       ├── docs
│       │   └── release-checklist.md
│       └── publish.yaml
├── .gitignore
├── .pre-commit-config.yaml
├── .python-version
├── CHANGELOG.MD
├── codecov.yml
├── CONTRIBUTING.MD
├── Dockerfile
├── LICENSE
├── llms-full.txt
├── pyproject.toml
├── README.md
├── smithery.yaml
├── supabase_mcp
│   ├── __init__.py
│   ├── clients
│   │   ├── api_client.py
│   │   ├── base_http_client.py
│   │   ├── management_client.py
│   │   └── sdk_client.py
│   ├── core
│   │   ├── __init__.py
│   │   ├── container.py
│   │   └── feature_manager.py
│   ├── exceptions.py
│   ├── logger.py
│   ├── main.py
│   ├── services
│   │   ├── __init__.py
│   │   ├── api
│   │   │   ├── __init__.py
│   │   │   ├── api_manager.py
│   │   │   ├── spec_manager.py
│   │   │   └── specs
│   │   │       └── api_spec.json
│   │   ├── database
│   │   │   ├── __init__.py
│   │   │   ├── migration_manager.py
│   │   │   ├── postgres_client.py
│   │   │   ├── query_manager.py
│   │   │   └── sql
│   │   │       ├── loader.py
│   │   │       ├── models.py
│   │   │       ├── queries
│   │   │       │   ├── create_migration.sql
│   │   │       │   ├── get_migrations.sql
│   │   │       │   ├── get_schemas.sql
│   │   │       │   ├── get_table_schema.sql
│   │   │       │   ├── get_tables.sql
│   │   │       │   ├── init_migrations.sql
│   │   │       │   └── logs
│   │   │       │       ├── auth_logs.sql
│   │   │       │       ├── cron_logs.sql
│   │   │       │       ├── edge_logs.sql
│   │   │       │       ├── function_edge_logs.sql
│   │   │       │       ├── pgbouncer_logs.sql
│   │   │       │       ├── postgres_logs.sql
│   │   │       │       ├── postgrest_logs.sql
│   │   │       │       ├── realtime_logs.sql
│   │   │       │       ├── storage_logs.sql
│   │   │       │       └── supavisor_logs.sql
│   │   │       └── validator.py
│   │   ├── logs
│   │   │   ├── __init__.py
│   │   │   └── log_manager.py
│   │   ├── safety
│   │   │   ├── __init__.py
│   │   │   ├── models.py
│   │   │   ├── safety_configs.py
│   │   │   └── safety_manager.py
│   │   └── sdk
│   │       ├── __init__.py
│   │       ├── auth_admin_models.py
│   │       └── auth_admin_sdk_spec.py
│   ├── settings.py
│   └── tools
│       ├── __init__.py
│       ├── descriptions
│       │   ├── api_tools.yaml
│       │   ├── database_tools.yaml
│       │   ├── logs_and_analytics_tools.yaml
│       │   ├── safety_tools.yaml
│       │   └── sdk_tools.yaml
│       ├── manager.py
│       └── registry.py
├── tests
│   ├── __init__.py
│   ├── conftest.py
│   ├── services
│   │   ├── __init__.py
│   │   ├── api
│   │   │   ├── __init__.py
│   │   │   ├── test_api_client.py
│   │   │   ├── test_api_manager.py
│   │   │   └── test_spec_manager.py
│   │   ├── database
│   │   │   ├── sql
│   │   │   │   ├── __init__.py
│   │   │   │   ├── conftest.py
│   │   │   │   ├── test_loader.py
│   │   │   │   ├── test_sql_validator_integration.py
│   │   │   │   └── test_sql_validator.py
│   │   │   ├── test_migration_manager.py
│   │   │   ├── test_postgres_client.py
│   │   │   └── test_query_manager.py
│   │   ├── logs
│   │   │   └── test_log_manager.py
│   │   ├── safety
│   │   │   ├── test_api_safety_config.py
│   │   │   ├── test_safety_manager.py
│   │   │   └── test_sql_safety_config.py
│   │   └── sdk
│   │       ├── test_auth_admin_models.py
│   │       └── test_sdk_client.py
│   ├── test_container.py
│   ├── test_main.py
│   ├── test_settings.py
│   ├── test_tool_manager.py
│   ├── test_tools_integration.py.bak
│   └── test_tools.py
└── uv.lock
```

# Files

--------------------------------------------------------------------------------
/.python-version:
--------------------------------------------------------------------------------

```
1 | 3.12.9
2 | 
```

--------------------------------------------------------------------------------
/.env.test.example:
--------------------------------------------------------------------------------

```
 1 | # Supabase MCP Server Test Environment
 2 | # Copy this file to .env.test and modify as needed for your tests
 3 | 
 4 | # Connection settings for test database
 5 | SUPABASE_PROJECT_REF=127.0.0.1:54322
 6 | SUPABASE_DB_PASSWORD=postgres
 7 | 
 8 | # Optional: Management API access token (for API tests)
 9 | # SUPABASE_ACCESS_TOKEN=your_access_token
10 | 
11 | # Optional: Service role key (for auth tests)
12 | # SUPABASE_SERVICE_ROLE_KEY=your_service_role_key
13 | 
14 | # TheQuery.dev API URL
15 | QUERY_API_URL=http://127.0.0.1:8080/v1
16 | 
```

--------------------------------------------------------------------------------
/.env.example:
--------------------------------------------------------------------------------

```
 1 | # Supabase MCP Server Environment Configuration
 2 | # Copy this file to .env to configure your server
 3 | 
 4 | # API Key
 5 | QUERY_API_KEY=your-api-key # thequery.dev API key
 6 | 
 7 | # Required for remote Supabase projects (optional for local development)
 8 | SUPABASE_PROJECT_REF=your-project-ref # Your project reference from dashboard URL
 9 | SUPABASE_DB_PASSWORD=your-db-password # Database password for your project
10 | SUPABASE_REGION=us-east-1 # Region where your Supabase project is hosted
11 | 
12 | 
13 | # Optional configuration
14 | SUPABASE_ACCESS_TOKEN=your-personal-access-token # Required for Management API tools
15 | SUPABASE_SERVICE_ROLE_KEY=your-service-role-key # Required for Auth Admin SDK tools
16 | 
17 | 
18 | # ONLY for local development
19 | QUERY_API_URL=http://127.0.0.1:8080/v1 # TheQuery.dev API URL when developing locally
20 | 
```

--------------------------------------------------------------------------------
/.dockerignore:
--------------------------------------------------------------------------------

```
 1 | # Python
 2 | __pycache__/
 3 | *.py[cod]
 4 | *$py.class
 5 | *.so
 6 | .Python
 7 | build/
 8 | develop-eggs/
 9 | dist/
10 | downloads/
11 | eggs/
12 | .eggs/
13 | lib/
14 | lib64/
15 | parts/
16 | sdist/
17 | var/
18 | wheels/
19 | *.egg-info/
20 | .installed.cfg
21 | *.egg
22 | .pytest_cache/
23 | .coverage
24 | htmlcov/
25 | .tox/
26 | .nox/
27 | 
28 | # Virtual Environment
29 | .env
30 | .venv
31 | env/
32 | venv/
33 | ENV/
34 | env.bak/
35 | venv.bak/
36 | 
37 | # macOS
38 | .DS_Store
39 | .AppleDouble
40 | .LSOverride
41 | Icon
42 | ._*
43 | .DocumentRevisions-V100
44 | .fseventsd
45 | .Spotlight-V100
46 | .TemporaryItems
47 | .Trashes
48 | .VolumeIcon.icns
49 | .com.apple.timemachine.donotpresent
50 | 
51 | # IDEs and Editors
52 | .idea/
53 | .vscode/
54 | *.swp
55 | *.swo
56 | *~
57 | .project
58 | .classpath
59 | .settings/
60 | *.sublime-workspace
61 | *.sublime-project
62 | 
63 | # Local development
64 | .env.mcp
65 | .env.mcp2
66 | *.log
67 | logs/
68 | 
69 | # Ignore local assets
70 | assets/
71 | *.gif
72 | *.mp4
73 | 
74 | # Generated version file
75 | supabase_mcp/_version.py
76 | 
77 | # Docs
78 | .llms-full.txt
79 | 
80 | # Docker specific ignores
81 | Dockerfile
82 | .dockerignore
83 | docker-compose.yml
84 | docker-compose.yaml
85 | 
86 | # Git
87 | .git/
88 | .github/
89 | .gitignore
90 | 
```

--------------------------------------------------------------------------------
/.gitignore:
--------------------------------------------------------------------------------

```
 1 | # Python
 2 | __pycache__/
 3 | *.py[cod]
 4 | *$py.class
 5 | *.so
 6 | .Python
 7 | build/
 8 | develop-eggs/
 9 | dist/
10 | downloads/
11 | eggs/
12 | .eggs/
13 | lib/
14 | lib64/
15 | parts/
16 | sdist/
17 | var/
18 | wheels/
19 | *.egg-info/
20 | .installed.cfg
21 | *.egg
22 | .pytest_cache/
23 | .coverage
24 | htmlcov/
25 | .tox/
26 | .nox/
27 | 
28 | # Virtual Environment
29 | .env
30 | .env.test
31 | .venv
32 | env/
33 | venv/
34 | ENV/
35 | env.bak/
36 | venv.bak/
37 | 
38 | # macOS
39 | .DS_Store
40 | .AppleDouble
41 | .LSOverride
42 | Icon
43 | ._*
44 | .DocumentRevisions-V100
45 | .fseventsd
46 | .Spotlight-V100
47 | .TemporaryItems
48 | .Trashes
49 | .VolumeIcon.icns
50 | .com.apple.timemachine.donotpresent
51 | 
52 | # IDEs and Editors
53 | .idea/
54 | .vscode/
55 | *.swp
56 | *.swo
57 | *~
58 | .project
59 | .classpath
60 | .settings/
61 | *.sublime-workspace
62 | *.sublime-project
63 | 
64 | # Local development
65 | 
66 | *.log
67 | # Only ignore logs directory in the root, not in the package
68 | /logs/
69 | 
70 | 
71 | # Ignore local assets
72 | assets/
73 | *.gif
74 | *.mp4
75 | 
76 | # Generated version file
77 | supabase_mcp/_version.py
78 | 
79 | # Docs
80 | .llms-full.txt
81 | COMMIT_CONVENTION.md
82 | feature-spec/
83 | 
84 | # Claude code
85 | CLAUDE.md
86 | 
87 | #
88 | future-evolution.md
89 | 
```

--------------------------------------------------------------------------------
/.pre-commit-config.yaml:
--------------------------------------------------------------------------------

```yaml
  1 | repos:
  2 |   # === Syntax & Basic Checks ===
  3 |   - repo: https://github.com/pre-commit/pre-commit-hooks
  4 |     rev: v5.0.0
  5 |     hooks:
  6 |       - id: check-ast
  7 |         name: Validate Python syntax
  8 |       - id: check-toml
  9 |         name: Validate TOML files
 10 |       - id: mixed-line-ending
 11 |         name: Normalize line endings
 12 |         args: ['--fix=lf']
 13 |       - id: trailing-whitespace
 14 |         name: Remove trailing whitespace
 15 |       - id: end-of-file-fixer
 16 |         name: Ensure file ends with newline
 17 | 
 18 |   # === Security ===
 19 |   - repo: https://github.com/pre-commit/pre-commit-hooks
 20 |     rev: v5.0.0
 21 |     hooks:
 22 |       - id: detect-private-key
 23 |         name: Check for private keys
 24 |         stages: [pre-commit, pre-push, manual]
 25 |       - id: check-merge-conflict
 26 |         name: Check for merge conflicts
 27 |         stages: [pre-commit, manual]
 28 |       - id: debug-statements
 29 |         name: Check for debugger imports
 30 |         stages: [pre-commit, manual]
 31 | 
 32 |   # === SQL Linting ===
 33 |   - repo: https://github.com/sqlfluff/sqlfluff
 34 |     rev: 3.3.1
 35 |     hooks:
 36 |       - id: sqlfluff-lint
 37 |         name: Run SQLFluff linter
 38 |         description: Lint SQL files with SQLFluff
 39 |         types: [sql]
 40 |         args: [
 41 |           "--dialect", "postgres",
 42 |           "--exclude-rules", "L016,L031,LT02",  # Exclude some opinionated rules
 43 |         ]
 44 |         files: ^(supabase_mcp/sql|tests/sql)/
 45 |       - id: sqlfluff-fix
 46 |         name: Run SQLFluff fixer
 47 |         description: Auto-fix SQL files with SQLFluff
 48 |         types: [sql]
 49 |         args: [
 50 |           "--dialect", "postgres",
 51 |           "--exclude-rules", "L016,L031,LT02",  # Exclude some opinionated rules
 52 |         ]
 53 |         files: ^(supabase_mcp/sql|tests/sql)/
 54 | 
 55 |   # === Type Checking ===
 56 | 
 57 |   - repo: https://github.com/pre-commit/mirrors-mypy
 58 |     rev: "v1.15.0"
 59 |     hooks:
 60 |       - id: mypy
 61 |         name: Run mypy type checker
 62 |         args: [
 63 |           "--config-file=pyproject.toml",
 64 |           "--show-error-codes",
 65 |           "--pretty",
 66 |         ]
 67 |         additional_dependencies: [
 68 |             "types-requests",
 69 |             "types-aiofiles",
 70 |             "types-pytz",
 71 |             "pydantic",
 72 |             "chainlit",
 73 |             "anthropic",
 74 |             "fastapi",
 75 |             "httpx",
 76 |             "tiktoken",
 77 |             "weave",
 78 |             "chromadb",
 79 |             "cohere",
 80 |             "langchain"
 81 |         ]
 82 |         entry: bash -c 'mypy "$@" || true' --
 83 | 
 84 |   # === Code Quality & Style ===
 85 |   - repo: https://github.com/astral-sh/ruff-pre-commit
 86 |     rev: v0.9.9
 87 |     hooks:
 88 |       - id: ruff
 89 |         name: Run Ruff linter
 90 |         args: [
 91 |           --fix,
 92 |           --exit-zero,
 93 |           --quiet,
 94 |         ]
 95 |         types_or: [python, pyi, jupyter]
 96 |         files: ^(src|tests)/
 97 |         exclude: ^src/experimental/
 98 |         verbose: false
 99 |       - id: ruff-format
100 |         name: Run Ruff formatter
101 |         types_or: [python, pyi, jupyter]
102 | 
103 |   # === Documentation Checks ===
104 |   - repo: https://github.com/tcort/markdown-link-check
105 |     rev: v3.13.6
106 |     hooks:
107 |       - id: markdown-link-check
108 |         name: Check Markdown links
109 |         description: Extracts links from markdown texts and checks they're all alive
110 |         stages: [pre-commit, pre-push, manual]
111 | 
112 |   # === Testing ===
113 |   - repo: local
114 |     hooks:
115 |       - id: pytest
116 |         name: Run tests
117 |         entry: pytest
118 |         language: system
119 |         types: [python]
120 |         pass_filenames: false
121 |         args: [
122 |           "--no-header",
123 |           "--quiet",
124 |           "--no-summary",
125 |           "--show-capture=no",
126 |           "--tb=line"  # Show only one line per failure
127 |         ]
128 |         stages: [pre-commit, pre-push]
129 | 
130 |   # === Build Check ===
131 |   - repo: local
132 |     hooks:
133 |       - id: build-check
134 |         name: Check build
135 |         entry: uv build
136 |         language: system
137 |         pass_filenames: false
138 |         stages: [pre-commit, pre-push]
139 |       - id: version-check
140 |         name: Check package version
141 |         # Print version from the built package
142 |         entry: python -c "from supabase_mcp import __version__; print('📦 Package version:', __version__)"
143 |         language: system
144 |         verbose: true
145 |         pass_filenames: false
146 |         stages: [pre-commit, pre-push]
147 | 
```

--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------

```markdown
  1 | # Query | MCP server for Supabase
  2 | 
  3 | > 🌅 More than 17k installs via pypi and close to 30k downloads on Smithery.ai — in short, this was fun! 🥳
  4 | > Thanks to everyone who has been using this server for the past few months, and I hope it was useful for you.
  5 | > Since Supabase has released their own [official MCP server](https://github.com/supabase-community/supabase-mcp),
  6 | > I've decided to no longer actively maintain this one. The official MCP server is as feature-rich, and many more
  7 | > features will be added in the future. Check it out!
  8 | 
  9 | 
 10 | <p class="center-text">
 11 |   <strong>Query MCP is an open-source MCP server that lets your IDE safely run SQL, manage schema changes, call the Supabase Management API, and use Auth Admin SDK — all with built-in safety controls.</strong>
 12 | </p>
 13 | 
 14 | 
 15 | <p class="center-text">
 16 |   <a href="https://pypi.org/project/supabase-mcp-server/"><img src="https://img.shields.io/pypi/v/supabase-mcp-server.svg" alt="PyPI version" /></a>
 17 |   <a href="https://github.com/alexander-zuev/supabase-mcp-server/actions"><img src="https://github.com/alexander-zuev/supabase-mcp-server/workflows/CI/badge.svg" alt="CI Status" /></a>
 18 |   <a href="https://codecov.io/gh/alexander-zuev/supabase-mcp-server"><img src="https://codecov.io/gh/alexander-zuev/supabase-mcp-server/branch/main/graph/badge.svg" alt="Code Coverage" /></a>
 19 |   <a href="https://www.python.org/downloads/"><img src="https://img.shields.io/badge/python-3.12%2B-blue.svg" alt="Python 3.12+" /></a>
 20 |   <a href="https://github.com/astral-sh/uv"><img src="https://img.shields.io/badge/uv-package%20manager-blueviolet" alt="uv package manager" /></a>
 21 |   <a href="https://pepy.tech/project/supabase-mcp-server"><img src="https://static.pepy.tech/badge/supabase-mcp-server" alt="PyPI Downloads" /></a>
 22 |   <a href="https://smithery.ai/server/@alexander-zuev/supabase-mcp-server"><img src="https://smithery.ai/badge/@alexander-zuev/supabase-mcp-server" alt="Smithery.ai Downloads" /></a>
 23 |   <a href="https://modelcontextprotocol.io/introduction"><img src="https://img.shields.io/badge/MCP-Server-orange" alt="MCP Server" /></a>
 24 |   <a href="LICENSE"><img src="https://img.shields.io/badge/license-Apache%202.0-blue.svg" alt="License" /></a>
 25 | </p>    
 26 | 
 27 | ## Table of contents
 28 | 
 29 | <p class="center-text">
 30 |   <a href="#getting-started">Getting started</a> •
 31 |   <a href="#feature-overview">Feature overview</a> •
 32 |   <a href="#troubleshooting">Troubleshooting</a> •
 33 |   <a href="#changelog">Changelog</a>
 34 | </p>
 35 | 
 36 | ## ✨ Key features
 37 | - 💻 Compatible with Cursor, Windsurf, Cline and other MCP clients supporting `stdio` protocol
 38 | - 🔐 Control read-only and read-write modes of SQL query execution
 39 | - 🔍 Runtime SQL query validation with risk level assessment
 40 | - 🛡️ Three-tier safety system for SQL operations: safe, write, and destructive
 41 | - 🔄 Robust transaction handling for both direct and pooled database connections
 42 | - 📝 Automatic versioning of database schema changes
 43 | - 💻 Manage your Supabase projects with Supabase Management API
 44 | - 🧑‍💻 Manage users with Supabase Auth Admin methods via Python SDK
 45 | - 🔨 Pre-built tools to help Cursor & Windsurf work with MCP more effectively
 46 | - 📦 Dead-simple install & setup via package manager (uv, pipx, etc.)
 47 | 
 48 | 
 49 | ## Getting Started
 50 | 
 51 | ### Prerequisites
 52 | Installing the server requires the following on your system:
 53 | - Python 3.12+
 54 | 
 55 | If you plan to install via `uv`, ensure it's [installed](https://docs.astral.sh/uv/getting-started/installation/#__tabbed_1_1).
 56 | 
 57 | ### PostgreSQL Installation
 58 | PostgreSQL installation is no longer required for the MCP server itself, as it now uses asyncpg which doesn't depend on PostgreSQL development libraries.
 59 | 
 60 | However, you'll still need PostgreSQL if you're running a local Supabase instance:
 61 | 
 62 | **MacOS**
 63 | ```bash
 64 | brew install postgresql@16
 65 | ```
 66 | 
 67 | **Windows**
 68 |   - Download and install PostgreSQL 16+ from https://www.postgresql.org/download/windows/
 69 |   - Ensure "PostgreSQL Server" and "Command Line Tools" are selected during installation
 70 | 
 71 | ### Step 1. Installation
 72 | 
 73 | Since v0.2.0 I introduced support for package installation. You can use your favorite Python package manager to install the server via:
 74 | 
 75 | ```bash
 76 | # if pipx is installed (recommended)
 77 | pipx install supabase-mcp-server
 78 | 
 79 | # if uv is installed
 80 | uv pip install supabase-mcp-server
 81 | ```
 82 | 
 83 | `pipx` is recommended because it creates isolated environments for each package.
 84 | 
 85 | You can also install the server manually by cloning the repository and running `pipx install -e .` from the root directory.
 86 | 
 87 | #### Installing from source
 88 | If you would like to install from source, for example for local development:
 89 | ```bash
 90 | uv venv
 91 | # On Mac
 92 | source .venv/bin/activate
 93 | # On Windows
 94 | .venv\Scripts\activate
 95 | # Install package in editable mode
 96 | uv pip install -e .
 97 | ```
 98 | 
 99 | #### Installing via Smithery.ai
100 | 
101 | You can find the full instructions on how to use Smithery.ai to connect to this MCP server [here](https://smithery.ai/server/@alexander-zuev/supabase-mcp-server).
102 | 
103 | 
104 | ### Step 2. Configuration
105 | 
106 | The Supabase MCP server requires configuration to connect to your Supabase database, access the Management API, and use the Auth Admin SDK. This section explains all available configuration options and how to set them up.
107 | 
108 | > 🔑 **Important**: Since v0.4 MCP server requires an API key which you can get for free at [thequery.dev](https://thequery.dev) to use this MCP server.
109 | 
110 | #### Environment Variables
111 | 
112 | The server uses the following environment variables:
113 | 
114 | | Variable | Required | Default | Description |
115 | |----------|----------|---------|-------------|
116 | | `SUPABASE_PROJECT_REF` | Yes | `127.0.0.1:54322` | Your Supabase project reference ID (or local host:port) |
117 | | `SUPABASE_DB_PASSWORD` | Yes | `postgres` | Your database password |
118 | | `SUPABASE_REGION` | Yes* | `us-east-1` | AWS region where your Supabase project is hosted |
119 | | `SUPABASE_ACCESS_TOKEN` | No | None | Personal access token for Supabase Management API |
120 | | `SUPABASE_SERVICE_ROLE_KEY` | No | None | Service role key for Auth Admin SDK |
121 | | `QUERY_API_KEY` | Yes | None | API key from thequery.dev (required for all operations) |
122 | 
123 | > **Note**: The default values are configured for local Supabase development. For remote Supabase projects, you must provide your own values for `SUPABASE_PROJECT_REF` and `SUPABASE_DB_PASSWORD`.
124 | 
125 | > 🚨 **CRITICAL CONFIGURATION NOTE**: For remote Supabase projects, you MUST specify the correct region where your project is hosted using `SUPABASE_REGION`. If you encounter a "Tenant or user not found" error, this is almost certainly because your region setting doesn't match your project's actual region. You can find your project's region in the Supabase dashboard under Project Settings.
126 | 
127 | #### Connection Types
128 | 
129 | ##### Database Connection
130 | - The server connects to your Supabase PostgreSQL database using the transaction pooler endpoint
131 | - Local development uses a direct connection to `127.0.0.1:54322`
132 | - Remote projects use the format: `postgresql://postgres.[project_ref]:[password]@aws-0-[region].pooler.supabase.com:6543/postgres`
133 | 
134 | > ⚠️ **Important**: Session pooling connections are not supported. The server exclusively uses transaction pooling for better compatibility with the MCP server architecture.
135 | 
136 | ##### Management API Connection
137 | - Requires `SUPABASE_ACCESS_TOKEN` to be set
138 | - Connects to the Supabase Management API at `https://api.supabase.com`
139 | - Only works with remote Supabase projects (not local development)
140 | 
141 | ##### Auth Admin SDK Connection
142 | - Requires `SUPABASE_SERVICE_ROLE_KEY` to be set
143 | - For local development, connects to `http://127.0.0.1:54321`
144 | - For remote projects, connects to `https://[project_ref].supabase.co`
145 | 
146 | #### Configuration Methods
147 | 
148 | The server looks for configuration in this order (highest to lowest priority):
149 | 
150 | 1. **Environment Variables**: Values set directly in your environment
151 | 2. **Local `.env` File**: A `.env` file in your current working directory (only works when running from source)
152 | 3. **Global Config File**:
153 |    - Windows: `%APPDATA%\supabase-mcp\.env`
154 |    - macOS/Linux: `~/.config/supabase-mcp/.env`
155 | 4. **Default Settings**: Local development defaults (if no other config is found)
156 | 
157 | > ⚠️ **Important**: When using the package installed via pipx or uv, local `.env` files in your project directory are **not** detected. You must use either environment variables or the global config file.
158 | 
159 | #### Setting Up Configuration
160 | 
161 | ##### Option 1: Client-Specific Configuration (Recommended)
162 | 
163 | Set environment variables directly in your MCP client configuration (see client-specific setup instructions in Step 3). Most MCP clients support this approach, which keeps your configuration with your client settings.
164 | 
165 | ##### Option 2: Global Configuration
166 | 
167 | Create a global `.env` configuration file that will be used for all MCP server instances:
168 | 
169 | ```bash
170 | # Create config directory
171 | # On macOS/Linux
172 | mkdir -p ~/.config/supabase-mcp
173 | # On Windows (PowerShell)
174 | mkdir -Force "$env:APPDATA\supabase-mcp"
175 | 
176 | # Create and edit .env file
177 | # On macOS/Linux
178 | nano ~/.config/supabase-mcp/.env
179 | # On Windows (PowerShell)
180 | notepad "$env:APPDATA\supabase-mcp\.env"
181 | ```
182 | 
183 | Add your configuration values to the file:
184 | 
185 | ```
186 | QUERY_API_KEY=your-api-key
187 | SUPABASE_PROJECT_REF=your-project-ref
188 | SUPABASE_DB_PASSWORD=your-db-password
189 | SUPABASE_REGION=us-east-1
190 | SUPABASE_ACCESS_TOKEN=your-access-token
191 | SUPABASE_SERVICE_ROLE_KEY=your-service-role-key
192 | ```
193 | 
194 | ##### Option 3: Project-Specific Configuration (Source Installation Only)
195 | 
196 | If you're running the server from source (not via package), you can create a `.env` file in your project directory with the same format as above.
197 | 
198 | #### Finding Your Supabase Project Information
199 | 
200 | - **Project Reference**: Found in your Supabase project URL: `https://supabase.com/dashboard/project/<project-ref>`
201 | - **Database Password**: Set during project creation or found in Project Settings → Database
202 | - **Access Token**: Generate at https://supabase.com/dashboard/account/tokens
203 | - **Service Role Key**: Found in Project Settings → API → Project API keys
204 | 
205 | #### Supported Regions
206 | 
207 | The server supports all Supabase regions:
208 | 
209 | - `us-west-1` - West US (North California)
210 | - `us-east-1` - East US (North Virginia) - default
211 | - `us-east-2` - East US (Ohio)
212 | - `ca-central-1` - Canada (Central)
213 | - `eu-west-1` - West EU (Ireland)
214 | - `eu-west-2` - West Europe (London)
215 | - `eu-west-3` - West EU (Paris)
216 | - `eu-central-1` - Central EU (Frankfurt)
217 | - `eu-central-2` - Central Europe (Zurich)
218 | - `eu-north-1` - North EU (Stockholm)
219 | - `ap-south-1` - South Asia (Mumbai)
220 | - `ap-southeast-1` - Southeast Asia (Singapore)
221 | - `ap-northeast-1` - Northeast Asia (Tokyo)
222 | - `ap-northeast-2` - Northeast Asia (Seoul)
223 | - `ap-southeast-2` - Oceania (Sydney)
224 | - `sa-east-1` - South America (São Paulo)
225 | 
226 | #### Limitations
227 | 
228 | - **No Self-Hosted Support**: The server only supports official Supabase.com hosted projects and local development
229 | - **No Connection String Support**: Custom connection strings are not supported
230 | - **No Session Pooling**: Only transaction pooling is supported for database connections
231 | - **API and SDK Features**: Management API and Auth Admin SDK features only work with remote Supabase projects, not local development
232 | 
233 | ### Step 3. Usage
234 | 
235 | In general, any MCP client that supports `stdio` protocol should work with this MCP server. This server was explicitly tested to work with:
236 | - Cursor
237 | - Windsurf
238 | - Cline
239 | - Claude Desktop
240 | 
241 | Additionally, you can also use smithery.ai to install this server a number of clients, including the ones above.
242 | 
243 | Follow the guides below to install this MCP server in your client.
244 | 
245 | #### Cursor
246 | Go to Settings -> Features -> MCP Servers and add a new server with this configuration:
247 | ```bash
248 | # can be set to any name
249 | name: supabase
250 | type: command
251 | # if you installed with pipx
252 | command: supabase-mcp-server
253 | # if you installed with uv
254 | command: uv run supabase-mcp-server
255 | # if the above doesn't work, use the full path (recommended)
256 | command: /full/path/to/supabase-mcp-server  # Find with 'which supabase-mcp-server' (macOS/Linux) or 'where supabase-mcp-server' (Windows)
257 | ```
258 | 
259 | If configuration is correct, you should see a green dot indicator and the number of tools exposed by the server.
260 | ![How successful Cursor config looks like](https://github.com/user-attachments/assets/45df080a-8199-4aca-b59c-a84dc7fe2c09)
261 | 
262 | #### Windsurf
263 | Go to Cascade -> Click on the hammer icon -> Configure -> Fill in the configuration:
264 | ```json
265 | {
266 |     "mcpServers": {
267 |       "supabase": {
268 |         "command": "/Users/username/.local/bin/supabase-mcp-server",  // update path
269 |         "env": {
270 |           "QUERY_API_KEY": "your-api-key",  // Required - get your API key at thequery.dev
271 |           "SUPABASE_PROJECT_REF": "your-project-ref",
272 |           "SUPABASE_DB_PASSWORD": "your-db-password",
273 |           "SUPABASE_REGION": "us-east-1",  // optional, defaults to us-east-1
274 |           "SUPABASE_ACCESS_TOKEN": "your-access-token",  // optional, for management API
275 |           "SUPABASE_SERVICE_ROLE_KEY": "your-service-role-key"  // optional, for Auth Admin SDK
276 |         }
277 |       }
278 |     }
279 | }
280 | ```
281 | If configuration is correct, you should see green dot indicator and clickable supabase server in the list of available servers.
282 | 
283 | ![How successful Windsurf config looks like](https://github.com/user-attachments/assets/322b7423-8c71-410b-bcab-aff1b143faa4)
284 | 
285 | #### Claude Desktop
286 | Claude Desktop also supports MCP servers through a JSON configuration. Follow these steps to set up the Supabase MCP server:
287 | 
288 | 1. **Find the full path to the executable** (this step is critical):
289 |    ```bash
290 |    # On macOS/Linux
291 |    which supabase-mcp-server
292 | 
293 |    # On Windows
294 |    where supabase-mcp-server
295 |    ```
296 |    Copy the full path that is returned (e.g., `/Users/username/.local/bin/supabase-mcp-server`).
297 | 
298 | 2. **Configure the MCP server** in Claude Desktop:
299 |    - Open Claude Desktop
300 |    - Go to Settings → Developer -> Edit Config MCP Servers
301 |    - Add a new configuration with the following JSON:
302 | 
303 |    ```json
304 |    {
305 |      "mcpServers": {
306 |        "supabase": {
307 |          "command": "/full/path/to/supabase-mcp-server",  // Replace with the actual path from step 1
308 |          "env": {
309 |            "QUERY_API_KEY": "your-api-key",  // Required - get your API key at thequery.dev
310 |            "SUPABASE_PROJECT_REF": "your-project-ref",
311 |            "SUPABASE_DB_PASSWORD": "your-db-password",
312 |            "SUPABASE_REGION": "us-east-1",  // optional, defaults to us-east-1
313 |            "SUPABASE_ACCESS_TOKEN": "your-access-token",  // optional, for management API
314 |            "SUPABASE_SERVICE_ROLE_KEY": "your-service-role-key"  // optional, for Auth Admin SDK
315 |          }
316 |        }
317 |      }
318 |    }
319 |    ```
320 | 
321 | > ⚠️ **Important**: Unlike Windsurf and Cursor, Claude Desktop requires the **full absolute path** to the executable. Using just the command name (`supabase-mcp-server`) will result in a "spawn ENOENT" error.
322 | 
323 | If configuration is correct, you should see the Supabase MCP server listed as available in Claude Desktop.
324 | 
325 | ![How successful Windsurf config looks like](https://github.com/user-attachments/assets/500bcd40-6245-40a7-b23b-189827ed2923)
326 | 
327 | #### Cline
328 | Cline also supports MCP servers through a similar JSON configuration. Follow these steps to set up the Supabase MCP server:
329 | 
330 | 1. **Find the full path to the executable** (this step is critical):
331 |    ```bash
332 |    # On macOS/Linux
333 |    which supabase-mcp-server
334 | 
335 |    # On Windows
336 |    where supabase-mcp-server
337 |    ```
338 |    Copy the full path that is returned (e.g., `/Users/username/.local/bin/supabase-mcp-server`).
339 | 
340 | 2. **Configure the MCP server** in Cline:
341 |    - Open Cline in VS Code
342 |    - Click on the "MCP Servers" tab in the Cline sidebar
343 |    - Click "Configure MCP Servers"
344 |    - This will open the `cline_mcp_settings.json` file
345 |    - Add the following configuration:
346 | 
347 |    ```json
348 |    {
349 |      "mcpServers": {
350 |        "supabase": {
351 |          "command": "/full/path/to/supabase-mcp-server",  // Replace with the actual path from step 1
352 |          "env": {
353 |            "QUERY_API_KEY": "your-api-key",  // Required - get your API key at thequery.dev
354 |            "SUPABASE_PROJECT_REF": "your-project-ref",
355 |            "SUPABASE_DB_PASSWORD": "your-db-password",
356 |            "SUPABASE_REGION": "us-east-1",  // optional, defaults to us-east-1
357 |            "SUPABASE_ACCESS_TOKEN": "your-access-token",  // optional, for management API
358 |            "SUPABASE_SERVICE_ROLE_KEY": "your-service-role-key"  // optional, for Auth Admin SDK
359 |          }
360 |        }
361 |      }
362 |    }
363 |    ```
364 | 
365 | If configuration is correct, you should see a green indicator next to the Supabase MCP server in the Cline MCP Servers list, and a message confirming "supabase MCP server connected" at the bottom of the panel.
366 | 
367 | ![How successful configuration in Cline looks like](https://github.com/user-attachments/assets/6c4446ad-7a58-44c6-bf12-6c82222bbe59)
368 | 
369 | ### Troubleshooting
370 | 
371 | Here are some tips & tricks that might help you:
372 | - **Debug installation** - run `supabase-mcp-server` directly from the terminal to see if it works. If it doesn't, there might be an issue with the installation.
373 | - **MCP Server configuration** - if the above step works, it means the server is installed and configured correctly. As long as you provided the right command, IDE should be able to connect. Make sure to provide the right path to the server executable.
374 | - **"No tools found" error** - If you see "Client closed - no tools available" in Cursor despite the package being installed:
375 |   - Find the full path to the executable by running `which supabase-mcp-server` (macOS/Linux) or `where supabase-mcp-server` (Windows)
376 |   - Use the full path in your MCP server configuration instead of just `supabase-mcp-server`
377 |   - For example: `/Users/username/.local/bin/supabase-mcp-server` or `C:\Users\username\.local\bin\supabase-mcp-server.exe`
378 | - **Environment variables** - to connect to the right database, make sure you either set env variables in `mcp_config.json` or in `.env` file placed in a global config directory (`~/.config/supabase-mcp/.env` on macOS/Linux or `%APPDATA%\supabase-mcp\.env` on Windows).
379 | - **Accessing logs** - The MCP server writes detailed logs to a file:
380 |   - Log file location:
381 |     - macOS/Linux: `~/.local/share/supabase-mcp/mcp_server.log`
382 |     - Windows: `%USERPROFILE%\.local\share\supabase-mcp\mcp_server.log`
383 |   - Logs include connection status, configuration details, and operation results
384 |   - View logs using any text editor or terminal commands:
385 |     ```bash
386 |     # On macOS/Linux
387 |     cat ~/.local/share/supabase-mcp/mcp_server.log
388 | 
389 |     # On Windows (PowerShell)
390 |     Get-Content "$env:USERPROFILE\.local\share\supabase-mcp\mcp_server.log"
391 |     ```
392 | 
393 | If you are stuck or any of the instructions above are incorrect, please raise an issue.
394 | 
395 | ### MCP Inspector
396 | A super useful tool to help debug MCP server issues is MCP Inspector. If you installed from source, you can run `supabase-mcp-inspector` from the project repo and it will run the inspector instance. Coupled with logs this will give you complete overview over what's happening in the server.
397 | > 📝 Running `supabase-mcp-inspector`, if installed from package, doesn't work properly - I will validate and fix in the coming release.
398 | 
399 | ## Feature Overview
400 | 
401 | ### Database query tools
402 | 
403 | Since v0.3+ server provides comprehensive database management capabilities with built-in safety controls:
404 | 
405 | - **SQL Query Execution**: Execute PostgreSQL queries with risk assessment
406 |   - **Three-tier safety system**:
407 |     - `safe`: Read-only operations (SELECT) - always allowed
408 |     - `write`: Data modifications (INSERT, UPDATE, DELETE) - require unsafe mode
409 |     - `destructive`: Schema changes (DROP, CREATE) - require unsafe mode + confirmation
410 | 
411 | - **SQL Parsing and Validation**:
412 |   - Uses PostgreSQL's parser (pglast) for accurate analysis and provides clear feedback on safety requirements
413 | 
414 | - **Automatic Migration Versioning**:
415 |   - Database-altering operations operations are automatically versioned
416 |   - Generates descriptive names based on operation type and target
417 | 
418 | 
419 | - **Safety Controls**:
420 |   - Default SAFE mode allows only read-only operations
421 |   - All statements run in transaction mode via `asyncpg`
422 |   - 2-step confirmation for high-risk operations
423 | 
424 | - **Available Tools**:
425 |   - `get_schemas`: Lists schemas with sizes and table counts
426 |   - `get_tables`: Lists tables, foreign tables, and views with metadata
427 |   - `get_table_schema`: Gets detailed table structure (columns, keys, relationships)
428 |   - `execute_postgresql`: Executes SQL statements against your database
429 |   - `confirm_destructive_operation`: Executes high-risk operations after confirmation
430 |   - `retrieve_migrations`: Gets migrations with filtering and pagination options
431 |   - `live_dangerously`: Toggles between safe and unsafe modes
432 | 
433 | ### Management API tools
434 | 
435 | Since v0.3.0 server provides secure access to the Supabase Management API with built-in safety controls:
436 | 
437 | - **Available Tools**:
438 |   - `send_management_api_request`: Sends arbitrary requests to Supabase Management API with auto-injection of project ref
439 |   - `get_management_api_spec`: Gets the enriched API specification with safety information
440 |     - Supports multiple query modes: by domain, by specific path/method, or all paths
441 |     - Includes risk assessment information for each endpoint
442 |     - Provides detailed parameter requirements and response formats
443 |     - Helps LLMs understand the full capabilities of the Supabase Management API
444 |   - `get_management_api_safety_rules`: Gets all safety rules with human-readable explanations
445 |   - `live_dangerously`: Toggles between safe and unsafe operation modes
446 | 
447 | - **Safety Controls**:
448 |   - Uses the same safety manager as database operations for consistent risk management
449 |   - Operations categorized by risk level:
450 |     - `safe`: Read-only operations (GET) - always allowed
451 |     - `unsafe`: State-changing operations (POST, PUT, PATCH, DELETE) - require unsafe mode
452 |     - `blocked`: Destructive operations (delete project, etc.) - never allowed
453 |   - Default safe mode prevents accidental state changes
454 |   - Path-based pattern matching for precise safety rules
455 | 
456 | **Note**: Management API tools only work with remote Supabase instances and are not compatible with local Supabase development setups.
457 | 
458 | ### Auth Admin tools
459 | 
460 | I was planning to add support for Python SDK methods to the MCP server. Upon consideration I decided to only add support for Auth admin methods as I often found myself manually creating test users which was prone to errors and time consuming. Now I can just ask Cursor to create a test user and it will be done seamlessly. Check out the full Auth Admin SDK method docs to know what it can do.
461 | 
462 | Since v0.3.6 server supports direct access to Supabase Auth Admin methods via Python SDK:
463 |   - Includes the following tools:
464 |     - `get_auth_admin_methods_spec` to retrieve documentation for all available Auth Admin methods
465 |     - `call_auth_admin_method` to directly invoke Auth Admin methods with proper parameter handling
466 |   - Supported methods:
467 |     - `get_user_by_id`: Retrieve a user by their ID
468 |     - `list_users`: List all users with pagination
469 |     - `create_user`: Create a new user
470 |     - `delete_user`: Delete a user by their ID
471 |     - `invite_user_by_email`: Send an invite link to a user's email
472 |     - `generate_link`: Generate an email link for various authentication purposes
473 |     - `update_user_by_id`: Update user attributes by ID
474 |     - `delete_factor`: Delete a factor on a user (currently not implemented in SDK)
475 | 
476 | #### Why use Auth Admin SDK instead of raw SQL queries?
477 | 
478 | The Auth Admin SDK provides several key advantages over direct SQL manipulation:
479 | - **Functionality**: Enables operations not possible with SQL alone (invites, magic links, MFA)
480 | - **Accuracy**: More reliable then creating and executing raw SQL queries on auth schemas
481 | - **Simplicity**: Offers clear methods with proper validation and error handling
482 | 
483 |   - Response format:
484 |     - All methods return structured Python objects instead of raw dictionaries
485 |     - Object attributes can be accessed using dot notation (e.g., `user.id` instead of `user["id"]`)
486 |   - Edge cases and limitations:
487 |     - UUID validation: Many methods require valid UUID format for user IDs and will return specific validation errors
488 |     - Email configuration: Methods like `invite_user_by_email` and `generate_link` require email sending to be configured in your Supabase project
489 |     - Link types: When generating links, different link types have different requirements:
490 |       - `signup` links don't require the user to exist
491 |       - `magiclink` and `recovery` links require the user to already exist in the system
492 |     - Error handling: The server provides detailed error messages from the Supabase API, which may differ from the dashboard interface
493 |     - Method availability: Some methods like `delete_factor` are exposed in the API but not fully implemented in the SDK
494 | 
495 | ### Logs & Analytics
496 | 
497 | The server provides access to Supabase logs and analytics data, making it easier to monitor and troubleshoot your applications:
498 | 
499 | - **Available Tool**: `retrieve_logs` - Access logs from any Supabase service
500 | 
501 | - **Log Collections**:
502 |   - `postgres`: Database server logs
503 |   - `api_gateway`: API gateway requests
504 |   - `auth`: Authentication events
505 |   - `postgrest`: RESTful API service logs
506 |   - `pooler`: Connection pooling logs
507 |   - `storage`: Object storage operations
508 |   - `realtime`: WebSocket subscription logs
509 |   - `edge_functions`: Serverless function executions
510 |   - `cron`: Scheduled job logs
511 |   - `pgbouncer`: Connection pooler logs
512 | 
513 | - **Features**: Filter by time, search text, apply field filters, or use custom SQL queries
514 | 
515 | Simplifies debugging across your Supabase stack without switching between interfaces or writing complex queries.
516 | 
517 | ### Automatic Versioning of Database Changes
518 | 
519 | "With great power comes great responsibility." While `execute_postgresql` tool coupled with aptly named `live_dangerously` tool provide a powerful and simple way to manage your Supabase database, it also means that dropping a table or modifying one is one chat message away. In order to reduce the risk of irreversible changes, since v0.3.8 the server supports:
520 | - automatic creation of migration scripts for all write & destructive sql operations executed on the database
521 | - improved safety mode of query execution, in which all queries are categorized in:
522 |   - `safe` type: always allowed. Includes all read-only ops.
523 |   - `write`type: requires `write` mode to be enabled by the user.
524 |   - `destructive` type: requires `write` mode to be enabled by the user AND a 2-step confirmation of query execution for clients that do not execute tools automatically.
525 | 
526 | ### Universal Safety Mode
527 | Since v0.3.8 Safety Mode has been standardized across all services (database, API, SDK) using a universal safety manager. This provides consistent risk management and a unified interface for controlling safety settings across the entire MCP server.
528 | 
529 | All operations (SQL queries, API requests, SDK methods) are categorized into risk levels:
530 | - `Low` risk: Read-only operations that don't modify data or structure (SELECT queries, GET API requests)
531 | - `Medium` risk: Write operations that modify data but not structure (INSERT/UPDATE/DELETE, most POST/PUT API requests)
532 | - `High` risk: Destructive operations that modify database structure or could cause data loss (DROP/TRUNCATE, DELETE API endpoints)
533 | - `Extreme` risk: Operations with severe consequences that are blocked entirely (deleting projects)
534 | 
535 | Safety controls are applied based on risk level:
536 | - Low risk operations are always allowed
537 | - Medium risk operations require unsafe mode to be enabled
538 | - High risk operations require unsafe mode AND explicit confirmation
539 | - Extreme risk operations are never allowed
540 | 
541 | #### How confirmation flow works
542 | 
543 | Any high-risk operations (be it a postgresql or api request) will be blocked even in `unsafe` mode.
544 | ![Every high-risk operation is blocked](https://github.com/user-attachments/assets/c0df79c2-a879-4b1f-a39d-250f9965c36a)
545 | You will have to confirm and approve every high-risk operation explicitly in order for it to be executed.
546 | ![Explicit approval is always required](https://github.com/user-attachments/assets/5cd7a308-ec2a-414e-abe2-ff2f3836dd8b)
547 | 
548 | 
549 | ## Changelog
550 | 
551 | - 📦 Simplified installation via package manager - ✅ (v0.2.0)
552 | - 🌎 Support for different Supabase regions - ✅ (v0.2.2)
553 | - 🎮 Programmatic access to Supabase management API with safety controls - ✅ (v0.3.0)
554 | - 👷‍♂️ Read and read-write database SQL queries with safety controls - ✅ (v0.3.0)
555 | - 🔄 Robust transaction handling for both direct and pooled connections - ✅ (v0.3.2)
556 | - 🐍 Support methods and objects available in native Python SDK - ✅ (v0.3.6)
557 | - 🔍 Stronger SQL query validation ✅ (v0.3.8)
558 | - 📝 Automatic versioning of database changes ✅ (v0.3.8)
559 | - 📖 Radically improved knowledge and tools of api spec ✅ (v0.3.8)
560 | - ✍️ Improved consistency of migration-related tools for a more organized database vcs ✅ (v0.3.10)
561 | - 🥳 Query MCP is released (v0.4.0)
562 | 
563 | 
564 | For a more detailed roadmap, please see this [discussion](https://github.com/alexander-zuev/supabase-mcp-server/discussions/46) on GitHub.
565 | 
566 | 
567 | ## Star History
568 | 
569 | [![Star History Chart](https://api.star-history.com/svg?repos=alexander-zuev/supabase-mcp-server&type=Date)](https://star-history.com/#alexander-zuev/supabase-mcp-server&Date)
570 | 
571 | ---
572 | 
573 | Enjoy! ☺️
574 | 
```

--------------------------------------------------------------------------------
/CONTRIBUTING.MD:
--------------------------------------------------------------------------------

```markdown
 1 | # Contributing to Supabase MCP Server
 2 | 
 3 | Thank you for your interest in Supabase MCP Server. This project aims to maintain a high quality standard I've set for it. I welcome and carefully review all contributions. Please read the following guidelines carefully.
 4 | 
 5 | ## 🤓 Important: Pre-Contribution Requirements
 6 | 
 7 | 1. **Required: Open a Discussion First**
 8 |    - **All contributions** must start with a GitHub Discussion before any code is written
 9 |    - Explain your proposed changes, why they're needed, and how they align with the project's vision
10 |    - Wait for explicit approval from the maintainer before proceeding
11 |    - PRs without a prior approved discussion will be closed immediately without review
12 | 
13 | 2. **Project Vision**
14 |    - This project follows a specific development vision maintained by the owner
15 |    - Not all feature ideas will be accepted, even if well-implemented
16 |    - The maintainer reserves the right to decline contributions that don't align with the project's direction
17 | 
18 | ## 🛠️ Contribution Process (Only After Discussion Approval)
19 | 
20 | 1. **Fork the repository:** Click the "Fork" button in the top right corner of the GitHub page.
21 | 
22 | 2. **Create a new branch:** Create a branch with a descriptive name related to your contribution.
23 |    ```bash
24 |    git checkout -b feature/your-approved-feature
25 |    ```
26 | 
27 | 3. **Quality Requirements:**
28 |    - **Test Coverage:** All code changes must include appropriate tests
29 |    - **Documentation:** Update all relevant documentation
30 |    - **Code Style:** Follow the existing code style and patterns
31 |    - **Commit Messages:** Use clear, descriptive commit messages
32 | 
33 | 4. **Make your changes:** Implement the changes that were approved in the discussion.
34 | 
35 | 5. **Test thoroughly:** Ensure all tests pass and add new tests for your changes.
36 |    ```bash
37 |    # Run tests
38 |    pytest
39 |    ```
40 | 
41 | 6. **Commit your changes:** Use clear, descriptive commit messages that explain what you've done.
42 |    ```bash
43 |    git commit -m "feat: implement approved feature X"
44 |    ```
45 | 
46 | 7. **Push your branch:** Push your changes to your forked repository.
47 |    ```bash
48 |    git push origin feature/your-approved-feature
49 |    ```
50 | 
51 | 8. **Create a pull request:**
52 |    - Go to the original repository on GitHub
53 |    - Click "New Pull Request"
54 |    - Select "compare across forks"
55 |    - Select your fork and branch as the source
56 |    - Add a detailed description that references the approved discussion
57 |    - Include information about how you've tested the changes
58 |    - Submit the pull request
59 | 
60 | 9. **Review Process:**
61 |    - PRs will be reviewed when time permits
62 |    - Be prepared to make requested changes
63 |    - The maintainer may request significant revisions
64 |    - PRs may be rejected even after review if they don't meet quality standards
65 | 
66 | ## ⚠️ Grounds for Immediate Rejection
67 | 
68 | Your PR will be closed without review if:
69 | - No prior discussion was opened and approved
70 | - Tests are missing or failing
71 | - Documentation is not updated
72 | - Code quality doesn't meet project standards
73 | - PR description is inadequate
74 | - Changes don't align with the approved discussion
75 | 
76 | ## 🤔 Why These Requirements?
77 | 
78 | - This project is maintained by a single developer (me) with limited review time
79 | - Quality and consistency are prioritized over quantity of contributions
80 | - The project follows a specific vision that I want to maintain
81 | 
82 | ## 🌟 Acceptable Contributions
83 | 
84 | The following types of contributions are most welcome:
85 | - Bug fixes with clear reproduction steps
86 | - Performance improvements with benchmarks
87 | - Documentation improvements
88 | - New features that have been pre-approved via discussion
89 | 
90 | ## 💡 Alternative Ways to Contribute
91 | 
92 | If you have ideas but don't want to go through this process:
93 | - Fork the project and build your own version
94 | - Share your use case in Discussions
95 | - Report bugs with detailed reproduction steps
96 | 
97 | Thank you for understanding and respecting these guidelines. They help maintain the quality and direction of the project.
98 | 
```

--------------------------------------------------------------------------------
/supabase_mcp/services/__init__.py:
--------------------------------------------------------------------------------

```python
1 | 
```

--------------------------------------------------------------------------------
/supabase_mcp/services/api/__init__.py:
--------------------------------------------------------------------------------

```python
1 | 
```

--------------------------------------------------------------------------------
/supabase_mcp/services/database/__init__.py:
--------------------------------------------------------------------------------

```python
1 | 
```

--------------------------------------------------------------------------------
/supabase_mcp/services/logs/__init__.py:
--------------------------------------------------------------------------------

```python
1 | 
```

--------------------------------------------------------------------------------
/supabase_mcp/services/safety/__init__.py:
--------------------------------------------------------------------------------

```python
1 | 
```

--------------------------------------------------------------------------------
/supabase_mcp/services/sdk/__init__.py:
--------------------------------------------------------------------------------

```python
1 | 
```

--------------------------------------------------------------------------------
/tests/__init__.py:
--------------------------------------------------------------------------------

```python
1 | 
```

--------------------------------------------------------------------------------
/tests/services/__init__.py:
--------------------------------------------------------------------------------

```python
1 | 
```

--------------------------------------------------------------------------------
/tests/services/api/__init__.py:
--------------------------------------------------------------------------------

```python
1 | 
```

--------------------------------------------------------------------------------
/tests/services/database/sql/__init__.py:
--------------------------------------------------------------------------------

```python
1 | 
```

--------------------------------------------------------------------------------
/tests/services/database/sql/test_sql_validator_integration.py:
--------------------------------------------------------------------------------

```python
1 | 
```

--------------------------------------------------------------------------------
/.github/FUNDING.yml:
--------------------------------------------------------------------------------

```yaml
1 | github: alexander-zuev
2 | 
```

--------------------------------------------------------------------------------
/.claude/settings.local.json:
--------------------------------------------------------------------------------

```json
1 | {
2 |   "permissions": {
3 |     "allow": [
4 |       "Bash(mv:*)"
5 |     ],
6 |     "deny": []
7 |   }
8 | }
```

--------------------------------------------------------------------------------
/supabase_mcp/services/database/sql/queries/create_migration.sql:
--------------------------------------------------------------------------------

```sql
1 | -- Create a migration
2 | INSERT INTO supabase_migrations.schema_migrations
3 | (version, name, statements)
4 | VALUES ('{version}', '{name}', ARRAY['{statements}']);
5 | 
```

--------------------------------------------------------------------------------
/supabase_mcp/__init__.py:
--------------------------------------------------------------------------------

```python
1 | """Supabase MCP Server package."""
2 | 
3 | from supabase_mcp._version import __version__, version, version_tuple
4 | 
5 | __all__ = ["__version__", "version", "version_tuple"]
6 | 
```

--------------------------------------------------------------------------------
/supabase_mcp/core/__init__.py:
--------------------------------------------------------------------------------

```python
1 | """Supabase MCP Server package."""
2 | 
3 | from supabase_mcp._version import __version__, version, version_tuple
4 | 
5 | __all__ = ["__version__", "version", "version_tuple"]
6 | 
```

--------------------------------------------------------------------------------
/supabase_mcp/services/database/sql/queries/logs/postgrest_logs.sql:
--------------------------------------------------------------------------------

```sql
 1 | SELECT
 2 |   id,
 3 |   postgrest_logs.timestamp,
 4 |   event_message,
 5 |   identifier,
 6 |   metadata.host
 7 | FROM postgrest_logs
 8 | CROSS JOIN unnest(metadata) AS metadata
 9 | {where_clause}
10 | ORDER BY timestamp DESC
11 | LIMIT {limit};
12 | 
```

--------------------------------------------------------------------------------
/supabase_mcp/services/database/sql/queries/logs/pgbouncer_logs.sql:
--------------------------------------------------------------------------------

```sql
 1 | SELECT
 2 |   id,
 3 |   pgbouncer_logs.timestamp,
 4 |   event_message,
 5 |   metadata.host,
 6 |   metadata.project
 7 | FROM pgbouncer_logs
 8 | CROSS JOIN unnest(metadata) AS metadata
 9 | {where_clause}
10 | ORDER BY timestamp DESC
11 | LIMIT {limit};
12 | 
```

--------------------------------------------------------------------------------
/supabase_mcp/tools/__init__.py:
--------------------------------------------------------------------------------

```python
1 | """Tool manager module for Supabase MCP server.
2 | 
3 | This module provides a centralized way to manage tool descriptions and registration.
4 | """
5 | 
6 | from .manager import ToolManager, ToolName
7 | 
8 | __all__ = ["ToolManager", "ToolName"]
9 | 
```

--------------------------------------------------------------------------------
/supabase_mcp/services/database/sql/queries/logs/supavisor_logs.sql:
--------------------------------------------------------------------------------

```sql
 1 | SELECT
 2 |   id,
 3 |   supavisor_logs.timestamp,
 4 |   event_message,
 5 |   metadata.level,
 6 |   metadata.project,
 7 |   metadata.region
 8 | FROM supavisor_logs
 9 | CROSS JOIN unnest(metadata) AS metadata
10 | {where_clause}
11 | ORDER BY timestamp DESC
12 | LIMIT {limit};
13 | 
```

--------------------------------------------------------------------------------
/supabase_mcp/services/database/sql/queries/logs/auth_logs.sql:
--------------------------------------------------------------------------------

```sql
 1 | SELECT
 2 |   id,
 3 |   auth_logs.timestamp,
 4 |   event_message,
 5 |   metadata.level,
 6 |   metadata.status,
 7 |   metadata.path,
 8 |   metadata.msg
 9 | FROM auth_logs
10 | CROSS JOIN unnest(metadata) AS metadata
11 | {where_clause}
12 | ORDER BY timestamp DESC
13 | LIMIT {limit};
14 | 
```

--------------------------------------------------------------------------------
/supabase_mcp/services/database/sql/queries/logs/storage_logs.sql:
--------------------------------------------------------------------------------

```sql
 1 | SELECT
 2 |   id,
 3 |   storage_logs.timestamp,
 4 |   event_message,
 5 |   metadata.level,
 6 |   metadata.project,
 7 |   metadata.responseTime,
 8 |   metadata.rawError
 9 | FROM storage_logs
10 | CROSS JOIN unnest(metadata) AS metadata
11 | {where_clause}
12 | ORDER BY timestamp DESC
13 | LIMIT {limit};
14 | 
```

--------------------------------------------------------------------------------
/supabase_mcp/services/database/sql/queries/logs/postgres_logs.sql:
--------------------------------------------------------------------------------

```sql
 1 | SELECT
 2 |   id,
 3 |   postgres_logs.timestamp,
 4 |   event_message,
 5 |   identifier,
 6 |   parsed.error_severity,
 7 |   parsed.query,
 8 |   parsed.application_name
 9 | FROM postgres_logs
10 | CROSS JOIN unnest(metadata) AS m
11 | CROSS JOIN unnest(m.parsed) AS parsed
12 | {where_clause}
13 | ORDER BY timestamp DESC
14 | LIMIT {limit};
15 | 
```

--------------------------------------------------------------------------------
/supabase_mcp/services/database/sql/queries/logs/edge_logs.sql:
--------------------------------------------------------------------------------

```sql
 1 | SELECT
 2 |   id,
 3 |   edge_logs.timestamp,
 4 |   event_message,
 5 |   identifier,
 6 |   request.method,
 7 |   request.path,
 8 |   response.status_code,
 9 |   request.url,
10 |   response.origin_time
11 | FROM edge_logs
12 | CROSS JOIN unnest(metadata) AS m
13 | CROSS JOIN unnest(m.request) AS request
14 | CROSS JOIN unnest(m.response) AS response
15 | {where_clause}
16 | ORDER BY timestamp DESC
17 | LIMIT {limit};
18 | 
```

--------------------------------------------------------------------------------
/supabase_mcp/services/database/sql/queries/init_migrations.sql:
--------------------------------------------------------------------------------

```sql
 1 | -- Initialize migrations infrastructure
 2 | -- Create the migrations schema if it doesn't exist
 3 | CREATE SCHEMA IF NOT EXISTS supabase_migrations;
 4 | 
 5 | -- Create the migrations table if it doesn't exist
 6 | CREATE TABLE IF NOT EXISTS supabase_migrations.schema_migrations (
 7 |     version TEXT PRIMARY KEY,
 8 |     statements TEXT[] NOT NULL,
 9 |     name TEXT NOT NULL
10 | );
11 | 
```

--------------------------------------------------------------------------------
/supabase_mcp/services/database/sql/queries/logs/cron_logs.sql:
--------------------------------------------------------------------------------

```sql
 1 | SELECT
 2 |   id,
 3 |   postgres_logs.timestamp,
 4 |   event_message,
 5 |   identifier,
 6 |   parsed.error_severity,
 7 |   parsed.query,
 8 |   parsed.application_name
 9 | FROM postgres_logs
10 | CROSS JOIN unnest(metadata) AS m
11 | CROSS JOIN unnest(m.parsed) AS parsed
12 | WHERE (parsed.application_name = 'pg_cron' OR event_message LIKE '%cron job%')
13 | {and_where_clause}
14 | ORDER BY timestamp DESC
15 | LIMIT {limit};
16 | 
```

--------------------------------------------------------------------------------
/supabase_mcp/services/database/sql/queries/logs/realtime_logs.sql:
--------------------------------------------------------------------------------

```sql
 1 | SELECT
 2 |   id,
 3 |   realtime_logs.timestamp,
 4 |   event_message,
 5 |   metadata.level,
 6 |   measurements.connected,
 7 |   measurements.connected_cluster,
 8 |   measurements.limit,
 9 |   measurements.sum,
10 |   metadata.external_id
11 | FROM realtime_logs
12 | CROSS JOIN unnest(metadata) AS metadata
13 | CROSS JOIN unnest(metadata.measurements) AS measurements
14 | {where_clause}
15 | ORDER BY timestamp DESC
16 | LIMIT {limit};
17 | 
```

--------------------------------------------------------------------------------
/Dockerfile:
--------------------------------------------------------------------------------

```dockerfile
 1 | FROM python:3.12-slim-bookworm
 2 | 
 3 | WORKDIR /app
 4 | 
 5 | # Prepare the basic dependencies
 6 | RUN apt-get update && apt-get install -y \
 7 |     && rm -rf /var/lib/apt/lists/*
 8 | 
 9 | # Install pipx
10 | RUN pip install --no-cache-dir pipx && \
11 |     pipx ensurepath && \
12 |     pipx install supabase-mcp-server
13 | 
14 | # Add pipx bin directory to PATH
15 | ENV PATH="/root/.local/bin:$PATH"
16 | 
17 | CMD ["supabase-mcp-server"]
18 | 
```

--------------------------------------------------------------------------------
/supabase_mcp/services/database/sql/queries/logs/function_edge_logs.sql:
--------------------------------------------------------------------------------

```sql
 1 | SELECT
 2 |   id,
 3 |   function_edge_logs.timestamp,
 4 |   event_message,
 5 |   m.deployment_id,
 6 |   m.execution_time_ms,
 7 |   m.function_id,
 8 |   m.project_ref,
 9 |   request.method,
10 |   request.pathname,
11 |   request.url,
12 |   request.host,
13 |   response.status_code,
14 |   m.version
15 | FROM function_edge_logs
16 | CROSS JOIN unnest(metadata) AS m
17 | CROSS JOIN unnest(m.request) AS request
18 | CROSS JOIN unnest(m.response) AS response
19 | {where_clause}
20 | ORDER BY timestamp DESC
21 | LIMIT {limit};
22 | 
```

--------------------------------------------------------------------------------
/supabase_mcp/services/database/sql/queries/get_migrations.sql:
--------------------------------------------------------------------------------

```sql
 1 | SELECT
 2 |     version,
 3 |     name,
 4 |     CASE
 5 |         WHEN '{include_full_queries}' = 'true' THEN statements
 6 |         ELSE NULL
 7 |     END AS statements,
 8 |     array_length(statements, 1) AS statement_count,
 9 |     CASE
10 |         WHEN version ~ '^[0-9]+$' THEN 'numbered'
11 |         ELSE 'named'
12 |     END AS version_type
13 | FROM supabase_migrations.schema_migrations
14 | WHERE
15 |     -- Filter by name if provided
16 |     ('{name_pattern}' = '' OR name ILIKE '%' || '{name_pattern}' || '%')
17 | ORDER BY
18 |     -- Order by version (timestamp) descending
19 |     version DESC
20 | LIMIT {limit} OFFSET {offset};
21 | 
```

--------------------------------------------------------------------------------
/.github/ISSUE_TEMPLATE/roadmap_item.md:
--------------------------------------------------------------------------------

```markdown
 1 | ---
 2 | name: Roadmap Item (only for maintainers)
 3 | about: Reserved for maintainers, used to track roadmap items
 4 | title: "[target version] Brief description of the feature"
 5 | labels: roadmap
 6 | assignees: alexander-zuev
 7 | 
 8 | ---
 9 | 
10 | ## Context
11 | What is the context of this item?
12 | 
13 | ## Motivation
14 | Why should this feature be added? What problems does it solve? For whom?
15 | 
16 | ## Expected Impact
17 | - **Users**: How will this benefit users?
18 | - **Development**: How does this improve the codebase or development process?
19 | 
20 | ## Implementation Ideas
21 | Any initial thoughts on how this could be implemented.
22 | 
```

--------------------------------------------------------------------------------
/supabase_mcp/services/safety/models.py:
--------------------------------------------------------------------------------

```python
 1 | from enum import Enum, IntEnum
 2 | 
 3 | 
 4 | class OperationRiskLevel(IntEnum):
 5 |     """Universal operation risk level mapping.
 6 | 
 7 |     Higher number reflects higher risk levels with 4 being the highest."""
 8 | 
 9 |     LOW = 1
10 |     MEDIUM = 2
11 |     HIGH = 3
12 |     EXTREME = 4
13 | 
14 | 
15 | class SafetyMode(str, Enum):
16 |     """Universal safety mode of a client (database, api, etc).
17 |     Clients should always default to safe mode."""
18 | 
19 |     SAFE = "safe"
20 |     UNSAFE = "unsafe"
21 | 
22 | 
23 | class ClientType(str, Enum):
24 |     """Types of clients that can be managed by the safety system."""
25 | 
26 |     DATABASE = "database"
27 |     API = "api"
28 | 
```

--------------------------------------------------------------------------------
/.github/ISSUE_TEMPLATE/feature_request.md:
--------------------------------------------------------------------------------

```markdown
 1 | ---
 2 | name: Feature Request
 3 | about: Suggest an idea to improve the Supabase MCP server
 4 | title: "I want X so that I can do Y and gain Z"
 5 | labels: ''
 6 | assignees: ''
 7 | 
 8 | ---
 9 | 
10 | ## Is your feature request related to a problem?
11 | A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]
12 | 
13 | ## Describe the solution you'd like
14 | A clear and concise description of what you want to happen.
15 | 
16 | ## Describe alternatives you've considered
17 | A clear and concise description of any alternative solutions or features you've considered.
18 | 
19 | ## Additional context
20 | Add any other context or screenshots about the feature request here.
21 | 
```

--------------------------------------------------------------------------------
/.github/PULL_REQUEST_TEMPLATE.md:
--------------------------------------------------------------------------------

```markdown
 1 |  # Description
 2 | 
 3 | <!-- Provide a clear and concise description of what this PR accomplishes -->
 4 | 
 5 | ## Type of Change
 6 | 
 7 | - [ ] Bug fix (non-breaking change which fixes an issue)
 8 | - [ ] New feature (non-breaking change which adds functionality)
 9 | - [ ] Breaking change (fix or feature that would cause existing functionality to not work as expected)
10 | - [ ] Documentation update
11 | - [ ] Performance improvement
12 | - [ ] Code refactoring (no functional changes)
13 | - [ ] Test updates
14 | - [ ] CI/CD or build process changes
15 | - [ ] Other (please describe):
16 | 
17 | ## Checklist
18 | - [ ] I have performed a self-review of my own code
19 | - [ ] I have made corresponding changes to the documentation
20 | - [ ] New and existing unit tests pass locally with my changes
21 | 
```

--------------------------------------------------------------------------------
/codecov.yml:
--------------------------------------------------------------------------------

```yaml
 1 | codecov:
 2 |   require_ci_to_pass: true
 3 |   notify:
 4 |     wait_for_ci: true
 5 | 
 6 | coverage:
 7 |   precision: 1
 8 |   round: down
 9 |   range: "60...90"
10 | 
11 |   status:
12 |     # For overall project coverage
13 |     project:
14 |       default:
15 |         target: 80%       # Target coverage of 80%
16 |         threshold: 5%     # Allow a 5% decrease without failing
17 |         informational: true  # Start as informational until coverage improves
18 | 
19 |     # For coverage of new/changed code in PRs
20 |     patch:
21 |       default:
22 |         target: 80%
23 |         threshold: 5%     # Be a bit more lenient on PR patches
24 |         informational: true  # Start as informational until coverage improves
25 | 
26 |     # Don't fail if there are no changes
27 |     changes: no
28 | 
29 | # Configure PR comment
30 | comment:
31 |   layout: "reach, diff, flags, files"
32 |   behavior: default
33 |   require_changes: false  # Comment even if coverage doesn't change
34 |   require_base: false
35 |   require_head: true
36 |   hide_project_coverage: false  # Show both project and diff coverage
37 | 
38 | # Ignore certain paths
39 | ignore:
40 |   - "tests/**/*"          # Don't count test files in coverage
41 |   - "supabase_mcp/_version.py"  # Auto-generated version file
42 | 
```

--------------------------------------------------------------------------------
/supabase_mcp/services/database/sql/queries/get_schemas.sql:
--------------------------------------------------------------------------------

```sql
 1 | WITH t AS (
 2 |     -- Regular tables
 3 |     SELECT
 4 |         schemaname AS schema_name,
 5 |         tablename AS table_name,
 6 |         'regular' AS table_type
 7 |     FROM pg_tables
 8 | 
 9 |     UNION ALL
10 | 
11 |     -- Foreign tables
12 |     SELECT
13 |         foreign_table_schema AS schema_name,
14 |         foreign_table_name AS table_name,
15 |         'foreign' AS table_type
16 |     FROM information_schema.foreign_tables
17 | )
18 | 
19 | SELECT
20 |     s.schema_name,
21 |     COALESCE(PG_SIZE_PRETTY(SUM(
22 |         COALESCE(
23 |             CASE
24 |                 WHEN t.table_type = 'regular'
25 |                     THEN PG_TOTAL_RELATION_SIZE(
26 |                         QUOTE_IDENT(t.schema_name) || '.' || QUOTE_IDENT(t.table_name)
27 |                     )
28 |                 ELSE 0
29 |             END, 0
30 |         )
31 |     )), '0 B') AS total_size,
32 |     COUNT(t.table_name) AS table_count
33 | FROM information_schema.schemata AS s
34 | LEFT JOIN t ON s.schema_name = t.schema_name
35 | WHERE
36 |     s.schema_name NOT IN ('pg_catalog', 'information_schema')
37 |     AND s.schema_name NOT LIKE 'pg_%'
38 |     AND s.schema_name NOT LIKE 'pg_toast%'
39 | GROUP BY s.schema_name
40 | ORDER BY
41 |     COUNT(t.table_name) DESC,           -- Schemas with most tables first
42 |     total_size DESC,                    -- Then by size
43 |     s.schema_name ASC;                      -- Then alphabetically
44 | 
```

--------------------------------------------------------------------------------
/supabase_mcp/logger.py:
--------------------------------------------------------------------------------

```python
 1 | import logging
 2 | import logging.handlers
 3 | from pathlib import Path
 4 | 
 5 | 
 6 | def setup_logger() -> logging.Logger:
 7 |     """Configure logging for the MCP server with log rotation."""
 8 |     logger = logging.getLogger("supabase-mcp")
 9 | 
10 |     # Remove existing handlers to avoid duplicate logs
11 |     if logger.hasHandlers():
12 |         logger.handlers.clear()
13 | 
14 |     # Define a consistent log directory in the user's home folder
15 |     log_dir = Path.home() / ".local" / "share" / "supabase-mcp"
16 |     log_dir.mkdir(parents=True, exist_ok=True)  # Ensure the directory exists
17 | 
18 |     # Define the log file path
19 |     log_file = log_dir / "mcp_server.log"
20 | 
21 |     # Create a rotating file handler
22 |     # - Rotate when log reaches 5MB
23 |     # - Keep 3 backup files
24 |     file_handler = logging.handlers.RotatingFileHandler(
25 |         log_file,
26 |         maxBytes=5 * 1024 * 1024,  # 5MB
27 |         backupCount=3,
28 |         encoding="utf-8",
29 |     )
30 | 
31 |     # Create formatter
32 |     formatter = logging.Formatter("[%(asctime)s] %(levelname)-8s %(message)s", datefmt="%y/%m/%d %H:%M:%S")
33 | 
34 |     # Add formatter to file handler
35 |     file_handler.setFormatter(formatter)
36 | 
37 |     # Add handler to logger
38 |     logger.addHandler(file_handler)
39 | 
40 |     # Set level
41 |     logger.setLevel(logging.DEBUG)
42 | 
43 |     return logger
44 | 
45 | 
46 | logger = setup_logger()
47 | 
```

--------------------------------------------------------------------------------
/.github/workflows/docs/release-checklist.md:
--------------------------------------------------------------------------------

```markdown
 1 | # Release Checklist
 2 | 
 3 | Pre-release
 4 | 1. Tests pass
 5 | 2. CI passes
 6 | 3. Build succeeds
 7 | 4. Clean install succeeds
 8 | 5. Documentation is up to date
 9 | 6. Changelog is up to date
10 | 7. Tag and release on GitHub
11 | 8. Release is published to PyPI
12 | 9. Update dockerfile
13 | 10. Update .env.example (if necessary)
14 | 
15 | Post-release
16 | - Clean install from PyPi works
17 | 
18 | 
19 | ## v0.3.12 - 2025-03-12
20 | 
21 | Pre-release
22 | 1. Tests pass - [X]
23 | 2. CI passes - [X]
24 | 3. Build succeeds - [X]
25 | 4. Documentation is up to date - [X]
26 | 5. Changelog is up to date - [X]
27 | 6. Tag and release on GitHub - []
28 | 7. Release is published to PyPI - []
29 | 8. Update dockerfile - [X]
30 | 9. Update .env.example (if necessary) - [X]
31 | 
32 | Post-release
33 | 10. Clean install from PyPi works - []
34 | 
35 | 
36 | ## v0.3.8 - 2025-03-07
37 | 
38 | Pre-release
39 | 1. Tests pass - [X]
40 | 2. CI passes - [X]
41 | 3. Build succeeds - [X]
42 | 4. Documentation is up to date - [X]
43 | 5. Changelog is up to date - [X]
44 | 6. Tag and release on GitHub
45 | 7. Release is published to PyPI
46 | 8. Update dockerfile - [X]
47 | 9. Update .env.example (if necessary) - [X]
48 | 
49 | Post-release
50 | 10. Clean install from PyPi works - [X]
51 | 
52 | 
53 | 
54 | ## v0.3.0 - 2025-02-22
55 | 
56 | 1. Tests pass - [X]
57 | 2. CI passes - [X]
58 | 3. Build succeeds - [X]
59 | 4. Clean install succeeds - [X]
60 | 5. Documentation is up to date - [X]
61 | 6. Changelog is up to date - [X]
62 | 7. Tag and release on GitHub - [X]
63 | 8. Release is published to PyPI - [X]
64 | 9. Clean install from PyPI works - [X]
65 | 
```

--------------------------------------------------------------------------------
/.github/ISSUE_TEMPLATE/bug_report.md:
--------------------------------------------------------------------------------

```markdown
 1 | ---
 2 | name: Bug Report
 3 | about: Report an issue with the Supabase MCP server
 4 | title: "An issue with doing X when Y under conditions Z"
 5 | labels: bug
 6 | assignees: alexander-zuev
 7 | 
 8 | ---
 9 | 
10 | ## ⚠️ IMPORTANT NOTE
11 | The following types of reports will be closed immediately without investigation:
12 | - Vague reports like "Something is not working" without clear explanations
13 | - Issues missing reproduction steps or necessary context
14 | - Reports without essential information (logs, environment details)
15 | - Issues already covered in the README
16 | 
17 | Please provide complete information as outlined below.
18 | 
19 | ## Describe the bug
20 | A clear and concise description of what the bug is.
21 | 
22 | ## Steps to Reproduce
23 | 1.
24 | 2.
25 | 3.
26 | 
27 | ## Connection Details
28 | - Connection type: (Local or Remote)
29 | - Using password with special characters? (Yes/No)
30 | 
31 | ## Screenshots
32 | If applicable, add screenshots to help explain your problem.
33 | 
34 | ## Logs
35 | HIGHLY USEFUL: Attach server logs from:
36 | - macOS/Linux: ~/.local/share/supabase-mcp/mcp_server.log
37 | - Windows: %USERPROFILE%\.local\share\supabase-mcp\mcp_server.log
38 | 
39 | You can get the last 50 lines with:
40 | ```
41 | tail -n 50 ~/.local/share/supabase-mcp/mcp_server.log
42 | ```
43 | 
44 | ## Additional context
45 | Add any other context about the problem here.
46 | 
47 | ## Checklist
48 | Mark completed items with an [x]:
49 | - [ ] I've included the server logs
50 | - [ ] I've checked the README troubleshooting section
51 | - [ ] I've verified my connection settings are correct
52 | 
```

--------------------------------------------------------------------------------
/smithery.yaml:
--------------------------------------------------------------------------------

```yaml
 1 | # Smithery configuration file: https://smithery.ai/docs/config#smitheryyaml
 2 | 
 3 | build:
 4 |   dockerBuildPath: .
 5 | 
 6 | startCommand:
 7 |   type: stdio
 8 |   configSchema:
 9 |     type: object
10 |     required:
11 |       - supabaseProjectRef
12 |       - supabaseDbPassword
13 |       - supabaseRegion
14 |       - queryApiKey
15 |     properties:
16 |       queryApiKey:
17 |         type: string
18 |         description: "(required) - Your Query API key"
19 |       supabaseProjectRef:
20 |         type: string
21 |         description: "(required) - Supabase project reference ID. Defaults to: 127.0.0.1:54322"
22 |       supabaseDbPassword:
23 |         type: string
24 |         description: "(required) - Database password"
25 |       supabaseRegion:
26 |         type: string
27 |         description: "(required) - AWS region where your Supabase project is hosted - Default: us-east-1"
28 |       supabaseAccessToken:
29 |         type: string
30 |         description: "(optional) - Personal access token for Supabase Management API - Default: none"
31 |       supabaseServiceRoleKey:
32 |         type: string
33 |         description: "(optional) - Project Service Role Key for Auth Admin SDK - Default: none"
34 |   commandFunction:
35 |     |-
36 |     (config) => ({
37 |       command: 'supabase-mcp-server',
38 |       args: [],
39 |       env: {
40 |         QUERY_API_KEY: config.queryApiKey,
41 |         SUPABASE_PROJECT_REF: config.supabaseProjectRef,
42 |         SUPABASE_DB_PASSWORD: config.supabaseDbPassword,
43 |         SUPABASE_REGION: config.supabaseRegion,
44 |         SUPABASE_ACCESS_TOKEN: config.supabaseAccessToken,
45 |         SUPABASE_SERVICE_ROLE_KEY: config.supabaseServiceRoleKey
46 |       }
47 |     })
48 | 
```

--------------------------------------------------------------------------------
/supabase_mcp/main.py:
--------------------------------------------------------------------------------

```python
 1 | from collections.abc import AsyncGenerator
 2 | from contextlib import asynccontextmanager
 3 | 
 4 | from mcp.server.fastmcp import FastMCP
 5 | 
 6 | from supabase_mcp.core.container import ServicesContainer
 7 | from supabase_mcp.logger import logger
 8 | from supabase_mcp.settings import settings
 9 | from supabase_mcp.tools.registry import ToolRegistry
10 | 
11 | 
12 | # Create lifespan for the MCP server
13 | @asynccontextmanager
14 | async def lifespan(app: FastMCP) -> AsyncGenerator[FastMCP, None]:
15 |     try:
16 |         logger.info("Initializing services")
17 | 
18 |         # Initialize services
19 |         services_container = ServicesContainer.get_instance()
20 |         services_container.initialize_services(settings)
21 | 
22 |         # Register tools
23 |         mcp = ToolRegistry(mcp=app, services_container=services_container).register_tools()
24 |         yield mcp
25 |     finally:
26 |         logger.info("Shutting down services")
27 |         services_container = ServicesContainer.get_instance()
28 |         await services_container.shutdown_services()
29 |         # Force kill the entire process - doesn't care about async contexts
30 |         import os
31 | 
32 |         os._exit(0)  # Use 0 for successful termination
33 | 
34 | 
35 | # Create mcp instance
36 | mcp = FastMCP("supabase", lifespan=lifespan)
37 | 
38 | 
39 | def run_server() -> None:
40 |     logger.info("Starting Supabase MCP server")
41 |     mcp.run()
42 |     logger.info("This code runs only if I don't exit in lifespan")
43 | 
44 | 
45 | def run_inspector() -> None:
46 |     """Inspector mode - same as mcp dev"""
47 |     logger.info("Starting Supabase MCP server inspector")
48 | 
49 |     from mcp.cli.cli import dev
50 | 
51 |     return dev(__file__)
52 | 
53 | 
54 | if __name__ == "__main__":
55 |     logger.info("Starting Supabase MCP server")
56 |     run_server()
57 | 
```

--------------------------------------------------------------------------------
/supabase_mcp/services/database/sql/queries/get_table_schema.sql:
--------------------------------------------------------------------------------

```sql
 1 | WITH pk AS (
 2 |     SELECT ccu.column_name
 3 |     FROM information_schema.table_constraints AS tc
 4 |     INNER JOIN information_schema.constraint_column_usage AS ccu
 5 |         ON tc.constraint_name = ccu.constraint_name
 6 |     WHERE
 7 |         tc.table_schema = '{schema_name}'
 8 |         AND tc.table_name = '{table}'
 9 |         AND tc.constraint_type = 'PRIMARY KEY'
10 | ),
11 | 
12 | fk AS (
13 |     SELECT
14 |         kcu.column_name,
15 |         ccu.table_name AS foreign_table_name,
16 |         ccu.column_name AS foreign_column_name
17 |     FROM information_schema.table_constraints AS tc
18 |     INNER JOIN information_schema.key_column_usage AS kcu
19 |         ON tc.constraint_name = kcu.constraint_name
20 |     INNER JOIN information_schema.constraint_column_usage AS ccu
21 |         ON tc.constraint_name = ccu.constraint_name
22 |     WHERE
23 |         tc.table_schema = '{schema_name}'
24 |         AND tc.table_name = '{table}'
25 |         AND tc.constraint_type = 'FOREIGN KEY'
26 | )
27 | 
28 | SELECT DISTINCT
29 |     c.column_name,
30 |     c.data_type,
31 |     c.is_nullable,
32 |     c.column_default,
33 |     c.ordinal_position,
34 |     fk.foreign_table_name,
35 |     fk.foreign_column_name,
36 |     col_description(pc.oid, c.ordinal_position) AS column_description,
37 |     coalesce(pk.column_name IS NOT NULL, FALSE) AS is_primary_key
38 | FROM information_schema.columns AS c
39 | INNER JOIN pg_class AS pc
40 |     ON
41 |         pc.relname = '{table}'
42 |         AND pc.relnamespace = (
43 |             SELECT oid FROM pg_namespace
44 |             WHERE nspname = '{schema_name}'
45 |         )
46 | LEFT JOIN pk ON c.column_name = pk.column_name
47 | LEFT JOIN fk ON c.column_name = fk.column_name
48 | WHERE
49 |     c.table_schema = '{schema_name}'
50 |     AND c.table_name = '{table}'
51 | ORDER BY c.ordinal_position;
52 | 
```

--------------------------------------------------------------------------------
/.github/workflows/publish.yaml:
--------------------------------------------------------------------------------

```yaml
 1 | name: Publish to PyPI
 2 | 
 3 | on:
 4 |   release:
 5 |     types: [published]
 6 |     branches: [main]  # Only trigger for releases from main
 7 | 
 8 | env:
 9 |   UV_VERSION: "0.6.0" # Pin uv version to avoid breaking changes
10 | 
11 | jobs:
12 |   build-and-publish:
13 |     runs-on: ubuntu-latest
14 |     environment:
15 |       name: pypi
16 |       url: https://pypi.org/project/supabase-mcp-server/
17 |     permissions:
18 |       id-token: write  # Required for trusted publishing
19 |       contents: read
20 | 
21 |     steps:
22 |       - uses: actions/checkout@v4
23 |         with:
24 |           fetch-depth: 0  # Required for proper version detection
25 | 
26 |       - name: Install uv
27 |         uses: astral-sh/setup-uv@v5
28 |         with:
29 |           version: ${{ env.UV_VERSION }}
30 | 
31 |       - name: Set up Python
32 |         uses: actions/setup-python@v5
33 |         with:
34 |           python-version: "3.12"
35 | 
36 |       - name: Build package
37 |         run: uv build --no-sources
38 | 
39 |       - name: Verify package installation and entry points
40 |         env:
41 |           SUPABASE_PROJECT_REF: ${{ secrets.SUPABASE_PROJECT_REF }}
42 |           SUPABASE_DB_PASSWORD: ${{ secrets.SUPABASE_DB_PASSWORD }}
43 |         run: |
44 |           # Create a new venv for testing
45 |           uv venv
46 |           source .venv/bin/activate
47 | 
48 |           # Install the built wheel
49 |           uv pip install dist/*.whl
50 | 
51 |           echo "Testing supabase-mcp-server entry point..."
52 |           # Run with --help to test basic functionality without needing actual connection
53 |           if ! uv run supabase-mcp-server --help; then
54 |             echo "❌ supabase-mcp-server --help failed"
55 |             exit 1
56 |           fi
57 |           echo "✅ supabase-mcp-server --help succeeded"
58 | 
59 |       - name: Publish to PyPI
60 |         uses: pypa/gh-action-pypi-publish@release/v1
61 | 
```

--------------------------------------------------------------------------------
/.github/workflows/ci.yaml:
--------------------------------------------------------------------------------

```yaml
 1 | name: CI
 2 | 
 3 | on:
 4 |   push:
 5 |     branches: [ main ]
 6 |   pull_request:
 7 |     branches: [ main ]
 8 | 
 9 | 
10 | env:
11 |   UV_VERSION: "0.6.1" # Pin uv version to avoid breaking changes
12 | 
13 | jobs:
14 |   test:
15 |     runs-on: ubuntu-latest
16 |     env:
17 |       # Test environment variables - no real secrets needed anymore
18 |       SUPABASE_PROJECT_REF: "abcdefghij1234567890"
19 |       SUPABASE_DB_PASSWORD: "test-password-123"
20 |       SUPABASE_ACCESS_TOKEN: "test-access-token"
21 |       SUPABASE_SERVICE_ROLE_KEY: "eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJzdWIiOiIxMjM0NTY3ODkwIiwibmFtZSI6IkpvaG4gRG9lIiwiaWF0IjoxNTE2MjM5MDIyfQ.SflKxwRJSMeKKF2QT4fwpMeJf36POk6yJV_adQssw5c"
22 |     steps:
23 |       - uses: actions/checkout@v4
24 | 
25 |       - name: Set up Python 3.12
26 |         uses: actions/setup-python@v5
27 |         with:
28 |           python-version: "3.12"
29 | 
30 |       - name: Install uv
31 |         uses: astral-sh/setup-uv@v5
32 |         with:
33 |           version: ${{ env.UV_VERSION }}
34 | 
35 |       - name: Create venv and install dependencies
36 |         run: |
37 |           # Create venv and install dependencies
38 |           uv venv
39 |           source .venv/bin/activate
40 |           uv sync --group dev --frozen
41 | 
42 |       - name: Run tests
43 |         run: |
44 |           source .venv/bin/activate # necessary for pytest
45 |           pytest --cov=supabase_mcp --cov-report=xml --cov-report=term
46 | 
47 |       - name: Upload coverage to Codecov
48 |         uses: codecov/codecov-action@v3
49 |         with:
50 |           token: ${{ secrets.CODECOV_TOKEN }}
51 |           files: ./coverage.xml
52 |           fail_ci_if_error: false
53 | 
54 |       - name: Build distribution packages
55 |         run: |
56 |           uv build --no-sources
57 |           # Verify dist contains both wheel and tar.gz
58 |           test -f dist/*.whl
59 |           test -f dist/*.tar.gz
60 | 
```

--------------------------------------------------------------------------------
/tests/test_container.py:
--------------------------------------------------------------------------------

```python
 1 | from typing import Any, cast
 2 | 
 3 | from mcp.server.fastmcp import FastMCP
 4 | 
 5 | from supabase_mcp.core.container import ServicesContainer
 6 | from supabase_mcp.settings import Settings
 7 | 
 8 | 
 9 | class TestContainer:
10 |     """Tests for the Container class functionality."""
11 | 
12 |     def test_container_initialization(self, container_integration: ServicesContainer):
13 |         """Test that the container is properly initialized with all services."""
14 |         # Verify all services are properly initialized
15 |         assert container_integration.postgres_client is not None
16 |         assert container_integration.api_client is not None
17 |         assert container_integration.sdk_client is not None
18 |         assert container_integration.api_manager is not None
19 |         assert container_integration.safety_manager is not None
20 |         assert container_integration.query_manager is not None
21 |         assert container_integration.tool_manager is not None
22 |         assert container_integration.mcp_server is not None
23 | 
24 |     def test_container_initialize_method(self, settings_integration: Settings, mock_mcp_server: Any):
25 |         """Test the initialize method creates all services properly."""
26 |         # Create empty container
27 |         container = ServicesContainer(mcp_server=cast(FastMCP, mock_mcp_server))
28 | 
29 |         # Initialize with settings
30 |         container.initialize_services(settings_integration)
31 | 
32 |         # Verify all services were created
33 |         assert container.postgres_client is not None
34 |         assert container.api_client is not None
35 |         assert container.sdk_client is not None
36 |         assert container.api_manager is not None
37 |         assert container.safety_manager is not None
38 |         assert container.query_manager is not None
39 |         assert container.tool_manager is not None
40 |         assert container.mcp_server is not None
41 | 
```

--------------------------------------------------------------------------------
/supabase_mcp/services/database/sql/queries/get_tables.sql:
--------------------------------------------------------------------------------

```sql
 1 | (
 2 | -- Regular tables & views: full metadata available
 3 |     SELECT
 4 |         t.table_name,
 5 |         t.table_type,
 6 |         obj_description(pc.oid) AS description,
 7 |         pg_total_relation_size(format('%I.%I', t.table_schema, t.table_name))::bigint AS size_bytes,
 8 |         pg_stat_get_live_tuples(pc.oid)::bigint AS row_count,
 9 |         (
10 |             SELECT count(*) FROM information_schema.columns AS c
11 |             WHERE
12 |                 c.table_schema = t.table_schema
13 |                 AND c.table_name = t.table_name
14 |         ) AS column_count,
15 |         (
16 |             SELECT count(*) FROM pg_indexes AS i
17 |             WHERE
18 |                 i.schemaname = t.table_schema
19 |                 AND i.tablename = t.table_name
20 |         ) AS index_count
21 |     FROM information_schema.tables AS t
22 |     INNER JOIN pg_class AS pc
23 |         ON
24 |             t.table_name = pc.relname
25 |             AND pc.relnamespace = (
26 |                 SELECT oid FROM pg_namespace
27 |                 WHERE nspname = '{schema_name}'
28 |             )
29 |     WHERE
30 |         t.table_schema = '{schema_name}'
31 |         AND t.table_type IN ('BASE TABLE', 'VIEW')
32 | )
33 | UNION ALL
34 | (
35 | -- Foreign tables: limited metadata (size & row count functions don't apply)
36 |     SELECT
37 |         ft.foreign_table_name AS table_name,
38 |         'FOREIGN TABLE' AS table_type,
39 |         (
40 |             SELECT obj_description(
41 |                 (quote_ident(ft.foreign_table_schema) || '.' || quote_ident(ft.foreign_table_name))::regclass
42 |             )
43 |         ) AS description,
44 |         0::bigint AS size_bytes,
45 |         NULL::bigint AS row_count,
46 |         (
47 |             SELECT count(*) FROM information_schema.columns AS c
48 |             WHERE
49 |                 c.table_schema = ft.foreign_table_schema
50 |                 AND c.table_name = ft.foreign_table_name
51 |         ) AS column_count,
52 |         0 AS index_count
53 |     FROM information_schema.foreign_tables AS ft
54 |     WHERE ft.foreign_table_schema = '{schema_name}'
55 | )
56 | ORDER BY size_bytes DESC;
57 | 
```

--------------------------------------------------------------------------------
/supabase_mcp/tools/descriptions/sdk_tools.yaml:
--------------------------------------------------------------------------------

```yaml
 1 | # Python SDK tools descriptions
 2 | 
 3 | get_auth_admin_methods_spec: |
 4 |   Get Python SDK methods specification for Auth Admin.
 5 | 
 6 |   Returns a comprehensive dictionary of all Auth Admin methods available in the Supabase Python SDK, including:
 7 |   - Method names and descriptions
 8 |   - Required and optional parameters for each method
 9 |   - Parameter types and constraints
10 |   - Return value information
11 | 
12 |   This tool is useful for exploring the capabilities of the Auth Admin SDK and understanding
13 |   how to properly format parameters for the call_auth_admin_method tool.
14 | 
15 |   No parameters required.
16 | 
17 | call_auth_admin_method: |
18 |   Call an Auth Admin method from Supabase Python SDK.
19 | 
20 |   This tool provides a safe, validated interface to the Supabase Auth Admin SDK, allowing you to:
21 |   - Manage users (create, update, delete)
22 |   - List and search users
23 |   - Generate authentication links
24 |   - Manage multi-factor authentication
25 |   - And more
26 | 
27 |   IMPORTANT NOTES:
28 |   - Request bodies must adhere to the Python SDK specification
29 |   - Some methods may have nested parameter structures
30 |   - The tool validates all parameters against Pydantic models
31 |   - Extra fields not defined in the models will be rejected
32 | 
33 |   AVAILABLE METHODS:
34 |   - get_user_by_id: Retrieve a user by their ID
35 |   - list_users: List all users with pagination
36 |   - create_user: Create a new user
37 |   - delete_user: Delete a user by their ID
38 |   - invite_user_by_email: Send an invite link to a user's email
39 |   - generate_link: Generate an email link for various authentication purposes
40 |   - update_user_by_id: Update user attributes by ID
41 |   - delete_factor: Delete a factor on a user
42 | 
43 |   EXAMPLES:
44 |   1. Get user by ID:
45 |      method: "get_user_by_id"
46 |      params: {"uid": "user-uuid-here"}
47 | 
48 |   2. Create user:
49 |      method: "create_user"
50 |      params: {
51 |        "email": "[email protected]",
52 |        "password": "secure-password"
53 |      }
54 | 
55 |   3. Update user by ID:
56 |      method: "update_user_by_id"
57 |      params: {
58 |        "uid": "user-uuid-here",
59 |        "attributes": {
60 |          "email": "[email protected]"
61 |        }
62 |      }
63 | 
64 |   For complete documentation of all methods and their parameters, use the get_auth_admin_methods_spec tool.
65 | 
```

--------------------------------------------------------------------------------
/supabase_mcp/services/sdk/auth_admin_models.py:
--------------------------------------------------------------------------------

```python
 1 | from typing import Any, Literal
 2 | 
 3 | from pydantic import BaseModel, model_validator
 4 | 
 5 | 
 6 | class GetUserByIdParams(BaseModel):
 7 |     uid: str
 8 | 
 9 | 
10 | class ListUsersParams(BaseModel):
11 |     page: int | None = 1
12 |     per_page: int | None = 50
13 | 
14 | 
15 | class UserMetadata(BaseModel):
16 |     email: str | None = None
17 |     email_verified: bool | None = None
18 |     phone_verified: bool | None = None
19 |     sub: str | None = None
20 | 
21 | 
22 | class AdminUserAttributes(BaseModel):
23 |     email: str | None = None
24 |     password: str | None = None
25 |     email_confirm: bool | None = False
26 |     phone: str | None = None
27 |     phone_confirm: bool | None = False
28 |     user_metadata: UserMetadata | None = None
29 |     app_metadata: dict[str, Any] | None = None
30 |     role: str | None = None
31 |     ban_duration: str | None = None
32 |     nonce: str | None = None
33 | 
34 | 
35 | class CreateUserParams(AdminUserAttributes):
36 |     pass
37 | 
38 |     @model_validator(mode="after")
39 |     def check_email_or_phone(self) -> "CreateUserParams":
40 |         if not self.email and not self.phone:
41 |             raise ValueError("Either email or phone must be provided")
42 |         return self
43 | 
44 | 
45 | class DeleteUserParams(BaseModel):
46 |     id: str
47 |     should_soft_delete: bool | None = False
48 | 
49 | 
50 | class InviteUserByEmailParams(BaseModel):
51 |     email: str
52 |     options: dict[str, Any] | None = None
53 | 
54 | 
55 | class GenerateLinkParams(BaseModel):
56 |     type: Literal[
57 |         "signup", "invite", "magiclink", "recovery", "email_change_current", "email_change_new", "phone_change"
58 |     ]
59 |     email: str
60 |     password: str | None = None
61 |     new_email: str | None = None
62 |     options: dict[str, Any] | None = None
63 | 
64 |     @model_validator(mode="after")
65 |     def validate_required_fields(self) -> "GenerateLinkParams":
66 |         # Check password for signup
67 |         if self.type == "signup" and not self.password:
68 |             raise ValueError("Password is required for signup links")
69 | 
70 |         # Check new_email for email change
71 |         if self.type in ["email_change_current", "email_change_new"] and not self.new_email:
72 |             raise ValueError("new_email is required for email change links")
73 | 
74 |         return self
75 | 
76 | 
77 | class UpdateUserByIdParams(BaseModel):
78 |     uid: str
79 |     attributes: AdminUserAttributes
80 | 
81 | 
82 | class DeleteFactorParams(BaseModel):
83 |     id: str
84 |     user_id: str
85 | 
86 | 
87 | # Map method names to their parameter models
88 | PARAM_MODELS = {
89 |     "get_user_by_id": GetUserByIdParams,
90 |     "list_users": ListUsersParams,
91 |     "create_user": CreateUserParams,
92 |     "delete_user": DeleteUserParams,
93 |     "invite_user_by_email": InviteUserByEmailParams,
94 |     "generate_link": GenerateLinkParams,
95 |     "update_user_by_id": UpdateUserByIdParams,
96 |     "delete_factor": DeleteFactorParams,
97 | }
98 | 
```

--------------------------------------------------------------------------------
/supabase_mcp/clients/api_client.py:
--------------------------------------------------------------------------------

```python
 1 | import httpx
 2 | from pydantic import BaseModel
 3 | 
 4 | from supabase_mcp.clients.base_http_client import AsyncHTTPClient
 5 | from supabase_mcp.logger import logger
 6 | from supabase_mcp.settings import settings
 7 | 
 8 | 
 9 | class ApiRoutes:
10 |     """Routes for the Query API"""
11 | 
12 |     FEATURES_ACCESS = "/features/{feature_name}/access"
13 | 
14 | 
15 | class FeatureAccessRequest(BaseModel):
16 |     """Request for feature access. Later can be extended with additional metadata."""
17 | 
18 |     feature_name: str
19 | 
20 | 
21 | class FeatureAccessResponse(BaseModel):
22 |     """Response for feature access. Later can be extended with additional metadata."""
23 | 
24 |     access_granted: bool
25 | 
26 | 
27 | class ApiClient(AsyncHTTPClient):
28 |     """Client for communicating with the Query API server for premium features.
29 | 
30 |     To preserve backwards compatibility and ensure a smooth UX for existing users,
31 |     API key is not required as of now.
32 |     """
33 | 
34 |     def __init__(
35 |         self,
36 |         query_api_key: str | None = None,
37 |         query_api_url: str | None = None,
38 |     ):
39 |         """Initialize the Query API client"""
40 |         self.query_api_key = query_api_key or settings.query_api_key
41 |         self.query_api_url = query_api_url or settings.query_api_url
42 |         self._check_api_key_set()
43 |         self.client: httpx.AsyncClient | None = None
44 |         logger.info(
45 |             f"✔️ Query API client initialized successfully with URL: {self.query_api_url}, with key: {bool(self.query_api_key)}"
46 |         )
47 | 
48 |     async def _ensure_client(self) -> httpx.AsyncClient:
49 |         """Ensure client exists and is ready for use.
50 | 
51 |         Creates the client if it doesn't exist yet.
52 |         Returns the client instance.
53 |         """
54 |         if self.client is None:
55 |             logger.info("Creating new Query API client")
56 |             self.client = httpx.AsyncClient(
57 |                 base_url=self.query_api_url,
58 |                 headers={"X-API-Key": f"{self.query_api_key}"},
59 |                 timeout=30.0,
60 |             )
61 |         logger.info("Returning existing Query API client")
62 |         return self.client
63 | 
64 |     async def close(self) -> None:
65 |         """Close the client and release resources."""
66 |         if self.client:
67 |             await self.client.aclose()
68 |             logger.info("Query API client closed")
69 | 
70 |     def _check_api_key_set(self) -> None:
71 |         """Check if the API key is set"""
72 |         if not self.query_api_key:
73 |             logger.warning("Query API key is not set. Only free features will be available.")
74 |             return
75 | 
76 |     async def check_feature_access(self, feature_name: str) -> FeatureAccessResponse:
77 |         """Check if the feature is available for the user"""
78 | 
79 |         try:
80 |             result = await self.execute_request(
81 |                 method="GET",
82 |                 path=ApiRoutes.FEATURES_ACCESS.format(feature_name=feature_name),
83 |             )
84 |             logger.debug(f"Feature access response: {result}")
85 |             return FeatureAccessResponse.model_validate(result)
86 |         except Exception as e:
87 |             logger.error(f"Error checking feature access: {e}")
88 |             raise e
89 | 
```

--------------------------------------------------------------------------------
/supabase_mcp/tools/manager.py:
--------------------------------------------------------------------------------

```python
 1 | from __future__ import annotations
 2 | 
 3 | from enum import Enum
 4 | from pathlib import Path
 5 | 
 6 | import yaml
 7 | 
 8 | from supabase_mcp.logger import logger
 9 | 
10 | 
11 | class ToolName(str, Enum):
12 |     """Enum of all available tools in the Supabase MCP server."""
13 | 
14 |     # Database tools
15 |     GET_SCHEMAS = "get_schemas"
16 |     GET_TABLES = "get_tables"
17 |     GET_TABLE_SCHEMA = "get_table_schema"
18 |     EXECUTE_POSTGRESQL = "execute_postgresql"
19 |     RETRIEVE_MIGRATIONS = "retrieve_migrations"
20 | 
21 |     # Safety tools
22 |     LIVE_DANGEROUSLY = "live_dangerously"
23 |     CONFIRM_DESTRUCTIVE_OPERATION = "confirm_destructive_operation"
24 | 
25 |     # Management API tools
26 |     SEND_MANAGEMENT_API_REQUEST = "send_management_api_request"
27 |     GET_MANAGEMENT_API_SPEC = "get_management_api_spec"
28 | 
29 |     # Auth Admin tools
30 |     GET_AUTH_ADMIN_METHODS_SPEC = "get_auth_admin_methods_spec"
31 |     CALL_AUTH_ADMIN_METHOD = "call_auth_admin_method"
32 | 
33 |     # Logs & Analytics tools
34 |     RETRIEVE_LOGS = "retrieve_logs"
35 | 
36 | 
37 | class ToolManager:
38 |     """Manager for tool descriptions and registration.
39 | 
40 |     This class is responsible for loading tool descriptions from YAML files
41 |     and providing them to the main application.
42 |     """
43 | 
44 |     _instance: ToolManager | None = None  # Singleton instance
45 | 
46 |     def __init__(self) -> None:
47 |         """Initialize the tool manager."""
48 |         self.descriptions: dict[str, str] = {}
49 |         self._load_descriptions()
50 | 
51 |     @classmethod
52 |     def get_instance(cls) -> ToolManager:
53 |         """Get or create the singleton instance of ToolManager."""
54 |         if cls._instance is None:
55 |             cls._instance = cls()
56 |         return cls._instance
57 | 
58 |     @classmethod
59 |     def reset(cls) -> None:
60 |         """Reset the singleton instance of ToolManager."""
61 |         if cls._instance is not None:
62 |             cls._instance = None
63 |             logger.info("ToolManager instance reset complete")
64 | 
65 |     def _load_descriptions(self) -> None:
66 |         """Load tool descriptions from YAML files."""
67 |         # Path to the descriptions directory
68 |         descriptions_dir = Path(__file__).parent / "descriptions"
69 | 
70 |         # Check if the directory exists
71 |         if not descriptions_dir.exists():
72 |             raise FileNotFoundError(f"Tool descriptions directory not found: {descriptions_dir}")
73 | 
74 |         # Load all YAML files in the directory
75 |         for yaml_file in descriptions_dir.glob("*.yaml"):
76 |             try:
77 |                 with open(yaml_file) as f:
78 |                     tool_descriptions = yaml.safe_load(f)
79 |                     if tool_descriptions:
80 |                         self.descriptions.update(tool_descriptions)
81 |             except Exception as e:
82 |                 print(f"Error loading tool descriptions from {yaml_file}: {e}")
83 | 
84 |     def get_description(self, tool_name: str) -> str:
85 |         """Get the description for a specific tool.
86 | 
87 |         Args:
88 |             tool_name: The name of the tool to get the description for.
89 | 
90 |         Returns:
91 |             The description of the tool, or an empty string if not found.
92 |         """
93 |         return self.descriptions.get(tool_name, "")
94 | 
```

--------------------------------------------------------------------------------
/supabase_mcp/exceptions.py:
--------------------------------------------------------------------------------

```python
  1 | from typing import Any
  2 | 
  3 | 
  4 | class DatabaseError(Exception):
  5 |     """Base class for database-related errors."""
  6 | 
  7 |     pass
  8 | 
  9 | 
 10 | class ConnectionError(DatabaseError):
 11 |     """Raised when a database connection fails."""
 12 | 
 13 |     pass
 14 | 
 15 | 
 16 | class PermissionError(DatabaseError):
 17 |     """Raised when a database operation is not permitted."""
 18 | 
 19 |     pass
 20 | 
 21 | 
 22 | class QueryError(DatabaseError):
 23 |     """Raised when a database query fails."""
 24 | 
 25 |     pass
 26 | 
 27 | 
 28 | class TimeoutError(DatabaseError):
 29 |     """Raised when a database operation times out."""
 30 | 
 31 |     pass
 32 | 
 33 | 
 34 | class ValidationError(Exception):
 35 |     """Raised when validation fails."""
 36 | 
 37 |     pass
 38 | 
 39 | 
 40 | class SafetyError(Exception):
 41 |     """Raised when a safety check fails."""
 42 | 
 43 |     pass
 44 | 
 45 | 
 46 | class OperationNotAllowedError(SafetyError):
 47 |     """Raised when an operation is not allowed in the current safety mode."""
 48 | 
 49 |     pass
 50 | 
 51 | 
 52 | class ConfirmationRequiredError(SafetyError):
 53 |     """Raised when a user needs to confirm destructive SQL operation"""
 54 | 
 55 |     pass
 56 | 
 57 | 
 58 | class APIError(Exception):
 59 |     """Base class for API-related errors."""
 60 | 
 61 |     def __init__(
 62 |         self,
 63 |         message: str,
 64 |         status_code: int | None = None,
 65 |         response_body: dict[str, Any] | None = None,
 66 |     ):
 67 |         self.status_code = status_code
 68 |         self.response_body = response_body
 69 |         super().__init__(message)
 70 | 
 71 | 
 72 | class APIConnectionError(APIError):
 73 |     """Raised when an API connection fails."""
 74 | 
 75 |     pass
 76 | 
 77 | 
 78 | class PythonSDKError(Exception):
 79 |     """Raised when a Python SDK operation fails."""
 80 | 
 81 |     pass
 82 | 
 83 | 
 84 | class APIResponseError(APIError):
 85 |     """Raised when an API response is invalid."""
 86 | 
 87 |     pass
 88 | 
 89 | 
 90 | class APIClientError(APIError):
 91 |     """Raised when an API client error occurs."""
 92 | 
 93 |     pass
 94 | 
 95 | 
 96 | class APIServerError(APIError):
 97 |     """Raised when an API server error occurs."""
 98 | 
 99 |     pass
100 | 
101 | 
102 | class UnexpectedError(APIError):
103 |     """Raised when an unexpected error occurs."""
104 | 
105 |     pass
106 | 
107 | 
108 | class FeatureAccessError(APIError):
109 |     """Raised when a user does not have access to a premium feature."""
110 | 
111 |     def __init__(
112 |         self,
113 |         feature_name: str,
114 |         status_code: int | None = None,
115 |         response_body: dict[str, Any] | None = None,
116 |     ):
117 |         message = (
118 |             f"This feature '{feature_name}' is available in our Pro plan. "
119 |             f"Upgrade at https://thequery.dev to unlock advanced capabilities "
120 |             f"and take your database experience to the next level!"
121 |         )
122 |         super().__init__(message, status_code, response_body)
123 | 
124 | 
125 | class FeatureTemporaryError(APIError):
126 |     """Raised when a feature check encounters a temporary error."""
127 | 
128 |     def __init__(
129 |         self,
130 |         feature_name: str,
131 |         status_code: int | None = None,
132 |         response_body: dict[str, Any] | None = None,
133 |     ):
134 |         message = (
135 |             f"We couldn't verify access to '{feature_name}' at the moment. "
136 |             f"Please try again in a few moments. If this persists, "
137 |             f"check your connection or visit https://thequery.dev/status for updates."
138 |         )
139 |         super().__init__(message, status_code, response_body)
140 | 
```

--------------------------------------------------------------------------------
/tests/services/safety/test_sql_safety_config.py:
--------------------------------------------------------------------------------

```python
 1 | """
 2 | Unit tests for the SQLSafetyConfig class.
 3 | 
 4 | This file contains unit test cases for the SQLSafetyConfig class, which is responsible for
 5 | determining the risk level of SQL operations and whether they are allowed or require confirmation.
 6 | """
 7 | 
 8 | from unittest.mock import MagicMock
 9 | 
10 | import pytest
11 | 
12 | from supabase_mcp.services.safety.models import OperationRiskLevel, SafetyMode
13 | from supabase_mcp.services.safety.safety_configs import SQLSafetyConfig
14 | 
15 | 
16 | @pytest.mark.unit
17 | class TestSQLSafetyConfig:
18 |     """Unit tests for the SQLSafetyConfig class."""
19 | 
20 |     def test_get_risk_level(self):
21 |         """Test that get_risk_level returns the highest_risk_level from the operation."""
22 |         config = SQLSafetyConfig()
23 | 
24 |         # Create mock QueryValidationResults objects with different risk levels
25 |         low_risk_op = MagicMock()
26 |         low_risk_op.highest_risk_level = OperationRiskLevel.LOW
27 | 
28 |         medium_risk_op = MagicMock()
29 |         medium_risk_op.highest_risk_level = OperationRiskLevel.MEDIUM
30 | 
31 |         high_risk_op = MagicMock()
32 |         high_risk_op.highest_risk_level = OperationRiskLevel.HIGH
33 | 
34 |         extreme_risk_op = MagicMock()
35 |         extreme_risk_op.highest_risk_level = OperationRiskLevel.EXTREME
36 | 
37 |         # Test that the risk level is correctly returned
38 |         assert config.get_risk_level(low_risk_op) == OperationRiskLevel.LOW
39 |         assert config.get_risk_level(medium_risk_op) == OperationRiskLevel.MEDIUM
40 |         assert config.get_risk_level(high_risk_op) == OperationRiskLevel.HIGH
41 |         assert config.get_risk_level(extreme_risk_op) == OperationRiskLevel.EXTREME
42 | 
43 |     def test_is_operation_allowed(self):
44 |         """Test if operations are allowed based on risk level and safety mode.
45 | 
46 |         This tests the behavior inherited from SafetyConfigBase.
47 |         """
48 |         config = SQLSafetyConfig()
49 | 
50 |         # Low risk operations should be allowed in both safe and unsafe modes
51 |         assert config.is_operation_allowed(OperationRiskLevel.LOW, SafetyMode.SAFE) is True
52 |         assert config.is_operation_allowed(OperationRiskLevel.LOW, SafetyMode.UNSAFE) is True
53 | 
54 |         # Medium/high risk operations should only be allowed in unsafe mode
55 |         assert config.is_operation_allowed(OperationRiskLevel.MEDIUM, SafetyMode.SAFE) is False
56 |         assert config.is_operation_allowed(OperationRiskLevel.MEDIUM, SafetyMode.UNSAFE) is True
57 |         assert config.is_operation_allowed(OperationRiskLevel.HIGH, SafetyMode.SAFE) is False
58 |         assert config.is_operation_allowed(OperationRiskLevel.HIGH, SafetyMode.UNSAFE) is True
59 | 
60 |         # Extreme risk operations are never allowed
61 |         assert config.is_operation_allowed(OperationRiskLevel.EXTREME, SafetyMode.SAFE) is False
62 |         assert config.is_operation_allowed(OperationRiskLevel.EXTREME, SafetyMode.UNSAFE) is False
63 | 
64 |     def test_needs_confirmation(self):
65 |         """Test if operations need confirmation based on risk level.
66 | 
67 |         This tests the behavior inherited from SafetyConfigBase.
68 |         """
69 |         config = SQLSafetyConfig()
70 | 
71 |         # Low and medium risk operations should not need confirmation
72 |         assert config.needs_confirmation(OperationRiskLevel.LOW) is False
73 |         assert config.needs_confirmation(OperationRiskLevel.MEDIUM) is False
74 | 
75 |         # High risk operations should need confirmation
76 |         assert config.needs_confirmation(OperationRiskLevel.HIGH) is True
77 | 
78 |         # Extreme risk operations should need confirmation
79 |         assert config.needs_confirmation(OperationRiskLevel.EXTREME) is True
80 | 
```

--------------------------------------------------------------------------------
/supabase_mcp/tools/descriptions/logs_and_analytics_tools.yaml:
--------------------------------------------------------------------------------

```yaml
 1 | # Tools related to logs and Advisor analytics
 2 | 
 3 | retrieve_logs: |
 4 |   Retrieve logs from your Supabase project's services for debugging and monitoring.
 5 | 
 6 |   Returns log entries from various Supabase services with timestamps, messages, and metadata.
 7 |   This tool provides access to the same logs available in the Supabase dashboard's Logs & Analytics section.
 8 | 
 9 |   AVAILABLE LOG COLLECTIONS:
10 |   - postgres: Database server logs including queries, errors, warnings, and system messages
11 |   - api_gateway: API requests, responses, and errors processed by the Kong API gateway
12 |   - auth: Authentication and authorization logs for sign-ups, logins, and token operations
13 |   - postgrest: Logs from the RESTful API service that exposes your PostgreSQL database
14 |   - pooler: Connection pooling logs from pgbouncer and supavisor services
15 |   - storage: Object storage service logs for file uploads, downloads, and permissions
16 |   - realtime: Logs from the real-time subscription service for WebSocket connections
17 |   - edge_functions: Serverless function execution logs including invocations and errors
18 |   - cron: Scheduled job logs (can be queried through postgres logs with specific filters)
19 |   - pgbouncer: Connection pooler logs
20 | 
21 |   PARAMETERS:
22 |   - collection: The log collection to query (required, one of the values listed above)
23 |   - limit: Maximum number of log entries to return (default: 20)
24 |   - hours_ago: Retrieve logs from the last N hours (default: 1)
25 |   - filters: List of filter objects with field, operator, and value (default: [])
26 |     Format: [{"field": "field_name", "operator": "=", "value": "value"}]
27 |   - search: Text to search for in event messages (default: "")
28 |   - custom_query: Complete custom SQL query to execute instead of the pre-built queries (default: "")
29 | 
30 |   HOW IT WORKS:
31 |   This tool makes a request to the Supabase Management API endpoint for logs, sending
32 |   either a pre-built optimized query for the selected collection or your custom query.
33 |   Each log collection has a specific table structure and metadata format that requires
34 |   appropriate CROSS JOIN UNNEST operations to access nested fields.
35 | 
36 |   EXAMPLES:
37 |   1. Using pre-built parameters:
38 |      collection: "postgres"
39 |      limit: 20
40 |      hours_ago: 24
41 |      filters: [{"field": "parsed.error_severity", "operator": "=", "value": "ERROR"}]
42 |      search: "connection"
43 | 
44 |   2. Using a custom query:
45 |      collection: "edge_functions"
46 |      custom_query: "SELECT id, timestamp, event_message, m.function_id, m.execution_time_ms
47 |                    FROM function_edge_logs
48 |                    CROSS JOIN unnest(metadata) AS m
49 |                    WHERE m.execution_time_ms > 1000
50 |                    ORDER BY timestamp DESC LIMIT 10"
51 | 
52 |   METADATA STRUCTURE:
53 |   The metadata structure is important because it determines how to access nested fields in filters:
54 |   - postgres_logs: Use "parsed.field_name" for fields like error_severity, query, application_name
55 |   - edge_logs: Use "request.field_name" or "response.field_name" for HTTP details
56 |   - function_edge_logs: Use "function_id", "execution_time_ms" for function metrics
57 | 
58 |   NOTE FOR LLM CLIENTS:
59 |   When encountering errors with field access, examine the error message to see what fields are
60 |   actually available in the structure. Start with basic fields before accessing nested metadata.
61 | 
62 |   SAFETY CONSIDERATIONS:
63 |   - This is a low-risk read operation that can be executed in SAFE mode
64 |   - Requires a valid Supabase Personal Access Token to be configured
65 |   - Not available for local Supabase instances (requires cloud deployment)
66 | 
67 | 
68 | retrieve_advisor_analytics: |
69 |   Get advisor analytics from the database.
70 | 
```

--------------------------------------------------------------------------------
/supabase_mcp/services/database/sql/models.py:
--------------------------------------------------------------------------------

```python
  1 | from enum import Enum
  2 | 
  3 | from pydantic import BaseModel, Field
  4 | 
  5 | from supabase_mcp.services.safety.models import OperationRiskLevel
  6 | 
  7 | 
  8 | class SQLQueryCategory(str, Enum):
  9 |     """Category of the SQL query tracked by the SQL validator"""
 10 | 
 11 |     DQL = "DQL"  # Data Query Language (SELECT)
 12 |     DML = "DML"  # Data Manipulation Language (INSERT, UPDATE, DELETE)
 13 |     DDL = "DDL"  # Data Definition Language (CREATE, ALTER, DROP)
 14 |     TCL = "TCL"  # Transaction Control Language (BEGIN, COMMIT, ROLLBACK)
 15 |     DCL = "DCL"  # Data Control Language (GRANT, REVOKE)
 16 |     POSTGRES_SPECIFIC = "POSTGRES_SPECIFIC"  # PostgreSQL-specific commands
 17 |     OTHER = "OTHER"  # Other commands not fitting into the categories above
 18 | 
 19 | 
 20 | class SQLQueryCommand(str, Enum):
 21 |     """Command of the SQL query tracked by the SQL validator"""
 22 | 
 23 |     # DQL Commands
 24 |     SELECT = "SELECT"
 25 | 
 26 |     # DML Commands
 27 |     INSERT = "INSERT"
 28 |     UPDATE = "UPDATE"
 29 |     DELETE = "DELETE"
 30 |     MERGE = "MERGE"
 31 | 
 32 |     # DDL Commands
 33 |     CREATE = "CREATE"
 34 |     ALTER = "ALTER"
 35 |     DROP = "DROP"
 36 |     TRUNCATE = "TRUNCATE"
 37 |     COMMENT = "COMMENT"
 38 |     RENAME = "RENAME"
 39 | 
 40 |     # DCL Commands
 41 |     GRANT = "GRANT"
 42 |     REVOKE = "REVOKE"
 43 | 
 44 |     # TCL Commands (for tracking, not query types)
 45 |     BEGIN = "BEGIN"
 46 |     COMMIT = "COMMIT"
 47 |     ROLLBACK = "ROLLBACK"
 48 |     SAVEPOINT = "SAVEPOINT"
 49 | 
 50 |     # PostgreSQL-specific Commands
 51 |     VACUUM = "VACUUM"
 52 |     ANALYZE = "ANALYZE"
 53 |     EXPLAIN = "EXPLAIN"
 54 |     COPY = "COPY"
 55 |     LISTEN = "LISTEN"
 56 |     NOTIFY = "NOTIFY"
 57 |     PREPARE = "PREPARE"
 58 |     EXECUTE = "EXECUTE"
 59 |     DEALLOCATE = "DEALLOCATE"
 60 | 
 61 |     # Other/Unknown
 62 |     UNKNOWN = "UNKNOWN"
 63 | 
 64 | 
 65 | class ValidatedStatement(BaseModel):
 66 |     """Result of the query validation for a single SQL statement."""
 67 | 
 68 |     category: SQLQueryCategory = Field(
 69 |         ..., description="The category of SQL statement (DQL, DML, DDL, etc.) derived from pglast parse tree"
 70 |     )
 71 |     risk_level: OperationRiskLevel = Field(
 72 |         ..., description="The risk level associated with this statement based on category and command"
 73 |     )
 74 |     command: SQLQueryCommand = Field(
 75 |         ..., description="The specific SQL command (SELECT, INSERT, CREATE, etc.) extracted from parse tree"
 76 |     )
 77 |     object_type: str | None = Field(
 78 |         None, description="The type of object being operated on (TABLE, INDEX, etc.) when available"
 79 |     )
 80 |     schema_name: str | None = Field(None, description="The schema name for the objects in the statement when available")
 81 |     needs_migration: bool = Field(
 82 |         ..., description="Whether this statement requires a migration based on statement type and safety rules"
 83 |     )
 84 |     query: str | None = Field(None, description="The actual SQL text for this statement extracted from original query")
 85 | 
 86 | 
 87 | class QueryValidationResults(BaseModel):
 88 |     """Result of the batch validation for one or more SQL statements."""
 89 | 
 90 |     statements: list[ValidatedStatement] = Field(
 91 |         default_factory=list, description="List of validated statements from the query built during validation"
 92 |     )
 93 |     highest_risk_level: OperationRiskLevel = Field(
 94 |         default=OperationRiskLevel.LOW, description="The highest risk level among all statements in the batch"
 95 |     )
 96 |     has_transaction_control: bool = Field(
 97 |         default=False, description="Whether the query contains transaction control statements (BEGIN, COMMIT, etc.)"
 98 |     )
 99 |     original_query: str = Field(..., description="The original SQL query text as provided by the user")
100 | 
101 |     def needs_migration(self) -> bool:
102 |         """Check if any statement in the batch needs migration."""
103 |         return any(stmt.needs_migration for stmt in self.statements)
104 | 
```

--------------------------------------------------------------------------------
/supabase_mcp/tools/descriptions/safety_tools.yaml:
--------------------------------------------------------------------------------

```yaml
 1 | # Safety tools descriptions
 2 | 
 3 | live_dangerously: |
 4 |   Toggle unsafe mode for either Management API or Database operations.
 5 | 
 6 |   WHAT THIS TOOL DOES:
 7 |   This tool switches between safe (default) and unsafe operation modes for either the Management API or Database operations.
 8 | 
 9 |   SAFETY MODES EXPLAINED:
10 |   1. Database Safety Modes:
11 |      - SAFE mode (default): Only low-risk operations like SELECT queries are allowed
12 |      - UNSAFE mode: Higher-risk operations including INSERT, UPDATE, DELETE, and schema changes are permitted
13 | 
14 |   2. API Safety Modes:
15 |      - SAFE mode (default): Only low-risk operations that don't modify state are allowed
16 |      - UNSAFE mode: Higher-risk state-changing operations are permitted (except those explicitly blocked for safety)
17 | 
18 |   OPERATION RISK LEVELS:
19 |   The system categorizes operations by risk level:
20 |   - LOW: Safe read operations with minimal impact
21 |   - MEDIUM: Write operations that modify data but don't change structure
22 |   - HIGH: Operations that modify database structure or important system settings
23 |   - EXTREME: Destructive operations that could cause data loss or service disruption
24 | 
25 |   WHEN TO USE THIS TOOL:
26 |   - Use this tool BEFORE attempting write operations or schema changes
27 |   - Enable unsafe mode only when you need to perform data modifications
28 |   - Always return to safe mode after completing write operations
29 | 
30 |   USAGE GUIDELINES:
31 |   - Start in safe mode by default for exploration and analysis
32 |   - Switch to unsafe mode only when you need to make changes
33 |   - Be specific about which service you're enabling unsafe mode for
34 |   - Consider the risks before enabling unsafe mode, especially for database operations
35 |   - For database operations requiring schema changes, you'll need to enable unsafe mode first
36 | 
37 |   Parameters:
38 |   - service: Which service to toggle ("api" or "database")
39 |   - enable_unsafe_mode: True to enable unsafe mode, False for safe mode (default: False)
40 | 
41 |   Examples:
42 |   1. Enable database unsafe mode:
43 |      live_dangerously(service="database", enable_unsafe_mode=True)
44 | 
45 |   2. Return to safe mode after operations:
46 |      live_dangerously(service="database", enable_unsafe_mode=False)
47 | 
48 |   3. Enable API unsafe mode:
49 |      live_dangerously(service="api", enable_unsafe_mode=True)
50 | 
51 |   Note: This tool affects ALL subsequent operations for the specified service until changed again.
52 | 
53 | confirm_destructive_operation: |
54 |   Execute a destructive database or API operation after confirmation. Use this only after reviewing the risks with the user.
55 | 
56 |   HOW IT WORKS:
57 |   - This tool executes a previously rejected high-risk operation using its confirmation ID
58 |   - The operation will be exactly the same as the one that generated the ID
59 |   - No need to retype the query or api request params - the system remembers it
60 | 
61 |   STEPS:
62 |   1. Explain the risks to the user and get their approval
63 |   2. Use this tool with the confirmation ID from the error message
64 |   3. The original query will be executed as-is
65 | 
66 |   PARAMETERS:
67 |   - operation_type: Type of operation ("api" or "database")
68 |   - confirmation_id: The ID provided in the error message (required)
69 |   - user_confirmation: Set to true to confirm execution (default: false)
70 | 
71 |   NOTE: Confirmation IDs expire after 5 minutes for security
72 | 
73 | get_management_api_safety_rules: |
74 |   Get all safety rules for the Supabase Management API.
75 | 
76 |   Returns a comprehensive overview of all safety rules applied to the Management API, including:
77 |   - Blocked operations (never allowed)
78 |   - Unsafe operations (allowed only in unsafe mode)
79 |   - Safe operations (always allowed)
80 | 
81 |   Each rule includes:
82 |   - The HTTP method and path pattern
83 |   - A human-readable explanation of why the operation has its safety designation
84 |   - The safety level assigned to the operation
85 | 
86 |   This information helps you understand which operations require unsafe mode and why certain operations might be completely blocked for safety reasons.
87 | 
```

--------------------------------------------------------------------------------
/supabase_mcp/core/container.py:
--------------------------------------------------------------------------------

```python
  1 | from __future__ import annotations
  2 | 
  3 | from mcp.server.fastmcp import FastMCP
  4 | 
  5 | from supabase_mcp.clients.api_client import ApiClient
  6 | from supabase_mcp.clients.management_client import ManagementAPIClient
  7 | from supabase_mcp.clients.sdk_client import SupabaseSDKClient
  8 | from supabase_mcp.core.feature_manager import FeatureManager
  9 | from supabase_mcp.logger import logger
 10 | from supabase_mcp.services.api.api_manager import SupabaseApiManager
 11 | from supabase_mcp.services.database.postgres_client import PostgresClient
 12 | from supabase_mcp.services.database.query_manager import QueryManager
 13 | from supabase_mcp.services.logs.log_manager import LogManager
 14 | from supabase_mcp.services.safety.safety_manager import SafetyManager
 15 | from supabase_mcp.settings import Settings
 16 | from supabase_mcp.tools import ToolManager
 17 | 
 18 | 
 19 | class ServicesContainer:
 20 |     """Container for all services"""
 21 | 
 22 |     _instance: ServicesContainer | None = None
 23 | 
 24 |     def __init__(
 25 |         self,
 26 |         mcp_server: FastMCP | None = None,
 27 |         postgres_client: PostgresClient | None = None,
 28 |         api_client: ManagementAPIClient | None = None,
 29 |         sdk_client: SupabaseSDKClient | None = None,
 30 |         api_manager: SupabaseApiManager | None = None,
 31 |         safety_manager: SafetyManager | None = None,
 32 |         query_manager: QueryManager | None = None,
 33 |         tool_manager: ToolManager | None = None,
 34 |         log_manager: LogManager | None = None,
 35 |         query_api_client: ApiClient | None = None,
 36 |         feature_manager: FeatureManager | None = None,
 37 |     ) -> None:
 38 |         """Create a new container container reference"""
 39 |         self.postgres_client = postgres_client
 40 |         self.api_client = api_client
 41 |         self.api_manager = api_manager
 42 |         self.sdk_client = sdk_client
 43 |         self.safety_manager = safety_manager
 44 |         self.query_manager = query_manager
 45 |         self.tool_manager = tool_manager
 46 |         self.log_manager = log_manager
 47 |         self.query_api_client = query_api_client
 48 |         self.feature_manager = feature_manager
 49 |         self.mcp_server = mcp_server
 50 | 
 51 |     @classmethod
 52 |     def get_instance(cls) -> ServicesContainer:
 53 |         """Get the singleton instance of the container"""
 54 |         if cls._instance is None:
 55 |             cls._instance = cls()
 56 |         return cls._instance
 57 | 
 58 |     def initialize_services(self, settings: Settings) -> None:
 59 |         """Initializes all services in a synchronous manner to satisfy MCP runtime requirements"""
 60 |         # Create clients
 61 |         self.postgres_client = PostgresClient.get_instance(settings=settings)
 62 |         self.api_client = ManagementAPIClient(settings=settings)  # not a singleton, simple
 63 |         self.sdk_client = SupabaseSDKClient.get_instance(settings=settings)
 64 | 
 65 |         # Create managers
 66 |         self.safety_manager = SafetyManager.get_instance()
 67 |         self.api_manager = SupabaseApiManager.get_instance(
 68 |             api_client=self.api_client,
 69 |             safety_manager=self.safety_manager,
 70 |         )
 71 |         self.query_manager = QueryManager(
 72 |             postgres_client=self.postgres_client,
 73 |             safety_manager=self.safety_manager,
 74 |         )
 75 |         self.tool_manager = ToolManager.get_instance()
 76 | 
 77 |         # Register safety configs
 78 |         self.safety_manager.register_safety_configs()
 79 | 
 80 |         # Create query api client
 81 |         self.query_api_client = ApiClient()
 82 |         self.feature_manager = FeatureManager(self.query_api_client)
 83 | 
 84 |         logger.info("✓ All services initialized successfully.")
 85 | 
 86 |     async def shutdown_services(self) -> None:
 87 |         """Properly close all relevant clients and connections"""
 88 |         # Postgres client
 89 |         if self.postgres_client:
 90 |             await self.postgres_client.close()
 91 | 
 92 |         # API clients
 93 |         if self.query_api_client:
 94 |             await self.query_api_client.close()
 95 | 
 96 |         if self.api_client:
 97 |             await self.api_client.close()
 98 | 
 99 |         # SDK client
100 |         if self.sdk_client:
101 |             await self.sdk_client.close()
102 | 
```

--------------------------------------------------------------------------------
/tests/services/database/sql/test_loader.py:
--------------------------------------------------------------------------------

```python
  1 | import os
  2 | from pathlib import Path
  3 | from unittest.mock import mock_open, patch
  4 | 
  5 | import pytest
  6 | 
  7 | from supabase_mcp.services.database.sql.loader import SQLLoader
  8 | 
  9 | 
 10 | @pytest.mark.unit
 11 | class TestSQLLoader:
 12 |     """Unit tests for the SQLLoader class."""
 13 | 
 14 |     def test_load_sql_with_extension(self):
 15 |         """Test loading SQL with file extension provided."""
 16 |         mock_sql = "SELECT * FROM test;"
 17 | 
 18 |         with patch("builtins.open", mock_open(read_data=mock_sql)):
 19 |             with patch.object(Path, "exists", return_value=True):
 20 |                 result = SQLLoader.load_sql("test.sql")
 21 | 
 22 |         assert result == mock_sql
 23 | 
 24 |     def test_load_sql_without_extension(self):
 25 |         """Test loading SQL without file extension provided."""
 26 |         mock_sql = "SELECT * FROM test;"
 27 | 
 28 |         with patch("builtins.open", mock_open(read_data=mock_sql)):
 29 |             with patch.object(Path, "exists", return_value=True):
 30 |                 result = SQLLoader.load_sql("test")
 31 | 
 32 |         assert result == mock_sql
 33 | 
 34 |     def test_load_sql_file_not_found(self):
 35 |         """Test loading SQL when file doesn't exist."""
 36 |         with patch.object(Path, "exists", return_value=False):
 37 |             with pytest.raises(FileNotFoundError):
 38 |                 SQLLoader.load_sql("nonexistent")
 39 | 
 40 |     def test_get_schemas_query(self):
 41 |         """Test getting schemas query."""
 42 |         mock_sql = "SELECT * FROM schemas;"
 43 | 
 44 |         with patch.object(SQLLoader, "load_sql", return_value=mock_sql):
 45 |             result = SQLLoader.get_schemas_query()
 46 | 
 47 |         assert result == mock_sql
 48 | 
 49 |     def test_get_tables_query(self):
 50 |         """Test getting tables query with schema replacement."""
 51 |         mock_sql = "SELECT * FROM {schema_name}.tables;"
 52 |         expected = "SELECT * FROM test_schema.tables;"
 53 | 
 54 |         with patch.object(SQLLoader, "load_sql", return_value=mock_sql):
 55 |             result = SQLLoader.get_tables_query("test_schema")
 56 | 
 57 |         assert result == expected
 58 | 
 59 |     def test_get_table_schema_query(self):
 60 |         """Test getting table schema query with replacements."""
 61 |         mock_sql = "SELECT * FROM {schema_name}.{table};"
 62 |         expected = "SELECT * FROM test_schema.test_table;"
 63 | 
 64 |         with patch.object(SQLLoader, "load_sql", return_value=mock_sql):
 65 |             result = SQLLoader.get_table_schema_query("test_schema", "test_table")
 66 | 
 67 |         assert result == expected
 68 | 
 69 |     def test_get_migrations_query(self):
 70 |         """Test getting migrations query with all parameters."""
 71 |         mock_sql = "SELECT * FROM migrations WHERE name LIKE '%{name_pattern}%' LIMIT {limit} OFFSET {offset} AND include_queries = {include_full_queries};"
 72 |         expected = "SELECT * FROM migrations WHERE name LIKE '%test%' LIMIT 10 OFFSET 5 AND include_queries = true;"
 73 | 
 74 |         with patch.object(SQLLoader, "load_sql", return_value=mock_sql):
 75 |             result = SQLLoader.get_migrations_query(limit=10, offset=5, name_pattern="test", include_full_queries=True)
 76 | 
 77 |         assert result == expected
 78 | 
 79 |     def test_get_init_migrations_query(self):
 80 |         """Test getting init migrations query."""
 81 |         mock_sql = "CREATE SCHEMA IF NOT EXISTS migrations;"
 82 | 
 83 |         with patch.object(SQLLoader, "load_sql", return_value=mock_sql):
 84 |             result = SQLLoader.get_init_migrations_query()
 85 | 
 86 |         assert result == mock_sql
 87 | 
 88 |     def test_get_create_migration_query(self):
 89 |         """Test getting create migration query with replacements."""
 90 |         mock_sql = "INSERT INTO migrations VALUES ('{version}', '{name}', ARRAY['{statements}']);"
 91 |         expected = "INSERT INTO migrations VALUES ('20230101', 'test_migration', ARRAY['SELECT 1;']);"
 92 | 
 93 |         with patch.object(SQLLoader, "load_sql", return_value=mock_sql):
 94 |             result = SQLLoader.get_create_migration_query(
 95 |                 version="20230101", name="test_migration", statements="SELECT 1;"
 96 |             )
 97 | 
 98 |         assert result == expected
 99 | 
100 |     def test_sql_dir_path(self):
101 |         """Test that SQL_DIR points to the correct location."""
102 |         expected_path = Path(SQLLoader.__module__.replace(".", os.sep)).parent / "queries"
103 |         assert str(SQLLoader.SQL_DIR).endswith(str(expected_path))
104 | 
```

--------------------------------------------------------------------------------
/pyproject.toml:
--------------------------------------------------------------------------------

```toml
  1 | [build-system]
  2 | requires = ["hatchling", "hatch-vcs"]
  3 | build-backend = "hatchling.build"
  4 | 
  5 | [project]
  6 | name = "supabase-mcp-server"
  7 | dynamic = ["version"]
  8 | description = "Community Supabase MCP server that enables Cursor and Windsurf to end-to-end manage your Supabase project, execute SQL queries, and more."
  9 | readme = "README.md"
 10 | requires-python = ">=3.12"
 11 | dependencies = [
 12 |     "asyncpg>=0.30.0",
 13 |     "logfire[system-metrics]>=3.12.0",
 14 |     "mcp[cli]>=1.4.1",
 15 |     "pglast>=7.3",
 16 |     "pyyaml>=6.0.2",
 17 |     "supabase>=2.13.0",
 18 |     "tenacity>=9.0.0",
 19 | ]
 20 | authors = [
 21 |     {name = "Alexander Zuev", email = "[email protected]"}
 22 | ]
 23 | keywords = ["supabase", "mcp", "cursor", "windsurf", "model-context-protocol", "claude", "cline"]
 24 | license = "Apache-2.0"
 25 | classifiers = [
 26 |     "Development Status :: 4 - Beta",
 27 |     "Intended Audience :: Developers",
 28 |     "License :: OSI Approved :: Apache Software License",
 29 |     "Programming Language :: Python :: 3.12",
 30 |     "Topic :: Software Development :: Libraries :: Python Modules",
 31 |     "Topic :: Database :: Database Engines/Servers",
 32 | ]
 33 | 
 34 | [project.urls]
 35 | Homepage = "https://github.com/alexander-zuev/supabase-mcp-server"
 36 | Repository = "https://github.com/alexander-zuev/supabase-mcp-server.git"
 37 | Changelog = "https://github.com/alexander-zuev/supabase-mcp-server/blob/main/CHANGELOG.MD"
 38 | Documentation = "https://github.com/alexander-zuev/supabase-mcp-server#readme"
 39 | 
 40 | 
 41 | 
 42 | [tool.hatch.build.targets.wheel]
 43 | packages = ["supabase_mcp"]
 44 | 
 45 | [tool.uv]
 46 | package = true
 47 | 
 48 | [tool.hatch.version]
 49 | source = "vcs"
 50 | raw-options = { version_scheme = "no-guess-dev" }
 51 | 
 52 | [tool.hatch.build.hooks.vcs]
 53 | version-file = "supabase_mcp/_version.py"
 54 | 
 55 | [project.scripts]
 56 | supabase-mcp-server = "supabase_mcp.main:run_server"
 57 | supabase-mcp-inspector = "supabase_mcp.main:run_inspector"
 58 | 
 59 | 
 60 | # Configure PyPI publishing
 61 | [[tool.uv.index]]
 62 | name = "pypi"
 63 | url = "https://pypi.org/simple/"
 64 | publish-url = "https://upload.pypi.org/legacy/"
 65 | 
 66 | [tool.ruff]
 67 | target-version = "py312"
 68 | line-length = 120
 69 | select = [
 70 |     "E",   # pycodestyle errors
 71 |     "W",   # pycodestyle warnings
 72 |     "F",   # pyflakes
 73 |     "I",   # isort
 74 |     "B",   # flake8-bugbear
 75 |     "C4",  # flake8-comprehensions
 76 |     "UP",  # pyupgrade
 77 | ]
 78 | ignore = []
 79 | 
 80 | [tool.ruff.format]
 81 | quote-style = "double"
 82 | indent-style = "space"
 83 | skip-magic-trailing-comma = false
 84 | line-ending = "auto"
 85 | 
 86 | 
 87 | [tool.mypy]
 88 | python_version = "3.12"
 89 | strict = true
 90 | ignore_missing_imports = true
 91 | disallow_untyped_defs = true
 92 | disallow_incomplete_defs = true
 93 | check_untyped_defs = true
 94 | disallow_untyped_decorators = true
 95 | no_implicit_optional = true
 96 | warn_redundant_casts = true
 97 | warn_unused_ignores = true
 98 | warn_return_any = true
 99 | warn_unreachable = true
100 | 
101 | # Relaxed rules for tests
102 | [[tool.mypy.overrides]]
103 | module = "tests.*"
104 | disallow_untyped_defs = false
105 | disallow_incomplete_defs = false
106 | disallow_untyped_decorators = false
107 | check_untyped_defs = false
108 | warn_return_any = false
109 | warn_unreachable = false
110 | 
111 | [tool.pytest]
112 | testpaths = ["tests"]
113 | python_files = ["test_*.py"]
114 | python_classes = ["Test*"]
115 | python_functions = ["test_*"]
116 | addopts = "-v --no-header --tb=short"
117 | 
118 | 
119 | [tool.pytest.ini_options]
120 | asyncio_mode = "auto"
121 | asyncio_default_fixture_loop_scope = "module"
122 | 
123 | markers = [
124 |     "unit: marks a test as a unit test",
125 |     "integration: marks a test as an integration test that requires database access"
126 | ]
127 | 
128 | [dependency-groups]
129 | dev = [
130 |     "asyncpg-stubs>=0.30.0",
131 |     "mypy>=1.15.0",
132 |     "pytest>=8.3.4",
133 |     "pytest-asyncio>=0.25.3",
134 |     "pytest-cov>=6.0.0",
135 |     "pytest-mock>=3.14.0",
136 |     "ruff>=0.9.9",
137 |     "sqlfluff>=3.3.1",
138 | ]
139 | 
140 | 
141 | [tool.sqlfluff.core]
142 | dialect = "postgres"
143 | templater = "jinja"
144 | max_line_length = 120
145 | 
146 | [tool.sqlfluff.indentation]
147 | tab_space_size = 4
148 | 
149 | [tool.sqlfluff.rules]
150 | exclude_rules = [
151 |     "L016",  # Line length rules
152 |     "L031",  # Table aliasing
153 |     "L034",  # Column order in GROUP BY
154 |     "L036",  # Select targets should be on a new line
155 |     "L037",  # Ambiguous ordering directions
156 |     "L042",  # Join condition required
157 |     "L047",  # DISTINCT used with parentheses
158 |     "LT02",  # Layout indent
159 |     "LT12",  # Files must end with a single trailing newline
160 |     "LT14",  # Keyword newline
161 |     "AL01",  # Aliasing of table
162 |     "AM05",  # Join clauses should be fully qualified
163 |     "ST09",  # Join order
164 |     "CP03"   # Function name capitalization
165 | ]
166 | 
```

--------------------------------------------------------------------------------
/supabase_mcp/services/logs/log_manager.py:
--------------------------------------------------------------------------------

```python
  1 | from typing import Any
  2 | 
  3 | from supabase_mcp.logger import logger
  4 | from supabase_mcp.services.database.sql.loader import SQLLoader
  5 | 
  6 | 
  7 | class LogManager:
  8 |     """Manager for retrieving logs from Supabase services."""
  9 | 
 10 |     # Map collection names to table names
 11 |     COLLECTION_TO_TABLE = {
 12 |         "postgres": "postgres_logs",
 13 |         "api_gateway": "edge_logs",
 14 |         "auth": "auth_logs",
 15 |         "postgrest": "postgrest_logs",
 16 |         "pooler": "supavisor_logs",
 17 |         "storage": "storage_logs",
 18 |         "realtime": "realtime_logs",
 19 |         "edge_functions": "function_edge_logs",
 20 |         "cron": "postgres_logs",
 21 |         "pgbouncer": "pgbouncer_logs",
 22 |     }
 23 | 
 24 |     def __init__(self) -> None:
 25 |         """Initialize the LogManager."""
 26 |         self.sql_loader = SQLLoader()
 27 | 
 28 |     def _build_where_clause(
 29 |         self,
 30 |         collection: str,
 31 |         hours_ago: int | None = None,
 32 |         filters: list[dict[str, Any]] | None = None,
 33 |         search: str | None = None,
 34 |     ) -> str:
 35 |         """Build the WHERE clause for a log query.
 36 | 
 37 |         Args:
 38 |             collection: The log collection name
 39 |             hours_ago: Number of hours to look back
 40 |             filters: List of filter objects with field, operator, and value
 41 |             search: Text to search for in event messages
 42 | 
 43 |         Returns:
 44 |             The WHERE clause as a string
 45 |         """
 46 |         logger.debug(
 47 |             f"Building WHERE clause for collection={collection}, hours_ago={hours_ago}, filters={filters}, search={search}"
 48 |         )
 49 | 
 50 |         clauses = []
 51 | 
 52 |         # Get the table name for this collection
 53 |         table_name = self.COLLECTION_TO_TABLE.get(collection, collection)
 54 | 
 55 |         # Add time filter using BigQuery's TIMESTAMP_SUB function
 56 |         if hours_ago:
 57 |             # Qualify the timestamp column with the table name to avoid ambiguity
 58 |             clauses.append(f"{table_name}.timestamp > TIMESTAMP_SUB(CURRENT_TIMESTAMP(), INTERVAL {hours_ago} HOUR)")
 59 | 
 60 |         # Add search filter
 61 |         if search:
 62 |             # Escape single quotes in search text
 63 |             search_escaped = search.replace("'", "''")
 64 |             clauses.append(f"event_message LIKE '%{search_escaped}%'")
 65 | 
 66 |         # Add custom filters
 67 |         if filters:
 68 |             for filter_obj in filters:
 69 |                 field = filter_obj["field"]
 70 |                 operator = filter_obj["operator"]
 71 |                 value = filter_obj["value"]
 72 | 
 73 |                 # Handle string values
 74 |                 if isinstance(value, str) and not value.isdigit():
 75 |                     value = f"'{value.replace("'", "''")}'"
 76 | 
 77 |                 clauses.append(f"{field} {operator} {value}")
 78 | 
 79 |         # For cron logs, we already have a WHERE clause in the template
 80 |         if collection == "cron":
 81 |             if clauses:
 82 |                 where_clause = f"AND {' AND '.join(clauses)}"
 83 |             else:
 84 |                 where_clause = ""
 85 |         else:
 86 |             if clauses:
 87 |                 where_clause = f"WHERE {' AND '.join(clauses)}"
 88 |             else:
 89 |                 where_clause = ""
 90 | 
 91 |         logger.debug(f"Built WHERE clause: {where_clause}")
 92 |         return where_clause
 93 | 
 94 |     def build_logs_query(
 95 |         self,
 96 |         collection: str,
 97 |         limit: int = 20,
 98 |         hours_ago: int | None = 1,
 99 |         filters: list[dict[str, Any]] | None = None,
100 |         search: str | None = None,
101 |         custom_query: str | None = None,
102 |     ) -> str:
103 |         """Build a query for retrieving logs from a Supabase service.
104 | 
105 |         Args:
106 |             collection: The log collection to query
107 |             limit: Maximum number of log entries to return
108 |             hours_ago: Retrieve logs from the last N hours
109 |             filters: List of filter objects with field, operator, and value
110 |             search: Text to search for in event messages
111 |             custom_query: Complete custom SQL query to execute
112 | 
113 |         Returns:
114 |             The SQL query string
115 | 
116 |         Raises:
117 |             ValueError: If the collection is unknown
118 |         """
119 |         if custom_query:
120 |             return custom_query
121 | 
122 |         # Build the WHERE clause
123 |         where_clause = self._build_where_clause(
124 |             collection=collection, hours_ago=hours_ago, filters=filters, search=search
125 |         )
126 | 
127 |         # Get the SQL query
128 |         return self.sql_loader.get_logs_query(collection=collection, where_clause=where_clause, limit=limit)
129 | 
```

--------------------------------------------------------------------------------
/tests/services/database/sql/conftest.py:
--------------------------------------------------------------------------------

```python
  1 | import pytest
  2 | 
  3 | 
  4 | @pytest.fixture
  5 | def sample_dql_queries() -> dict[str, str]:
  6 |     """Sample DQL (SELECT) queries for testing."""
  7 |     return {
  8 |         "simple_select": "SELECT * FROM users",
  9 |         "select_with_where": "SELECT id, name FROM users WHERE age > 18",
 10 |         "select_with_join": "SELECT u.id, p.title FROM users u JOIN posts p ON u.id = p.user_id",
 11 |         "select_with_subquery": "SELECT * FROM users WHERE id IN (SELECT user_id FROM posts)",
 12 |         "select_with_cte": "WITH active_users AS (SELECT * FROM users WHERE active = true) SELECT * FROM active_users",
 13 |     }
 14 | 
 15 | 
 16 | @pytest.fixture
 17 | def sample_dml_queries() -> dict[str, str]:
 18 |     """Sample DML (INSERT, UPDATE, DELETE) queries for testing."""
 19 |     return {
 20 |         "simple_insert": "INSERT INTO users (name, email) VALUES ('John', '[email protected]')",
 21 |         "insert_with_select": "INSERT INTO user_backup SELECT * FROM users",
 22 |         "simple_update": "UPDATE users SET active = true WHERE id = 1",
 23 |         "simple_delete": "DELETE FROM users WHERE id = 1",
 24 |         "merge_statement": "MERGE INTO users u USING temp_users t ON (u.id = t.id) WHEN MATCHED THEN UPDATE SET name = t.name",
 25 |     }
 26 | 
 27 | 
 28 | @pytest.fixture
 29 | def sample_ddl_queries() -> dict[str, str]:
 30 |     """Sample DDL (CREATE, ALTER, DROP) queries for testing."""
 31 |     return {
 32 |         "create_table": "CREATE TABLE users (id SERIAL PRIMARY KEY, name TEXT, email TEXT UNIQUE)",
 33 |         "alter_table": "ALTER TABLE users ADD COLUMN active BOOLEAN DEFAULT false",
 34 |         "drop_table": "DROP TABLE users",
 35 |         "truncate_table": "TRUNCATE TABLE users",
 36 |         "create_index": "CREATE INDEX idx_user_email ON users (email)",
 37 |     }
 38 | 
 39 | 
 40 | @pytest.fixture
 41 | def sample_dcl_queries() -> dict[str, str]:
 42 |     """Sample DCL (GRANT, REVOKE) queries for testing."""
 43 |     return {
 44 |         "grant_select": "GRANT SELECT ON users TO read_role",
 45 |         "grant_all": "GRANT ALL PRIVILEGES ON users TO admin_role",
 46 |         "revoke_select": "REVOKE SELECT ON users FROM read_role",
 47 |         "create_role": "CREATE ROLE read_role",
 48 |         "drop_role": "DROP ROLE read_role",
 49 |     }
 50 | 
 51 | 
 52 | @pytest.fixture
 53 | def sample_tcl_queries() -> dict[str, str]:
 54 |     """Sample TCL (BEGIN, COMMIT, ROLLBACK) queries for testing."""
 55 |     return {
 56 |         "begin_transaction": "BEGIN",
 57 |         "commit_transaction": "COMMIT",
 58 |         "rollback_transaction": "ROLLBACK",
 59 |         "savepoint": "SAVEPOINT my_savepoint",
 60 |         "mixed_case_transaction": "Begin Transaction",
 61 |     }
 62 | 
 63 | 
 64 | @pytest.fixture
 65 | def sample_postgres_specific_queries() -> dict[str, str]:
 66 |     """Sample PostgreSQL-specific queries for testing."""
 67 |     return {
 68 |         "vacuum": "VACUUM users",
 69 |         "analyze": "ANALYZE users",
 70 |         "copy_to": "COPY users TO '/tmp/users.csv' WITH CSV",
 71 |         "copy_from": "COPY users FROM '/tmp/users.csv' WITH CSV",
 72 |         "explain": "EXPLAIN ANALYZE SELECT * FROM users",
 73 |     }
 74 | 
 75 | 
 76 | @pytest.fixture
 77 | def sample_invalid_queries() -> dict[str, str]:
 78 |     """Sample invalid SQL queries for testing error handling."""
 79 |     return {
 80 |         "syntax_error": "SELECT * FORM users",
 81 |         "missing_parenthesis": "SELECT * FROM users WHERE id IN (1, 2, 3",
 82 |         "invalid_column": "SELECT nonexistent_column FROM users",
 83 |         "incomplete_statement": "SELECT * FROM",
 84 |         "invalid_table": "SELECT * FROM nonexistent_table",
 85 |     }
 86 | 
 87 | 
 88 | @pytest.fixture
 89 | def sample_multiple_statements() -> dict[str, str]:
 90 |     """Sample SQL with multiple statements for testing batch processing."""
 91 |     return {
 92 |         "multiple_safe": "SELECT * FROM users; SELECT * FROM posts;",
 93 |         "safe_and_write": "SELECT * FROM users; INSERT INTO logs (message) VALUES ('queried users');",
 94 |         "write_and_destructive": "INSERT INTO logs (message) VALUES ('dropping users'); DROP TABLE users;",
 95 |         "with_transaction": "BEGIN; INSERT INTO users (name) VALUES ('John'); COMMIT;",
 96 |         "mixed_categories": "SELECT * FROM users; UPDATE users SET active = true; DROP TABLE old_users;",
 97 |     }
 98 | 
 99 | 
100 | @pytest.fixture
101 | def sample_edge_cases() -> dict[str, str]:
102 |     """Sample edge cases for testing."""
103 |     return {
104 |         "with_comments": "SELECT * FROM users; -- This is a comment\n/* Multi-line\ncomment */",
105 |         "quoted_identifiers": 'SELECT * FROM "user table" WHERE "first name" = \'John\'',
106 |         "special_characters": "SELECT * FROM users WHERE name LIKE 'O''Brien%'",
107 |         "schema_qualified": "SELECT * FROM public.users",
108 |         "with_dollar_quotes": "SELECT $$This is a dollar-quoted string with 'quotes'$$ AS message",
109 |     }
110 | 
```

--------------------------------------------------------------------------------
/supabase_mcp/services/database/sql/loader.py:
--------------------------------------------------------------------------------

```python
  1 | from pathlib import Path
  2 | 
  3 | from supabase_mcp.logger import logger
  4 | 
  5 | 
  6 | class SQLLoader:
  7 |     """Responsible for loading SQL queries from files."""
  8 | 
  9 |     # Path to SQL files directory
 10 |     SQL_DIR = Path(__file__).parent / "queries"
 11 | 
 12 |     @classmethod
 13 |     def load_sql(cls, filename: str) -> str:
 14 |         """
 15 |         Load SQL from a file in the sql directory.
 16 | 
 17 |         Args:
 18 |             filename: Name of the SQL file (with or without .sql extension)
 19 | 
 20 |         Returns:
 21 |             str: The SQL query from the file
 22 | 
 23 |         Raises:
 24 |             FileNotFoundError: If the SQL file doesn't exist
 25 |         """
 26 |         # Ensure the filename has .sql extension
 27 |         if not filename.endswith(".sql"):
 28 |             filename = f"{filename}.sql"
 29 | 
 30 |         file_path = cls.SQL_DIR / filename
 31 | 
 32 |         if not file_path.exists():
 33 |             logger.error(f"SQL file not found: {file_path}")
 34 |             raise FileNotFoundError(f"SQL file not found: {file_path}")
 35 | 
 36 |         with open(file_path) as f:
 37 |             sql = f.read().strip()
 38 |             logger.debug(f"Loaded SQL file: {filename} ({len(sql)} chars)")
 39 |             return sql
 40 | 
 41 |     @classmethod
 42 |     def get_schemas_query(cls) -> str:
 43 |         """Get a query to list all schemas."""
 44 |         return cls.load_sql("get_schemas")
 45 | 
 46 |     @classmethod
 47 |     def get_tables_query(cls, schema_name: str) -> str:
 48 |         """Get a query to list all tables in a schema."""
 49 |         query = cls.load_sql("get_tables")
 50 |         return query.replace("{schema_name}", schema_name)
 51 | 
 52 |     @classmethod
 53 |     def get_table_schema_query(cls, schema_name: str, table: str) -> str:
 54 |         """Get a query to get the schema of a table."""
 55 |         query = cls.load_sql("get_table_schema")
 56 |         return query.replace("{schema_name}", schema_name).replace("{table}", table)
 57 | 
 58 |     @classmethod
 59 |     def get_migrations_query(
 60 |         cls, limit: int = 50, offset: int = 0, name_pattern: str = "", include_full_queries: bool = False
 61 |     ) -> str:
 62 |         """Get a query to list migrations."""
 63 |         query = cls.load_sql("get_migrations")
 64 |         return (
 65 |             query.replace("{limit}", str(limit))
 66 |             .replace("{offset}", str(offset))
 67 |             .replace("{name_pattern}", name_pattern)
 68 |             .replace("{include_full_queries}", str(include_full_queries).lower())
 69 |         )
 70 | 
 71 |     @classmethod
 72 |     def get_init_migrations_query(cls) -> str:
 73 |         """Get a query to initialize the migrations schema and table."""
 74 |         return cls.load_sql("init_migrations")
 75 | 
 76 |     @classmethod
 77 |     def get_create_migration_query(cls, version: str, name: str, statements: str) -> str:
 78 |         """Get a query to create a migration.
 79 | 
 80 |         Args:
 81 |             version: The migration version (timestamp)
 82 |             name: The migration name
 83 |             statements: The SQL statements (escaped)
 84 | 
 85 |         Returns:
 86 |             str: The SQL query to create a migration
 87 |         """
 88 |         query = cls.load_sql("create_migration")
 89 |         return query.replace("{version}", version).replace("{name}", name).replace("{statements}", statements)
 90 | 
 91 |     @classmethod
 92 |     def get_logs_query(cls, collection: str, where_clause: str = "", limit: int = 20) -> str:
 93 |         """Get a query to retrieve logs from a specific collection.
 94 | 
 95 |         Args:
 96 |             collection: The log collection name (e.g., postgres, api_gateway, auth)
 97 |             where_clause: The WHERE clause to filter logs
 98 |             limit: Maximum number of log entries to return
 99 | 
100 |         Returns:
101 |             str: The SQL query to retrieve logs
102 | 
103 |         Raises:
104 |             FileNotFoundError: If the log collection SQL file doesn't exist
105 |         """
106 |         # Map collection names to SQL files
107 |         collection_map = {
108 |             "postgres": "logs/postgres_logs",
109 |             "api_gateway": "logs/edge_logs",
110 |             "auth": "logs/auth_logs",
111 |             "postgrest": "logs/postgrest_logs",
112 |             "pooler": "logs/supavisor_logs",
113 |             "storage": "logs/storage_logs",
114 |             "realtime": "logs/realtime_logs",
115 |             "edge_functions": "logs/function_edge_logs",
116 |             "cron": "logs/cron_logs",
117 |             "pgbouncer": "logs/pgbouncer_logs",
118 |         }
119 | 
120 |         # Get the SQL file path
121 |         sql_file = collection_map.get(collection)
122 |         if not sql_file:
123 |             raise ValueError(f"Unknown log collection: {collection}")
124 | 
125 |         # Load the SQL template
126 |         query = cls.load_sql(sql_file)
127 | 
128 |         # Handle special case for cron logs
129 |         if collection == "cron":
130 |             return query.replace("{and_where_clause}", where_clause).replace("{limit}", str(limit))
131 |         else:
132 |             return query.replace("{where_clause}", where_clause).replace("{limit}", str(limit))
133 | 
```

--------------------------------------------------------------------------------
/supabase_mcp/tools/descriptions/api_tools.yaml:
--------------------------------------------------------------------------------

```yaml
  1 | # API tools descriptions
  2 | 
  3 | send_management_api_request: |
  4 |   Execute a Supabase Management API request.
  5 | 
  6 |   This tool allows you to make direct calls to the Supabase Management API, which provides
  7 |   programmatic access to manage your Supabase project settings, resources, and configurations.
  8 | 
  9 |   REQUEST FORMATTING:
 10 |   - Use paths exactly as defined in the API specification
 11 |   - The {ref} parameter will be automatically injected from settings
 12 |   - Format request bodies according to the API specification
 13 | 
 14 |   PARAMETERS:
 15 |   - method: HTTP method (GET, POST, PUT, PATCH, DELETE)
 16 |   - path: API path (e.g. /v1/projects/{ref}/functions)
 17 |   - path_params: Path parameters as dict (e.g. {"function_slug": "my-function"}) - use empty dict {} if not needed
 18 |   - request_params: Query parameters as dict (e.g. {"key": "value"}) - use empty dict {} if not needed
 19 |   - request_body: Request body as dict (e.g. {"name": "test"}) - use empty dict {} if not needed
 20 | 
 21 |   PATH PARAMETERS HANDLING:
 22 |   - The {ref} placeholder (project reference) is automatically injected - you don't need to provide it
 23 |   - All other path placeholders must be provided in the path_params dictionary
 24 |   - Common placeholders include:
 25 |     * {function_slug}: For Edge Functions operations
 26 |     * {id}: For operations on specific resources (API keys, auth providers, etc.)
 27 |     * {slug}: For organization operations
 28 |     * {branch_id}: For database branch operations
 29 |     * {provider_id}: For SSO provider operations
 30 |     * {tpa_id}: For third-party auth operations
 31 | 
 32 |   EXAMPLES:
 33 |   1. GET request with path and query parameters:
 34 |      method: "GET"
 35 |      path: "/v1/projects/{ref}/functions/{function_slug}"
 36 |      path_params: {"function_slug": "my-function"}
 37 |      request_params: {"version": "1"}
 38 |      request_body: {}
 39 | 
 40 |   2. POST request with body:
 41 |      method: "POST"
 42 |      path: "/v1/projects/{ref}/functions"
 43 |      path_params: {}
 44 |      request_params: {}
 45 |      request_body: {"name": "test-function", "slug": "test-function"}
 46 | 
 47 |   SAFETY SYSTEM:
 48 |   API operations are categorized by risk level:
 49 |   - LOW RISK: Read operations (GET) - allowed in SAFE mode
 50 |   - MEDIUM/HIGH RISK: Write operations (POST, PUT, PATCH, DELETE) - require UNSAFE mode
 51 |   - EXTREME RISK: Destructive operations - require UNSAFE mode and confirmation
 52 |   - BLOCKED: Some operations are completely blocked for safety reasons
 53 | 
 54 |   SAFETY CONSIDERATIONS:
 55 |   - By default, the API client starts in SAFE mode, allowing only read operations
 56 |   - To perform write operations, first use live_dangerously(service="api", enable=True)
 57 |   - High-risk operations will be rejected with a confirmation ID
 58 |   - Use confirm_destructive_operation with the provided ID after reviewing risks
 59 |   - Some operations may be completely blocked for safety reasons
 60 | 
 61 |   For a complete list of available API endpoints and their parameters, use the get_management_api_spec tool.
 62 |   For details on safety rules, use the get_management_api_safety_rules tool.
 63 | 
 64 | get_management_api_spec: |
 65 |   Get the complete Supabase Management API specification.
 66 | 
 67 |   Returns the full OpenAPI specification for the Supabase Management API, including:
 68 |   - All available endpoints and operations
 69 |   - Required and optional parameters for each operation
 70 |   - Request and response schemas
 71 |   - Authentication requirements
 72 |   - Safety information for each operation
 73 | 
 74 |   This tool can be used in four different ways:
 75 |   1. Without parameters: Returns all domains (default)
 76 |   2. With path and method: Returns the full specification for a specific API endpoint
 77 |   3. With domain only: Returns all paths and methods within that domain
 78 |   4. With all_paths=True: Returns all paths and methods
 79 | 
 80 |   Parameters:
 81 |   - params: Dictionary containing optional parameters:
 82 |       - path: Optional API path (e.g., "/v1/projects/{ref}/functions")
 83 |       - method: Optional HTTP method (e.g., "GET", "POST")
 84 |       - domain: Optional domain/tag name (e.g., "Auth", "Storage")
 85 |       - all_paths: Optional boolean, if True returns all paths and methods
 86 | 
 87 |   Available domains:
 88 |   - Analytics: Analytics-related endpoints
 89 |   - Auth: Authentication and authorization endpoints
 90 |   - Database: Database management endpoints
 91 |   - Domains: Custom domain configuration endpoints
 92 |   - Edge Functions: Serverless function management endpoints
 93 |   - Environments: Environment configuration endpoints
 94 |   - OAuth: OAuth integration endpoints
 95 |   - Organizations: Organization management endpoints
 96 |   - Projects: Project management endpoints
 97 |   - Rest: RESTful API endpoints
 98 |   - Secrets: Secret management endpoints
 99 |   - Storage: Storage management endpoints
100 | 
101 |   This specification is useful for understanding:
102 |   - What operations are available through the Management API
103 |   - How to properly format requests for each endpoint
104 |   - Which operations require unsafe mode
105 |   - What data structures to expect in responses
106 | 
107 |   SAFETY: This is a low-risk read operation that can be executed in SAFE mode.
108 | 
```

--------------------------------------------------------------------------------
/tests/test_settings.py:
--------------------------------------------------------------------------------

```python
  1 | from unittest.mock import patch
  2 | 
  3 | import pytest
  4 | from pydantic import ValidationError
  5 | 
  6 | from supabase_mcp.settings import SUPPORTED_REGIONS, Settings
  7 | 
  8 | 
  9 | @pytest.fixture(autouse=True)
 10 | def reset_settings_singleton() -> None:
 11 |     """Reset the Settings singleton before each test"""
 12 |     # Clear singleton instance if it exists
 13 |     if hasattr(Settings, "_instance"):
 14 |         delattr(Settings, "_instance")
 15 |     yield
 16 |     # Clean up after test
 17 |     if hasattr(Settings, "_instance"):
 18 |         delattr(Settings, "_instance")
 19 | 
 20 | 
 21 | class TestSettings:
 22 |     """Integration tests for Settings."""
 23 | 
 24 |     @pytest.mark.integration
 25 |     def test_settings_default_values(self, clean_environment: None) -> None:
 26 |         """Test default values (no config file, no env vars)"""
 27 |         settings = Settings.with_config()  # No config file
 28 |         assert settings.supabase_project_ref == "127.0.0.1:54322"
 29 |         assert settings.supabase_db_password == "postgres"
 30 |         assert settings.supabase_region == "us-east-1"
 31 |         assert settings.supabase_access_token is None
 32 |         assert settings.supabase_service_role_key is None
 33 | 
 34 |     @pytest.mark.integration
 35 |     def test_settings_from_env_test(self, clean_environment: None) -> None:
 36 |         """Test loading from .env.test"""
 37 |         import os
 38 | 
 39 |         settings = Settings.with_config(".env.test")
 40 | 
 41 |         # In CI, we expect default values since .env.test might not be properly set up
 42 |         if os.environ.get("CI") == "true":
 43 |             assert settings.supabase_project_ref == "127.0.0.1:54322"  # Default value in CI
 44 |             assert settings.supabase_db_password == "postgres"  # Default value in CI
 45 |         else:
 46 |             # In local dev, we expect .env.test to override defaults
 47 |             assert settings.supabase_project_ref != "127.0.0.1:54322"  # Should be overridden by .env.test
 48 |             assert settings.supabase_db_password != "postgres"  # Should be overridden by .env.test
 49 | 
 50 |         # Check that the values are not empty
 51 |         assert settings.supabase_project_ref, "Project ref should not be empty"
 52 |         assert settings.supabase_db_password, "DB password should not be empty"
 53 | 
 54 |     @pytest.mark.integration
 55 |     def test_settings_from_env_vars(self, clean_environment: None) -> None:
 56 |         """Test env vars take precedence over config file"""
 57 |         env_values = {
 58 |             "SUPABASE_PROJECT_REF": "abcdefghij1234567890",  # Valid 20-char project ref
 59 |             "SUPABASE_DB_PASSWORD": "env-password",
 60 |         }
 61 |         with patch.dict("os.environ", env_values, clear=False):
 62 |             settings = Settings.with_config(".env.test")  # Even with config file
 63 |             assert settings.supabase_project_ref == "abcdefghij1234567890"
 64 |             assert settings.supabase_db_password == "env-password"
 65 | 
 66 |     @pytest.mark.integration
 67 |     def test_settings_integration_fixture(self, settings_integration: Settings) -> None:
 68 |         """Test the settings_integration fixture provides valid settings."""
 69 |         # The settings_integration fixture should load from .env.test or environment variables
 70 |         assert settings_integration.supabase_project_ref, "Project ref should not be empty"
 71 |         assert settings_integration.supabase_db_password, "DB password should not be empty"
 72 |         assert settings_integration.supabase_region, "Region should not be empty"
 73 | 
 74 |     @pytest.mark.integration
 75 |     def test_settings_region_validation(self) -> None:
 76 |         """Test region validation."""
 77 |         # Test default region
 78 |         settings = Settings()
 79 |         assert settings.supabase_region == "us-east-1"
 80 | 
 81 |         # Test valid region from environment
 82 |         env_values = {"SUPABASE_REGION": "ap-southeast-1"}
 83 |         with patch.dict("os.environ", env_values, clear=True):
 84 |             settings = Settings()
 85 |             assert settings.supabase_region == "ap-southeast-1"
 86 | 
 87 |         # Test invalid region
 88 |         with pytest.raises(ValidationError) as exc_info:
 89 |             env_values = {"SUPABASE_REGION": "invalid-region"}
 90 |             with patch.dict("os.environ", env_values, clear=True):
 91 |                 Settings()
 92 |         assert "Region 'invalid-region' is not supported" in str(exc_info.value)
 93 | 
 94 |     @pytest.mark.integration
 95 |     def test_supported_regions(self) -> None:
 96 |         """Test that all supported regions are valid."""
 97 |         for region in SUPPORTED_REGIONS.__args__:
 98 |             env_values = {"SUPABASE_REGION": region}
 99 |             with patch.dict("os.environ", env_values, clear=True):
100 |                 settings = Settings()
101 |                 assert settings.supabase_region == region
102 | 
103 |     @pytest.mark.integration
104 |     def test_settings_access_token_and_service_role(self) -> None:
105 |         """Test access token and service role key settings."""
106 |         # Test with environment variables
107 |         env_values = {
108 |             "SUPABASE_ACCESS_TOKEN": "test-access-token",
109 |             "SUPABASE_SERVICE_ROLE_KEY": "test-service-role-key",
110 |         }
111 |         with patch.dict("os.environ", env_values, clear=True):
112 |             settings = Settings()
113 |             assert settings.supabase_access_token == "test-access-token"
114 |             assert settings.supabase_service_role_key == "test-service-role-key"
115 | 
116 |         # Test defaults (should be None)
117 |         with patch.dict("os.environ", {}, clear=True):
118 |             settings = Settings()
119 |             assert settings.supabase_access_token is None
120 |             assert settings.supabase_service_role_key is None
121 | 
```

--------------------------------------------------------------------------------
/supabase_mcp/tools/descriptions/database_tools.yaml:
--------------------------------------------------------------------------------

```yaml
  1 | # Database tool descriptions
  2 | 
  3 | get_db_schemas: |
  4 |   List all database schemas with their sizes and table counts.
  5 | 
  6 |   Returns a comprehensive overview of all schemas in the database, including:
  7 |   - Schema names
  8 |   - Total size of each schema
  9 |   - Number of tables in each schema
 10 |   - Owner information
 11 | 
 12 |   This is useful for getting a high-level understanding of the database structure.
 13 | 
 14 |   SAFETY: This is a low-risk read operation that can be executed in SAFE mode.
 15 | 
 16 | get_tables: |
 17 |   List all tables, foreign tables, and views in a schema with their sizes, row counts, and metadata.
 18 | 
 19 |   Provides detailed information about all database objects in the specified schema:
 20 |   - Table/view names
 21 |   - Object types (table, view, foreign table)
 22 |   - Row counts
 23 |   - Size on disk
 24 |   - Column counts
 25 |   - Index information
 26 |   - Last vacuum/analyze times
 27 | 
 28 |   Parameters:
 29 |   - schema_name: Name of the schema to inspect (e.g., 'public', 'auth', etc.)
 30 | 
 31 |   SAFETY: This is a low-risk read operation that can be executed in SAFE mode.
 32 | 
 33 | get_table_schema: |
 34 |   Get detailed table structure including columns, keys, and relationships.
 35 | 
 36 |   Returns comprehensive information about a specific table's structure:
 37 |   - Column definitions (names, types, constraints)
 38 |   - Primary key information
 39 |   - Foreign key relationships
 40 |   - Indexes
 41 |   - Constraints
 42 |   - Triggers
 43 | 
 44 |   Parameters:
 45 |   - schema_name: Name of the schema (e.g., 'public', 'auth')
 46 |   - table: Name of the table to inspect
 47 | 
 48 |   SAFETY: This is a low-risk read operation that can be executed in SAFE mode.
 49 | 
 50 | execute_postgresql: |
 51 |   Execute PostgreSQL statements against your Supabase database.
 52 | 
 53 |   IMPORTANT: All SQL statements must end with a semicolon (;).
 54 | 
 55 |   OPERATION TYPES AND REQUIREMENTS:
 56 |   1. READ Operations (SELECT, EXPLAIN, etc.):
 57 |      - Can be executed directly without special requirements
 58 |      - Example: SELECT * FROM public.users LIMIT 10;
 59 | 
 60 |   2. WRITE Operations (INSERT, UPDATE, DELETE):
 61 |      - Require UNSAFE mode (use live_dangerously('database', True) first)
 62 |      - Example:
 63 |        INSERT INTO public.users (email) VALUES ('[email protected]');
 64 | 
 65 |   3. SCHEMA Operations (CREATE, ALTER, DROP):
 66 |      - Require UNSAFE mode (use live_dangerously('database', True) first)
 67 |      - Destructive operations (DROP, TRUNCATE) require additional confirmation
 68 |      - Example:
 69 |        CREATE TABLE public.test_table (id SERIAL PRIMARY KEY, name TEXT);
 70 | 
 71 |   MIGRATION HANDLING:
 72 |   All queries that modify the database will be automatically version controlled by the server. You can provide optional migration name, if you want to name the migration.
 73 |    - Respect the following format: verb_noun_detail. Be descriptive and concise.
 74 |    - Examples:
 75 |      - create_users_table
 76 |      - add_email_to_profiles
 77 |      - enable_rls_on_users
 78 |    - If you don't provide a migration name, the server will generate one based on the SQL statement
 79 |    - The system will sanitize your provided name to ensure compatibility with database systems
 80 |    - Migration names are prefixed with a timestamp in the format YYYYMMDDHHMMSS
 81 | 
 82 |   SAFETY SYSTEM:
 83 |   Operations are categorized by risk level:
 84 |   - LOW RISK: Read operations (SELECT, EXPLAIN) - allowed in SAFE mode
 85 |   - MEDIUM RISK: Write operations (INSERT, UPDATE, DELETE) - require UNSAFE mode
 86 |   - HIGH RISK: Schema operations (CREATE, ALTER) - require UNSAFE mode
 87 |   - EXTREME RISK: Destructive operations (DROP, TRUNCATE) - require UNSAFE mode and confirmation
 88 | 
 89 |   TRANSACTION HANDLING:
 90 |   - DO NOT use transaction control statements (BEGIN, COMMIT, ROLLBACK)
 91 |   - The database client automatically wraps queries in transactions
 92 |   - The SQL validator will reject queries containing transaction control statements
 93 |   - This ensures atomicity and provides rollback capability for data modifications
 94 | 
 95 |   MULTIPLE STATEMENTS:
 96 |   - You can send multiple SQL statements in a single query
 97 |   - Each statement will be executed in order within the same transaction
 98 |   - Example:
 99 |     CREATE TABLE public.test_table (id SERIAL PRIMARY KEY, name TEXT);
100 |     INSERT INTO public.test_table (name) VALUES ('test');
101 | 
102 |   CONFIRMATION FLOW FOR HIGH-RISK OPERATIONS:
103 |   - High-risk operations (DROP TABLE, TRUNCATE, etc.) will be rejected with a confirmation ID
104 |   - The error message will explain what happened and provide a confirmation ID
105 |   - Review the risks with the user before proceeding
106 |   - Use the confirm_destructive_operation tool with the provided ID to execute the operation
107 | 
108 |   IMPORTANT GUIDELINES:
109 |   - The database client starts in SAFE mode by default for safety
110 |   - Only enable UNSAFE mode when you need to modify data or schema
111 |   - Never mix READ and WRITE operations in the same transaction
112 |   - For destructive operations, be prepared to confirm with the confirm_destructive_operation tool
113 | 
114 |   WHEN TO USE OTHER TOOLS INSTEAD:
115 |   - For Auth operations (users, authentication, etc.): Use call_auth_admin_method instead of direct SQL
116 |     The Auth Admin SDK provides safer, validated methods for user management
117 |   - For project configuration, functions, storage, etc.: Use send_management_api_request
118 |     The Management API handles Supabase platform features that aren't directly in the database
119 | 
120 |   Note: This tool operates on the PostgreSQL database only. API operations use separate safety controls.
121 | 
122 | retrieve_migrations: |
123 |   Retrieve a list of all migrations a user has from Supabase.
124 | 
125 |   Returns a list of migrations with the following information:
126 |   - Version (timestamp)
127 |   - Name
128 |   - SQL statements (if requested)
129 |   - Statement count
130 |   - Version type (named or numbered)
131 | 
132 |   Parameters:
133 |   - limit: Maximum number of migrations to return (default: 50, max: 100)
134 |   - offset: Number of migrations to skip for pagination (default: 0)
135 |   - name_pattern: Optional pattern to filter migrations by name. Uses SQL ILIKE pattern matching (case-insensitive).
136 |     The pattern is automatically wrapped with '%' wildcards, so "users" will match "create_users_table",
137 |     "add_email_to_users", etc. To search for an exact match, use the complete name.
138 |   - include_full_queries: Whether to include the full SQL statements in the result (default: false)
139 | 
140 |   SAFETY: This is a low-risk read operation that can be executed in SAFE mode.
141 | 
```

--------------------------------------------------------------------------------
/tests/test_main.py:
--------------------------------------------------------------------------------

```python
  1 | import asyncio
  2 | from unittest.mock import patch
  3 | 
  4 | import pytest
  5 | 
  6 | from supabase_mcp.core.container import ServicesContainer
  7 | from supabase_mcp.logger import logger
  8 | from supabase_mcp.main import run_inspector, run_server
  9 | from supabase_mcp.services.safety.models import ClientType
 10 | from supabase_mcp.tools.manager import ToolName
 11 | 
 12 | # === UNIT TESTS ===
 13 | 
 14 | 
 15 | class TestMain:
 16 |     """Tests for the main application functionality."""
 17 | 
 18 |     @pytest.mark.unit
 19 |     def test_mcp_server_initializes(self, container_integration: ServicesContainer):
 20 |         """Test that the MCP server initializes correctly."""
 21 |         # Verify server name
 22 |         mcp = container_integration.mcp_server
 23 |         assert mcp.name == "supabase"
 24 | 
 25 |         # Verify MCP server is created but not yet initialized with tools
 26 |         tools = asyncio.run(mcp.list_tools())
 27 |         logger.info(f"Found {len(tools)} MCP tools registered in basic container")
 28 | 
 29 |         # At this point, no tools should be registered yet
 30 |         assert len(tools) == 0, f"Expected 0 tools in basic container, but got {len(tools)}"
 31 | 
 32 |     @pytest.mark.unit
 33 |     def test_services_container_initialization(
 34 |         self,
 35 |         initialized_container_integration: ServicesContainer,
 36 |     ):
 37 |         """Test that the services container is correctly initialized."""
 38 |         # Verify container has all required services
 39 |         container = initialized_container_integration
 40 |         assert container.postgres_client is not None
 41 |         assert container.api_client is not None
 42 |         assert container.sdk_client is not None
 43 |         assert container.api_manager is not None
 44 |         assert container.safety_manager is not None
 45 |         assert container.query_manager is not None
 46 |         assert container.tool_manager is not None
 47 | 
 48 |         # Verify the container is fully initialized
 49 |         # Check that safety manager has been initialized by verifying it has configs registered
 50 |         safety_manager = container.safety_manager
 51 |         assert safety_manager.get_safety_mode(ClientType.DATABASE) is not None
 52 |         assert safety_manager.get_safety_mode(ClientType.API) is not None
 53 | 
 54 |     @pytest.mark.unit
 55 |     def test_tool_registration(self, tools_registry_integration: ServicesContainer):
 56 |         """Test that tools are registered correctly using ToolManager's tool names."""
 57 | 
 58 |         # Get the tool manager from the container
 59 |         tool_manager = tools_registry_integration.tool_manager
 60 |         assert tool_manager is not None, "Tool manager should be initialized"
 61 | 
 62 |         # Get the MCP server from the container
 63 |         mcp = tools_registry_integration.mcp_server
 64 | 
 65 |         # Get expected tools from ToolName enum
 66 |         expected_tools = [
 67 |             ToolName.GET_SCHEMAS,
 68 |             ToolName.GET_TABLES,
 69 |             ToolName.GET_TABLE_SCHEMA,
 70 |             ToolName.EXECUTE_POSTGRESQL,
 71 |             ToolName.CONFIRM_DESTRUCTIVE_OPERATION,
 72 |             ToolName.RETRIEVE_MIGRATIONS,
 73 |             ToolName.LIVE_DANGEROUSLY,
 74 |             ToolName.SEND_MANAGEMENT_API_REQUEST,
 75 |             ToolName.GET_MANAGEMENT_API_SPEC,
 76 |             ToolName.GET_AUTH_ADMIN_METHODS_SPEC,
 77 |             ToolName.CALL_AUTH_ADMIN_METHOD,
 78 |             ToolName.RETRIEVE_LOGS,
 79 |         ]
 80 | 
 81 |         # Verify tools are registered in MCP
 82 |         registered_tools = asyncio.run(mcp.list_tools())
 83 |         registered_tool_names = {tool.name for tool in registered_tools}
 84 | 
 85 |         # We should have exactly 12 tools (all the tools defined in ToolName enum)
 86 |         assert len(registered_tools) == 12, f"Expected 12 tools, but got {len(registered_tools)}"
 87 | 
 88 |         # Log the actual number of tools for reference
 89 |         logger.info(f"Found {len(registered_tools)} MCP tools registered")
 90 | 
 91 |         # Verify each tool has proper MCP protocol structure
 92 |         for tool in registered_tools:
 93 |             assert tool.name, "Tool must have a name"
 94 |             assert tool.description, "Tool must have a description"
 95 |             assert tool.inputSchema, "Tool must have an input schema"
 96 | 
 97 |         # Check that each expected tool is registered by its string value
 98 |         for tool_name in expected_tools:
 99 |             # Convert enum to string value (e.g., 'get_schemas' instead of ToolName.GET_SCHEMAS)
100 |             tool_str_value = str(tool_name.value)
101 |             assert tool_str_value in registered_tool_names, f"Tool {tool_name} not registered"
102 | 
103 |         # Verify we have tools for core functionality categories
104 |         # Instead of checking specific names, check for categories of functionality
105 |         tool_names = {tool.name for tool in registered_tools}
106 | 
107 |         # Log all available tools for debugging
108 |         tool_list = ", ".join(sorted(tool_names))
109 |         logger.info(f"Available MCP tools: {tool_list}")
110 | 
111 |     @pytest.mark.unit
112 |     def test_run_server_starts(self):
113 |         """Test that run_server starts the server."""
114 |         # Patch the global mcp instance and its run method
115 |         with patch("supabase_mcp.main.mcp") as mock_mcp:
116 |             # Call run_server which should use the global mcp instance
117 |             run_server()
118 |             mock_mcp.run.assert_called_once()
119 | 
120 |     @pytest.mark.unit
121 |     def test_inspector_mode(self):
122 |         """Test that inspector mode initializes correctly"""
123 |         # This test is fine as is since it's testing the global function
124 |         # and mocking an external dependency
125 |         with patch("mcp.cli.cli.dev") as mock_dev:
126 |             # Patch __file__ in the main module to match what we expect
127 |             with patch("supabase_mcp.main.__file__", __file__):
128 |                 run_inspector()
129 |                 mock_dev.assert_called_once_with(__file__)
130 | 
131 |     @pytest.mark.unit
132 |     def test_server_command_exists(self):
133 |         """Test that the server command exists and is executable"""
134 |         import os
135 |         import shutil
136 | 
137 |         # Skip this test in CI environments
138 |         if os.environ.get("CI") == "true":
139 |             pytest.skip("Skipping server command test in CI environment")
140 | 
141 |         # Check if the command exists in PATH
142 |         server_path = shutil.which("supabase-mcp-server")
143 |         assert server_path is not None, "supabase-mcp-server command not found in PATH"
144 | 
145 |         # Check if the file is executable
146 |         assert os.access(server_path, os.X_OK), "supabase-mcp-server is not executable"
147 | 
```

--------------------------------------------------------------------------------
/CHANGELOG.MD:
--------------------------------------------------------------------------------

```markdown
  1 | # Changelog
  2 | 
  3 | All notable changes to this project will be documented in this file.
  4 | 
  5 | The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.1.0/).
  6 | 
  7 | 
  8 | ## [0.3.12] - 2025-03-12
  9 | ### Added
 10 | - Implemented a new `retrieve_logs` tool that allows retrieval of logs from any Supabase log collection - Postgres, PostgREST, Auth, Edge Functions and others. Provides a way to query and filter
 11 | - Implemented log rotation to prevent unbounded log file growth (5MB limit with 3 backup files)
 12 | 
 13 | ### Changed
 14 | - Improved region configuration with clearer error messages for region mismatches
 15 | - Updated smithery.yaml to reduce configuration error rate (Tenant not found)
 16 | - Improved PostgreSQL client connection error handling with specific guidance for "Tenant or user not found" errors
 17 | 
 18 | 
 19 | ## [0.3.11] - 2025-03-10
 20 | ### Fixed
 21 | - Fixed an error with creating a migration file when a user doesn't have `supabase_migrations` schema
 22 | 
 23 | 
 24 | ## [0.3.10] - 2025-03-09
 25 | ### Added
 26 | - Enhanced migration naming system with improved object type detection for procedures, functions, and views.
 27 | - Expanded `retrieve_migrations` tool with pagination, name pattern filtering, and option to include full SQL queries.
 28 | - Added a check to validate personal access token and service role key are set before calling API or SDK methods
 29 | 
 30 | ### Changed
 31 | - Updated setup instructions for Claude Desktop
 32 | - Updated setup instructions for Cline
 33 | - Updated and fixed Smithery.ai setup
 34 | 
 35 | ### Removed
 36 | - Removed redundant `get_api_safety_rules` tool since exceptions already provide enough information to the client
 37 | 
 38 | 
 39 | ## [0.3.9] - 2025-03-08
 40 | ### Fixed
 41 | - Fixed an issue with api spec tool that prevented spec retrieval
 42 | 
 43 | 
 44 | ## [0.3.8] - 2025-03-07
 45 | ### Added
 46 | - SQL query validation using PostgreSQL's parser (pglast v7.3+)
 47 | - Automatic migration script generation for schema changes
 48 | - Universal safety system with standardized risk levels (Low/Medium/High/Extreme)
 49 | - Switched to asyncpg v0.30.0+ from psycopg2
 50 | - Enhanced API spec tool with multiple query modes and risk assessment
 51 | - Connection retry logic for database and API operations
 52 | - Code coverage with pytest-cov
 53 | - SQL linting with SQLFluff
 54 | - Added pyyaml v6.0.2+ for configuration
 55 | 
 56 | ### Changed
 57 | - Refactored to use dependency injection pattern
 58 | - Standardized service initialization to synchronous pattern
 59 | - Improved SQL safety categorization:
 60 |   - `safe`: Read-only operations (always allowed)
 61 |   - `write`: Data modification (requires unsafe mode)
 62 |   - `destructive`: Schema changes (requires unsafe mode + confirmation)
 63 | - Updated Ruff to v0.9.9
 64 | - Added asyncpg-stubs and pytest-mock for testing
 65 | 
 66 | ## [0.3.7] - 2025-03-02
 67 | ### Fixed
 68 | - Documentation inaccuracies
 69 | 
 70 | ### Added
 71 | - Auth admin SDK support for local Supabase instances
 72 | 
 73 | 
 74 | ## [0.3.6] - 2025-02-26
 75 | ### Added
 76 | - Added `call_auth_admin_method` which enables MCP server to manage users in your database (create, update, delete, confirm). All Auth SDK methods are supported
 77 | - Added `get_auth_admin_methods_spec` to retrieve documentation for all available Auth Admin methods. Response objects now use attribute access (dot notation) instead of dictionary access.
 78 | 
 79 | ### Fixed
 80 | - Fixed an issue with improper encoding of database passwords. Previously passwords containing "%" symbol led to connection failures
 81 | 
 82 | 
 83 | ## [0.3.5] - 2025-02-26
 84 | ### Fixed
 85 | - Fixed an issue with `get_tables` so that it reliably returns foreign tables and views
 86 | - Updated docs to describe how to setup mcp.json with project-specific MCPs
 87 | - Expanded and improved test suite to cover each MCP tool
 88 | 
 89 | 
 90 | ## [0.3.4] - 2025-02-25
 91 | ### Fixed
 92 | - Improved `get_tables` to return foreign data tables
 93 | 
 94 | 
 95 | ## [0.3.3] - 2025-02-25
 96 | ### Fixed
 97 | - Fixed a bug with `readonly` scope being incorrectly managed in db client
 98 | 
 99 | ## [0.3.2] - 2025-02-25
100 | ### Fixed
101 | - Fixed a bug preventing execution of DDL commands (create, alter tables, etc.)
102 | 
103 | ## [0.3.1] - 2025-02-23
104 | ### Changed
105 | - Significantly improved docs to make install, configuration, usage instructions super clear
106 | 
107 | 
108 | ## [0.3.0] - 2025-02-23
109 | ### Added
110 | - Full support for read-write SQL operations:
111 |   - Implemented safety mode system with read-only (default) and read-write modes
112 |   - Added mode switching with automatic reset to read-only
113 |   - Enhanced transaction support for testing write operations
114 |   - Improved error handling for read-only violations
115 | - Support for Supabase Management API
116 |   - Introduces supabase management API integration with safe (enabled by default) and yolo modes
117 |   - Includes the following tools:
118 |     - `send_management_api_request` to send arbitrary requests to Supabase Management API, with auto-injection of project ref and safety mode control.
119 |     - `get_management_api_spec` to get the enriched API specification with safety information
120 |     - `get_management_api_safety_rules` to get all safety rules including blocked and unsafe operations with human-readable explanations
121 |     - `live_dangerously` to switch to yolo mode
122 |   - Safety features:
123 |     - Divides API methods into `safe`, `unsafe` and `blocked` categories based on the risk of the operation
124 |     - Allows to switch between safe and yolo modes dynamically
125 |     - Blocked operations (delete project, delete database) are not allowed regardless of the mode
126 | 
127 | 
128 | ## [0.2.2] - 2025-02-20
129 | ### Added
130 | - Support for different Supabase regions:
131 |   - Configuration via `SUPABASE_REGION` environment variable
132 |   - Validation for all 16 supported AWS regions
133 |   - Default to `us-east-1` for backward compatibility
134 |   - Enhanced logging for region information
135 |   - Comprehensive documentation and examples
136 | 
137 | ## [0.2.1] - 2025-02-19
138 | ### Added
139 | - Package distribution support:
140 |   - PyPI package publishing setup
141 |   - Installation via `pipx` and `uv`
142 |   - Entry point scripts for direct execution
143 | - Smithery.ai deployment configuration
144 | 
145 | ### Changed
146 | - BREAKING: Installation and execution methods:
147 |   - Switched from direct script execution to proper module structure
148 |   - Updated Cursor/Windsurf configuration for package-based execution
149 | - Improved setup instructions in README
150 | 
151 | ## [0.2.0] - 2025-02-18
152 | Intermediary release for package distribution support
153 | 
154 | ## [0.1.0] - 2025-02-16
155 | ### Added
156 | - Initial release
157 | - Basic MCP server functionality
158 | - Supabase database connection support
159 | - Integration with Cursor and Windsurf IDEs
160 | 
161 | [0.3.0]: https://github.com/alexander-zuev/supabase-mcp-server/releases/tag/v0.3.0
162 | [0.2.2]: https://github.com/alexander-zuev/supabase-mcp-server/releases/tag/v0.2.2
163 | [0.2.1]: https://github.com/alexander-zuev/supabase-mcp-server/releases/tag/v0.2.1
164 | [0.2.0]: https://github.com/alexander-zuev/supabase-mcp-server/releases/tag/v0.2.0-dev0
165 | [0.1.0]: https://github.com/alexander-zuev/supabase-mcp-server/releases/tag/v0.1.0
166 | 
```
Page 1/6FirstPrevNextLast