#
tokens: 47828/50000 14/822 files (page 19/36)
lines: off (toggle) GitHub
raw markdown copy
This is page 19 of 36. Use http://codebase.md/googleapis/genai-toolbox?page={x} to view the full context.

# Directory Structure

```
├── .ci
│   ├── continuous.release.cloudbuild.yaml
│   ├── generate_release_table.sh
│   ├── integration.cloudbuild.yaml
│   ├── quickstart_test
│   │   ├── go.integration.cloudbuild.yaml
│   │   ├── js.integration.cloudbuild.yaml
│   │   ├── py.integration.cloudbuild.yaml
│   │   ├── run_go_tests.sh
│   │   ├── run_js_tests.sh
│   │   ├── run_py_tests.sh
│   │   └── setup_hotels_sample.sql
│   ├── test_with_coverage.sh
│   └── versioned.release.cloudbuild.yaml
├── .github
│   ├── auto-label.yaml
│   ├── blunderbuss.yml
│   ├── CODEOWNERS
│   ├── header-checker-lint.yml
│   ├── ISSUE_TEMPLATE
│   │   ├── bug_report.yml
│   │   ├── config.yml
│   │   ├── feature_request.yml
│   │   └── question.yml
│   ├── label-sync.yml
│   ├── labels.yaml
│   ├── PULL_REQUEST_TEMPLATE.md
│   ├── release-please.yml
│   ├── renovate.json5
│   ├── sync-repo-settings.yaml
│   └── workflows
│       ├── cloud_build_failure_reporter.yml
│       ├── deploy_dev_docs.yaml
│       ├── deploy_previous_version_docs.yaml
│       ├── deploy_versioned_docs.yaml
│       ├── docs_deploy.yaml
│       ├── docs_preview_clean.yaml
│       ├── docs_preview_deploy.yaml
│       ├── lint.yaml
│       ├── schedule_reporter.yml
│       ├── sync-labels.yaml
│       └── tests.yaml
├── .gitignore
├── .gitmodules
├── .golangci.yaml
├── .hugo
│   ├── archetypes
│   │   └── default.md
│   ├── assets
│   │   ├── icons
│   │   │   └── logo.svg
│   │   └── scss
│   │       ├── _styles_project.scss
│   │       └── _variables_project.scss
│   ├── go.mod
│   ├── go.sum
│   ├── hugo.toml
│   ├── layouts
│   │   ├── _default
│   │   │   └── home.releases.releases
│   │   ├── index.llms-full.txt
│   │   ├── index.llms.txt
│   │   ├── partials
│   │   │   ├── hooks
│   │   │   │   └── head-end.html
│   │   │   ├── navbar-version-selector.html
│   │   │   ├── page-meta-links.html
│   │   │   └── td
│   │   │       └── render-heading.html
│   │   ├── robot.txt
│   │   └── shortcodes
│   │       ├── include.html
│   │       ├── ipynb.html
│   │       └── regionInclude.html
│   ├── package-lock.json
│   ├── package.json
│   └── static
│       ├── favicons
│       │   ├── android-chrome-192x192.png
│       │   ├── android-chrome-512x512.png
│       │   ├── apple-touch-icon.png
│       │   ├── favicon-16x16.png
│       │   ├── favicon-32x32.png
│       │   └── favicon.ico
│       └── js
│           └── w3.js
├── CHANGELOG.md
├── cmd
│   ├── options_test.go
│   ├── options.go
│   ├── root_test.go
│   ├── root.go
│   └── version.txt
├── CODE_OF_CONDUCT.md
├── CONTRIBUTING.md
├── DEVELOPER.md
├── Dockerfile
├── docs
│   └── en
│       ├── _index.md
│       ├── about
│       │   ├── _index.md
│       │   └── faq.md
│       ├── concepts
│       │   ├── _index.md
│       │   └── telemetry
│       │       ├── index.md
│       │       ├── telemetry_flow.png
│       │       └── telemetry_traces.png
│       ├── getting-started
│       │   ├── _index.md
│       │   ├── colab_quickstart.ipynb
│       │   ├── configure.md
│       │   ├── introduction
│       │   │   ├── _index.md
│       │   │   └── architecture.png
│       │   ├── local_quickstart_go.md
│       │   ├── local_quickstart_js.md
│       │   ├── local_quickstart.md
│       │   ├── mcp_quickstart
│       │   │   ├── _index.md
│       │   │   ├── inspector_tools.png
│       │   │   └── inspector.png
│       │   └── quickstart
│       │       ├── go
│       │       │   ├── genAI
│       │       │   │   ├── go.mod
│       │       │   │   ├── go.sum
│       │       │   │   └── quickstart.go
│       │       │   ├── genkit
│       │       │   │   ├── go.mod
│       │       │   │   ├── go.sum
│       │       │   │   └── quickstart.go
│       │       │   ├── langchain
│       │       │   │   ├── go.mod
│       │       │   │   ├── go.sum
│       │       │   │   └── quickstart.go
│       │       │   ├── openAI
│       │       │   │   ├── go.mod
│       │       │   │   ├── go.sum
│       │       │   │   └── quickstart.go
│       │       │   └── quickstart_test.go
│       │       ├── golden.txt
│       │       ├── js
│       │       │   ├── genAI
│       │       │   │   ├── package-lock.json
│       │       │   │   ├── package.json
│       │       │   │   └── quickstart.js
│       │       │   ├── genkit
│       │       │   │   ├── package-lock.json
│       │       │   │   ├── package.json
│       │       │   │   └── quickstart.js
│       │       │   ├── langchain
│       │       │   │   ├── package-lock.json
│       │       │   │   ├── package.json
│       │       │   │   └── quickstart.js
│       │       │   ├── llamaindex
│       │       │   │   ├── package-lock.json
│       │       │   │   ├── package.json
│       │       │   │   └── quickstart.js
│       │       │   └── quickstart.test.js
│       │       ├── python
│       │       │   ├── __init__.py
│       │       │   ├── adk
│       │       │   │   ├── quickstart.py
│       │       │   │   └── requirements.txt
│       │       │   ├── core
│       │       │   │   ├── quickstart.py
│       │       │   │   └── requirements.txt
│       │       │   ├── langchain
│       │       │   │   ├── quickstart.py
│       │       │   │   └── requirements.txt
│       │       │   ├── llamaindex
│       │       │   │   ├── quickstart.py
│       │       │   │   └── requirements.txt
│       │       │   └── quickstart_test.py
│       │       └── shared
│       │           ├── cloud_setup.md
│       │           ├── configure_toolbox.md
│       │           └── database_setup.md
│       ├── how-to
│       │   ├── _index.md
│       │   ├── connect_via_geminicli.md
│       │   ├── connect_via_mcp.md
│       │   ├── connect-ide
│       │   │   ├── _index.md
│       │   │   ├── alloydb_pg_admin_mcp.md
│       │   │   ├── alloydb_pg_mcp.md
│       │   │   ├── bigquery_mcp.md
│       │   │   ├── cloud_sql_mssql_admin_mcp.md
│       │   │   ├── cloud_sql_mssql_mcp.md
│       │   │   ├── cloud_sql_mysql_admin_mcp.md
│       │   │   ├── cloud_sql_mysql_mcp.md
│       │   │   ├── cloud_sql_pg_admin_mcp.md
│       │   │   ├── cloud_sql_pg_mcp.md
│       │   │   ├── firestore_mcp.md
│       │   │   ├── looker_mcp.md
│       │   │   ├── mssql_mcp.md
│       │   │   ├── mysql_mcp.md
│       │   │   ├── neo4j_mcp.md
│       │   │   ├── postgres_mcp.md
│       │   │   ├── spanner_mcp.md
│       │   │   └── sqlite_mcp.md
│       │   ├── deploy_docker.md
│       │   ├── deploy_gke.md
│       │   ├── deploy_toolbox.md
│       │   ├── export_telemetry.md
│       │   └── toolbox-ui
│       │       ├── edit-headers.gif
│       │       ├── edit-headers.png
│       │       ├── index.md
│       │       ├── optional-param-checked.png
│       │       ├── optional-param-unchecked.png
│       │       ├── run-tool.gif
│       │       ├── tools.png
│       │       └── toolsets.png
│       ├── reference
│       │   ├── _index.md
│       │   ├── cli.md
│       │   └── prebuilt-tools.md
│       ├── resources
│       │   ├── _index.md
│       │   ├── authServices
│       │   │   ├── _index.md
│       │   │   └── google.md
│       │   ├── sources
│       │   │   ├── _index.md
│       │   │   ├── alloydb-admin.md
│       │   │   ├── alloydb-pg.md
│       │   │   ├── bigquery.md
│       │   │   ├── bigtable.md
│       │   │   ├── cassandra.md
│       │   │   ├── clickhouse.md
│       │   │   ├── cloud-monitoring.md
│       │   │   ├── cloud-sql-admin.md
│       │   │   ├── cloud-sql-mssql.md
│       │   │   ├── cloud-sql-mysql.md
│       │   │   ├── cloud-sql-pg.md
│       │   │   ├── couchbase.md
│       │   │   ├── dataplex.md
│       │   │   ├── dgraph.md
│       │   │   ├── firebird.md
│       │   │   ├── firestore.md
│       │   │   ├── http.md
│       │   │   ├── looker.md
│       │   │   ├── mongodb.md
│       │   │   ├── mssql.md
│       │   │   ├── mysql.md
│       │   │   ├── neo4j.md
│       │   │   ├── oceanbase.md
│       │   │   ├── oracle.md
│       │   │   ├── postgres.md
│       │   │   ├── redis.md
│       │   │   ├── serverless-spark.md
│       │   │   ├── spanner.md
│       │   │   ├── sqlite.md
│       │   │   ├── tidb.md
│       │   │   ├── trino.md
│       │   │   ├── valkey.md
│       │   │   └── yugabytedb.md
│       │   └── tools
│       │       ├── _index.md
│       │       ├── alloydb
│       │       │   ├── _index.md
│       │       │   ├── alloydb-create-cluster.md
│       │       │   ├── alloydb-create-instance.md
│       │       │   ├── alloydb-create-user.md
│       │       │   ├── alloydb-get-cluster.md
│       │       │   ├── alloydb-get-instance.md
│       │       │   ├── alloydb-get-user.md
│       │       │   ├── alloydb-list-clusters.md
│       │       │   ├── alloydb-list-instances.md
│       │       │   ├── alloydb-list-users.md
│       │       │   └── alloydb-wait-for-operation.md
│       │       ├── alloydbainl
│       │       │   ├── _index.md
│       │       │   └── alloydb-ai-nl.md
│       │       ├── bigquery
│       │       │   ├── _index.md
│       │       │   ├── bigquery-analyze-contribution.md
│       │       │   ├── bigquery-conversational-analytics.md
│       │       │   ├── bigquery-execute-sql.md
│       │       │   ├── bigquery-forecast.md
│       │       │   ├── bigquery-get-dataset-info.md
│       │       │   ├── bigquery-get-table-info.md
│       │       │   ├── bigquery-list-dataset-ids.md
│       │       │   ├── bigquery-list-table-ids.md
│       │       │   ├── bigquery-search-catalog.md
│       │       │   └── bigquery-sql.md
│       │       ├── bigtable
│       │       │   ├── _index.md
│       │       │   └── bigtable-sql.md
│       │       ├── cassandra
│       │       │   ├── _index.md
│       │       │   └── cassandra-cql.md
│       │       ├── clickhouse
│       │       │   ├── _index.md
│       │       │   ├── clickhouse-execute-sql.md
│       │       │   ├── clickhouse-list-databases.md
│       │       │   ├── clickhouse-list-tables.md
│       │       │   └── clickhouse-sql.md
│       │       ├── cloudmonitoring
│       │       │   ├── _index.md
│       │       │   └── cloud-monitoring-query-prometheus.md
│       │       ├── cloudsql
│       │       │   ├── _index.md
│       │       │   ├── cloudsqlcreatedatabase.md
│       │       │   ├── cloudsqlcreateusers.md
│       │       │   ├── cloudsqlgetinstances.md
│       │       │   ├── cloudsqllistdatabases.md
│       │       │   ├── cloudsqllistinstances.md
│       │       │   ├── cloudsqlmssqlcreateinstance.md
│       │       │   ├── cloudsqlmysqlcreateinstance.md
│       │       │   ├── cloudsqlpgcreateinstances.md
│       │       │   └── cloudsqlwaitforoperation.md
│       │       ├── couchbase
│       │       │   ├── _index.md
│       │       │   └── couchbase-sql.md
│       │       ├── dataform
│       │       │   ├── _index.md
│       │       │   └── dataform-compile-local.md
│       │       ├── dataplex
│       │       │   ├── _index.md
│       │       │   ├── dataplex-lookup-entry.md
│       │       │   ├── dataplex-search-aspect-types.md
│       │       │   └── dataplex-search-entries.md
│       │       ├── dgraph
│       │       │   ├── _index.md
│       │       │   └── dgraph-dql.md
│       │       ├── firebird
│       │       │   ├── _index.md
│       │       │   ├── firebird-execute-sql.md
│       │       │   └── firebird-sql.md
│       │       ├── firestore
│       │       │   ├── _index.md
│       │       │   ├── firestore-add-documents.md
│       │       │   ├── firestore-delete-documents.md
│       │       │   ├── firestore-get-documents.md
│       │       │   ├── firestore-get-rules.md
│       │       │   ├── firestore-list-collections.md
│       │       │   ├── firestore-query-collection.md
│       │       │   ├── firestore-query.md
│       │       │   ├── firestore-update-document.md
│       │       │   └── firestore-validate-rules.md
│       │       ├── http
│       │       │   ├── _index.md
│       │       │   └── http.md
│       │       ├── looker
│       │       │   ├── _index.md
│       │       │   ├── looker-add-dashboard-element.md
│       │       │   ├── looker-conversational-analytics.md
│       │       │   ├── looker-create-project-file.md
│       │       │   ├── looker-delete-project-file.md
│       │       │   ├── looker-dev-mode.md
│       │       │   ├── looker-get-dashboards.md
│       │       │   ├── looker-get-dimensions.md
│       │       │   ├── looker-get-explores.md
│       │       │   ├── looker-get-filters.md
│       │       │   ├── looker-get-looks.md
│       │       │   ├── looker-get-measures.md
│       │       │   ├── looker-get-models.md
│       │       │   ├── looker-get-parameters.md
│       │       │   ├── looker-get-project-file.md
│       │       │   ├── looker-get-project-files.md
│       │       │   ├── looker-get-projects.md
│       │       │   ├── looker-health-analyze.md
│       │       │   ├── looker-health-pulse.md
│       │       │   ├── looker-health-vacuum.md
│       │       │   ├── looker-make-dashboard.md
│       │       │   ├── looker-make-look.md
│       │       │   ├── looker-query-sql.md
│       │       │   ├── looker-query-url.md
│       │       │   ├── looker-query.md
│       │       │   ├── looker-run-look.md
│       │       │   └── looker-update-project-file.md
│       │       ├── mongodb
│       │       │   ├── _index.md
│       │       │   ├── mongodb-aggregate.md
│       │       │   ├── mongodb-delete-many.md
│       │       │   ├── mongodb-delete-one.md
│       │       │   ├── mongodb-find-one.md
│       │       │   ├── mongodb-find.md
│       │       │   ├── mongodb-insert-many.md
│       │       │   ├── mongodb-insert-one.md
│       │       │   ├── mongodb-update-many.md
│       │       │   └── mongodb-update-one.md
│       │       ├── mssql
│       │       │   ├── _index.md
│       │       │   ├── mssql-execute-sql.md
│       │       │   ├── mssql-list-tables.md
│       │       │   └── mssql-sql.md
│       │       ├── mysql
│       │       │   ├── _index.md
│       │       │   ├── mysql-execute-sql.md
│       │       │   ├── mysql-list-active-queries.md
│       │       │   ├── mysql-list-table-fragmentation.md
│       │       │   ├── mysql-list-tables-missing-unique-indexes.md
│       │       │   ├── mysql-list-tables.md
│       │       │   └── mysql-sql.md
│       │       ├── neo4j
│       │       │   ├── _index.md
│       │       │   ├── neo4j-cypher.md
│       │       │   ├── neo4j-execute-cypher.md
│       │       │   └── neo4j-schema.md
│       │       ├── oceanbase
│       │       │   ├── _index.md
│       │       │   ├── oceanbase-execute-sql.md
│       │       │   └── oceanbase-sql.md
│       │       ├── oracle
│       │       │   ├── _index.md
│       │       │   ├── oracle-execute-sql.md
│       │       │   └── oracle-sql.md
│       │       ├── postgres
│       │       │   ├── _index.md
│       │       │   ├── postgres-execute-sql.md
│       │       │   ├── postgres-list-active-queries.md
│       │       │   ├── postgres-list-available-extensions.md
│       │       │   ├── postgres-list-installed-extensions.md
│       │       │   ├── postgres-list-tables.md
│       │       │   ├── postgres-list-views.md
│       │       │   └── postgres-sql.md
│       │       ├── redis
│       │       │   ├── _index.md
│       │       │   └── redis.md
│       │       ├── serverless-spark
│       │       │   ├── _index.md
│       │       │   ├── serverless-spark-get-batch.md
│       │       │   └── serverless-spark-list-batches.md
│       │       ├── spanner
│       │       │   ├── _index.md
│       │       │   ├── spanner-execute-sql.md
│       │       │   ├── spanner-list-tables.md
│       │       │   └── spanner-sql.md
│       │       ├── sqlite
│       │       │   ├── _index.md
│       │       │   ├── sqlite-execute-sql.md
│       │       │   └── sqlite-sql.md
│       │       ├── tidb
│       │       │   ├── _index.md
│       │       │   ├── tidb-execute-sql.md
│       │       │   └── tidb-sql.md
│       │       ├── trino
│       │       │   ├── _index.md
│       │       │   ├── trino-execute-sql.md
│       │       │   └── trino-sql.md
│       │       ├── utility
│       │       │   ├── _index.md
│       │       │   └── wait.md
│       │       ├── valkey
│       │       │   ├── _index.md
│       │       │   └── valkey.md
│       │       └── yuagbytedb
│       │           ├── _index.md
│       │           └── yugabytedb-sql.md
│       ├── samples
│       │   ├── _index.md
│       │   ├── alloydb
│       │   │   ├── _index.md
│       │   │   ├── ai-nl
│       │   │   │   ├── alloydb_ai_nl.ipynb
│       │   │   │   └── index.md
│       │   │   └── mcp_quickstart.md
│       │   ├── bigquery
│       │   │   ├── _index.md
│       │   │   ├── colab_quickstart_bigquery.ipynb
│       │   │   ├── local_quickstart.md
│       │   │   └── mcp_quickstart
│       │   │       ├── _index.md
│       │   │       ├── inspector_tools.png
│       │   │       └── inspector.png
│       │   └── looker
│       │       ├── _index.md
│       │       ├── looker_gemini_oauth
│       │       │   ├── _index.md
│       │       │   ├── authenticated.png
│       │       │   ├── authorize.png
│       │       │   └── registration.png
│       │       ├── looker_gemini.md
│       │       └── looker_mcp_inspector
│       │           ├── _index.md
│       │           ├── inspector_tools.png
│       │           └── inspector.png
│       └── sdks
│           ├── _index.md
│           ├── go-sdk.md
│           ├── js-sdk.md
│           └── python-sdk.md
├── gemini-extension.json
├── go.mod
├── go.sum
├── internal
│   ├── auth
│   │   ├── auth.go
│   │   └── google
│   │       └── google.go
│   ├── log
│   │   ├── handler.go
│   │   ├── log_test.go
│   │   ├── log.go
│   │   └── logger.go
│   ├── prebuiltconfigs
│   │   ├── prebuiltconfigs_test.go
│   │   ├── prebuiltconfigs.go
│   │   └── tools
│   │       ├── alloydb-postgres-admin.yaml
│   │       ├── alloydb-postgres-observability.yaml
│   │       ├── alloydb-postgres.yaml
│   │       ├── bigquery.yaml
│   │       ├── clickhouse.yaml
│   │       ├── cloud-sql-mssql-admin.yaml
│   │       ├── cloud-sql-mssql-observability.yaml
│   │       ├── cloud-sql-mssql.yaml
│   │       ├── cloud-sql-mysql-admin.yaml
│   │       ├── cloud-sql-mysql-observability.yaml
│   │       ├── cloud-sql-mysql.yaml
│   │       ├── cloud-sql-postgres-admin.yaml
│   │       ├── cloud-sql-postgres-observability.yaml
│   │       ├── cloud-sql-postgres.yaml
│   │       ├── dataplex.yaml
│   │       ├── firestore.yaml
│   │       ├── looker-conversational-analytics.yaml
│   │       ├── looker.yaml
│   │       ├── mssql.yaml
│   │       ├── mysql.yaml
│   │       ├── neo4j.yaml
│   │       ├── oceanbase.yaml
│   │       ├── postgres.yaml
│   │       ├── serverless-spark.yaml
│   │       ├── spanner-postgres.yaml
│   │       ├── spanner.yaml
│   │       └── sqlite.yaml
│   ├── server
│   │   ├── api_test.go
│   │   ├── api.go
│   │   ├── common_test.go
│   │   ├── config.go
│   │   ├── mcp
│   │   │   ├── jsonrpc
│   │   │   │   ├── jsonrpc_test.go
│   │   │   │   └── jsonrpc.go
│   │   │   ├── mcp.go
│   │   │   ├── util
│   │   │   │   └── lifecycle.go
│   │   │   ├── v20241105
│   │   │   │   ├── method.go
│   │   │   │   └── types.go
│   │   │   ├── v20250326
│   │   │   │   ├── method.go
│   │   │   │   └── types.go
│   │   │   └── v20250618
│   │   │       ├── method.go
│   │   │       └── types.go
│   │   ├── mcp_test.go
│   │   ├── mcp.go
│   │   ├── server_test.go
│   │   ├── server.go
│   │   ├── static
│   │   │   ├── assets
│   │   │   │   └── mcptoolboxlogo.png
│   │   │   ├── css
│   │   │   │   └── style.css
│   │   │   ├── index.html
│   │   │   ├── js
│   │   │   │   ├── auth.js
│   │   │   │   ├── loadTools.js
│   │   │   │   ├── mainContent.js
│   │   │   │   ├── navbar.js
│   │   │   │   ├── runTool.js
│   │   │   │   ├── toolDisplay.js
│   │   │   │   ├── tools.js
│   │   │   │   └── toolsets.js
│   │   │   ├── tools.html
│   │   │   └── toolsets.html
│   │   ├── web_test.go
│   │   └── web.go
│   ├── sources
│   │   ├── alloydbadmin
│   │   │   ├── alloydbadmin_test.go
│   │   │   └── alloydbadmin.go
│   │   ├── alloydbpg
│   │   │   ├── alloydb_pg_test.go
│   │   │   └── alloydb_pg.go
│   │   ├── bigquery
│   │   │   ├── bigquery_test.go
│   │   │   └── bigquery.go
│   │   ├── bigtable
│   │   │   ├── bigtable_test.go
│   │   │   └── bigtable.go
│   │   ├── cassandra
│   │   │   ├── cassandra_test.go
│   │   │   └── cassandra.go
│   │   ├── clickhouse
│   │   │   ├── clickhouse_test.go
│   │   │   └── clickhouse.go
│   │   ├── cloudmonitoring
│   │   │   ├── cloud_monitoring_test.go
│   │   │   └── cloud_monitoring.go
│   │   ├── cloudsqladmin
│   │   │   ├── cloud_sql_admin_test.go
│   │   │   └── cloud_sql_admin.go
│   │   ├── cloudsqlmssql
│   │   │   ├── cloud_sql_mssql_test.go
│   │   │   └── cloud_sql_mssql.go
│   │   ├── cloudsqlmysql
│   │   │   ├── cloud_sql_mysql_test.go
│   │   │   └── cloud_sql_mysql.go
│   │   ├── cloudsqlpg
│   │   │   ├── cloud_sql_pg_test.go
│   │   │   └── cloud_sql_pg.go
│   │   ├── couchbase
│   │   │   ├── couchbase_test.go
│   │   │   └── couchbase.go
│   │   ├── dataplex
│   │   │   ├── dataplex_test.go
│   │   │   └── dataplex.go
│   │   ├── dgraph
│   │   │   ├── dgraph_test.go
│   │   │   └── dgraph.go
│   │   ├── dialect.go
│   │   ├── firebird
│   │   │   ├── firebird_test.go
│   │   │   └── firebird.go
│   │   ├── firestore
│   │   │   ├── firestore_test.go
│   │   │   └── firestore.go
│   │   ├── http
│   │   │   ├── http_test.go
│   │   │   └── http.go
│   │   ├── ip_type.go
│   │   ├── looker
│   │   │   ├── looker_test.go
│   │   │   └── looker.go
│   │   ├── mongodb
│   │   │   ├── mongodb_test.go
│   │   │   └── mongodb.go
│   │   ├── mssql
│   │   │   ├── mssql_test.go
│   │   │   └── mssql.go
│   │   ├── mysql
│   │   │   ├── mysql_test.go
│   │   │   └── mysql.go
│   │   ├── neo4j
│   │   │   ├── neo4j_test.go
│   │   │   └── neo4j.go
│   │   ├── oceanbase
│   │   │   ├── oceanbase_test.go
│   │   │   └── oceanbase.go
│   │   ├── oracle
│   │   │   └── oracle.go
│   │   ├── postgres
│   │   │   ├── postgres_test.go
│   │   │   └── postgres.go
│   │   ├── redis
│   │   │   ├── redis_test.go
│   │   │   └── redis.go
│   │   ├── serverlessspark
│   │   │   ├── serverlessspark_test.go
│   │   │   └── serverlessspark.go
│   │   ├── sources.go
│   │   ├── spanner
│   │   │   ├── spanner_test.go
│   │   │   └── spanner.go
│   │   ├── sqlite
│   │   │   ├── sqlite_test.go
│   │   │   └── sqlite.go
│   │   ├── tidb
│   │   │   ├── tidb_test.go
│   │   │   └── tidb.go
│   │   ├── trino
│   │   │   ├── trino_test.go
│   │   │   └── trino.go
│   │   ├── util.go
│   │   ├── valkey
│   │   │   ├── valkey_test.go
│   │   │   └── valkey.go
│   │   └── yugabytedb
│   │       ├── yugabytedb_test.go
│   │       └── yugabytedb.go
│   ├── telemetry
│   │   ├── instrumentation.go
│   │   └── telemetry.go
│   ├── testutils
│   │   └── testutils.go
│   ├── tools
│   │   ├── alloydb
│   │   │   ├── alloydbcreatecluster
│   │   │   │   ├── alloydbcreatecluster_test.go
│   │   │   │   └── alloydbcreatecluster.go
│   │   │   ├── alloydbcreateinstance
│   │   │   │   ├── alloydbcreateinstance_test.go
│   │   │   │   └── alloydbcreateinstance.go
│   │   │   ├── alloydbcreateuser
│   │   │   │   ├── alloydbcreateuser_test.go
│   │   │   │   └── alloydbcreateuser.go
│   │   │   ├── alloydbgetcluster
│   │   │   │   ├── alloydbgetcluster_test.go
│   │   │   │   └── alloydbgetcluster.go
│   │   │   ├── alloydbgetinstance
│   │   │   │   ├── alloydbgetinstance_test.go
│   │   │   │   └── alloydbgetinstance.go
│   │   │   ├── alloydbgetuser
│   │   │   │   ├── alloydbgetuser_test.go
│   │   │   │   └── alloydbgetuser.go
│   │   │   ├── alloydblistclusters
│   │   │   │   ├── alloydblistclusters_test.go
│   │   │   │   └── alloydblistclusters.go
│   │   │   ├── alloydblistinstances
│   │   │   │   ├── alloydblistinstances_test.go
│   │   │   │   └── alloydblistinstances.go
│   │   │   ├── alloydblistusers
│   │   │   │   ├── alloydblistusers_test.go
│   │   │   │   └── alloydblistusers.go
│   │   │   └── alloydbwaitforoperation
│   │   │       ├── alloydbwaitforoperation_test.go
│   │   │       └── alloydbwaitforoperation.go
│   │   ├── alloydbainl
│   │   │   ├── alloydbainl_test.go
│   │   │   └── alloydbainl.go
│   │   ├── bigquery
│   │   │   ├── bigqueryanalyzecontribution
│   │   │   │   ├── bigqueryanalyzecontribution_test.go
│   │   │   │   └── bigqueryanalyzecontribution.go
│   │   │   ├── bigquerycommon
│   │   │   │   ├── table_name_parser_test.go
│   │   │   │   ├── table_name_parser.go
│   │   │   │   └── util.go
│   │   │   ├── bigqueryconversationalanalytics
│   │   │   │   ├── bigqueryconversationalanalytics_test.go
│   │   │   │   └── bigqueryconversationalanalytics.go
│   │   │   ├── bigqueryexecutesql
│   │   │   │   ├── bigqueryexecutesql_test.go
│   │   │   │   └── bigqueryexecutesql.go
│   │   │   ├── bigqueryforecast
│   │   │   │   ├── bigqueryforecast_test.go
│   │   │   │   └── bigqueryforecast.go
│   │   │   ├── bigquerygetdatasetinfo
│   │   │   │   ├── bigquerygetdatasetinfo_test.go
│   │   │   │   └── bigquerygetdatasetinfo.go
│   │   │   ├── bigquerygettableinfo
│   │   │   │   ├── bigquerygettableinfo_test.go
│   │   │   │   └── bigquerygettableinfo.go
│   │   │   ├── bigquerylistdatasetids
│   │   │   │   ├── bigquerylistdatasetids_test.go
│   │   │   │   └── bigquerylistdatasetids.go
│   │   │   ├── bigquerylisttableids
│   │   │   │   ├── bigquerylisttableids_test.go
│   │   │   │   └── bigquerylisttableids.go
│   │   │   ├── bigquerysearchcatalog
│   │   │   │   ├── bigquerysearchcatalog_test.go
│   │   │   │   └── bigquerysearchcatalog.go
│   │   │   └── bigquerysql
│   │   │       ├── bigquerysql_test.go
│   │   │       └── bigquerysql.go
│   │   ├── bigtable
│   │   │   ├── bigtable_test.go
│   │   │   └── bigtable.go
│   │   ├── cassandra
│   │   │   └── cassandracql
│   │   │       ├── cassandracql_test.go
│   │   │       └── cassandracql.go
│   │   ├── clickhouse
│   │   │   ├── clickhouseexecutesql
│   │   │   │   ├── clickhouseexecutesql_test.go
│   │   │   │   └── clickhouseexecutesql.go
│   │   │   ├── clickhouselistdatabases
│   │   │   │   ├── clickhouselistdatabases_test.go
│   │   │   │   └── clickhouselistdatabases.go
│   │   │   ├── clickhouselisttables
│   │   │   │   ├── clickhouselisttables_test.go
│   │   │   │   └── clickhouselisttables.go
│   │   │   └── clickhousesql
│   │   │       ├── clickhousesql_test.go
│   │   │       └── clickhousesql.go
│   │   ├── cloudmonitoring
│   │   │   ├── cloudmonitoring_test.go
│   │   │   └── cloudmonitoring.go
│   │   ├── cloudsql
│   │   │   ├── cloudsqlcreatedatabase
│   │   │   │   ├── cloudsqlcreatedatabase_test.go
│   │   │   │   └── cloudsqlcreatedatabase.go
│   │   │   ├── cloudsqlcreateusers
│   │   │   │   ├── cloudsqlcreateusers_test.go
│   │   │   │   └── cloudsqlcreateusers.go
│   │   │   ├── cloudsqlgetinstances
│   │   │   │   ├── cloudsqlgetinstances_test.go
│   │   │   │   └── cloudsqlgetinstances.go
│   │   │   ├── cloudsqllistdatabases
│   │   │   │   ├── cloudsqllistdatabases_test.go
│   │   │   │   └── cloudsqllistdatabases.go
│   │   │   ├── cloudsqllistinstances
│   │   │   │   ├── cloudsqllistinstances_test.go
│   │   │   │   └── cloudsqllistinstances.go
│   │   │   └── cloudsqlwaitforoperation
│   │   │       ├── cloudsqlwaitforoperation_test.go
│   │   │       └── cloudsqlwaitforoperation.go
│   │   ├── cloudsqlmssql
│   │   │   └── cloudsqlmssqlcreateinstance
│   │   │       ├── cloudsqlmssqlcreateinstance_test.go
│   │   │       └── cloudsqlmssqlcreateinstance.go
│   │   ├── cloudsqlmysql
│   │   │   └── cloudsqlmysqlcreateinstance
│   │   │       ├── cloudsqlmysqlcreateinstance_test.go
│   │   │       └── cloudsqlmysqlcreateinstance.go
│   │   ├── cloudsqlpg
│   │   │   └── cloudsqlpgcreateinstances
│   │   │       ├── cloudsqlpgcreateinstances_test.go
│   │   │       └── cloudsqlpgcreateinstances.go
│   │   ├── common_test.go
│   │   ├── common.go
│   │   ├── couchbase
│   │   │   ├── couchbase_test.go
│   │   │   └── couchbase.go
│   │   ├── dataform
│   │   │   └── dataformcompilelocal
│   │   │       ├── dataformcompilelocal_test.go
│   │   │       └── dataformcompilelocal.go
│   │   ├── dataplex
│   │   │   ├── dataplexlookupentry
│   │   │   │   ├── dataplexlookupentry_test.go
│   │   │   │   └── dataplexlookupentry.go
│   │   │   ├── dataplexsearchaspecttypes
│   │   │   │   ├── dataplexsearchaspecttypes_test.go
│   │   │   │   └── dataplexsearchaspecttypes.go
│   │   │   └── dataplexsearchentries
│   │   │       ├── dataplexsearchentries_test.go
│   │   │       └── dataplexsearchentries.go
│   │   ├── dgraph
│   │   │   ├── dgraph_test.go
│   │   │   └── dgraph.go
│   │   ├── firebird
│   │   │   ├── firebirdexecutesql
│   │   │   │   ├── firebirdexecutesql_test.go
│   │   │   │   └── firebirdexecutesql.go
│   │   │   └── firebirdsql
│   │   │       ├── firebirdsql_test.go
│   │   │       └── firebirdsql.go
│   │   ├── firestore
│   │   │   ├── firestoreadddocuments
│   │   │   │   ├── firestoreadddocuments_test.go
│   │   │   │   └── firestoreadddocuments.go
│   │   │   ├── firestoredeletedocuments
│   │   │   │   ├── firestoredeletedocuments_test.go
│   │   │   │   └── firestoredeletedocuments.go
│   │   │   ├── firestoregetdocuments
│   │   │   │   ├── firestoregetdocuments_test.go
│   │   │   │   └── firestoregetdocuments.go
│   │   │   ├── firestoregetrules
│   │   │   │   ├── firestoregetrules_test.go
│   │   │   │   └── firestoregetrules.go
│   │   │   ├── firestorelistcollections
│   │   │   │   ├── firestorelistcollections_test.go
│   │   │   │   └── firestorelistcollections.go
│   │   │   ├── firestorequery
│   │   │   │   ├── firestorequery_test.go
│   │   │   │   └── firestorequery.go
│   │   │   ├── firestorequerycollection
│   │   │   │   ├── firestorequerycollection_test.go
│   │   │   │   └── firestorequerycollection.go
│   │   │   ├── firestoreupdatedocument
│   │   │   │   ├── firestoreupdatedocument_test.go
│   │   │   │   └── firestoreupdatedocument.go
│   │   │   ├── firestorevalidaterules
│   │   │   │   ├── firestorevalidaterules_test.go
│   │   │   │   └── firestorevalidaterules.go
│   │   │   └── util
│   │   │       ├── converter_test.go
│   │   │       ├── converter.go
│   │   │       ├── validator_test.go
│   │   │       └── validator.go
│   │   ├── http
│   │   │   ├── http_test.go
│   │   │   └── http.go
│   │   ├── http_method.go
│   │   ├── looker
│   │   │   ├── lookeradddashboardelement
│   │   │   │   ├── lookeradddashboardelement_test.go
│   │   │   │   └── lookeradddashboardelement.go
│   │   │   ├── lookercommon
│   │   │   │   ├── lookercommon_test.go
│   │   │   │   └── lookercommon.go
│   │   │   ├── lookerconversationalanalytics
│   │   │   │   ├── lookerconversationalanalytics_test.go
│   │   │   │   └── lookerconversationalanalytics.go
│   │   │   ├── lookercreateprojectfile
│   │   │   │   ├── lookercreateprojectfile_test.go
│   │   │   │   └── lookercreateprojectfile.go
│   │   │   ├── lookerdeleteprojectfile
│   │   │   │   ├── lookerdeleteprojectfile_test.go
│   │   │   │   └── lookerdeleteprojectfile.go
│   │   │   ├── lookerdevmode
│   │   │   │   ├── lookerdevmode_test.go
│   │   │   │   └── lookerdevmode.go
│   │   │   ├── lookergetdashboards
│   │   │   │   ├── lookergetdashboards_test.go
│   │   │   │   └── lookergetdashboards.go
│   │   │   ├── lookergetdimensions
│   │   │   │   ├── lookergetdimensions_test.go
│   │   │   │   └── lookergetdimensions.go
│   │   │   ├── lookergetexplores
│   │   │   │   ├── lookergetexplores_test.go
│   │   │   │   └── lookergetexplores.go
│   │   │   ├── lookergetfilters
│   │   │   │   ├── lookergetfilters_test.go
│   │   │   │   └── lookergetfilters.go
│   │   │   ├── lookergetlooks
│   │   │   │   ├── lookergetlooks_test.go
│   │   │   │   └── lookergetlooks.go
│   │   │   ├── lookergetmeasures
│   │   │   │   ├── lookergetmeasures_test.go
│   │   │   │   └── lookergetmeasures.go
│   │   │   ├── lookergetmodels
│   │   │   │   ├── lookergetmodels_test.go
│   │   │   │   └── lookergetmodels.go
│   │   │   ├── lookergetparameters
│   │   │   │   ├── lookergetparameters_test.go
│   │   │   │   └── lookergetparameters.go
│   │   │   ├── lookergetprojectfile
│   │   │   │   ├── lookergetprojectfile_test.go
│   │   │   │   └── lookergetprojectfile.go
│   │   │   ├── lookergetprojectfiles
│   │   │   │   ├── lookergetprojectfiles_test.go
│   │   │   │   └── lookergetprojectfiles.go
│   │   │   ├── lookergetprojects
│   │   │   │   ├── lookergetprojects_test.go
│   │   │   │   └── lookergetprojects.go
│   │   │   ├── lookerhealthanalyze
│   │   │   │   ├── lookerhealthanalyze_test.go
│   │   │   │   └── lookerhealthanalyze.go
│   │   │   ├── lookerhealthpulse
│   │   │   │   ├── lookerhealthpulse_test.go
│   │   │   │   └── lookerhealthpulse.go
│   │   │   ├── lookerhealthvacuum
│   │   │   │   ├── lookerhealthvacuum_test.go
│   │   │   │   └── lookerhealthvacuum.go
│   │   │   ├── lookermakedashboard
│   │   │   │   ├── lookermakedashboard_test.go
│   │   │   │   └── lookermakedashboard.go
│   │   │   ├── lookermakelook
│   │   │   │   ├── lookermakelook_test.go
│   │   │   │   └── lookermakelook.go
│   │   │   ├── lookerquery
│   │   │   │   ├── lookerquery_test.go
│   │   │   │   └── lookerquery.go
│   │   │   ├── lookerquerysql
│   │   │   │   ├── lookerquerysql_test.go
│   │   │   │   └── lookerquerysql.go
│   │   │   ├── lookerqueryurl
│   │   │   │   ├── lookerqueryurl_test.go
│   │   │   │   └── lookerqueryurl.go
│   │   │   ├── lookerrunlook
│   │   │   │   ├── lookerrunlook_test.go
│   │   │   │   └── lookerrunlook.go
│   │   │   └── lookerupdateprojectfile
│   │   │       ├── lookerupdateprojectfile_test.go
│   │   │       └── lookerupdateprojectfile.go
│   │   ├── mongodb
│   │   │   ├── mongodbaggregate
│   │   │   │   ├── mongodbaggregate_test.go
│   │   │   │   └── mongodbaggregate.go
│   │   │   ├── mongodbdeletemany
│   │   │   │   ├── mongodbdeletemany_test.go
│   │   │   │   └── mongodbdeletemany.go
│   │   │   ├── mongodbdeleteone
│   │   │   │   ├── mongodbdeleteone_test.go
│   │   │   │   └── mongodbdeleteone.go
│   │   │   ├── mongodbfind
│   │   │   │   ├── mongodbfind_test.go
│   │   │   │   └── mongodbfind.go
│   │   │   ├── mongodbfindone
│   │   │   │   ├── mongodbfindone_test.go
│   │   │   │   └── mongodbfindone.go
│   │   │   ├── mongodbinsertmany
│   │   │   │   ├── mongodbinsertmany_test.go
│   │   │   │   └── mongodbinsertmany.go
│   │   │   ├── mongodbinsertone
│   │   │   │   ├── mongodbinsertone_test.go
│   │   │   │   └── mongodbinsertone.go
│   │   │   ├── mongodbupdatemany
│   │   │   │   ├── mongodbupdatemany_test.go
│   │   │   │   └── mongodbupdatemany.go
│   │   │   └── mongodbupdateone
│   │   │       ├── mongodbupdateone_test.go
│   │   │       └── mongodbupdateone.go
│   │   ├── mssql
│   │   │   ├── mssqlexecutesql
│   │   │   │   ├── mssqlexecutesql_test.go
│   │   │   │   └── mssqlexecutesql.go
│   │   │   ├── mssqllisttables
│   │   │   │   ├── mssqllisttables_test.go
│   │   │   │   └── mssqllisttables.go
│   │   │   └── mssqlsql
│   │   │       ├── mssqlsql_test.go
│   │   │       └── mssqlsql.go
│   │   ├── mysql
│   │   │   ├── mysqlcommon
│   │   │   │   └── mysqlcommon.go
│   │   │   ├── mysqlexecutesql
│   │   │   │   ├── mysqlexecutesql_test.go
│   │   │   │   └── mysqlexecutesql.go
│   │   │   ├── mysqllistactivequeries
│   │   │   │   ├── mysqllistactivequeries_test.go
│   │   │   │   └── mysqllistactivequeries.go
│   │   │   ├── mysqllisttablefragmentation
│   │   │   │   ├── mysqllisttablefragmentation_test.go
│   │   │   │   └── mysqllisttablefragmentation.go
│   │   │   ├── mysqllisttables
│   │   │   │   ├── mysqllisttables_test.go
│   │   │   │   └── mysqllisttables.go
│   │   │   ├── mysqllisttablesmissinguniqueindexes
│   │   │   │   ├── mysqllisttablesmissinguniqueindexes_test.go
│   │   │   │   └── mysqllisttablesmissinguniqueindexes.go
│   │   │   └── mysqlsql
│   │   │       ├── mysqlsql_test.go
│   │   │       └── mysqlsql.go
│   │   ├── neo4j
│   │   │   ├── neo4jcypher
│   │   │   │   ├── neo4jcypher_test.go
│   │   │   │   └── neo4jcypher.go
│   │   │   ├── neo4jexecutecypher
│   │   │   │   ├── classifier
│   │   │   │   │   ├── classifier_test.go
│   │   │   │   │   └── classifier.go
│   │   │   │   ├── neo4jexecutecypher_test.go
│   │   │   │   └── neo4jexecutecypher.go
│   │   │   └── neo4jschema
│   │   │       ├── cache
│   │   │       │   ├── cache_test.go
│   │   │       │   └── cache.go
│   │   │       ├── helpers
│   │   │       │   ├── helpers_test.go
│   │   │       │   └── helpers.go
│   │   │       ├── neo4jschema_test.go
│   │   │       ├── neo4jschema.go
│   │   │       └── types
│   │   │           └── types.go
│   │   ├── oceanbase
│   │   │   ├── oceanbaseexecutesql
│   │   │   │   ├── oceanbaseexecutesql_test.go
│   │   │   │   └── oceanbaseexecutesql.go
│   │   │   └── oceanbasesql
│   │   │       ├── oceanbasesql_test.go
│   │   │       └── oceanbasesql.go
│   │   ├── oracle
│   │   │   ├── oracleexecutesql
│   │   │   │   └── oracleexecutesql.go
│   │   │   └── oraclesql
│   │   │       └── oraclesql.go
│   │   ├── parameters_test.go
│   │   ├── parameters.go
│   │   ├── postgres
│   │   │   ├── postgresexecutesql
│   │   │   │   ├── postgresexecutesql_test.go
│   │   │   │   └── postgresexecutesql.go
│   │   │   ├── postgreslistactivequeries
│   │   │   │   ├── postgreslistactivequeries_test.go
│   │   │   │   └── postgreslistactivequeries.go
│   │   │   ├── postgreslistavailableextensions
│   │   │   │   ├── postgreslistavailableextensions_test.go
│   │   │   │   └── postgreslistavailableextensions.go
│   │   │   ├── postgreslistinstalledextensions
│   │   │   │   ├── postgreslistinstalledextensions_test.go
│   │   │   │   └── postgreslistinstalledextensions.go
│   │   │   ├── postgreslisttables
│   │   │   │   ├── postgreslisttables_test.go
│   │   │   │   └── postgreslisttables.go
│   │   │   ├── postgreslistviews
│   │   │   │   ├── postgreslistviews_test.go
│   │   │   │   └── postgreslistviews.go
│   │   │   └── postgressql
│   │   │       ├── postgressql_test.go
│   │   │       └── postgressql.go
│   │   ├── redis
│   │   │   ├── redis_test.go
│   │   │   └── redis.go
│   │   ├── serverlessspark
│   │   │   ├── serverlesssparkgetbatch
│   │   │   │   ├── serverlesssparkgetbatch_test.go
│   │   │   │   └── serverlesssparkgetbatch.go
│   │   │   └── serverlesssparklistbatches
│   │   │       ├── serverlesssparklistbatches_test.go
│   │   │       └── serverlesssparklistbatches.go
│   │   ├── spanner
│   │   │   ├── spannerexecutesql
│   │   │   │   ├── spannerexecutesql_test.go
│   │   │   │   └── spannerexecutesql.go
│   │   │   ├── spannerlisttables
│   │   │   │   ├── spannerlisttables_test.go
│   │   │   │   └── spannerlisttables.go
│   │   │   └── spannersql
│   │   │       ├── spanner_test.go
│   │   │       └── spannersql.go
│   │   ├── sqlite
│   │   │   ├── sqliteexecutesql
│   │   │   │   ├── sqliteexecutesql_test.go
│   │   │   │   └── sqliteexecutesql.go
│   │   │   └── sqlitesql
│   │   │       ├── sqlitesql_test.go
│   │   │       └── sqlitesql.go
│   │   ├── tidb
│   │   │   ├── tidbexecutesql
│   │   │   │   ├── tidbexecutesql_test.go
│   │   │   │   └── tidbexecutesql.go
│   │   │   └── tidbsql
│   │   │       ├── tidbsql_test.go
│   │   │       └── tidbsql.go
│   │   ├── tools_test.go
│   │   ├── tools.go
│   │   ├── toolsets.go
│   │   ├── trino
│   │   │   ├── trinoexecutesql
│   │   │   │   ├── trinoexecutesql_test.go
│   │   │   │   └── trinoexecutesql.go
│   │   │   └── trinosql
│   │   │       ├── trinosql_test.go
│   │   │       └── trinosql.go
│   │   ├── utility
│   │   │   └── wait
│   │   │       ├── wait_test.go
│   │   │       └── wait.go
│   │   ├── valkey
│   │   │   ├── valkey_test.go
│   │   │   └── valkey.go
│   │   └── yugabytedbsql
│   │       ├── yugabytedbsql_test.go
│   │       └── yugabytedbsql.go
│   └── util
│       └── util.go
├── LICENSE
├── logo.png
├── main.go
├── MCP-TOOLBOX-EXTENSION.md
├── README.md
└── tests
    ├── alloydb
    │   ├── alloydb_integration_test.go
    │   └── alloydb_wait_for_operation_test.go
    ├── alloydbainl
    │   └── alloydb_ai_nl_integration_test.go
    ├── alloydbpg
    │   └── alloydb_pg_integration_test.go
    ├── auth.go
    ├── bigquery
    │   └── bigquery_integration_test.go
    ├── bigtable
    │   └── bigtable_integration_test.go
    ├── cassandra
    │   └── cassandra_integration_test.go
    ├── clickhouse
    │   └── clickhouse_integration_test.go
    ├── cloudmonitoring
    │   └── cloud_monitoring_integration_test.go
    ├── cloudsql
    │   ├── cloud_sql_create_database_test.go
    │   ├── cloud_sql_create_users_test.go
    │   ├── cloud_sql_get_instances_test.go
    │   ├── cloud_sql_list_databases_test.go
    │   ├── cloudsql_list_instances_test.go
    │   └── cloudsql_wait_for_operation_test.go
    ├── cloudsqlmssql
    │   ├── cloud_sql_mssql_create_instance_integration_test.go
    │   └── cloud_sql_mssql_integration_test.go
    ├── cloudsqlmysql
    │   ├── cloud_sql_mysql_create_instance_integration_test.go
    │   └── cloud_sql_mysql_integration_test.go
    ├── cloudsqlpg
    │   ├── cloud_sql_pg_create_instances_test.go
    │   └── cloud_sql_pg_integration_test.go
    ├── common.go
    ├── couchbase
    │   └── couchbase_integration_test.go
    ├── dataform
    │   └── dataform_integration_test.go
    ├── dataplex
    │   └── dataplex_integration_test.go
    ├── dgraph
    │   └── dgraph_integration_test.go
    ├── firebird
    │   └── firebird_integration_test.go
    ├── firestore
    │   └── firestore_integration_test.go
    ├── http
    │   └── http_integration_test.go
    ├── looker
    │   └── looker_integration_test.go
    ├── mongodb
    │   └── mongodb_integration_test.go
    ├── mssql
    │   └── mssql_integration_test.go
    ├── mysql
    │   └── mysql_integration_test.go
    ├── neo4j
    │   └── neo4j_integration_test.go
    ├── oceanbase
    │   └── oceanbase_integration_test.go
    ├── option.go
    ├── oracle
    │   └── oracle_integration_test.go
    ├── postgres
    │   └── postgres_integration_test.go
    ├── redis
    │   └── redis_test.go
    ├── server.go
    ├── serverlessspark
    │   └── serverless_spark_integration_test.go
    ├── source.go
    ├── spanner
    │   └── spanner_integration_test.go
    ├── sqlite
    │   └── sqlite_integration_test.go
    ├── tidb
    │   └── tidb_integration_test.go
    ├── tool.go
    ├── trino
    │   └── trino_integration_test.go
    ├── utility
    │   └── wait_integration_test.go
    ├── valkey
    │   └── valkey_test.go
    └── yugabytedb
        └── yugabytedb_integration_test.go
```

# Files

--------------------------------------------------------------------------------
/internal/tools/firestore/firestorevalidaterules/firestorevalidaterules.go:
--------------------------------------------------------------------------------

```go
// Copyright 2025 Google LLC
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
//     http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.

package firestorevalidaterules

import (
	"context"
	"fmt"
	"strings"

	yaml "github.com/goccy/go-yaml"
	"github.com/googleapis/genai-toolbox/internal/sources"
	firestoreds "github.com/googleapis/genai-toolbox/internal/sources/firestore"
	"github.com/googleapis/genai-toolbox/internal/tools"
	"google.golang.org/api/firebaserules/v1"
)

const kind string = "firestore-validate-rules"

// Parameter keys
const (
	sourceKey = "source"
)

func init() {
	if !tools.Register(kind, newConfig) {
		panic(fmt.Sprintf("tool kind %q already registered", kind))
	}
}

func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.ToolConfig, error) {
	actual := Config{Name: name}
	if err := decoder.DecodeContext(ctx, &actual); err != nil {
		return nil, err
	}
	return actual, nil
}

type compatibleSource interface {
	FirebaseRulesClient() *firebaserules.Service
	GetProjectId() string
}

// validate compatible sources are still compatible
var _ compatibleSource = &firestoreds.Source{}

var compatibleSources = [...]string{firestoreds.SourceKind}

type Config struct {
	Name         string   `yaml:"name" validate:"required"`
	Kind         string   `yaml:"kind" validate:"required"`
	Source       string   `yaml:"source" validate:"required"`
	Description  string   `yaml:"description" validate:"required"`
	AuthRequired []string `yaml:"authRequired"`
}

// validate interface
var _ tools.ToolConfig = Config{}

func (cfg Config) ToolConfigKind() string {
	return kind
}

func (cfg Config) Initialize(srcs map[string]sources.Source) (tools.Tool, error) {
	// verify source exists
	rawS, ok := srcs[cfg.Source]
	if !ok {
		return nil, fmt.Errorf("no source named %q configured", cfg.Source)
	}

	// verify the source is compatible
	s, ok := rawS.(compatibleSource)
	if !ok {
		return nil, fmt.Errorf("invalid source for %q tool: source kind must be one of %q", kind, compatibleSources)
	}

	// Create parameters
	parameters := createParameters()
	mcpManifest := tools.GetMcpManifest(cfg.Name, cfg.Description, cfg.AuthRequired, parameters)

	// finish tool setup
	t := Tool{
		Name:         cfg.Name,
		Kind:         kind,
		Parameters:   parameters,
		AuthRequired: cfg.AuthRequired,
		RulesClient:  s.FirebaseRulesClient(),
		ProjectId:    s.GetProjectId(),
		manifest:     tools.Manifest{Description: cfg.Description, Parameters: parameters.Manifest(), AuthRequired: cfg.AuthRequired},
		mcpManifest:  mcpManifest,
	}
	return t, nil
}

// createParameters creates the parameter definitions for the tool
func createParameters() tools.Parameters {
	sourceParameter := tools.NewStringParameter(
		sourceKey,
		"The Firestore Rules source code to validate",
	)

	return tools.Parameters{sourceParameter}
}

// validate interface
var _ tools.Tool = Tool{}

type Tool struct {
	Name         string           `yaml:"name"`
	Kind         string           `yaml:"kind"`
	AuthRequired []string         `yaml:"authRequired"`
	Parameters   tools.Parameters `yaml:"parameters"`

	RulesClient *firebaserules.Service
	ProjectId   string
	manifest    tools.Manifest
	mcpManifest tools.McpManifest
}

// Issue represents a validation issue in the rules
type Issue struct {
	SourcePosition SourcePosition `json:"sourcePosition"`
	Description    string         `json:"description"`
	Severity       string         `json:"severity"`
}

// SourcePosition represents the location of an issue in the source
type SourcePosition struct {
	FileName      string `json:"fileName,omitempty"`
	Line          int64  `json:"line"`          // 1-based
	Column        int64  `json:"column"`        // 1-based
	CurrentOffset int64  `json:"currentOffset"` // 0-based, inclusive start
	EndOffset     int64  `json:"endOffset"`     // 0-based, exclusive end
}

// ValidationResult represents the result of rules validation
type ValidationResult struct {
	Valid           bool    `json:"valid"`
	IssueCount      int     `json:"issueCount"`
	FormattedIssues string  `json:"formattedIssues,omitempty"`
	RawIssues       []Issue `json:"rawIssues,omitempty"`
}

func (t Tool) Invoke(ctx context.Context, params tools.ParamValues, accessToken tools.AccessToken) (any, error) {
	mapParams := params.AsMap()

	// Get source parameter
	source, ok := mapParams[sourceKey].(string)
	if !ok || source == "" {
		return nil, fmt.Errorf("invalid or missing '%s' parameter", sourceKey)
	}

	// Create test request
	testRequest := &firebaserules.TestRulesetRequest{
		Source: &firebaserules.Source{
			Files: []*firebaserules.File{
				{
					Name:    "firestore.rules",
					Content: source,
				},
			},
		},
		// We don't need test cases for validation only
		TestSuite: &firebaserules.TestSuite{
			TestCases: []*firebaserules.TestCase{},
		},
	}

	// Call the test API
	projectName := fmt.Sprintf("projects/%s", t.ProjectId)
	response, err := t.RulesClient.Projects.Test(projectName, testRequest).Context(ctx).Do()
	if err != nil {
		return nil, fmt.Errorf("failed to validate rules: %w", err)
	}

	// Process the response
	result := t.processValidationResponse(response, source)

	return result, nil
}

func (t Tool) processValidationResponse(response *firebaserules.TestRulesetResponse, source string) ValidationResult {
	if len(response.Issues) == 0 {
		return ValidationResult{
			Valid:           true,
			IssueCount:      0,
			FormattedIssues: "✓ No errors detected. Rules are valid.",
		}
	}

	// Convert issues to our format
	issues := make([]Issue, len(response.Issues))
	for i, issue := range response.Issues {
		issues[i] = Issue{
			Description: issue.Description,
			Severity:    issue.Severity,
			SourcePosition: SourcePosition{
				FileName:      issue.SourcePosition.FileName,
				Line:          issue.SourcePosition.Line,
				Column:        issue.SourcePosition.Column,
				CurrentOffset: issue.SourcePosition.CurrentOffset,
				EndOffset:     issue.SourcePosition.EndOffset,
			},
		}
	}

	// Format issues
	formattedIssues := t.formatRulesetIssues(issues, source)

	return ValidationResult{
		Valid:           false,
		IssueCount:      len(issues),
		FormattedIssues: formattedIssues,
		RawIssues:       issues,
	}
}

// formatRulesetIssues formats validation issues into a human-readable string with code snippets
func (t Tool) formatRulesetIssues(issues []Issue, rulesSource string) string {
	sourceLines := strings.Split(rulesSource, "\n")
	var formattedOutput []string

	formattedOutput = append(formattedOutput, fmt.Sprintf("Found %d issue(s) in rules source:\n", len(issues)))

	for _, issue := range issues {
		issueString := fmt.Sprintf("%s: %s [Ln %d, Col %d]",
			issue.Severity,
			issue.Description,
			issue.SourcePosition.Line,
			issue.SourcePosition.Column)

		if issue.SourcePosition.Line > 0 {
			lineIndex := int(issue.SourcePosition.Line - 1) // 0-based index
			if lineIndex >= 0 && lineIndex < len(sourceLines) {
				errorLine := sourceLines[lineIndex]
				issueString += fmt.Sprintf("\n```\n%s", errorLine)

				// Add carets if we have column and offset information
				if issue.SourcePosition.Column > 0 &&
					issue.SourcePosition.CurrentOffset >= 0 &&
					issue.SourcePosition.EndOffset > issue.SourcePosition.CurrentOffset {

					startColumn := int(issue.SourcePosition.Column - 1) // 0-based
					errorTokenLength := int(issue.SourcePosition.EndOffset - issue.SourcePosition.CurrentOffset)

					if startColumn >= 0 && errorTokenLength > 0 && startColumn <= len(errorLine) {
						padding := strings.Repeat(" ", startColumn)
						carets := strings.Repeat("^", errorTokenLength)
						issueString += fmt.Sprintf("\n%s%s", padding, carets)
					}
				}
				issueString += "\n```"
			}
		}

		formattedOutput = append(formattedOutput, issueString)
	}

	return strings.Join(formattedOutput, "\n\n")
}

func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (tools.ParamValues, error) {
	return tools.ParseParams(t.Parameters, data, claims)
}

func (t Tool) Manifest() tools.Manifest {
	return t.manifest
}

func (t Tool) McpManifest() tools.McpManifest {
	return t.mcpManifest
}

func (t Tool) Authorized(verifiedAuthServices []string) bool {
	return tools.IsAuthorized(t.AuthRequired, verifiedAuthServices)
}

func (t Tool) RequiresClientAuthorization() bool {
	return false
}

```

--------------------------------------------------------------------------------
/docs/en/how-to/connect-ide/cloud_sql_mysql_admin_mcp.md:
--------------------------------------------------------------------------------

```markdown
---
title: "Cloud SQL for MySQL Admin using MCP"
type: docs
weight: 4
description: >
  Create and manage Cloud SQL for MySQL (Admin) using Toolbox.
---

This guide covers how to use [MCP Toolbox for Databases][toolbox] to expose your
developer assistant tools to create and manage Cloud SQL for MySQL instance,
database and users:

* [Cursor][cursor]
* [Windsurf][windsurf] (Codium)
* [Visual Studio Code][vscode] (Copilot)
* [Cline][cline]  (VS Code extension)
* [Claude desktop][claudedesktop]
* [Claude code][claudecode]
* [Gemini CLI][geminicli]
* [Gemini Code Assist][geminicodeassist]

[toolbox]: https://github.com/googleapis/genai-toolbox
[cursor]: #configure-your-mcp-client
[windsurf]: #configure-your-mcp-client
[vscode]: #configure-your-mcp-client
[cline]: #configure-your-mcp-client
[claudedesktop]: #configure-your-mcp-client
[claudecode]: #configure-your-mcp-client
[geminicli]: #configure-your-mcp-client
[geminicodeassist]: #configure-your-mcp-client

## Before you begin

1. In the Google Cloud console, on the [project selector
   page](https://console.cloud.google.com/projectselector2/home/dashboard),
   select or create a Google Cloud project.

1. [Make sure that billing is enabled for your Google Cloud
   project](https://cloud.google.com/billing/docs/how-to/verify-billing-enabled#confirm_billing_is_enabled_on_a_project).

1. Grant the necessary IAM roles to the user that will be running the MCP
   server. The tools available will depend on the roles granted:
    * `roles/cloudsql.viewer`: Provides read-only access to resources.
        * `get_instance`
        * `list_instances`
        * `list_databases`
        * `wait_for_operation`
    * `roles/cloudsql.editor`: Provides permissions to manage existing resources.
        * All `viewer` tools
        * `create_database`
    * `roles/cloudsql.admin`: Provides full control over all resources.
        * All `editor` and `viewer` tools
        * `create_instance`
        * `create_user`

## Install MCP Toolbox

1. Download the latest version of Toolbox as a binary. Select the [correct
   binary](https://github.com/googleapis/genai-toolbox/releases) corresponding
   to your OS and CPU architecture. You are required to use Toolbox version
   V0.15.0+:

   <!-- {x-release-please-start-version} -->
   {{< tabpane persist=header >}}
{{< tab header="linux/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.15.0/linux/amd64/toolbox
{{< /tab >}}

{{< tab header="darwin/arm64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.15.0/darwin/arm64/toolbox
{{< /tab >}}

{{< tab header="darwin/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.15.0/darwin/amd64/toolbox
{{< /tab >}}

{{< tab header="windows/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.15.0/windows/amd64/toolbox.exe
{{< /tab >}}
{{< /tabpane >}}
    <!-- {x-release-please-end} -->

1. Make the binary executable:

    ```bash
    chmod +x toolbox
    ```

1. Verify the installation:

    ```bash
    ./toolbox --version
    ```

## Configure your MCP Client

{{< tabpane text=true >}}
{{% tab header="Claude code" lang="en" %}}

1. Install [Claude
   Code](https://docs.anthropic.com/en/docs/agents-and-tools/claude-code/overview).
1. Create a `.mcp.json` file in your project root if it doesn't exist.
1. Add the following configuration and save:

    ```json
    {
      "mcpServers": {
        "cloud-sql-mysql-admin": {
          "command": "./PATH/TO/toolbox",
          "args": ["--prebuilt","cloud-sql-mysql-admin","--stdio"],
          "env": {
          }
        }
      }
    }
    ```

1. Restart Claude code to apply the new configuration.
{{% /tab %}}

{{% tab header="Claude desktop" lang="en" %}}

1. Open [Claude desktop](https://claude.ai/download) and navigate to Settings.
1. Under the Developer tab, tap Edit Config to open the configuration file.
1. Add the following configuration and save:

    ```json
    {
      "mcpServers": {
        "cloud-sql-mysql-admin": {
          "command": "./PATH/TO/toolbox",
          "args": ["--prebuilt","cloud-sql-mysql-admin","--stdio"],
          "env": {
          }
        }
      }
    }
    ```

1. Restart Claude desktop.
1. From the new chat screen, you should see a hammer (MCP) icon appear with the
   new MCP server available.
{{% /tab %}}

{{% tab header="Cline" lang="en" %}}

1. Open the [Cline](https://github.com/cline/cline) extension in VS Code and tap
   the **MCP Servers** icon.
1. Tap Configure MCP Servers to open the configuration file.
1. Add the following configuration and save:

    ```json
    {
      "mcpServers": {
        "cloud-sql-mysql-admin": {
          "command": "./PATH/TO/toolbox",
          "args": ["--prebuilt","cloud-sql-mysql-admin","--stdio"],
          "env": {
          }
        }
      }
    }
    ```

1. You should see a green active status after the server is successfully
   connected.
{{% /tab %}}

{{% tab header="Cursor" lang="en" %}}

1. Create a `.cursor` directory in your project root if it doesn't exist.
1. Create a `.cursor/mcp.json` file if it doesn't exist and open it.
1. Add the following configuration and save:

    ```json
    {
      "mcpServers": {
        "cloud-sql-mysql-admin": {
          "command": "./PATH/TO/toolbox",
          "args": ["--prebuilt","cloud-sql-mysql-admin","--stdio"],
          "env": {
          }
        }
      }
    }
    ```

1. [Cursor](https://www.cursor.com/) and navigate to **Settings > Cursor
   Settings > MCP**. You should see a green active status after the server is
   successfully connected.
{{% /tab %}}

{{% tab header="Visual Studio Code (Copilot)" lang="en" %}}

1. Open [VS Code](https://code.visualstudio.com/docs/copilot/overview) and
   create a `.vscode` directory in your project root if it doesn't exist.
1. Create a `.vscode/mcp.json` file if it doesn't exist and open it.
1. Add the following configuration and save:

    ```json
    {
      "servers": {
        "cloud-sql-mysql-admin": {
          "command": "./PATH/TO/toolbox",
          "args": ["--prebuilt","cloud-sql-mysql-admin","--stdio"],
          "env": {
          }
        }
      }
    }
    ```

{{% /tab %}}

{{% tab header="Windsurf" lang="en" %}}

1. Open [Windsurf](https://docs.codeium.com/windsurf) and navigate to the
   Cascade assistant.
1. Tap on the hammer (MCP) icon, then Configure to open the configuration file.
1. Add the following configuration and save:

    ```json
    {
      "mcpServers": {
        "cloud-sql-mysql-admin": {
          "command": "./PATH/TO/toolbox",
          "args": ["--prebuilt","cloud-sql-mysql-admin","--stdio"],
          "env": {
          }
        }
      }
    }
    ```

{{% /tab %}}

{{% tab header="Gemini CLI" lang="en" %}}

1.  Install the [Gemini
    CLI](https://github.com/google-gemini/gemini-cli?tab=readme-ov-file#quickstart).
1.  In your working directory, create a folder named `.gemini`. Within it,
    create a `settings.json` file.
1.  Add the following configuration and save:

    ```json
    {
      "mcpServers": {
        "cloud-sql-mysql-admin": {
          "command": "./PATH/TO/toolbox",
          "args": ["--prebuilt","cloud-sql-mysql-admin","--stdio"],
          "env": {
          }
        }
      }
    }
    ```
{{% /tab %}}

{{% tab header="Gemini Code Assist" lang="en" %}}

1.  Install the [Gemini Code
    Assist](https://marketplace.visualstudio.com/items?itemName=Google.geminicodeassist)
    extension in Visual Studio Code.
1.  Enable Agent Mode in Gemini Code Assist chat.
1.  In your working directory, create a folder named `.gemini`. Within it,
    create a `settings.json` file.
1.  Add the following configuration and save:

    ```json
    {
      "mcpServers": {
        "cloud-sql-mysql-admin": {
          "command": "./PATH/TO/toolbox",
          "args": ["--prebuilt","cloud-sql-mysql-admin","--stdio"],
          "env": {
          }
        }
      }
    }
    ```
{{% /tab %}}
{{< /tabpane >}}

## Use Tools

Your AI tool is now connected to Cloud SQL for MySQL using MCP.

The `cloud-sql-mysql-admin` server provides tools for managing your Cloud SQL
instances and interacting with your database:
* **create_instance**: Creates a new Cloud SQL for MySQL instance.
* **get_instance**: Gets information about a Cloud SQL instance.
* **list_instances**: Lists Cloud SQL instances in a project.
* **create_database**: Creates a new database in a Cloud SQL instance.
* **list_databases**: Lists all databases for a Cloud SQL instance.
* **create_user**: Creates a new user in a Cloud SQL instance.
* **wait_for_operation**: Waits for a Cloud SQL operation to complete.

{{< notice note >}}
Prebuilt tools are pre-1.0, so expect some tool changes between versions. LLMs
will adapt to the tools available, so this shouldn't affect most users.
{{< /notice >}}

```

--------------------------------------------------------------------------------
/docs/en/how-to/connect-ide/cloud_sql_mssql_admin_mcp.md:
--------------------------------------------------------------------------------

```markdown
---
title: "Cloud SQL for SQL Server Admin using MCP"
type: docs
weight: 5
description: >
  Create and manage Cloud SQL for SQL Server (Admin) using Toolbox.
---

This guide covers how to use [MCP Toolbox for Databases][toolbox] to expose your
developer assistant tools to create and manage Cloud SQL for SQL Server
instance, database and users:

* [Cursor][cursor]
* [Windsurf][windsurf] (Codium)
* [Visual Studio Code][vscode] (Copilot)
* [Cline][cline]  (VS Code extension)
* [Claude desktop][claudedesktop]
* [Claude code][claudecode]
* [Gemini CLI][geminicli]
* [Gemini Code Assist][geminicodeassist]

[toolbox]: https://github.com/googleapis/genai-toolbox
[cursor]: #configure-your-mcp-client
[windsurf]: #configure-your-mcp-client
[vscode]: #configure-your-mcp-client
[cline]: #configure-your-mcp-client
[claudedesktop]: #configure-your-mcp-client
[claudecode]: #configure-your-mcp-client
[geminicli]: #configure-your-mcp-client
[geminicodeassist]: #configure-your-mcp-client

## Before you begin

1. In the Google Cloud console, on the [project selector
   page](https://console.cloud.google.com/projectselector2/home/dashboard),
   select or create a Google Cloud project.

1. [Make sure that billing is enabled for your Google Cloud
   project](https://cloud.google.com/billing/docs/how-to/verify-billing-enabled#confirm_billing_is_enabled_on_a_project).

1. Grant the necessary IAM roles to the user that will be running the MCP
   server. The tools available will depend on the roles granted:
    * `roles/cloudsql.viewer`: Provides read-only access to resources.
        * `get_instance`
        * `list_instances`
        * `list_databases`
        * `wait_for_operation`
    * `roles/cloudsql.editor`: Provides permissions to manage existing resources.
        * All `viewer` tools
        * `create_database`
    * `roles/cloudsql.admin`: Provides full control over all resources.
        * All `editor` and `viewer` tools
        * `create_instance`
        * `create_user`

## Install MCP Toolbox

1. Download the latest version of Toolbox as a binary. Select the [correct
   binary](https://github.com/googleapis/genai-toolbox/releases) corresponding
   to your OS and CPU architecture. You are required to use Toolbox version
   V0.15.0+:

   <!-- {x-release-please-start-version} -->
   {{< tabpane persist=header >}}
{{< tab header="linux/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.15.0/linux/amd64/toolbox
{{< /tab >}}

{{< tab header="darwin/arm64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.15.0/darwin/arm64/toolbox
{{< /tab >}}

{{< tab header="darwin/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.15.0/darwin/amd64/toolbox
{{< /tab >}}

{{< tab header="windows/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.15.0/windows/amd64/toolbox.exe
{{< /tab >}}
{{< /tabpane >}}
    <!-- {x-release-please-end} -->

1. Make the binary executable:

    ```bash
    chmod +x toolbox
    ```

1. Verify the installation:

    ```bash
    ./toolbox --version
    ```

## Configure your MCP Client

{{< tabpane text=true >}}
{{% tab header="Claude code" lang="en" %}}

1. Install [Claude
   Code](https://docs.anthropic.com/en/docs/agents-and-tools/claude-code/overview).
1. Create a `.mcp.json` file in your project root if it doesn't exist.
1. Add the following configuration and save:

    ```json
    {
      "mcpServers": {
        "cloud-sql-mssql-admin": {
          "command": "./PATH/TO/toolbox",
          "args": ["--prebuilt","cloud-sql-mssql-admin","--stdio"],
          "env": {
          }
        }
      }
    }
    ```

1. Restart Claude code to apply the new configuration.
{{% /tab %}}

{{% tab header="Claude desktop" lang="en" %}}

1. Open [Claude desktop](https://claude.ai/download) and navigate to Settings.
1. Under the Developer tab, tap Edit Config to open the configuration file.
1. Add the following configuration and save:

    ```json
    {
      "mcpServers": {
        "cloud-sql-mssql-admin": {
          "command": "./PATH/TO/toolbox",
          "args": ["--prebuilt","cloud-sql-mssql-admin","--stdio"],
          "env": {
          }
        }
      }
    }
    ```

1. Restart Claude desktop.
1. From the new chat screen, you should see a hammer (MCP) icon appear with the
   new MCP server available.
{{% /tab %}}

{{% tab header="Cline" lang="en" %}}

1. Open the [Cline](https://github.com/cline/cline) extension in VS Code and tap
   the **MCP Servers** icon.
1. Tap Configure MCP Servers to open the configuration file.
1. Add the following configuration and save:

    ```json
    {
      "mcpServers": {
        "cloud-sql-mssql-admin": {
          "command": "./PATH/TO/toolbox",
          "args": ["--prebuilt","cloud-sql-mssql-admin","--stdio"],
          "env": {
          }
        }
      }
    }
    ```

1. You should see a green active status after the server is successfully
   connected.
{{% /tab %}}

{{% tab header="Cursor" lang="en" %}}

1. Create a `.cursor` directory in your project root if it doesn't exist.
1. Create a `.cursor/mcp.json` file if it doesn't exist and open it.
1. Add the following configuration and save:

    ```json
    {
      "mcpServers": {
        "cloud-sql-mssql-admin": {
          "command": "./PATH/TO/toolbox",
          "args": ["--prebuilt","cloud-sql-mssql-admin","--stdio"],
          "env": {
          }
        }
      }
    }
    ```

1. [Cursor](https://www.cursor.com/) and navigate to **Settings > Cursor
   Settings > MCP**. You should see a green active status after the server is
   successfully connected.
{{% /tab %}}

{{% tab header="Visual Studio Code (Copilot)" lang="en" %}}

1. Open [VS Code](https://code.visualstudio.com/docs/copilot/overview) and
   create a `.vscode` directory in your project root if it doesn't exist.
1. Create a `.vscode/mcp.json` file if it doesn't exist and open it.
1. Add the following configuration and save:

    ```json
    {
      "servers": {
        "cloud-sql-mssql-admin": {
          "command": "./PATH/TO/toolbox",
          "args": ["--prebuilt","cloud-sql-mssql-admin","--stdio"],
          "env": {
          }
        }
      }
    }
    ```

{{% /tab %}}

{{% tab header="Windsurf" lang="en" %}}

1. Open [Windsurf](https://docs.codeium.com/windsurf) and navigate to the
   Cascade assistant.
1. Tap on the hammer (MCP) icon, then Configure to open the configuration file.
1. Add the following configuration and save:

    ```json
    {
      "mcpServers": {
        "cloud-sql-mssql-admin": {
          "command": "./PATH/TO/toolbox",
          "args": ["--prebuilt","cloud-sql-mssql-admin","--stdio"],
          "env": {
          }
        }
      }
    }
    ```

{{% /tab %}}

{{% tab header="Gemini CLI" lang="en" %}}

1.  Install the [Gemini
    CLI](https://github.com/google-gemini/gemini-cli?tab=readme-ov-file#quickstart).
1.  In your working directory, create a folder named `.gemini`. Within it,
    create a `settings.json` file.
1.  Add the following configuration and save:

    ```json
    {
      "mcpServers": {
        "cloud-sql-mssql-admin": {
          "command": "./PATH/TO/toolbox",
          "args": ["--prebuilt","cloud-sql-mssql-admin","--stdio"],
          "env": {
          }
        }
      }
    }
    ```
{{% /tab %}}

{{% tab header="Gemini Code Assist" lang="en" %}}

1.  Install the [Gemini Code
    Assist](https://marketplace.visualstudio.com/items?itemName=Google.geminicodeassist)
    extension in Visual Studio Code.
1.  Enable Agent Mode in Gemini Code Assist chat.
1.  In your working directory, create a folder named `.gemini`. Within it,
    create a `settings.json` file.
1.  Add the following configuration and save:

    ```json
    {
      "mcpServers": {
        "cloud-sql-mssql-admin": {
          "command": "./PATH/TO/toolbox",
          "args": ["--prebuilt","cloud-sql-mssql-admin","--stdio"],
          "env": {
          }
        }
      }
    }
    ```
{{% /tab %}}
{{< /tabpane >}}

## Use Tools

Your AI tool is now connected to Cloud SQL for SQL Server using MCP.

The `cloud-sql-mssql-admin` server provides tools for managing your Cloud SQL
instances and interacting with your database:
* **create_instance**: Creates a new Cloud SQL for SQL Server instance.
* **get_instance**: Gets information about a Cloud SQL instance.
* **list_instances**: Lists Cloud SQL instances in a project.
* **create_database**: Creates a new database in a Cloud SQL instance.
* **list_databases**: Lists all databases for a Cloud SQL instance.
* **create_user**: Creates a new user in a Cloud SQL instance.
* **wait_for_operation**: Waits for a Cloud SQL operation to complete.

{{< notice note >}}
Prebuilt tools are pre-1.0, so expect some tool changes between versions. LLMs
will adapt to the tools available, so this shouldn't affect most users.
{{< /notice >}}

```

--------------------------------------------------------------------------------
/tests/cloudsql/cloudsql_wait_for_operation_test.go:
--------------------------------------------------------------------------------

```go
// Copyright 2025 Google LLC
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
//      http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.

package cloudsql

import (
	"bytes"
	"context"
	"encoding/json"
	"fmt"
	"io"
	"net/http"
	"net/http/httptest"
	"net/url"
	"reflect"
	"regexp"
	"strings"
	"sync"
	"testing"
	"time"

	"github.com/googleapis/genai-toolbox/internal/testutils"
	"github.com/googleapis/genai-toolbox/tests"

	_ "github.com/googleapis/genai-toolbox/internal/tools/cloudsql/cloudsqlwaitforoperation"
)

var (
	cloudsqlWaitToolKind = "cloud-sql-wait-for-operation"
)

type waitForOperationTransport struct {
	transport http.RoundTripper
	url       *url.URL
}

func (t *waitForOperationTransport) RoundTrip(req *http.Request) (*http.Response, error) {
	if strings.HasPrefix(req.URL.String(), "https://sqladmin.googleapis.com") {
		req.URL.Scheme = t.url.Scheme
		req.URL.Host = t.url.Host
	}
	return t.transport.RoundTrip(req)
}

type cloudsqlOperation struct {
	Name          string `json:"name"`
	Status        string `json:"status"`
	TargetLink    string `json:"targetLink"`
	OperationType string `json:"operationType"`
	Error         *struct {
		Errors []struct {
			Code    string `json:"code"`
			Message string `json:"message"`
		} `json:"errors"`
	} `json:"error,omitempty"`
}

type cloudsqlInstance struct {
	Region          string `json:"region"`
	DatabaseVersion string `json:"databaseVersion"`
}

type cloudsqlHandler struct {
	mu         sync.Mutex
	operations map[string]*cloudsqlOperation
	instances  map[string]*cloudsqlInstance
	t          *testing.T
}

func (h *cloudsqlHandler) ServeHTTP(w http.ResponseWriter, r *http.Request) {
	h.mu.Lock()
	defer h.mu.Unlock()

	if !strings.Contains(r.UserAgent(), "genai-toolbox/") {
		h.t.Errorf("User-Agent header not found")
	}

	if match, _ := regexp.MatchString("/v1/projects/p1/operations/.*", r.URL.Path); match {
		parts := regexp.MustCompile("/").Split(r.URL.Path, -1)
		opName := parts[len(parts)-1]

		op, ok := h.operations[opName]
		if !ok {
			http.NotFound(w, r)
			return
		}

		if op.Status != "DONE" {
			op.Status = "DONE"
		}

		w.Header().Set("Content-Type", "application/json")
		if err := json.NewEncoder(w).Encode(op); err != nil {
			http.Error(w, err.Error(), http.StatusInternalServerError)
		}
	} else if match, _ := regexp.MatchString("/v1/projects/p1/instances/.*", r.URL.Path); match {
		parts := regexp.MustCompile("/").Split(r.URL.Path, -1)
		instanceName := parts[len(parts)-1]

		instance, ok := h.instances[instanceName]
		if !ok {
			http.NotFound(w, r)
			return
		}
		w.Header().Set("Content-Type", "application/json")
		if err := json.NewEncoder(w).Encode(instance); err != nil {
			http.Error(w, err.Error(), http.StatusInternalServerError)
		}
	} else {
		http.NotFound(w, r)
	}
}

func TestCloudSQLWaitToolEndpoints(t *testing.T) {
	h := &cloudsqlHandler{
		operations: map[string]*cloudsqlOperation{
			"op1": {Name: "op1", Status: "PENDING", OperationType: "CREATE_DATABASE"},
			"op2": {Name: "op2", Status: "PENDING", OperationType: "CREATE_DATABASE", Error: &struct {
				Errors []struct {
					Code    string `json:"code"`
					Message string `json:"message"`
				} `json:"errors"`
			}{
				Errors: []struct {
					Code    string `json:"code"`
					Message string `json:"message"`
				}{
					{Code: "ERROR_CODE", Message: "failed"},
				},
			}},
			"op3": {Name: "op3", Status: "PENDING", OperationType: "CREATE"},
		},
		instances: map[string]*cloudsqlInstance{
			"i1": {Region: "r1", DatabaseVersion: "POSTGRES_13"},
		},
		t: t,
	}
	server := httptest.NewServer(h)
	defer server.Close()

	h.operations["op1"].TargetLink = "https://sqladmin.googleapis.com/v1/projects/p1/instances/i1/databases/d1"
	h.operations["op2"].TargetLink = "https://sqladmin.googleapis.com/v1/projects/p1/instances/i2/databases/d2"
	h.operations["op3"].TargetLink = "https://sqladmin.googleapis.com/v1/projects/p1/instances/i1"

	serverURL, err := url.Parse(server.URL)
	if err != nil {
		t.Fatalf("failed to parse server URL: %v", err)
	}

	originalTransport := http.DefaultClient.Transport
	if originalTransport == nil {
		originalTransport = http.DefaultTransport
	}
	http.DefaultClient.Transport = &waitForOperationTransport{
		transport: originalTransport,
		url:       serverURL,
	}
	t.Cleanup(func() {
		http.DefaultClient.Transport = originalTransport
	})

	ctx, cancel := context.WithTimeout(context.Background(), time.Minute)
	defer cancel()

	var args []string

	toolsFile := getCloudSQLWaitToolsConfig()
	cmd, cleanup, err := tests.StartCmd(ctx, toolsFile, args...)
	if err != nil {
		t.Fatalf("command initialization returned an error: %s", err)
	}
	defer cleanup()

	waitCtx, cancel := context.WithTimeout(ctx, 10*time.Second)
	defer cancel()
	out, err := testutils.WaitForString(waitCtx, regexp.MustCompile(`Server ready to serve`), cmd.Out)
	if err != nil {
		t.Logf("toolbox command logs: \n%s", out)
		t.Fatalf("toolbox didn't start successfully: %s", err)
	}

	tcs := []struct {
		name          string
		toolName      string
		body          string
		want          string
		expectError   bool
		wantSubstring bool
	}{
		{
			name:          "successful operation",
			toolName:      "wait-for-op1",
			body:          `{"project": "p1", "operation": "op1"}`,
			want:          "Your Cloud SQL resource is ready",
			wantSubstring: true,
		},
		{
			name:        "failed operation",
			toolName:    "wait-for-op2",
			body:        `{"project": "p1", "operation": "op2"}`,
			expectError: true,
		},
		{
			name:     "non-database create operation",
			toolName: "wait-for-op3",
			body:     `{"project": "p1", "operation": "op3"}`,
			want:     `{"name":"op3","status":"DONE","targetLink":"` + h.operations["op3"].TargetLink + `","operationType":"CREATE"}`,
		},
	}

	for _, tc := range tcs {
		t.Run(tc.name, func(t *testing.T) {
			api := fmt.Sprintf("http://127.0.0.1:5000/api/tool/%s/invoke", tc.toolName)
			req, err := http.NewRequest(http.MethodPost, api, bytes.NewBufferString(tc.body))
			if err != nil {
				t.Fatalf("unable to create request: %s", err)
			}
			req.Header.Add("Content-type", "application/json")
			resp, err := http.DefaultClient.Do(req)
			if err != nil {
				t.Fatalf("unable to send request: %s", err)
			}
			defer resp.Body.Close()

			if tc.expectError {
				if resp.StatusCode == http.StatusOK {
					t.Fatal("expected error but got status 200")
				}
				return
			}

			if resp.StatusCode != http.StatusOK {
				bodyBytes, _ := io.ReadAll(resp.Body)
				t.Fatalf("response status code is not 200, got %d: %s", resp.StatusCode, string(bodyBytes))
			}

			if tc.wantSubstring {
				var result struct {
					Result string `json:"result"`
				}
				if err := json.NewDecoder(resp.Body).Decode(&result); err != nil {
					t.Fatalf("failed to decode response: %v", err)
				}

				if !bytes.Contains([]byte(result.Result), []byte(tc.want)) {
					t.Fatalf("unexpected result: got %q, want substring %q", result.Result, tc.want)
				}
				return
			}

			var result struct {
				Result string `json:"result"`
			}
			if err := json.NewDecoder(resp.Body).Decode(&result); err != nil {
				t.Fatalf("failed to decode response: %v", err)
			}

			var tempString string
			if err := json.Unmarshal([]byte(result.Result), &tempString); err != nil {
				t.Fatalf("failed to unmarshal outer JSON string: %v", err)
			}

			var got, want map[string]any
			if err := json.Unmarshal([]byte(tempString), &got); err != nil {
				t.Fatalf("failed to unmarshal inner JSON object: %v", err)
			}

			if err := json.Unmarshal([]byte(tc.want), &want); err != nil {
				t.Fatalf("failed to unmarshal want: %v", err)
			}

			if !reflect.DeepEqual(got, want) {
				t.Fatalf("unexpected result: got %+v, want %+v", got, want)
			}
		})
	}
}

func getCloudSQLWaitToolsConfig() map[string]any {
	return map[string]any{
		"sources": map[string]any{
			"my-cloud-sql-source": map[string]any{
				"kind": "cloud-sql-admin",
			},
		},
		"tools": map[string]any{
			"wait-for-op1": map[string]any{
				"kind":        cloudsqlWaitToolKind,
				"source":      "my-cloud-sql-source",
				"description": "wait for op1",
			},
			"wait-for-op2": map[string]any{
				"kind":        cloudsqlWaitToolKind,
				"source":      "my-cloud-sql-source",
				"description": "wait for op2",
			},
			"wait-for-op3": map[string]any{
				"kind":        cloudsqlWaitToolKind,
				"source":      "my-cloud-sql-source",
				"description": "wait for op3",
			},
		},
	}
}

```

--------------------------------------------------------------------------------
/tests/oceanbase/oceanbase_integration_test.go:
--------------------------------------------------------------------------------

```go
// Copyright 2025 Google LLC
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
//     http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.

package oceanbase

import (
	"context"
	"database/sql"
	"fmt"
	"os"
	"regexp"
	"strings"
	"testing"
	"time"

	"github.com/google/uuid"
	"github.com/googleapis/genai-toolbox/internal/testutils"
	"github.com/googleapis/genai-toolbox/tests"

	_ "github.com/go-sql-driver/mysql"
)

var (
	OceanBaseSourceKind = "oceanbase"
	OceanBaseToolKind   = "oceanbase-sql"
	OceanBaseDatabase   = os.Getenv("OCEANBASE_DATABASE")
	OceanBaseHost       = os.Getenv("OCEANBASE_HOST")
	OceanBasePort       = os.Getenv("OCEANBASE_PORT")
	OceanBaseUser       = os.Getenv("OCEANBASE_USER")
	OceanBasePass       = os.Getenv("OCEANBASE_PASSWORD")
)

func getOceanBaseVars(t *testing.T) map[string]any {
	switch "" {
	case OceanBaseDatabase:
		t.Fatal("'OCEANBASE_DATABASE' not set")
	case OceanBaseHost:
		t.Fatal("'OCEANBASE_HOST' not set")
	case OceanBasePort:
		t.Fatal("'OCEANBASE_PORT' not set")
	case OceanBaseUser:
		t.Fatal("'OCEANBASE_USER' not set")
	case OceanBasePass:
		t.Fatal("'OCEANBASE_PASSWORD' not set")
	}

	return map[string]any{
		"kind":     OceanBaseSourceKind,
		"host":     OceanBaseHost,
		"port":     OceanBasePort,
		"database": OceanBaseDatabase,
		"user":     OceanBaseUser,
		"password": OceanBasePass,
	}
}

// Copied over from oceanbase.go
func initOceanBaseConnectionPool(host, port, user, pass, dbname string) (*sql.DB, error) {
	dsn := fmt.Sprintf("%s:%s@tcp(%s:%s)/%s?parseTime=true", user, pass, host, port, dbname)

	// Interact with the driver directly as you normally would
	pool, err := sql.Open("mysql", dsn)
	if err != nil {
		return nil, fmt.Errorf("sql.Open: %w", err)
	}
	return pool, nil
}

func TestOceanBaseToolEndpoints(t *testing.T) {
	sourceConfig := getOceanBaseVars(t)
	ctx, cancel := context.WithTimeout(context.Background(), time.Minute)
	defer cancel()

	var args []string

	pool, err := initOceanBaseConnectionPool(OceanBaseHost, OceanBasePort, OceanBaseUser, OceanBasePass, OceanBaseDatabase)
	if err != nil {
		t.Fatalf("unable to create OceanBase connection pool: %s", err)
	}

	// create table name with UUID
	tableNameParam := "param_table_" + strings.ReplaceAll(uuid.New().String(), "-", "")
	tableNameAuth := "auth_table_" + strings.ReplaceAll(uuid.New().String(), "-", "")
	tableNameTemplateParam := "template_param_table_" + strings.ReplaceAll(uuid.New().String(), "-", "")

	// set up data for param tool
	createParamTableStmt, insertParamTableStmt, paramToolStmt, idParamToolStmt, nameParamToolStmt, arrayToolStmt, paramTestParams := getOceanBaseParamToolInfo(tableNameParam)
	teardownTable1 := setupOceanBaseTable(t, ctx, pool, createParamTableStmt, insertParamTableStmt, tableNameParam, paramTestParams)
	defer teardownTable1(t)

	// set up data for auth tool
	createAuthTableStmt, insertAuthTableStmt, authToolStmt, authTestParams := getOceanBaseAuthToolInfo(tableNameAuth)
	teardownTable2 := setupOceanBaseTable(t, ctx, pool, createAuthTableStmt, insertAuthTableStmt, tableNameAuth, authTestParams)
	defer teardownTable2(t)

	// Write config into a file and pass it to command
	toolsFile := tests.GetToolsConfig(sourceConfig, OceanBaseToolKind, paramToolStmt, idParamToolStmt, nameParamToolStmt, arrayToolStmt, authToolStmt)
	toolsFile = addOceanBaseExecuteSqlConfig(t, toolsFile)
	tmplSelectCombined, tmplSelectFilterCombined := getOceanBaseTmplToolStatement()
	toolsFile = tests.AddTemplateParamConfig(t, toolsFile, OceanBaseToolKind, tmplSelectCombined, tmplSelectFilterCombined, "")

	cmd, cleanup, err := tests.StartCmd(ctx, toolsFile, args...)
	if err != nil {
		t.Fatalf("command initialization returned an error: %s", err)
	}
	defer cleanup()

	waitCtx, cancel := context.WithTimeout(ctx, 10*time.Second)
	defer cancel()
	out, err := testutils.WaitForString(waitCtx, regexp.MustCompile(`Server ready to serve`), cmd.Out)
	if err != nil {
		t.Logf("toolbox command logs: \n%s", out)
		t.Fatalf("toolbox didn't start successfully: %s", err)
	}

	// Get configs for tests
	select1Want, mcpMyFailToolWant, createTableStatement, mcpSelect1Want := getOceanBaseWants()

	// Run tests
	tests.RunToolGetTest(t)
	tests.RunToolInvokeTest(t, select1Want, tests.DisableArrayTest())
	tests.RunMCPToolCallMethod(t, mcpMyFailToolWant, mcpSelect1Want)
	tests.RunExecuteSqlToolInvokeTest(t, createTableStatement, select1Want)
	tests.RunToolInvokeWithTemplateParameters(t, tableNameTemplateParam)
}

// OceanBase specific parameter tool info
func getOceanBaseParamToolInfo(tableName string) (string, string, string, string, string, string, []any) {
	createStatement := fmt.Sprintf("CREATE TABLE %s (id INT NOT NULL AUTO_INCREMENT PRIMARY KEY, name VARCHAR(255));", tableName)
	insertStatement := fmt.Sprintf("INSERT INTO %s (name) VALUES (?), (?), (?), (?);", tableName)
	toolStatement := fmt.Sprintf("SELECT * FROM %s WHERE id = ? OR name = ?;", tableName)
	idParamStatement := fmt.Sprintf("SELECT * FROM %s WHERE id = ?;", tableName)
	nameParamStatement := fmt.Sprintf("SELECT * FROM %s WHERE name = ?;", tableName)
	arrayToolStatement := fmt.Sprintf("SELECT * FROM %s WHERE id = ANY(?) AND name = ANY(?);", tableName)
	params := []any{"Alice", "Jane", "Sid", nil}
	return createStatement, insertStatement, toolStatement, idParamStatement, nameParamStatement, arrayToolStatement, params
}

// OceanBase specific auth tool info
func getOceanBaseAuthToolInfo(tableName string) (string, string, string, []any) {
	createStatement := fmt.Sprintf("CREATE TABLE %s (id INT NOT NULL AUTO_INCREMENT PRIMARY KEY, name VARCHAR(255), email VARCHAR(255));", tableName)
	insertStatement := fmt.Sprintf("INSERT INTO %s (name, email) VALUES (?, ?), (?, ?)", tableName)
	toolStatement := fmt.Sprintf("SELECT name FROM %s WHERE email = ?;", tableName)
	params := []any{"Alice", tests.ServiceAccountEmail, "Jane", "[email protected]"}
	return createStatement, insertStatement, toolStatement, params
}

// OceanBase specific template tool statements
func getOceanBaseTmplToolStatement() (string, string) {
	tmplSelectCombined := "SELECT * FROM {{.tableName}} WHERE id = ?"
	tmplSelectFilterCombined := "SELECT * FROM {{.tableName}} WHERE {{.columnFilter}} = ?"
	return tmplSelectCombined, tmplSelectFilterCombined
}

// OceanBase specific expected results
func getOceanBaseWants() (string, string, string, string) {
	select1Want := "[{\"1\":1}]"
	mcpMyFailToolWant := `{"jsonrpc":"2.0","id":"invoke-fail-tool","result":{"content":[{"type":"text","text":"unable to execute query: Error 1064 (42000): You have an error in your SQL syntax; check the manual that corresponds to your OceanBase version for the right syntax to use near 'SELEC 1;' at line 1"}],"isError":true}}`
	createTableStatement := `"CREATE TABLE t (id INT NOT NULL AUTO_INCREMENT PRIMARY KEY, name VARCHAR(255))"`
	mcpSelect1Want := `{"jsonrpc":"2.0","id":"invoke my-auth-required-tool","result":{"content":[{"type":"text","text":"{\"1\":1}"}]}}`
	return select1Want, mcpMyFailToolWant, createTableStatement, mcpSelect1Want
}

// Add OceanBase Execute SQL configuration
func addOceanBaseExecuteSqlConfig(t *testing.T, config map[string]any) map[string]any {
	tools, ok := config["tools"].(map[string]any)
	if !ok {
		t.Fatalf("unable to get tools from config")
	}
	tools["my-exec-sql-tool"] = map[string]any{
		"kind":        "oceanbase-execute-sql",
		"source":      "my-instance",
		"description": "Tool to execute sql",
	}
	tools["my-auth-exec-sql-tool"] = map[string]any{
		"kind":        "oceanbase-execute-sql",
		"source":      "my-instance",
		"description": "Tool to execute sql",
		"authRequired": []string{
			"my-google-auth",
		},
	}
	config["tools"] = tools
	return config
}

// Setup OceanBase table
func setupOceanBaseTable(t *testing.T, ctx context.Context, pool *sql.DB, createStatement, insertStatement, tableName string, params []any) func(*testing.T) {
	err := pool.PingContext(ctx)
	if err != nil {
		t.Fatalf("unable to connect to test database: %s", err)
	}

	// Create table
	_, err = pool.QueryContext(ctx, createStatement)
	if err != nil {
		t.Fatalf("unable to create test table %s: %s", tableName, err)
	}

	// Insert test data
	_, err = pool.QueryContext(ctx, insertStatement, params...)
	if err != nil {
		t.Fatalf("unable to insert test data: %s", err)
	}

	return func(t *testing.T) {
		// tear down test
		_, err = pool.ExecContext(ctx, fmt.Sprintf("DROP TABLE %s;", tableName))
		if err != nil {
			t.Errorf("Teardown failed: %s", err)
		}
	}
}

```

--------------------------------------------------------------------------------
/docs/en/how-to/connect-ide/cloud_sql_pg_admin_mcp.md:
--------------------------------------------------------------------------------

```markdown
---
title: "Cloud SQL for PostgreSQL Admin using MCP"
type: docs
weight: 3
description: >
  Create and manage Cloud SQL for PostgreSQL (Admin) using Toolbox.
---

This guide covers how to use [MCP Toolbox for Databases][toolbox] to expose your
developer assistant tools to create and manage Cloud SQL for PostgreSQL
instance, database and users:

* [Cursor][cursor]
* [Windsurf][windsurf] (Codium)
* [Visual Studio Code][vscode] (Copilot)
* [Cline][cline]  (VS Code extension)
* [Claude desktop][claudedesktop]
* [Claude code][claudecode]
* [Gemini CLI][geminicli]
* [Gemini Code Assist][geminicodeassist]

[toolbox]: https://github.com/googleapis/genai-toolbox
[cursor]: #configure-your-mcp-client
[windsurf]: #configure-your-mcp-client
[vscode]: #configure-your-mcp-client
[cline]: #configure-your-mcp-client
[claudedesktop]: #configure-your-mcp-client
[claudecode]: #configure-your-mcp-client
[geminicli]: #configure-your-mcp-client
[geminicodeassist]: #configure-your-mcp-client

## Before you begin

1. In the Google Cloud console, on the [project selector
   page](https://console.cloud.google.com/projectselector2/home/dashboard),
   select or create a Google Cloud project.

1. [Make sure that billing is enabled for your Google Cloud
   project](https://cloud.google.com/billing/docs/how-to/verify-billing-enabled#confirm_billing_is_enabled_on_a_project).

1. Grant the necessary IAM roles to the user that will be running the MCP
   server. The tools available will depend on the roles granted:
    * `roles/cloudsql.viewer`: Provides read-only access to resources.
        * `get_instance`
        * `list_instances`
        * `list_databases`
        * `wait_for_operation`
    * `roles/cloudsql.editor`: Provides permissions to manage existing resources.
        * All `viewer` tools
        * `create_database`
    * `roles/cloudsql.admin`: Provides full control over all resources.
        * All `editor` and `viewer` tools
        * `create_instance`
        * `create_user`

## Install MCP Toolbox

1. Download the latest version of Toolbox as a binary. Select the [correct
   binary](https://github.com/googleapis/genai-toolbox/releases) corresponding
   to your OS and CPU architecture. You are required to use Toolbox version
   V0.15.0+:

   <!-- {x-release-please-start-version} -->
   {{< tabpane persist=header >}}
{{< tab header="linux/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.15.0/linux/amd64/toolbox
{{< /tab >}}

{{< tab header="darwin/arm64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.15.0/darwin/arm64/toolbox
{{< /tab >}}

{{< tab header="darwin/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.15.0/darwin/amd64/toolbox
{{< /tab >}}

{{< tab header="windows/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.15.0/windows/amd64/toolbox.exe
{{< /tab >}}
{{< /tabpane >}}
    <!-- {x-release-please-end} -->

1. Make the binary executable:

    ```bash
    chmod +x toolbox
    ```

1. Verify the installation:

    ```bash
    ./toolbox --version
    ```

## Configure your MCP Client

{{< tabpane text=true >}}
{{% tab header="Claude code" lang="en" %}}

1. Install [Claude
   Code](https://docs.anthropic.com/en/docs/agents-and-tools/claude-code/overview).
1. Create a `.mcp.json` file in your project root if it doesn't exist.
1. Add the following configuration and save:

    ```json
    {
      "mcpServers": {
        "cloud-sql-postgres-admin": {
          "command": "./PATH/TO/toolbox",
          "args": ["--prebuilt","cloud-sql-postgres-admin","--stdio"],
          "env": {
          }
        }
      }
    }
    ```

1. Restart Claude code to apply the new configuration.
{{% /tab %}}

{{% tab header="Claude desktop" lang="en" %}}

1. Open [Claude desktop](https://claude.ai/download) and navigate to Settings.
1. Under the Developer tab, tap Edit Config to open the configuration file.
1. Add the following configuration and save:

    ```json
    {
      "mcpServers": {
        "cloud-sql-postgres-admin": {
          "command": "./PATH/TO/toolbox",
          "args": ["--prebuilt","cloud-sql-postgres-admin","--stdio"],
          "env": {
          }
        }
      }
    }
    ```

1. Restart Claude desktop.
1. From the new chat screen, you should see a hammer (MCP) icon appear with the
   new MCP server available.
{{% /tab %}}

{{% tab header="Cline" lang="en" %}}

1. Open the [Cline](https://github.com/cline/cline) extension in VS Code and tap
   the **MCP Servers** icon.
1. Tap Configure MCP Servers to open the configuration file.
1. Add the following configuration and save:

    ```json
    {
      "mcpServers": {
        "cloud-sql-postgres-admin": {
          "command": "./PATH/TO/toolbox",
          "args": ["--prebuilt","cloud-sql-postgres-admin","--stdio"],
          "env": {
          }
        }
      }
    }
    ```

1. You should see a green active status after the server is successfully
   connected.
{{% /tab %}}

{{% tab header="Cursor" lang="en" %}}

1. Create a `.cursor` directory in your project root if it doesn't exist.
1. Create a `.cursor/mcp.json` file if it doesn't exist and open it.
1. Add the following configuration and save:

    ```json
    {
      "mcpServers": {
        "cloud-sql-postgres-admin": {
          "command": "./PATH/TO/toolbox",
          "args": ["--prebuilt","cloud-sql-postgres-admin","--stdio"],
          "env": {
          }
        }
      }
    }
    ```

1. [Cursor](https://www.cursor.com/) and navigate to **Settings > Cursor
   Settings > MCP**. You should see a green active status after the server is
   successfully connected.
{{% /tab %}}

{{% tab header="Visual Studio Code (Copilot)" lang="en" %}}

1. Open [VS Code](https://code.visualstudio.com/docs/copilot/overview) and
   create a `.vscode` directory in your project root if it doesn't exist.
1. Create a `.vscode/mcp.json` file if it doesn't exist and open it.
1. Add the following configuration and save:

    ```json
    {
      "servers": {
        "cloud-sql-postgres-admin": {
          "command": "./PATH/TO/toolbox",
          "args": ["--prebuilt","cloud-sql-postgres-admin","--stdio"],
          "env": {
          }
        }
      }
    }
    ```

{{% /tab %}}

{{% tab header="Windsurf" lang="en" %}}

1. Open [Windsurf](https://docs.codeium.com/windsurf) and navigate to the
   Cascade assistant.
1. Tap on the hammer (MCP) icon, then Configure to open the configuration file.
1. Add the following configuration and save:

    ```json
    {
      "mcpServers": {
        "cloud-sql-postgres-admin": {
          "command": "./PATH/TO/toolbox",
          "args": ["--prebuilt","cloud-sql-postgres-admin","--stdio"],
          "env": {
          }
        }
      }
    }
    ```

{{% /tab %}}

{{% tab header="Gemini CLI" lang="en" %}}

1.  Install the [Gemini
    CLI](https://github.com/google-gemini/gemini-cli?tab=readme-ov-file#quickstart).
1.  In your working directory, create a folder named `.gemini`. Within it,
    create a `settings.json` file.
1.  Add the following configuration and save:

    ```json
    {
      "mcpServers": {
        "cloud-sql-postgres-admin": {
          "command": "./PATH/TO/toolbox",
          "args": ["--prebuilt","cloud-sql-postgres-admin","--stdio"],
          "env": {
          }
        }
      }
    }
    ```
{{% /tab %}}

{{% tab header="Gemini Code Assist" lang="en" %}}

1.  Install the [Gemini Code
    Assist](https://marketplace.visualstudio.com/items?itemName=Google.geminicodeassist)
    extension in Visual Studio Code.
1.  Enable Agent Mode in Gemini Code Assist chat.
1.  In your working directory, create a folder named `.gemini`. Within it,
    create a `settings.json` file.
1.  Add the following configuration and save:

    ```json
    {
      "mcpServers": {
        "cloud-sql-postgres-admin": {
          "command": "./PATH/TO/toolbox",
          "args": ["--prebuilt","cloud-sql-postgres-admin","--stdio"],
          "env": {
          }
        }
      }
    }
    ```
{{% /tab %}}
{{< /tabpane >}}

## Use Tools

Your AI tool is now connected to Cloud SQL for PostgreSQL using MCP.

The `cloud-sql-postgres-admin` server provides tools for managing your Cloud SQL
instances and interacting with your database:
* **create_instance**: Creates a new Cloud SQL for PostgreSQL instance.
* **get_instance**: Gets information about a Cloud SQL instance.
* **list_instances**: Lists Cloud SQL instances in a project.
* **create_database**: Creates a new database in a Cloud SQL instance.
* **list_databases**: Lists all databases for a Cloud SQL instance.
* **create_user**: Creates a new user in a Cloud SQL instance.
* **wait_for_operation**: Waits for a Cloud SQL operation to complete.

{{< notice note >}}
Prebuilt tools are pre-1.0, so expect some tool changes between versions. LLMs
will adapt to the tools available, so this shouldn't affect most users.
{{< /notice >}}

```

--------------------------------------------------------------------------------
/docs/en/resources/authServices/_index.md:
--------------------------------------------------------------------------------

```markdown
---
title: "AuthServices"
type: docs
weight: 1
description: >
  AuthServices represent services that handle authentication and authorization.
---

AuthServices represent services that handle authentication and authorization. It
can primarily be used by [Tools](../tools/) in two different ways:

- [**Authorized Invocation**][auth-invoke] is when a tool
  is validated by the auth service before the call can be invoked. Toolbox
  will reject any calls that fail to validate or have an invalid token.
- [**Authenticated Parameters**][auth-params] replace the value of a parameter
  with a field from an [OIDC][openid-claims] claim. Toolbox will automatically
  resolve the ID token provided by the client and replace the parameter in the
  tool call.

[openid-claims]: https://openid.net/specs/openid-connect-core-1_0.html#StandardClaims
[auth-invoke]: ../tools/#authorized-invocations
[auth-params]: ../tools/#authenticated-parameters

## Example

The following configurations are placed at the top level of a `tools.yaml` file.

{{< notice tip >}}
If you are accessing Toolbox with multiple applications, each
 application should register their own Client ID even if they use the same
 "kind" of auth provider.
{{< /notice >}}

```yaml
authServices:
  my_auth_app_1:
    kind: google
    clientId: ${YOUR_CLIENT_ID_1}
  my_auth_app_2:
    kind: google
    clientId: ${YOUR_CLIENT_ID_2}
```

{{< notice tip >}}
Use environment variable replacement with the format ${ENV_NAME}
instead of hardcoding your secrets into the configuration file.
{{< /notice >}}

After you've configured an `authService` you'll, need to reference it in the
configuration for each tool that should use it:

- **Authorized Invocations** for authorizing a tool call, [use the
  `authRequired` field in a tool config][auth-invoke]
- **Authenticated Parameters** for using the value from a OIDC claim, [use the
  `authServices` field in a parameter config][auth-params]

## Specifying ID Tokens from Clients

After [configuring](#example) your `authServices` section, use a Toolbox SDK to
add your ID tokens to the header of a Tool invocation request. When specifying a
token you will provide a function (that returns an id). This function is called
when the tool is invoked. This allows you to cache and refresh the ID token as
needed.

The primary method for providing these getters is via the `auth_token_getters`
parameter when loading tools, or the `add_auth_token_getter`() /
`add_auth_token_getters()` methods on a loaded tool object.

### Specifying tokens during load

#### Python

Use the [Python SDK](https://github.com/googleapis/mcp-toolbox-sdk-python/tree/main).

{{< tabpane persist=header >}}
{{< tab header="Core" lang="Python" >}}
import asyncio
from toolbox_core import ToolboxClient

async def get_auth_token():
    # ... Logic to retrieve ID token (e.g., from local storage, OAuth flow)
    # This example just returns a placeholder. Replace with your actual token retrieval.
    return "YOUR_ID_TOKEN" # Placeholder

async def main():
    async with ToolboxClient("<http://127.0.0.1:5000>") as toolbox:
        auth_tool = await toolbox.load_tool(
            "get_sensitive_data",
            auth_token_getters={"my_auth_app_1": get_auth_token}
        )
        result = await auth_tool(param="value")
        print(result)

if **name** == "**main**":
    asyncio.run(main())
{{< /tab >}}
{{< tab header="LangChain" lang="Python" >}}
import asyncio
from toolbox_langchain import ToolboxClient

async def get_auth_token():
    # ... Logic to retrieve ID token (e.g., from local storage, OAuth flow)
    # This example just returns a placeholder. Replace with your actual token retrieval.
    return "YOUR_ID_TOKEN" # Placeholder

async def main():
    toolbox = ToolboxClient("<http://127.0.0.1:5000>")

    auth_tool = await toolbox.aload_tool(
        "get_sensitive_data",
        auth_token_getters={"my_auth_app_1": get_auth_token}
    )
    result = await auth_tool.ainvoke({"param": "value"})
    print(result)

if **name** == "**main**":
    asyncio.run(main())
{{< /tab >}}
{{< tab header="Llamaindex" lang="Python" >}}
import asyncio
from toolbox_llamaindex import ToolboxClient

async def get_auth_token():
    # ... Logic to retrieve ID token (e.g., from local storage, OAuth flow)
    # This example just returns a placeholder. Replace with your actual token retrieval.
    return "YOUR_ID_TOKEN" # Placeholder

async def main():
    toolbox = ToolboxClient("<http://127.0.0.1:5000>")

    auth_tool = await toolbox.aload_tool(
        "get_sensitive_data",
        auth_token_getters={"my_auth_app_1": get_auth_token}
    )
    # result = await auth_tool.acall(param="value")
    # print(result.content)

if **name** == "**main**":
    asyncio.run(main()){{< /tab >}}
{{< /tabpane >}}

#### Javascript/Typescript

Use the [JS SDK](https://github.com/googleapis/mcp-toolbox-sdk-js/tree/main).

```javascript
import { ToolboxClient } from '@toolbox-sdk/core';

async function getAuthToken() {
    // ... Logic to retrieve ID token (e.g., from local storage, OAuth flow)
    // This example just returns a placeholder. Replace with your actual token retrieval.
    return "YOUR_ID_TOKEN" // Placeholder
}

const URL = 'http://127.0.0.1:5000';
let client = new ToolboxClient(URL);
const authTool = await client.loadTool("my-tool", {"my_auth_app_1": getAuthToken});
const result = await authTool({param:"value"});
console.log(result);
print(result)
```

#### Go

Use the [Go SDK](https://github.com/googleapis/mcp-toolbox-sdk-go/tree/main).

```go
import "github.com/googleapis/mcp-toolbox-sdk-go/core"
import "fmt"

func getAuthToken() string {
	// ... Logic to retrieve ID token (e.g., from local storage, OAuth flow)
	// This example just returns a placeholder. Replace with your actual token retrieval.
	return "YOUR_ID_TOKEN" // Placeholder
}

func main() {
	URL := 'http://127.0.0.1:5000'
	client, err := core.NewToolboxClient(URL)
	if err != nil {
		log.Fatalf("Failed to create Toolbox client: %v", err)
  	}
	dynamicTokenSource := core.NewCustomTokenSource(getAuthToken)
	authTool, err := client.LoadTool(
		"my-tool",
		ctx,
		core.WithAuthTokenSource("my_auth_app_1", dynamicTokenSource))
	if err != nil {
		log.Fatalf("Failed to load tool: %v", err)
	}
	inputs := map[string]any{"param": "value"}
	result, err := authTool.Invoke(ctx, inputs)
	if err != nil {
		log.Fatalf("Failed to invoke tool: %v", err)
	}
	fmt.Println(result)
}
```

### Specifying tokens for existing tools

#### Python

Use the [Python
SDK](https://github.com/googleapis/mcp-toolbox-sdk-python/tree/main).

{{< tabpane persist=header >}}
{{< tab header="Core" lang="Python" >}}
tools = await toolbox.load_toolset()

# for a single token

authorized_tool = tools[0].add_auth_token_getter("my_auth", get_auth_token)

# OR, if multiple tokens are needed

authorized_tool = tools[0].add_auth_token_getters({
  "my_auth1": get_auth1_token,
  "my_auth2": get_auth2_token,
})
{{< /tab >}}
{{< tab header="LangChain" lang="Python" >}}
tools = toolbox.load_toolset()

# for a single token

authorized_tool = tools[0].add_auth_token_getter("my_auth", get_auth_token)

# OR, if multiple tokens are needed

authorized_tool = tools[0].add_auth_token_getters({
  "my_auth1": get_auth1_token,
  "my_auth2": get_auth2_token,
})
{{< /tab >}}
{{< tab header="Llamaindex" lang="Python" >}}
tools = toolbox.load_toolset()

# for a single token

authorized_tool = tools[0].add_auth_token_getter("my_auth", get_auth_token)

# OR, if multiple tokens are needed

authorized_tool = tools[0].add_auth_token_getters({
  "my_auth1": get_auth1_token,
  "my_auth2": get_auth2_token,
})
{{< /tab >}}
{{< /tabpane >}}

#### Javascript/Typescript

Use the [JS SDK](https://github.com/googleapis/mcp-toolbox-sdk-js/tree/main).

```javascript
const URL = 'http://127.0.0.1:5000';
let client = new ToolboxClient(URL);
let tool = await client.loadTool("my-tool")

// for a single token
const authorizedTool = tool.addAuthTokenGetter("my_auth", get_auth_token)

// OR, if multiple tokens are needed
const multiAuthTool = tool.addAuthTokenGetters({
    "my_auth_1": getAuthToken1,
    "my_auth_2": getAuthToken2,
})

```

#### Go

Use the [Go SDK](https://github.com/googleapis/mcp-toolbox-sdk-go/tree/main).

```go
import "github.com/googleapis/mcp-toolbox-sdk-go/core"

func main() {
	URL := 'http://127.0.0.1:5000'
	client, err := core.NewToolboxClient(URL)
	if err != nil {
		log.Fatalf("Failed to create Toolbox client: %v", err)
	}
	tool, err := client.LoadTool("my-tool", ctx))
	if err != nil {
		log.Fatalf("Failed to load tool: %v", err)
	}
	dynamicTokenSource1 := core.NewCustomTokenSource(getAuthToken1)
	dynamicTokenSource2 := core.NewCustomTokenSource(getAuthToken1)

	// For a single token
	authTool, err := tool.ToolFrom(
		core.WithAuthTokenSource("my-auth", dynamicTokenSource),
	)

	// OR, if multiple tokens are needed
	authTool, err := tool.ToolFrom(
		core.WithAuthTokenSource("my-auth_1", dynamicTokenSource1),
		core.WithAuthTokenSource("my-auth_2", dynamicTokenSource2),
	)
}
```

## Kinds of Auth Services

```

--------------------------------------------------------------------------------
/docs/en/how-to/connect-ide/neo4j_mcp.md:
--------------------------------------------------------------------------------

```markdown
---
title: Neo4j using MCP
type: docs
weight: 2
description: "Connect your IDE to Neo4j using Toolbox."
---

[Model Context Protocol (MCP)](https://modelcontextprotocol.io/introduction) is
an open protocol for connecting Large Language Models (LLMs) to data sources
like Neo4j. This guide covers how to use [MCP Toolbox for Databases][toolbox] to
expose your developer assistant tools to a Neo4j instance:

* [Cursor][cursor]
* [Windsurf][windsurf] (Codium)
* [Visual Studio Code][vscode] (Copilot)
* [Cline][cline] (VS Code extension)
* [Claude desktop][claudedesktop]
* [Claude code][claudecode]
* [Gemini CLI][geminicli]
* [Gemini Code Assist][geminicodeassist]

[toolbox]: https://github.com/googleapis/genai-toolbox
[cursor]: #configure-your-mcp-client
[windsurf]: #configure-your-mcp-client
[vscode]: #configure-your-mcp-client
[cline]: #configure-your-mcp-client
[claudedesktop]: #configure-your-mcp-client
[claudecode]: #configure-your-mcp-client
[geminicli]: #configure-your-mcp-client
[geminicodeassist]: #configure-your-mcp-client

## Set up the database

1.  [Create or select a Neo4j
    instance.](https://neo4j.com/cloud/platform/aura-graph-database/)

## Install MCP Toolbox

1. Download the latest version of Toolbox as a binary. Select the [correct
   binary](https://github.com/googleapis/genai-toolbox/releases) corresponding
   to your OS and CPU architecture. You are required to use Toolbox version
   v0.15.0+:

   <!-- {x-release-please-start-version} -->
   {{< tabpane persist=header >}}
{{< tab header="linux/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.18.0/linux/amd64/toolbox
{{< /tab >}}

{{< tab header="darwin/arm64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.18.0/darwin/arm64/toolbox
{{< /tab >}}

{{< tab header="darwin/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.18.0/darwin/amd64/toolbox
{{< /tab >}}

{{< tab header="windows/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.18.0/windows/amd64/toolbox.exe
{{< /tab >}}
{{< /tabpane >}}
    <!-- {x-release-please-end} -->

1. Make the binary executable:

    ```bash
    chmod +x toolbox
    ```

1. Verify the installation:

    ```bash
    ./toolbox --version
    ```

## Configure your MCP Client

{{< tabpane text=true >}}
{{% tab header="Claude code" lang="en" %}}

1.  Install [Claude
    Code](https://docs.anthropic.com/en/docs/agents-and-tools/claude-code/overview). 
1.  Create a `.mcp.json` file in your project root if it doesn't exist.
1.  Add the following configuration, replace the environment variables with your
    values, and save: 

    ```json
    {
      "mcpServers": {
        "neo4j": {
          "command": "./PATH/TO/toolbox",
          "args": ["--prebuilt","neo4j","--stdio"],
          "env": {
            "NEO4J_URI": "",
            "NEO4J_DATABASE": "",
            "NEO4J_USERNAME": "",
            "NEO4J_PASSWORD": ""
          }
        }
      }
    }
    ```

1.  Restart Claude code to apply the new configuration.
{{% /tab %}}
{{% tab header="Claude desktop" lang="en" %}}

1.  Open [Claude desktop](https://claude.ai/download) and navigate to Settings.
1.  Under the Developer tab, tap Edit Config to open the configuration file.
1.  Add the following configuration, replace the environment variables with your
    values, and save: 

    ```json
    {
      "mcpServers": {
        "neo4j": {
          "command": "./PATH/TO/toolbox",
          "args": ["--prebuilt","neo4j","--stdio"],
          "env": {
            "NEO4J_URI": "",
            "NEO4J_DATABASE": "",
            "NEO4J_USERNAME": "",
            "NEO4J_PASSWORD": ""
          }
        }
      }
    }
    ```

1.  Restart Claude desktop.
1.  From the new chat screen, you should see a hammer (MCP) icon appear with the
    new MCP server available. 
{{% /tab %}}
{{% tab header="Cline" lang="en" %}}

1.  Open the [Cline](https://github.com/cline/cline) extension in VS Code and
    tap the **MCP Servers** icon. 
1.  Tap Configure MCP Servers to open the configuration file.
1.  Add the following configuration, replace the environment variables with your
    values, and save: 

    ```json
    {
      "mcpServers": {
        "neo4j": {
          "command": "./PATH/TO/toolbox",
          "args": ["--prebuilt","neo4j","--stdio"],
          "env": {
            "NEO4J_URI": "",
            "NEO4J_DATABASE": "",
            "NEO4J_USERNAME": "",
            "NEO4J_PASSWORD": ""
          }
        }
      }
    }
    ```

1.  You should see a green active status after the server is successfully connected.
{{% /tab %}}
{{% tab header="Cursor" lang="en" %}}

1.  Create a `.cursor` directory in your project root if it doesn't exist.
1.  Create a `.cursor/mcp.json` file if it doesn't exist and open it.
1.  Add the following configuration, replace the environment variables with your values, and save:

    ```json
    {
      "mcpServers": {
        "neo4j": {
          "command": "./PATH/TO/toolbox",
          "args": ["--prebuilt","neo4j","--stdio"],
          "env": {
            "NEO4J_URI": "",
            "NEO4J_DATABASE": "",
            "NEO4J_USERNAME": "",
            "NEO4J_PASSWORD": ""
          }
        }
      }
    }
    ```

1.  Open [Cursor](https://www.cursor.com/) and navigate to **Settings > Cursor
    Settings > MCP**. You should see a green active status after the server is
    successfully connected.
{{% /tab %}}
{{% tab header="Visual Studio Code (Copilot)" lang="en" %}}

1.  Open [VS Code](https://code.visualstudio.com/docs/copilot/overview) and
    create a `.vscode` directory in your project root if it doesn't exist.
1.  Create a `.vscode/mcp.json` file if it doesn't exist and open it.
1.  Add the following configuration, replace the environment variables with your
    values, and save:

    ```json
    {
    "mcp" : {
        "servers": {
        "neo4j": {
            "command": "./PATH/TO/toolbox",
            "args": ["--prebuilt","neo4j","--stdio"],
            "env": {
              "NEO4J_URI": "",
              "NEO4J_DATABASE": "",
              "NEO4J_USERNAME": "",
              "NEO4J_PASSWORD": ""
            }
         }
        }
      }
    }
    ```
{{% /tab %}}
{{% tab header="Windsurf" lang="en" %}}

1.  Open [Windsurf](https://docs.codeium.com/windsurf) and navigate to the
    Cascade assistant.
1.  Tap on the hammer (MCP) icon, then Configure to open the configuration file.
1.  Add the following configuration, replace the environment variables with your
    values, and save:

    ```json
    {
      "mcpServers": {
        "neo4j": {
          "command": "./PATH/TO/toolbox",
          "args": ["--prebuilt","neo4j","--stdio"],
          "env": {
            "NEO4J_URI": "",
            "NEO4J_DATABASE": "",
            "NEO4J_USERNAME": "",
            "NEO4J_PASSWORD": ""
          }
        }
      }
    }
    ```
{{% /tab %}}
{{% tab header="Gemini CLI" lang="en" %}}

1.  Install the [Gemini
    CLI](https://github.com/google-gemini/gemini-cli?tab=readme-ov-file#quickstart).
1.  In your working directory, create a folder named `.gemini`. Within it,
    create a `settings.json` file.
1.  Add the following configuration, replace the environment variables with your
    values, and then save:

    ```json
    {
      "mcpServers": {
        "neo4j": {
          "command": "./PATH/TO/toolbox",
          "args": ["--prebuilt","neo4j","--stdio"],
          "env": {
            "NEO4J_URI": "",
            "NEO4J_DATABASE": "",
            "NEO4J_USERNAME": "",
            "NEO4J_PASSWORD": ""
          }
        }
      }
    }
    ```
{{% /tab %}}
{{% tab header="Gemini Code Assist" lang="en" %}}

1.  Install the [Gemini Code
    Assist](https://marketplace.visualstudio.com/items?itemName=Google.geminicodeassist)
    extension in Visual Studio Code.
1.  Enable Agent Mode in Gemini Code Assist chat.
1.  In your working directory, create a folder named `.gemini`. Within it,
    create a `settings.json` file.
1.  Add the following configuration, replace the environment variables with your
    values, and then save:

    ```json
    {
      "mcpServers": {
        "neo4j": {
          "command": "./PATH/TO/toolbox",
          "args": ["--prebuilt","neo4j","--stdio"],
          "env": {
            "NEO4J_URI": "",
            "NEO4J_DATABASE": "",
            "NEO4J_USERNAME": "",
            "NEO4J_PASSWORD": ""
          }
        }
      }
    }
    ```
{{% /tab %}}
{{< /tabpane >}}

## Use Tools

Your AI tool is now connected to Neo4j using MCP. Try asking your AI assistant
to get the graph schema or execute Cypher statements.

The following tools are available to the LLM:

1.  **get_schema**: extracts the complete database schema, including details
    about node labels, relationships, properties, constraints, and indexes.
1.  **execute_cypher**: executes any arbitrary Cypher statement.

{{< notice note >}}
Prebuilt tools are pre-1.0, so expect some tool changes between versions. LLMs
will adapt to the tools available, so this shouldn't affect most users.
{{< /notice >}}

```

--------------------------------------------------------------------------------
/docs/en/how-to/connect-ide/mysql_mcp.md:
--------------------------------------------------------------------------------

```markdown
---
title: MySQL using MCP
type: docs
weight: 2
description: "Connect your IDE to MySQL using Toolbox."
---

[Model Context Protocol (MCP)](https://modelcontextprotocol.io/introduction) is
an open protocol for connecting Large Language Models (LLMs) to data sources
like MySQL. This guide covers how to use [MCP Toolbox for Databases][toolbox] to
expose your developer assistant tools to a MySQL instance:

* [Cursor][cursor]
* [Windsurf][windsurf] (Codium)
* [Visual Studio Code][vscode] (Copilot)
* [Cline][cline] (VS Code extension)
* [Claude desktop][claudedesktop]
* [Claude code][claudecode]
* [Gemini CLI][geminicli]
* [Gemini Code Assist][geminicodeassist]

[toolbox]: https://github.com/googleapis/genai-toolbox
[cursor]: #configure-your-mcp-client
[windsurf]: #configure-your-mcp-client
[vscode]: #configure-your-mcp-client
[cline]: #configure-your-mcp-client
[claudedesktop]: #configure-your-mcp-client
[claudecode]: #configure-your-mcp-client
[geminicli]: #configure-your-mcp-client
[geminicodeassist]: #configure-your-mcp-client

## Set up the database

1.  [Create or select a MySQL instance.](https://dev.mysql.com/downloads/installer/)

## Install MCP Toolbox

1. Download the latest version of Toolbox as a binary. Select the [correct
   binary](https://github.com/googleapis/genai-toolbox/releases) corresponding
   to your OS and CPU architecture. You are required to use Toolbox version
   V0.10.0+:

   <!-- {x-release-please-start-version} -->
   {{< tabpane persist=header >}}
{{< tab header="linux/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.18.0/linux/amd64/toolbox
{{< /tab >}}

{{< tab header="darwin/arm64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.18.0/darwin/arm64/toolbox
{{< /tab >}}

{{< tab header="darwin/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.18.0/darwin/amd64/toolbox
{{< /tab >}}

{{< tab header="windows/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.18.0/windows/amd64/toolbox.exe
{{< /tab >}}
{{< /tabpane >}}
    <!-- {x-release-please-end} -->

1. Make the binary executable:

    ```bash
    chmod +x toolbox
    ```

1. Verify the installation:

    ```bash
    ./toolbox --version
    ```

## Configure your MCP Client

{{< tabpane text=true >}}
{{% tab header="Claude code" lang="en" %}}

1.  Install [Claude
    Code](https://docs.anthropic.com/en/docs/agents-and-tools/claude-code/overview).
1.  Create a `.mcp.json` file in your project root if it doesn't exist.
1.  Add the following configuration, replace the environment variables with your
    values, and save:

    ```json
    {
      "mcpServers": {
        "mysql": {
          "command": "./PATH/TO/toolbox",
          "args": ["--prebuilt", "mysql", "--stdio"],
          "env": {
            "MYSQL_HOST": "",
            "MYSQL_PORT": "",
            "MYSQL_DATABASE": "",
            "MYSQL_USER": "",
            "MYSQL_PASSWORD": ""
          }
        }
      }
    }
    ```

1.  Restart Claude code to apply the new configuration.
{{% /tab %}}
{{% tab header="Claude desktop" lang="en" %}}

1.  Open [Claude desktop](https://claude.ai/download) and navigate to Settings.
1.  Under the Developer tab, tap Edit Config to open the configuration file.
1.  Add the following configuration, replace the environment variables with your
    values, and save:

    ```json
    {
      "mcpServers": {
        "mysql": {
          "command": "./PATH/TO/toolbox",
          "args": ["--prebuilt", "mysql", "--stdio"],
          "env": {
            "MYSQL_HOST": "",
            "MYSQL_PORT": "",
            "MYSQL_DATABASE": "",
            "MYSQL_USER": "",
            "MYSQL_PASSWORD": ""
          }
        }
      }
    }
    ```

1.  Restart Claude desktop.
1.  From the new chat screen, you should see a hammer (MCP) icon appear with the
    new MCP server available.
{{% /tab %}}
{{% tab header="Cline" lang="en" %}}

1.  Open the [Cline](https://github.com/cline/cline) extension in VS Code and
    tap the **MCP Servers** icon.
1.  Tap Configure MCP Servers to open the configuration file.
1.  Add the following configuration, replace the environment variables with your
    values, and save:

    ```json
    {
      "mcpServers": {
        "mysql": {
          "command": "./PATH/TO/toolbox",
          "args": ["--prebuilt", "mysql", "--stdio"],
          "env": {
            "MYSQL_HOST": "",
            "MYSQL_PORT": "",
            "MYSQL_DATABASE": "",
            "MYSQL_USER": "",
            "MYSQL_PASSWORD": ""
          }
        }
      }
    }
    ```

1.  You should see a green active status after the server is successfully
    connected.
{{% /tab %}}
{{% tab header="Cursor" lang="en" %}}

1.  Create a `.cursor` directory in your project root if it doesn't exist.
1.  Create a `.cursor/mcp.json` file if it doesn't exist and open it.
1.  Add the following configuration, replace the environment variables with your
    values, and save:

    ```json
    {
      "mcpServers": {
        "mysql": {
          "command": "./PATH/TO/toolbox",
          "args": ["--prebuilt", "mysql", "--stdio"],
          "env": {
            "MYSQL_HOST": "",
            "MYSQL_PORT": "",
            "MYSQL_DATABASE": "",
            "MYSQL_USER": "",
            "MYSQL_PASSWORD": ""
          }
        }
      }
    }
    ```

1.  Open [Cursor](https://www.cursor.com/) and navigate to **Settings > Cursor
    Settings > MCP**. You should see a green active status after the server is
    successfully connected.
{{% /tab %}}
{{% tab header="Visual Studio Code (Copilot)" lang="en" %}}

1.  Open [VS Code](https://code.visualstudio.com/docs/copilot/overview) and
    create a `.vscode` directory in your project root if it doesn't exist.
1.  Create a `.vscode/mcp.json` file if it doesn't exist and open it.
1.  Add the following configuration, replace the environment variables with your
    values, and save:

    ```json
    {
      "servers": {
        "mysql": {
          "command": "./PATH/TO/toolbox",
          "args": ["--prebuilt","mysql","--stdio"],
          "env": {
            "MYSQL_HOST": "",
            "MYSQL_PORT": "",
            "MYSQL_DATABASE": "",
            "MYSQL_USER": "",
            "MYSQL_PASSWORD": ""
          }
        }
      }
    }
    ```
{{% /tab %}}
{{% tab header="Windsurf" lang="en" %}}

1.  Open [Windsurf](https://docs.codeium.com/windsurf) and navigate to the
    Cascade assistant.
1.  Tap on the hammer (MCP) icon, then Configure to open the configuration file.
1.  Add the following configuration, replace the environment variables with your
    values, and save:

    ```json
    {
      "mcpServers": {
        "mysql": {
          "command": "./PATH/TO/toolbox",
          "args": ["--prebuilt","mysql","--stdio"],
          "env": {
            "MYSQL_HOST": "",
            "MYSQL_PORT": "",
            "MYSQL_DATABASE": "",
            "MYSQL_USER": "",
            "MYSQL_PASSWORD": ""
          }
        }
      }
    }
    ```
{{% /tab %}}
{{% tab header="Gemini CLI" lang="en" %}}

1.  Install the [Gemini
    CLI](https://github.com/google-gemini/gemini-cli?tab=readme-ov-file#quickstart).
1.  In your working directory, create a folder named `.gemini`. Within it,
    create a `settings.json` file.
1.  Add the following configuration, replace the environment variables with your
    values, and then save:

    ```json
    {
      "mcpServers": {
        "mysql": {
          "command": "./PATH/TO/toolbox",
          "args": ["--prebuilt","mysql","--stdio"],
          "env": {
            "MYSQL_HOST": "",
            "MYSQL_PORT": "",
            "MYSQL_DATABASE": "",
            "MYSQL_USER": "",
            "MYSQL_PASSWORD": ""
          }
        }
      }
    }
    ```
{{% /tab %}}
{{% tab header="Gemini Code Assist" lang="en" %}}

1.  Install the [Gemini Code
    Assist](https://marketplace.visualstudio.com/items?itemName=Google.geminicodeassist)
    extension in Visual Studio Code.
1.  Enable Agent Mode in Gemini Code Assist chat.
1.  In your working directory, create a folder named `.gemini`. Within it,
    create a `settings.json` file.
1.  Add the following configuration, replace the environment variables with your
    values, and then save:

    ```json
    {
      "mcpServers": {
        "mysql": {
          "command": "./PATH/TO/toolbox",
          "args": ["--prebuilt","mysql","--stdio"],
          "env": {
            "MYSQL_HOST": "",
            "MYSQL_PORT": "",
            "MYSQL_DATABASE": "",
            "MYSQL_USER": "",
            "MYSQL_PASSWORD": ""
          }
        }
      }
    }
    ```
{{% /tab %}}
{{< /tabpane >}}

## Use Tools

Your AI tool is now connected to MySQL using MCP. Try asking your AI assistant
to list tables, create a table, or define and execute other SQL statements.

The following tools are available to the LLM:

1.  **list_tables**: lists tables and descriptions
1.  **execute_sql**: execute any SQL statement

{{< notice note >}}
Prebuilt tools are pre-1.0, so expect some tool changes between versions. LLMs
will adapt to the tools available, so this shouldn't affect most users.
{{< /notice >}}

```

--------------------------------------------------------------------------------
/docs/en/how-to/connect-ide/mssql_mcp.md:
--------------------------------------------------------------------------------

```markdown
---
title: SQL Server using MCP
type: docs
weight: 2
description: "Connect your IDE to SQL Server using Toolbox."
---

[Model Context Protocol (MCP)](https://modelcontextprotocol.io/introduction) is
an open protocol for connecting Large Language Models (LLMs) to data sources
like SQL Server. This guide covers how to use [MCP Toolbox for
Databases][toolbox] to expose your developer assistant tools to a SQL Server
instance:

* [Cursor][cursor]
* [Windsurf][windsurf] (Codium)
* [Visual Studio Code][vscode] (Copilot)
* [Cline][cline] (VS Code extension)
* [Claude desktop][claudedesktop]
* [Claude code][claudecode]
* [Gemini CLI][geminicli]
* [Gemini Code Assist][geminicodeassist]

[toolbox]: https://github.com/googleapis/genai-toolbox
[cursor]: #configure-your-mcp-client
[windsurf]: #configure-your-mcp-client
[vscode]: #configure-your-mcp-client
[cline]: #configure-your-mcp-client
[claudedesktop]: #configure-your-mcp-client
[claudecode]: #configure-your-mcp-client
[geminicli]: #configure-your-mcp-client
[geminicodeassist]: #configure-your-mcp-client

## Set up the database

1.  [Create or select a SQL Server
    instance.](https://www.microsoft.com/en-us/sql-server/sql-server-downloads)

## Install MCP Toolbox

1. Download the latest version of Toolbox as a binary. Select the [correct
   binary](https://github.com/googleapis/genai-toolbox/releases) corresponding
   to your OS and CPU architecture. You are required to use Toolbox version
   V0.10.0+:

   <!-- {x-release-please-start-version} -->
   {{< tabpane persist=header >}}
{{< tab header="linux/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.18.0/linux/amd64/toolbox
{{< /tab >}}

{{< tab header="darwin/arm64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.18.0/darwin/arm64/toolbox
{{< /tab >}}

{{< tab header="darwin/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.18.0/darwin/amd64/toolbox
{{< /tab >}}

{{< tab header="windows/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.18.0/windows/amd64/toolbox.exe
{{< /tab >}}
{{< /tabpane >}}
    <!-- {x-release-please-end} -->

1. Make the binary executable:

    ```bash
    chmod +x toolbox
    ```

1. Verify the installation:

    ```bash
    ./toolbox --version
    ```

## Configure your MCP Client

{{< tabpane text=true >}}
{{% tab header="Claude code" lang="en" %}}

1.  Install [Claude
    Code](https://docs.anthropic.com/en/docs/agents-and-tools/claude-code/overview).
1.  Create a `.mcp.json` file in your project root if it doesn't exist.
1.  Add the following configuration, replace the environment variables with your
    values, and save:

    ```json
    {
      "mcpServers": {
        "sqlserver": {
          "command": "./PATH/TO/toolbox",
          "args": ["--prebuilt","mssql","--stdio"],
          "env": {
            "MSSQL_HOST": "",
            "MSSQL_PORT": "",
            "MSSQL_DATABASE": "",
            "MSSQL_USER": "",
            "MSSQL_PASSWORD": ""
          }
        }
      }
    }
    ```

1.  Restart Claude code to apply the new configuration.
{{% /tab %}}
{{% tab header="Claude desktop" lang="en" %}}

1.  Open [Claude desktop](https://claude.ai/download) and navigate to Settings.
1.  Under the Developer tab, tap Edit Config to open the configuration file.
1.  Add the following configuration, replace the environment variables with your
    values, and save:

    ```json
    {
      "mcpServers": {
        "sqlserver": {
          "command": "./PATH/TO/toolbox",
          "args": ["--prebuilt","mssql","--stdio"],
          "env": {
            "MSSQL_HOST": "",
            "MSSQL_PORT": "",
            "MSSQL_DATABASE": "",
            "MSSQL_USER": "",
            "MSSQL_PASSWORD": ""
          }
        }
      }
    }
    ```

1.  Restart Claude desktop.
1.  From the new chat screen, you should see a hammer (MCP) icon appear with the
    new MCP server available.
{{% /tab %}}
{{% tab header="Cline" lang="en" %}}

1.  Open the [Cline](https://github.com/cline/cline) extension in VS Code and
    tap the **MCP Servers** icon.
1.  Tap Configure MCP Servers to open the configuration file.
1.  Add the following configuration, replace the environment variables with your
    values, and save:

    ```json
    {
      "mcpServers": {
        "sqlserver": {
          "command": "./PATH/TO/toolbox",
          "args": ["--prebuilt","mssql","--stdio"],
          "env": {
            "MSSQL_HOST": "",
            "MSSQL_PORT": "",
            "MSSQL_DATABASE": "",
            "MSSQL_USER": "",
            "MSSQL_PASSWORD": ""
          }
        }
      }
    }
    ```

1.  You should see a green active status after the server is successfully
    connected.
{{% /tab %}}
{{% tab header="Cursor" lang="en" %}}

1.  Create a `.cursor` directory in your project root if it doesn't exist.
1.  Create a `.cursor/mcp.json` file if it doesn't exist and open it.
1.  Add the following configuration, replace the environment variables with your
    values, and save:

    ```json
    {
      "mcpServers": {
        "sqlserver": {
          "command": "./PATH/TO/toolbox",
          "args": ["--prebuilt","mssql","--stdio"],
          "env": {
            "MSSQL_HOST": "",
            "MSSQL_PORT": "",
            "MSSQL_DATABASE": "",
            "MSSQL_USER": "",
            "MSSQL_PASSWORD": ""
          }
        }
      }
    }
    ```

1.  Open [Cursor](https://www.cursor.com/) and navigate to **Settings > Cursor
    Settings > MCP**. You should see a green active status after the server is
    successfully connected.
{{% /tab %}}
{{% tab header="Visual Studio Code (Copilot)" lang="en" %}}

1.  Open [VS Code](https://code.visualstudio.com/docs/copilot/overview) and
    create a `.vscode` directory in your project root if it doesn't exist.
1.  Create a `.vscode/mcp.json` file if it doesn't exist and open it.
1.  Add the following configuration, replace the environment variables with your
    values, and save:

    ```json
    {
      "servers": {
        "mssql": {
          "command": "./PATH/TO/toolbox",
          "args": ["--prebuilt","mssql","--stdio"],
          "env": {
            "MSSQL_HOST": "",
            "MSSQL_PORT": "",
            "MSSQL_DATABASE": "",
            "MSSQL_USER": "",
            "MSSQL_PASSWORD": ""
          }
        }
      }
    }
    ```
{{% /tab %}}
{{% tab header="Windsurf" lang="en" %}}

1.  Open [Windsurf](https://docs.codeium.com/windsurf) and navigate to the
    Cascade assistant.
1.  Tap on the hammer (MCP) icon, then Configure to open the configuration file.
1.  Add the following configuration, replace the environment variables with your
    values, and save:

    ```json
    {
      "mcpServers": {
        "sqlserver": {
          "command": "./PATH/TO/toolbox",
          "args": ["--prebuilt","mssql","--stdio"],
          "env": {
            "MSSQL_HOST": "",
            "MSSQL_PORT": "",
            "MSSQL_DATABASE": "",
            "MSSQL_USER": "",
            "MSSQL_PASSWORD": ""
          }
        }
      }
    }
    ```
{{% /tab %}}
{{% tab header="Gemini CLI" lang="en" %}}

1.  Install the [Gemini
    CLI](https://github.com/google-gemini/gemini-cli?tab=readme-ov-file#quickstart).
1.  In your working directory, create a folder named `.gemini`. Within it,
    create a `settings.json` file.
1.  Add the following configuration, replace the environment variables with your
    values, and then save:

    ```json
    {
      "mcpServers": {
        "sqlserver": {
          "command": "./PATH/TO/toolbox",
          "args": ["--prebuilt","mssql","--stdio"],
          "env": {
            "MSSQL_HOST": "",
            "MSSQL_PORT": "",
            "MSSQL_DATABASE": "",
            "MSSQL_USER": "",
            "MSSQL_PASSWORD": ""
          }
        }
      }
    }
    ```
{{% /tab %}}
{{% tab header="Gemini Code Assist" lang="en" %}}

1.  Install the [Gemini Code
    Assist](https://marketplace.visualstudio.com/items?itemName=Google.geminicodeassist)
    extension in Visual Studio Code.
1.  Enable Agent Mode in Gemini Code Assist chat.
1.  In your working directory, create a folder named `.gemini`. Within it,
    create a `settings.json` file.
1.  Add the following configuration, replace the environment variables with your
    values, and then save:

    ```json
    {
      "mcpServers": {
        "sqlserver": {
          "command": "./PATH/TO/toolbox",
          "args": ["--prebuilt","mssql","--stdio"],
          "env": {
            "MSSQL_HOST": "",
            "MSSQL_PORT": "",
            "MSSQL_DATABASE": "",
            "MSSQL_USER": "",
            "MSSQL_PASSWORD": ""
          }
        }
      }
    }
    ```
{{% /tab %}}
{{< /tabpane >}}

## Use Tools

Your AI tool is now connected to SQL Server using MCP. Try asking your AI
assistant to list tables, create a table, or define and execute other SQL
statements.

The following tools are available to the LLM:

1.  **list_tables**: lists tables and descriptions
1.  **execute_sql**: execute any SQL statement

{{< notice note >}}
Prebuilt tools are pre-1.0, so expect some tool changes between versions. LLMs
will adapt to the tools available, so this shouldn't affect most users.
{{< /notice >}}

```

--------------------------------------------------------------------------------
/tests/alloydbpg/alloydb_pg_integration_test.go:
--------------------------------------------------------------------------------

```go
// Copyright 2024 Google LLC
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
//     http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.

package alloydbpg

import (
	"context"
	"fmt"
	"net"
	"os"
	"regexp"
	"strings"
	"testing"
	"time"

	"cloud.google.com/go/alloydbconn"
	"github.com/google/uuid"
	"github.com/googleapis/genai-toolbox/internal/testutils"
	"github.com/googleapis/genai-toolbox/tests"
	"github.com/jackc/pgx/v5/pgxpool"
)

var (
	AlloyDBPostgresSourceKind = "alloydb-postgres"
	AlloyDBPostgresToolKind   = "postgres-sql"
	AlloyDBPostgresProject    = os.Getenv("ALLOYDB_POSTGRES_PROJECT")
	AlloyDBPostgresRegion     = os.Getenv("ALLOYDB_POSTGRES_REGION")
	AlloyDBPostgresCluster    = os.Getenv("ALLOYDB_POSTGRES_CLUSTER")
	AlloyDBPostgresInstance   = os.Getenv("ALLOYDB_POSTGRES_INSTANCE")
	AlloyDBPostgresDatabase   = os.Getenv("ALLOYDB_POSTGRES_DATABASE")
	AlloyDBPostgresUser       = os.Getenv("ALLOYDB_POSTGRES_USER")
	AlloyDBPostgresPass       = os.Getenv("ALLOYDB_POSTGRES_PASS")
)

func getAlloyDBPgVars(t *testing.T) map[string]any {
	switch "" {
	case AlloyDBPostgresProject:
		t.Fatal("'ALLOYDB_POSTGRES_PROJECT' not set")
	case AlloyDBPostgresRegion:
		t.Fatal("'ALLOYDB_POSTGRES_REGION' not set")
	case AlloyDBPostgresCluster:
		t.Fatal("'ALLOYDB_POSTGRES_CLUSTER' not set")
	case AlloyDBPostgresInstance:
		t.Fatal("'ALLOYDB_POSTGRES_INSTANCE' not set")
	case AlloyDBPostgresDatabase:
		t.Fatal("'ALLOYDB_POSTGRES_DATABASE' not set")
	case AlloyDBPostgresUser:
		t.Fatal("'ALLOYDB_POSTGRES_USER' not set")
	case AlloyDBPostgresPass:
		t.Fatal("'ALLOYDB_POSTGRES_PASS' not set")
	}
	return map[string]any{
		"kind":     AlloyDBPostgresSourceKind,
		"project":  AlloyDBPostgresProject,
		"cluster":  AlloyDBPostgresCluster,
		"instance": AlloyDBPostgresInstance,
		"region":   AlloyDBPostgresRegion,
		"database": AlloyDBPostgresDatabase,
		"user":     AlloyDBPostgresUser,
		"password": AlloyDBPostgresPass,
	}
}

// Copied over from  alloydb_pg.go
func getAlloyDBDialOpts(ipType string) ([]alloydbconn.DialOption, error) {
	switch strings.ToLower(ipType) {
	case "private":
		return []alloydbconn.DialOption{alloydbconn.WithPrivateIP()}, nil
	case "public":
		return []alloydbconn.DialOption{alloydbconn.WithPublicIP()}, nil
	default:
		return nil, fmt.Errorf("invalid ipType %s", ipType)
	}
}

// Copied over from  alloydb_pg.go
func initAlloyDBPgConnectionPool(project, region, cluster, instance, ipType, user, pass, dbname string) (*pgxpool.Pool, error) {
	// Configure the driver to connect to the database
	dsn := fmt.Sprintf("user=%s password=%s dbname=%s sslmode=disable", user, pass, dbname)
	config, err := pgxpool.ParseConfig(dsn)
	if err != nil {
		return nil, fmt.Errorf("unable to parse connection uri: %w", err)
	}

	// Create a new dialer with options
	dialOpts, err := getAlloyDBDialOpts(ipType)
	if err != nil {
		return nil, err
	}
	d, err := alloydbconn.NewDialer(context.Background(), alloydbconn.WithDefaultDialOptions(dialOpts...))
	if err != nil {
		return nil, fmt.Errorf("unable to parse connection uri: %w", err)
	}

	// Tell the driver to use the AlloyDB Go Connector to create connections
	i := fmt.Sprintf("projects/%s/locations/%s/clusters/%s/instances/%s", project, region, cluster, instance)
	config.ConnConfig.DialFunc = func(ctx context.Context, _ string, instance string) (net.Conn, error) {
		return d.Dial(ctx, i)
	}

	// Interact with the driver directly as you normally would
	pool, err := pgxpool.NewWithConfig(context.Background(), config)
	if err != nil {
		return nil, err
	}
	return pool, nil
}

func TestAlloyDBPgToolEndpoints(t *testing.T) {
	sourceConfig := getAlloyDBPgVars(t)
	ctx, cancel := context.WithTimeout(context.Background(), time.Minute)
	defer cancel()

	var args []string

	pool, err := initAlloyDBPgConnectionPool(AlloyDBPostgresProject, AlloyDBPostgresRegion, AlloyDBPostgresCluster, AlloyDBPostgresInstance, "public", AlloyDBPostgresUser, AlloyDBPostgresPass, AlloyDBPostgresDatabase)
	if err != nil {
		t.Fatalf("unable to create AlloyDB connection pool: %s", err)
	}

	// create table name with UUID
	tableNameParam := "param_table_" + strings.ReplaceAll(uuid.New().String(), "-", "")
	tableNameAuth := "auth_table_" + strings.ReplaceAll(uuid.New().String(), "-", "")
	tableNameTemplateParam := "template_param_table_" + strings.ReplaceAll(uuid.New().String(), "-", "")

	// set up data for param tool
	createParamTableStmt, insertParamTableStmt, paramToolStmt, idParamToolStmt, nameParamToolStmt, arrayToolStmt, paramTestParams := tests.GetPostgresSQLParamToolInfo(tableNameParam)
	teardownTable1 := tests.SetupPostgresSQLTable(t, ctx, pool, createParamTableStmt, insertParamTableStmt, tableNameParam, paramTestParams)
	defer teardownTable1(t)

	// set up data for auth tool
	createAuthTableStmt, insertAuthTableStmt, authToolStmt, authTestParams := tests.GetPostgresSQLAuthToolInfo(tableNameAuth)
	teardownTable2 := tests.SetupPostgresSQLTable(t, ctx, pool, createAuthTableStmt, insertAuthTableStmt, tableNameAuth, authTestParams)
	defer teardownTable2(t)

	// Write config into a file and pass it to command
	toolsFile := tests.GetToolsConfig(sourceConfig, AlloyDBPostgresToolKind, paramToolStmt, idParamToolStmt, nameParamToolStmt, arrayToolStmt, authToolStmt)
	toolsFile = tests.AddExecuteSqlConfig(t, toolsFile, "postgres-execute-sql")
	tmplSelectCombined, tmplSelectFilterCombined := tests.GetPostgresSQLTmplToolStatement()
	toolsFile = tests.AddTemplateParamConfig(t, toolsFile, AlloyDBPostgresToolKind, tmplSelectCombined, tmplSelectFilterCombined, "")

	cmd, cleanup, err := tests.StartCmd(ctx, toolsFile, args...)
	if err != nil {
		t.Fatalf("command initialization returned an error: %s", err)
	}
	defer cleanup()

	waitCtx, cancel := context.WithTimeout(ctx, 10*time.Second)
	defer cancel()
	out, err := testutils.WaitForString(waitCtx, regexp.MustCompile(`Server ready to serve`), cmd.Out)
	if err != nil {
		t.Logf("toolbox command logs: \n%s", out)
		t.Fatalf("toolbox didn't start successfully: %s", err)
	}

	// Get configs for tests
	select1Want, failInvocationWant, createTableStatement, mcpSelect1Want := tests.GetPostgresWants()

	// Run tests
	tests.RunToolGetTest(t)
	tests.RunToolInvokeTest(t, select1Want)
	tests.RunMCPToolCallMethod(t, failInvocationWant, mcpSelect1Want)
	tests.RunExecuteSqlToolInvokeTest(t, createTableStatement, select1Want)
	tests.RunToolInvokeWithTemplateParameters(t, tableNameTemplateParam)
}

// Test connection with different IP type
func TestAlloyDBPgIpConnection(t *testing.T) {
	sourceConfig := getAlloyDBPgVars(t)

	tcs := []struct {
		name   string
		ipType string
	}{
		{
			name:   "public ip",
			ipType: "public",
		},
		{
			name:   "private ip",
			ipType: "private",
		},
	}
	for _, tc := range tcs {
		t.Run(tc.name, func(t *testing.T) {
			sourceConfig["ipType"] = tc.ipType
			err := tests.RunSourceConnectionTest(t, sourceConfig, AlloyDBPostgresToolKind)
			if err != nil {
				t.Fatalf("Connection test failure: %s", err)
			}
		})
	}
}

// Test IAM connection
func TestAlloyDBPgIAMConnection(t *testing.T) {
	getAlloyDBPgVars(t)
	// service account email used for IAM should trim the suffix
	serviceAccountEmail := strings.TrimSuffix(tests.ServiceAccountEmail, ".gserviceaccount.com")

	noPassSourceConfig := map[string]any{
		"kind":     AlloyDBPostgresSourceKind,
		"project":  AlloyDBPostgresProject,
		"cluster":  AlloyDBPostgresCluster,
		"instance": AlloyDBPostgresInstance,
		"region":   AlloyDBPostgresRegion,
		"database": AlloyDBPostgresDatabase,
		"user":     serviceAccountEmail,
	}

	noUserSourceConfig := map[string]any{
		"kind":     AlloyDBPostgresSourceKind,
		"project":  AlloyDBPostgresProject,
		"cluster":  AlloyDBPostgresCluster,
		"instance": AlloyDBPostgresInstance,
		"region":   AlloyDBPostgresRegion,
		"database": AlloyDBPostgresDatabase,
		"password": "random",
	}

	noUserNoPassSourceConfig := map[string]any{
		"kind":     AlloyDBPostgresSourceKind,
		"project":  AlloyDBPostgresProject,
		"cluster":  AlloyDBPostgresCluster,
		"instance": AlloyDBPostgresInstance,
		"region":   AlloyDBPostgresRegion,
		"database": AlloyDBPostgresDatabase,
	}
	tcs := []struct {
		name         string
		sourceConfig map[string]any
		isErr        bool
	}{
		{
			name:         "no user no pass",
			sourceConfig: noUserNoPassSourceConfig,
			isErr:        false,
		},
		{
			name:         "no password",
			sourceConfig: noPassSourceConfig,
			isErr:        false,
		},
		{
			name:         "no user",
			sourceConfig: noUserSourceConfig,
			isErr:        true,
		},
	}
	for _, tc := range tcs {
		t.Run(tc.name, func(t *testing.T) {
			err := tests.RunSourceConnectionTest(t, tc.sourceConfig, AlloyDBPostgresToolKind)
			if err != nil {
				if tc.isErr {
					return
				}
				t.Fatalf("Connection test failure: %s", err)
			}
			if tc.isErr {
				t.Fatalf("Expected error but test passed.")
			}
		})
	}
}

```

--------------------------------------------------------------------------------
/internal/tools/bigquery/bigquerysearchcatalog/bigquerysearchcatalog.go:
--------------------------------------------------------------------------------

```go
// Copyright 2025 Google LLC
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
//     http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.

package bigquerysearchcatalog

import (
	"context"
	"fmt"
	"strings"

	dataplexapi "cloud.google.com/go/dataplex/apiv1"
	dataplexpb "cloud.google.com/go/dataplex/apiv1/dataplexpb"
	"github.com/goccy/go-yaml"
	"github.com/googleapis/genai-toolbox/internal/sources"
	bigqueryds "github.com/googleapis/genai-toolbox/internal/sources/bigquery"
	"github.com/googleapis/genai-toolbox/internal/tools"
	"google.golang.org/api/iterator"
)

const kind string = "bigquery-search-catalog"

func init() {
	if !tools.Register(kind, newConfig) {
		panic(fmt.Sprintf("tool kind %q already registered", kind))
	}
}

func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.ToolConfig, error) {
	actual := Config{Name: name}
	if err := decoder.DecodeContext(ctx, &actual); err != nil {
		return nil, err
	}
	return actual, nil
}

type compatibleSource interface {
	MakeDataplexCatalogClient() func() (*dataplexapi.CatalogClient, bigqueryds.DataplexClientCreator, error)
	BigQueryProject() string
	UseClientAuthorization() bool
}

// validate compatible sources are still compatible
var _ compatibleSource = &bigqueryds.Source{}

var compatibleSources = [...]string{bigqueryds.SourceKind}

type Config struct {
	Name         string   `yaml:"name" validate:"required"`
	Kind         string   `yaml:"kind" validate:"required"`
	Source       string   `yaml:"source" validate:"required"`
	Description  string   `yaml:"description"`
	AuthRequired []string `yaml:"authRequired"`
}

// validate interface
var _ tools.ToolConfig = Config{}

func (cfg Config) ToolConfigKind() string {
	return kind
}

func (cfg Config) Initialize(srcs map[string]sources.Source) (tools.Tool, error) {
	// Initialize the search configuration with the provided sources
	rawS, ok := srcs[cfg.Source]
	if !ok {
		return nil, fmt.Errorf("no source named %q configured", cfg.Source)
	}
	// verify the source is compatible
	s, ok := rawS.(compatibleSource)
	if !ok {
		return nil, fmt.Errorf("invalid source for %q tool: source kind must be one of %q", kind, compatibleSources)
	}

	// Get the Dataplex client using the method from the source
	makeCatalogClient := s.MakeDataplexCatalogClient()

	prompt := tools.NewStringParameter("prompt", "Prompt representing search intention. Do not rewrite the prompt.")
	datasetIds := tools.NewArrayParameterWithDefault("datasetIds", []any{}, "Array of dataset IDs.", tools.NewStringParameter("datasetId", "The IDs of the bigquery dataset."))
	projectIds := tools.NewArrayParameterWithDefault("projectIds", []any{}, "Array of project IDs.", tools.NewStringParameter("projectId", "The IDs of the bigquery project."))
	types := tools.NewArrayParameterWithDefault("types", []any{}, "Array of data types to filter by.", tools.NewStringParameter("type", "The type of the data. Accepted values are: CONNECTION, POLICY, DATASET, MODEL, ROUTINE, TABLE, VIEW."))
	pageSize := tools.NewIntParameterWithDefault("pageSize", 5, "Number of results in the search page.")
	parameters := tools.Parameters{prompt, datasetIds, projectIds, types, pageSize}

	description := "Use this tool to find tables, views, models, routines or connections."
	if cfg.Description != "" {
		description = cfg.Description
	}
	mcpManifest := tools.GetMcpManifest(cfg.Name, description, cfg.AuthRequired, parameters)

	t := Tool{
		Name:              cfg.Name,
		Kind:              kind,
		Parameters:        parameters,
		AuthRequired:      cfg.AuthRequired,
		UseClientOAuth:    s.UseClientAuthorization(),
		MakeCatalogClient: makeCatalogClient,
		ProjectID:         s.BigQueryProject(),
		manifest: tools.Manifest{
			Description:  cfg.Description,
			Parameters:   parameters.Manifest(),
			AuthRequired: cfg.AuthRequired,
		},
		mcpManifest: mcpManifest,
	}
	return t, nil
}

type Tool struct {
	Name              string
	Kind              string
	Parameters        tools.Parameters
	AuthRequired      []string
	UseClientOAuth    bool
	MakeCatalogClient func() (*dataplexapi.CatalogClient, bigqueryds.DataplexClientCreator, error)
	ProjectID         string
	manifest          tools.Manifest
	mcpManifest       tools.McpManifest
}

func (t Tool) Authorized(verifiedAuthServices []string) bool {
	return tools.IsAuthorized(t.AuthRequired, verifiedAuthServices)
}

func (t Tool) RequiresClientAuthorization() bool {
	return t.UseClientOAuth
}

func constructSearchQueryHelper(predicate string, operator string, items []string) string {
	if len(items) == 0 {
		return ""
	}

	if len(items) == 1 {
		return predicate + operator + items[0]
	}

	var builder strings.Builder
	builder.WriteString("(")
	for i, item := range items {
		if i > 0 {
			builder.WriteString(" OR ")
		}
		builder.WriteString(predicate)
		builder.WriteString(operator)
		builder.WriteString(item)
	}
	builder.WriteString(")")
	return builder.String()
}

func constructSearchQuery(projectIds []string, datasetIds []string, types []string) string {
	queryParts := []string{}

	if clause := constructSearchQueryHelper("projectid", "=", projectIds); clause != "" {
		queryParts = append(queryParts, clause)
	}

	if clause := constructSearchQueryHelper("parent", "=", datasetIds); clause != "" {
		queryParts = append(queryParts, clause)
	}

	if clause := constructSearchQueryHelper("type", "=", types); clause != "" {
		queryParts = append(queryParts, clause)
	}
	queryParts = append(queryParts, "system=bigquery")

	return strings.Join(queryParts, " AND ")
}

type Response struct {
	DisplayName   string
	Description   string
	Type          string
	Resource      string
	DataplexEntry string
}

var typeMap = map[string]string{
	"bigquery-connection":  "CONNECTION",
	"bigquery-data-policy": "POLICY",
	"bigquery-dataset":     "DATASET",
	"bigquery-model":       "MODEL",
	"bigquery-routine":     "ROUTINE",
	"bigquery-table":       "TABLE",
	"bigquery-view":        "VIEW",
}

func ExtractType(resourceString string) string {
	lastIndex := strings.LastIndex(resourceString, "/")
	if lastIndex == -1 {
		// No "/" found, return the original string
		return resourceString
	}
	return typeMap[resourceString[lastIndex+1:]]
}

func (t Tool) Invoke(ctx context.Context, params tools.ParamValues, accessToken tools.AccessToken) (any, error) {
	paramsMap := params.AsMap()
	pageSize := int32(paramsMap["pageSize"].(int))
	prompt, _ := paramsMap["prompt"].(string)
	projectIdSlice, err := tools.ConvertAnySliceToTyped(paramsMap["projectIds"].([]any), "string")
	if err != nil {
		return nil, fmt.Errorf("can't convert projectIds to array of strings: %s", err)
	}
	projectIds := projectIdSlice.([]string)
	datasetIdSlice, err := tools.ConvertAnySliceToTyped(paramsMap["datasetIds"].([]any), "string")
	if err != nil {
		return nil, fmt.Errorf("can't convert datasetIds to array of strings: %s", err)
	}
	datasetIds := datasetIdSlice.([]string)
	typesSlice, err := tools.ConvertAnySliceToTyped(paramsMap["types"].([]any), "string")
	if err != nil {
		return nil, fmt.Errorf("can't convert types to array of strings: %s", err)
	}
	types := typesSlice.([]string)

	req := &dataplexpb.SearchEntriesRequest{
		Query:          fmt.Sprintf("%s %s", prompt, constructSearchQuery(projectIds, datasetIds, types)),
		Name:           fmt.Sprintf("projects/%s/locations/global", t.ProjectID),
		PageSize:       pageSize,
		SemanticSearch: true,
	}

	catalogClient, dataplexClientCreator, _ := t.MakeCatalogClient()

	if t.UseClientOAuth {
		tokenStr, err := accessToken.ParseBearerToken()
		if err != nil {
			return nil, fmt.Errorf("error parsing access token: %w", err)
		}
		catalogClient, err = dataplexClientCreator(tokenStr)
		if err != nil {
			return nil, fmt.Errorf("error creating client from OAuth access token: %w", err)
		}
	}

	it := catalogClient.SearchEntries(ctx, req)
	if it == nil {
		return nil, fmt.Errorf("failed to create search entries iterator for project %q", t.ProjectID)
	}

	var results []Response
	for {
		entry, err := it.Next()
		if err == iterator.Done {
			break
		}
		if err != nil {
			break
		}
		entrySource := entry.DataplexEntry.GetEntrySource()
		resp := Response{
			DisplayName:   entrySource.GetDisplayName(),
			Description:   entrySource.GetDescription(),
			Type:          ExtractType(entry.DataplexEntry.GetEntryType()),
			Resource:      entrySource.GetResource(),
			DataplexEntry: entry.DataplexEntry.GetName(),
		}
		results = append(results, resp)
	}
	return results, nil
}

func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (tools.ParamValues, error) {
	// Parse parameters from the provided data
	return tools.ParseParams(t.Parameters, data, claims)
}

func (t Tool) Manifest() tools.Manifest {
	// Returns the tool manifest
	return t.manifest
}

func (t Tool) McpManifest() tools.McpManifest {
	// Returns the tool MCP manifest
	return t.mcpManifest
}

```

--------------------------------------------------------------------------------
/docs/en/resources/tools/http/http.md:
--------------------------------------------------------------------------------

```markdown
---
title: "http"
type: docs
weight: 1
description: >
  A "http" tool sends out an HTTP request to an HTTP endpoint.
aliases:
- /resources/tools/http
---


## About

The `http` tool allows you to make HTTP requests to APIs to retrieve data.
An HTTP request is the method by which a client communicates with a server to
retrieve or manipulate resources.
Toolbox allows you to configure the request URL, method, headers, query
parameters, and the request body for an HTTP Tool.

### URL

An HTTP request URL identifies the target the client wants to access.
Toolbox composes the request URL from 3 places:

1. The HTTP Source's `baseUrl`.
2. The HTTP Tool's `path` field.
3. The HTTP Tool's `pathParams` for dynamic path composed during Tool
   invocation.

For example, the following config allows you to reach different paths of the
same server using multiple Tools:

```yaml
sources:
    my-http-source:
        kind: http
        baseUrl: https://api.example.com

tools:
    my-post-tool:
        kind: http
        source: my-http-source
        method: POST
        path: /update
        description: Tool to update information to the example API

    my-get-tool:
        kind: http
        source: my-http-source
        method: GET
        path: /search
        description: Tool to search information from the example API

    my-dynamic-path-tool:
        kind: http
        source: my-http-source
        method: GET
        path: /{{.myPathParam}}/search
        description: Tool to reach endpoint based on the input to `myPathParam`
        pathParams:
            - name: myPathParam
              type: string
              description: The dynamic path parameter

```

### Headers

An HTTP request header is a key-value pair sent by a client to a server,
providing additional information about the request, such as the client's
preferences, the request body content type, and other metadata.
Headers specified by the HTTP Tool are combined with its HTTP Source headers for
the resulting HTTP request, and override the Source headers in case of conflict.
The HTTP Tool allows you to specify headers in two different ways:

- Static headers can be specified using the `headers` field, and will be the
  same for every invocation:

```yaml
my-http-tool:
    kind: http
    source: my-http-source
    method: GET
    path: /search
    description: Tool to search data from API
    headers:
      Authorization: API_KEY
      Content-Type: application/json
```

- Dynamic headers can be specified as parameters in the `headerParams` field.
  The `name` of the `headerParams` will be used as the header key, and the value
  is determined by the LLM input upon Tool invocation:

```yaml
my-http-tool:
    kind: http
    source: my-http-source
    method: GET
    path: /search
    description: some description
    headerParams:
      - name: Content-Type # Example LLM input: "application/json"
        description: request content type
        type: string
```

### Query parameters

Query parameters are key-value pairs appended to a URL after a question mark (?)
to provide additional information to the server for processing the request, like
filtering or sorting data.

- Static request query parameters should be specified in the `path` as part of
  the URL itself:

```yaml
my-http-tool:
    kind: http
    source: my-http-source
    method: GET
    path: /search?language=en&id=1
    description: Tool to search for item with ID 1 in English
```

- Dynamic request query parameters should be specified as parameters in the
  `queryParams` section:

```yaml
my-http-tool:
    kind: http
    source: my-http-source
    method: GET
    path: /search
    description: Tool to search for item with ID
    queryParams:
      - name: id
        description: item ID
        type: integer
```

### Request body

The request body payload is a string that supports parameter replacement
following [Go template][go-template-doc]'s annotations.
The parameter names in the `requestBody` should be preceded by "." and enclosed
by double curly brackets "{{}}". The values will be populated into the request
body payload upon Tool invocation.

Example:

```yaml
my-http-tool:
    kind: http
    source: my-http-source
    method: GET
    path: /search
    description: Tool to search for person with name and age
    requestBody: |
      {
        "age": {{.age}},
        "name": "{{.name}}"
      }
    bodyParams:
      - name: age
        description: age number
        type: integer
      - name: name
        description: name string
        type: string
```

#### Formatting Parameters

Some complex parameters (such as arrays) may require additional formatting to
match the expected output. For convenience, you can specify one of the following
pre-defined functions before the parameter name to format it:

##### JSON

The `json` keyword converts a parameter into a JSON format.

{{< notice note >}}
Using JSON may add quotes to the variable name for certain types (such as
strings).
{{< /notice >}}

Example:

```yaml
requestBody: |
  {
    "age": {{json .age}},
    "name": {{json .name}},
    "nickname": "{{json .nickname}}",
    "nameArray": {{json .nameArray}}
  }
```

will send the following output:

```yaml
{
  "age": 18,
  "name": "Katherine",
  "nickname": ""Kat"", # Duplicate quotes
  "nameArray": ["A", "B", "C"]
}
```

## Example

```yaml
my-http-tool:
    kind: http
    source: my-http-source
    method: GET
    path: /search
    description: some description
    authRequired:
      - my-google-auth-service
      - other-auth-service
    queryParams:
      - name: country
        description: some description
        type: string
    requestBody: |
      {
        "age": {{.age}},
        "city": "{{.city}}"
      }
    bodyParams:
      - name: age
        description: age number
        type: integer
      - name: city
        description: city string
        type: string
    headers:
      Authorization: API_KEY
      Content-Type: application/json
    headerParams:
      - name: Language
        description: language string
        type: string
```

## Reference

| **field**    |                  **type**                  | **required** | **description**                                                                                                                                                                                                            |
|--------------|:------------------------------------------:|:------------:|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| kind         |                   string                   |     true     | Must be "http".                                                                                                                                                                                                            |
| source       |                   string                   |     true     | Name of the source the HTTP request should be sent to.                                                                                                                                                                     |
| description  |                   string                   |     true     | Description of the tool that is passed to the LLM.                                                                                                                                                                         |
| path         |                   string                   |     true     | The path of the HTTP request. You can include static query parameters in the path string.                                                                                                                                  |
| method       |                   string                   |     true     | The HTTP method to use (e.g., GET, POST, PUT, DELETE).                                                                                                                                                                     |
| headers      |             map[string]string              |    false     | A map of headers to include in the HTTP request (overrides source headers).                                                                                                                                                |
| requestBody  |                   string                   |    false     | The request body payload. Use [go template][go-template-doc] with the parameter name as the placeholder (e.g., `{{.id}}` will be replaced with the value of the parameter that has name `id` in the `bodyParams` section). |
| queryParams  | [parameters](../#specifying-parameters) |    false     | List of [parameters](../#specifying-parameters) that will be inserted into the query string.                                                                                                                            |
| bodyParams   | [parameters](../#specifying-parameters) |    false     | List of [parameters](../#specifying-parameters) that will be inserted into the request body payload.                                                                                                                    |
| headerParams | [parameters](../#specifying-parameters) |    false     | List of [parameters](../#specifying-parameters) that will be inserted as the request headers.                                                                                                                           |

[go-template-doc]: <https://pkg.go.dev/text/template#pkg-overview>

```

--------------------------------------------------------------------------------
/docs/en/how-to/connect-ide/postgres_mcp.md:
--------------------------------------------------------------------------------

```markdown
---
title: "PostgreSQL using MCP"
type: docs
weight: 2
description: >
  Connect your IDE to PostgreSQL using Toolbox.
---

[Model Context Protocol (MCP)](https://modelcontextprotocol.io/introduction) is
an open protocol for connecting Large Language Models (LLMs) to data sources
like Postgres. This guide covers how to use [MCP Toolbox for Databases][toolbox]
to expose your developer assistant tools to a Postgres instance:

* [Cursor][cursor]
* [Windsurf][windsurf] (Codium)
* [Visual Studio Code][vscode] (Copilot)
* [Cline][cline]  (VS Code extension)
* [Claude desktop][claudedesktop]
* [Claude code][claudecode]
* [Gemini CLI][geminicli]
* [Gemini Code Assist][geminicodeassist]

[toolbox]: https://github.com/googleapis/genai-toolbox
[cursor]: #configure-your-mcp-client
[windsurf]: #configure-your-mcp-client
[vscode]: #configure-your-mcp-client
[cline]: #configure-your-mcp-client
[claudedesktop]: #configure-your-mcp-client
[claudecode]: #configure-your-mcp-client
[geminicli]: #configure-your-mcp-client
[geminicodeassist]: #configure-your-mcp-client

{{< notice tip >}}
This guide can be used with [AlloyDB
Omni](https://cloud.google.com/alloydb/omni/current/docs/overview).
{{< /notice >}}

## Set up the database

1. Create or select a PostgreSQL instance.

    * [Install PostgreSQL locally](https://www.postgresql.org/download/)
    * [Install AlloyDB Omni](https://cloud.google.com/alloydb/omni/current/docs/quickstart)

1. Create or reuse [a database
   user](https://cloud.google.com/alloydb/omni/current/docs/database-users/manage-users)
   and have the username and password ready.

## Install MCP Toolbox

1. Download the latest version of Toolbox as a binary. Select the [correct
   binary](https://github.com/googleapis/genai-toolbox/releases) corresponding
   to your OS and CPU architecture. You are required to use Toolbox version
   V0.6.0+:

   <!-- {x-release-please-start-version} -->
   {{< tabpane persist=header >}}
{{< tab header="linux/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.18.0/linux/amd64/toolbox
{{< /tab >}}

{{< tab header="darwin/arm64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.18.0/darwin/arm64/toolbox
{{< /tab >}}

{{< tab header="darwin/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.18.0/darwin/amd64/toolbox
{{< /tab >}}

{{< tab header="windows/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.18.0/windows/amd64/toolbox.exe
{{< /tab >}}
{{< /tabpane >}}
    <!-- {x-release-please-end} -->

1. Make the binary executable:

    ```bash
    chmod +x toolbox
    ```

1. Verify the installation:

    ```bash
    ./toolbox --version
    ```

## Configure your MCP Client

{{< tabpane text=true >}}
{{% tab header="Claude code" lang="en" %}}

1. Install [Claude
   Code](https://docs.anthropic.com/en/docs/agents-and-tools/claude-code/overview).
1. Create a `.mcp.json` file in your project root if it doesn't exist.
1. Add the following configuration, replace the environment variables with your
   values, and save:

    ```json
    {
      "mcpServers": {
        "postgres": {
          "command": "./PATH/TO/toolbox",
          "args": ["--prebuilt","postgres","--stdio"],
          "env": {
            "POSTGRES_HOST": "",
            "POSTGRES_PORT": "",
            "POSTGRES_DATABASE": "",
            "POSTGRES_USER": "",
            "POSTGRES_PASSWORD": ""
          }
        }
      }
    }
    ```

1. Restart Claude code to apply the new configuration.
{{% /tab %}}

{{% tab header="Claude desktop" lang="en" %}}

1. Open [Claude desktop](https://claude.ai/download) and navigate to Settings.
1. Under the Developer tab, tap Edit Config to open the configuration file.
1. Add the following configuration, replace the environment variables with your
   values, and save:

    ```json
    {
      "mcpServers": {
        "postgres": {
          "command": "./PATH/TO/toolbox",
          "args": ["--prebuilt","postgres","--stdio"],
          "env": {
            "POSTGRES_HOST": "",
            "POSTGRES_PORT": "",
            "POSTGRES_DATABASE": "",
            "POSTGRES_USER": "",
            "POSTGRES_PASSWORD": ""
          }
        }
      }
    }
    ```

1. Restart Claude desktop.
1. From the new chat screen, you should see a hammer (MCP) icon appear with the
   new MCP server available.
{{% /tab %}}

{{% tab header="Cline" lang="en" %}}

1. Open the [Cline](https://github.com/cline/cline) extension in VS Code and tap
   the **MCP Servers** icon.
1. Tap Configure MCP Servers to open the configuration file.
1. Add the following configuration, replace the environment variables with your
   values, and save:

    ```json
    {
      "mcpServers": {
        "postgres": {
          "command": "./PATH/TO/toolbox",
          "args": ["--prebuilt","postgres","--stdio"],
          "env": {
            "POSTGRES_HOST": "",
            "POSTGRES_PORT": "",
            "POSTGRES_DATABASE": "",
            "POSTGRES_USER": "",
            "POSTGRES_PASSWORD": ""
          }
        }
      }
    }
    ```

1. You should see a green active status after the server is successfully
   connected.
{{% /tab %}}

{{% tab header="Cursor" lang="en" %}}

1. Create a `.cursor` directory in your project root if it doesn't exist.
1. Create a `.cursor/mcp.json` file if it doesn't exist and open it.
1. Add the following configuration, replace the environment variables with your
   values, and save:

    ```json
    {
      "mcpServers": {
        "postgres": {
          "command": "./PATH/TO/toolbox",
          "args": ["--prebuilt","postgres","--stdio"],
          "env": {
            "POSTGRES_HOST": "",
            "POSTGRES_PORT": "",
            "POSTGRES_DATABASE": "",
            "POSTGRES_USER": "",
            "POSTGRES_PASSWORD": ""
          }
        }
      }
    }
    ```

1. [Cursor](https://www.cursor.com/) and navigate to **Settings > Cursor
   Settings > MCP**. You should see a green active status after the server is
   successfully connected.
{{% /tab %}}

{{% tab header="Visual Studio Code (Copilot)" lang="en" %}}

1. Open [VS Code](https://code.visualstudio.com/docs/copilot/overview) and
   create a `.vscode` directory in your project root if it doesn't exist.
1. Create a `.vscode/mcp.json` file if it doesn't exist and open it.
1. Add the following configuration, replace the environment variables with your
   values, and save:

    ```json
    {
      "servers": {
        "postgres": {
          "command": "./PATH/TO/toolbox",
          "args": ["--prebuilt","postgres","--stdio"],
          "env": {
            "POSTGRES_HOST": "",
            "POSTGRES_PORT": "",
            "POSTGRES_DATABASE": "",
            "POSTGRES_USER": "",
            "POSTGRES_PASSWORD": ""
          }
        }
      }
    }
    ```

{{% /tab %}}

{{% tab header="Windsurf" lang="en" %}}

1. Open [Windsurf](https://docs.codeium.com/windsurf) and navigate to the
   Cascade assistant.
1. Tap on the hammer (MCP) icon, then Configure to open the configuration file.
1. Add the following configuration, replace the environment variables with your
   values, and save:

    ```json
    {
      "mcpServers": {
        "postgres": {
          "command": "./PATH/TO/toolbox",
          "args": ["--prebuilt","postgres","--stdio"],
          "env": {
            "POSTGRES_HOST": "",
            "POSTGRES_PORT": "",
            "POSTGRES_DATABASE": "",
            "POSTGRES_USER": "",
            "POSTGRES_PASSWORD": ""
          }
        }
      }
    }

    ```

{{% /tab %}}

{{% tab header="Gemini CLI" lang="en" %}}

1.  Install the [Gemini CLI](https://github.com/google-gemini/gemini-cli?tab=readme-ov-file#quickstart).
1.  In your working directory, create a folder named `.gemini`. Within it, create a `settings.json` file.
1.  Add the following configuration, replace the environment variables with your values, and then save:

    ```json
    {
      "mcpServers": {
        "postgres": {
          "command": "./PATH/TO/toolbox",
          "args": ["--prebuilt","postgres","--stdio"],
          "env": {
            "POSTGRES_HOST": "",
            "POSTGRES_PORT": "",
            "POSTGRES_DATABASE": "",
            "POSTGRES_USER": "",
            "POSTGRES_PASSWORD": ""
          }
        }
      }
    }
    ```
{{% /tab %}}

{{% tab header="Gemini Code Assist" lang="en" %}}

1.  Install the [Gemini Code Assist](https://marketplace.visualstudio.com/items?itemName=Google.geminicodeassist) extension in Visual Studio Code.
1.  Enable Agent Mode in Gemini Code Assist chat.
1.  In your working directory, create a folder named `.gemini`. Within it, create a `settings.json` file.
1.  Add the following configuration, replace the environment variables with your values, and then save:

    ```json
    {
      "mcpServers": {
        "postgres": {
          "command": "./PATH/TO/toolbox",
          "args": ["--prebuilt","postgres","--stdio"],
          "env": {
            "POSTGRES_HOST": "",
            "POSTGRES_PORT": "",
            "POSTGRES_DATABASE": "",
            "POSTGRES_USER": "",
            "POSTGRES_PASSWORD": ""
          }
        }
      }
    }
    ```
{{% /tab %}}
{{< /tabpane >}}

## Use Tools

Your AI tool is now connected to Postgres using MCP. Try asking your AI
assistant to list tables, create a table, or define and execute other SQL
statements.

The following tools are available to the LLM:

1. **list_tables**: lists tables and descriptions
1. **execute_sql**: execute any SQL statement

{{< notice note >}}
Prebuilt tools are pre-1.0, so expect some tool changes between versions. LLMs
will adapt to the tools available, so this shouldn't affect most users.
{{< /notice >}}

```
Page 19/36FirstPrevNextLast