This is page 3 of 76. Use http://codebase.md/googleapis/genai-toolbox?lines=true&page={x} to view the full context.
# Directory Structure
```
├── .ci
│ ├── continuous.release.cloudbuild.yaml
│ ├── generate_release_table.sh
│ ├── integration.cloudbuild.yaml
│ ├── quickstart_test
│ │ ├── go.integration.cloudbuild.yaml
│ │ ├── js.integration.cloudbuild.yaml
│ │ ├── py.integration.cloudbuild.yaml
│ │ ├── run_go_tests.sh
│ │ ├── run_js_tests.sh
│ │ ├── run_py_tests.sh
│ │ └── setup_hotels_sample.sql
│ ├── test_prompts_with_coverage.sh
│ ├── test_with_coverage.sh
│ └── versioned.release.cloudbuild.yaml
├── .gemini
│ └── config.yaml
├── .github
│ ├── auto-label.yaml
│ ├── blunderbuss.yml
│ ├── CODEOWNERS
│ ├── header-checker-lint.yml
│ ├── ISSUE_TEMPLATE
│ │ ├── bug_report.yml
│ │ ├── config.yml
│ │ ├── feature_request.yml
│ │ └── question.yml
│ ├── label-sync.yml
│ ├── labels.yaml
│ ├── PULL_REQUEST_TEMPLATE.md
│ ├── release-please.yml
│ ├── renovate.json5
│ ├── sync-repo-settings.yaml
│ ├── trusted-contribution.yml
│ └── workflows
│ ├── cloud_build_failure_reporter.yml
│ ├── deploy_dev_docs.yaml
│ ├── deploy_previous_version_docs.yaml
│ ├── deploy_versioned_docs.yaml
│ ├── docs_preview_clean.yaml
│ ├── docs_preview_deploy.yaml
│ ├── link_checker_workflow.yaml
│ ├── lint.yaml
│ ├── publish-mcp.yml
│ ├── schedule_reporter.yml
│ ├── sync-labels.yaml
│ └── tests.yaml
├── .gitignore
├── .gitmodules
├── .golangci.yaml
├── .hugo
│ ├── archetypes
│ │ └── default.md
│ ├── assets
│ │ ├── icons
│ │ │ └── logo.svg
│ │ └── scss
│ │ ├── _styles_project.scss
│ │ └── _variables_project.scss
│ ├── go.mod
│ ├── go.sum
│ ├── hugo.toml
│ ├── layouts
│ │ ├── _default
│ │ │ └── home.releases.releases
│ │ ├── index.llms-full.txt
│ │ ├── index.llms.txt
│ │ ├── partials
│ │ │ ├── hooks
│ │ │ │ └── head-end.html
│ │ │ ├── navbar-version-selector.html
│ │ │ ├── page-meta-links.html
│ │ │ └── td
│ │ │ └── render-heading.html
│ │ ├── robot.txt
│ │ └── shortcodes
│ │ ├── include.html
│ │ ├── ipynb.html
│ │ └── regionInclude.html
│ ├── package-lock.json
│ ├── package.json
│ └── static
│ ├── favicons
│ │ ├── android-chrome-192x192.png
│ │ ├── android-chrome-512x512.png
│ │ ├── apple-touch-icon.png
│ │ ├── favicon-16x16.png
│ │ ├── favicon-32x32.png
│ │ └── favicon.ico
│ └── js
│ └── w3.js
├── .lycheeignore
├── CHANGELOG.md
├── cmd
│ ├── options_test.go
│ ├── options.go
│ ├── root_test.go
│ ├── root.go
│ └── version.txt
├── CODE_OF_CONDUCT.md
├── CONTRIBUTING.md
├── DEVELOPER.md
├── Dockerfile
├── docs
│ ├── ALLOYDBADMIN_README.md
│ ├── ALLOYDBPG_README.md
│ ├── BIGQUERY_README.md
│ ├── CLOUDSQLMSSQL_README.md
│ ├── CLOUDSQLMSSQLADMIN_README.md
│ ├── CLOUDSQLMYSQL_README.md
│ ├── CLOUDSQLMYSQLADMIN_README.md
│ ├── CLOUDSQLPG_README.md
│ ├── CLOUDSQLPGADMIN_README.md
│ ├── DATAPLEX_README.md
│ ├── en
│ │ ├── _index.md
│ │ ├── about
│ │ │ ├── _index.md
│ │ │ └── faq.md
│ │ ├── blogs
│ │ │ └── _index.md
│ │ ├── concepts
│ │ │ ├── _index.md
│ │ │ └── telemetry
│ │ │ ├── index.md
│ │ │ ├── telemetry_flow.png
│ │ │ └── telemetry_traces.png
│ │ ├── getting-started
│ │ │ ├── _index.md
│ │ │ ├── colab_quickstart.ipynb
│ │ │ ├── configure.md
│ │ │ ├── introduction
│ │ │ │ ├── _index.md
│ │ │ │ └── architecture.png
│ │ │ ├── local_quickstart_go.md
│ │ │ ├── local_quickstart_js.md
│ │ │ ├── local_quickstart.md
│ │ │ ├── mcp_quickstart
│ │ │ │ ├── _index.md
│ │ │ │ ├── inspector_tools.png
│ │ │ │ └── inspector.png
│ │ │ ├── prompts_quickstart_gemini_cli.md
│ │ │ └── quickstart
│ │ │ ├── go
│ │ │ │ ├── adkgo
│ │ │ │ │ ├── go.mod
│ │ │ │ │ ├── go.sum
│ │ │ │ │ └── quickstart.go
│ │ │ │ ├── genAI
│ │ │ │ │ ├── go.mod
│ │ │ │ │ ├── go.sum
│ │ │ │ │ └── quickstart.go
│ │ │ │ ├── genkit
│ │ │ │ │ ├── go.mod
│ │ │ │ │ ├── go.sum
│ │ │ │ │ └── quickstart.go
│ │ │ │ ├── langchain
│ │ │ │ │ ├── go.mod
│ │ │ │ │ ├── go.sum
│ │ │ │ │ └── quickstart.go
│ │ │ │ ├── openAI
│ │ │ │ │ ├── go.mod
│ │ │ │ │ ├── go.sum
│ │ │ │ │ └── quickstart.go
│ │ │ │ └── quickstart_test.go
│ │ │ ├── golden.txt
│ │ │ ├── js
│ │ │ │ ├── adk
│ │ │ │ │ ├── package-lock.json
│ │ │ │ │ ├── package.json
│ │ │ │ │ └── quickstart.js
│ │ │ │ ├── genAI
│ │ │ │ │ ├── package-lock.json
│ │ │ │ │ ├── package.json
│ │ │ │ │ └── quickstart.js
│ │ │ │ ├── genkit
│ │ │ │ │ ├── package-lock.json
│ │ │ │ │ ├── package.json
│ │ │ │ │ └── quickstart.js
│ │ │ │ ├── langchain
│ │ │ │ │ ├── package-lock.json
│ │ │ │ │ ├── package.json
│ │ │ │ │ └── quickstart.js
│ │ │ │ ├── llamaindex
│ │ │ │ │ ├── package-lock.json
│ │ │ │ │ ├── package.json
│ │ │ │ │ └── quickstart.js
│ │ │ │ └── quickstart.test.js
│ │ │ ├── python
│ │ │ │ ├── __init__.py
│ │ │ │ ├── adk
│ │ │ │ │ ├── quickstart.py
│ │ │ │ │ └── requirements.txt
│ │ │ │ ├── core
│ │ │ │ │ ├── quickstart.py
│ │ │ │ │ └── requirements.txt
│ │ │ │ ├── langchain
│ │ │ │ │ ├── quickstart.py
│ │ │ │ │ └── requirements.txt
│ │ │ │ ├── llamaindex
│ │ │ │ │ ├── quickstart.py
│ │ │ │ │ └── requirements.txt
│ │ │ │ └── quickstart_test.py
│ │ │ └── shared
│ │ │ ├── cloud_setup.md
│ │ │ ├── configure_toolbox.md
│ │ │ └── database_setup.md
│ │ ├── how-to
│ │ │ ├── _index.md
│ │ │ ├── connect_via_geminicli.md
│ │ │ ├── connect_via_mcp.md
│ │ │ ├── connect-ide
│ │ │ │ ├── _index.md
│ │ │ │ ├── alloydb_pg_admin_mcp.md
│ │ │ │ ├── alloydb_pg_mcp.md
│ │ │ │ ├── bigquery_mcp.md
│ │ │ │ ├── cloud_sql_mssql_admin_mcp.md
│ │ │ │ ├── cloud_sql_mssql_mcp.md
│ │ │ │ ├── cloud_sql_mysql_admin_mcp.md
│ │ │ │ ├── cloud_sql_mysql_mcp.md
│ │ │ │ ├── cloud_sql_pg_admin_mcp.md
│ │ │ │ ├── cloud_sql_pg_mcp.md
│ │ │ │ ├── firestore_mcp.md
│ │ │ │ ├── looker_mcp.md
│ │ │ │ ├── mssql_mcp.md
│ │ │ │ ├── mysql_mcp.md
│ │ │ │ ├── neo4j_mcp.md
│ │ │ │ ├── postgres_mcp.md
│ │ │ │ ├── spanner_mcp.md
│ │ │ │ └── sqlite_mcp.md
│ │ │ ├── deploy_adk_agent.md
│ │ │ ├── deploy_docker.md
│ │ │ ├── deploy_gke.md
│ │ │ ├── deploy_toolbox.md
│ │ │ ├── export_telemetry.md
│ │ │ └── toolbox-ui
│ │ │ ├── edit-headers.gif
│ │ │ ├── edit-headers.png
│ │ │ ├── index.md
│ │ │ ├── optional-param-checked.png
│ │ │ ├── optional-param-unchecked.png
│ │ │ ├── run-tool.gif
│ │ │ ├── tools.png
│ │ │ └── toolsets.png
│ │ ├── reference
│ │ │ ├── _index.md
│ │ │ ├── cli.md
│ │ │ └── prebuilt-tools.md
│ │ ├── resources
│ │ │ ├── _index.md
│ │ │ ├── authServices
│ │ │ │ ├── _index.md
│ │ │ │ └── google.md
│ │ │ ├── embeddingModels
│ │ │ │ ├── _index.md
│ │ │ │ └── gemini.md
│ │ │ ├── prompts
│ │ │ │ ├── _index.md
│ │ │ │ └── custom
│ │ │ │ └── _index.md
│ │ │ ├── sources
│ │ │ │ ├── _index.md
│ │ │ │ ├── alloydb-admin.md
│ │ │ │ ├── alloydb-pg.md
│ │ │ │ ├── bigquery.md
│ │ │ │ ├── bigtable.md
│ │ │ │ ├── cassandra.md
│ │ │ │ ├── clickhouse.md
│ │ │ │ ├── cloud-gda.md
│ │ │ │ ├── cloud-healthcare.md
│ │ │ │ ├── cloud-monitoring.md
│ │ │ │ ├── cloud-sql-admin.md
│ │ │ │ ├── cloud-sql-mssql.md
│ │ │ │ ├── cloud-sql-mysql.md
│ │ │ │ ├── cloud-sql-pg.md
│ │ │ │ ├── couchbase.md
│ │ │ │ ├── dataplex.md
│ │ │ │ ├── dgraph.md
│ │ │ │ ├── elasticsearch.md
│ │ │ │ ├── firebird.md
│ │ │ │ ├── firestore.md
│ │ │ │ ├── http.md
│ │ │ │ ├── looker.md
│ │ │ │ ├── mariadb.md
│ │ │ │ ├── mindsdb.md
│ │ │ │ ├── mongodb.md
│ │ │ │ ├── mssql.md
│ │ │ │ ├── mysql.md
│ │ │ │ ├── neo4j.md
│ │ │ │ ├── oceanbase.md
│ │ │ │ ├── oracle.md
│ │ │ │ ├── postgres.md
│ │ │ │ ├── redis.md
│ │ │ │ ├── serverless-spark.md
│ │ │ │ ├── singlestore.md
│ │ │ │ ├── snowflake.md
│ │ │ │ ├── spanner.md
│ │ │ │ ├── sqlite.md
│ │ │ │ ├── tidb.md
│ │ │ │ ├── trino.md
│ │ │ │ ├── valkey.md
│ │ │ │ └── yugabytedb.md
│ │ │ └── tools
│ │ │ ├── _index.md
│ │ │ ├── alloydb
│ │ │ │ ├── _index.md
│ │ │ │ ├── alloydb-create-cluster.md
│ │ │ │ ├── alloydb-create-instance.md
│ │ │ │ ├── alloydb-create-user.md
│ │ │ │ ├── alloydb-get-cluster.md
│ │ │ │ ├── alloydb-get-instance.md
│ │ │ │ ├── alloydb-get-user.md
│ │ │ │ ├── alloydb-list-clusters.md
│ │ │ │ ├── alloydb-list-instances.md
│ │ │ │ ├── alloydb-list-users.md
│ │ │ │ └── alloydb-wait-for-operation.md
│ │ │ ├── alloydbainl
│ │ │ │ ├── _index.md
│ │ │ │ └── alloydb-ai-nl.md
│ │ │ ├── bigquery
│ │ │ │ ├── _index.md
│ │ │ │ ├── bigquery-analyze-contribution.md
│ │ │ │ ├── bigquery-conversational-analytics.md
│ │ │ │ ├── bigquery-execute-sql.md
│ │ │ │ ├── bigquery-forecast.md
│ │ │ │ ├── bigquery-get-dataset-info.md
│ │ │ │ ├── bigquery-get-table-info.md
│ │ │ │ ├── bigquery-list-dataset-ids.md
│ │ │ │ ├── bigquery-list-table-ids.md
│ │ │ │ ├── bigquery-search-catalog.md
│ │ │ │ └── bigquery-sql.md
│ │ │ ├── bigtable
│ │ │ │ ├── _index.md
│ │ │ │ └── bigtable-sql.md
│ │ │ ├── cassandra
│ │ │ │ ├── _index.md
│ │ │ │ └── cassandra-cql.md
│ │ │ ├── clickhouse
│ │ │ │ ├── _index.md
│ │ │ │ ├── clickhouse-execute-sql.md
│ │ │ │ ├── clickhouse-list-databases.md
│ │ │ │ ├── clickhouse-list-tables.md
│ │ │ │ └── clickhouse-sql.md
│ │ │ ├── cloudgda
│ │ │ │ ├── _index.md
│ │ │ │ └── cloud-gda-query.md
│ │ │ ├── cloudhealthcare
│ │ │ │ ├── _index.md
│ │ │ │ ├── cloud-healthcare-fhir-fetch-page.md
│ │ │ │ ├── cloud-healthcare-fhir-patient-everything.md
│ │ │ │ ├── cloud-healthcare-fhir-patient-search.md
│ │ │ │ ├── cloud-healthcare-get-dataset.md
│ │ │ │ ├── cloud-healthcare-get-dicom-store-metrics.md
│ │ │ │ ├── cloud-healthcare-get-dicom-store.md
│ │ │ │ ├── cloud-healthcare-get-fhir-resource.md
│ │ │ │ ├── cloud-healthcare-get-fhir-store-metrics.md
│ │ │ │ ├── cloud-healthcare-get-fhir-store.md
│ │ │ │ ├── cloud-healthcare-list-dicom-stores.md
│ │ │ │ ├── cloud-healthcare-list-fhir-stores.md
│ │ │ │ ├── cloud-healthcare-retrieve-rendered-dicom-instance.md
│ │ │ │ ├── cloud-healthcare-search-dicom-instances.md
│ │ │ │ ├── cloud-healthcare-search-dicom-series.md
│ │ │ │ └── cloud-healthcare-search-dicom-studies.md
│ │ │ ├── cloudmonitoring
│ │ │ │ ├── _index.md
│ │ │ │ └── cloud-monitoring-query-prometheus.md
│ │ │ ├── cloudsql
│ │ │ │ ├── _index.md
│ │ │ │ ├── cloudsqlcloneinstance.md
│ │ │ │ ├── cloudsqlcreatedatabase.md
│ │ │ │ ├── cloudsqlcreateusers.md
│ │ │ │ ├── cloudsqlgetinstances.md
│ │ │ │ ├── cloudsqllistdatabases.md
│ │ │ │ ├── cloudsqllistinstances.md
│ │ │ │ ├── cloudsqlmssqlcreateinstance.md
│ │ │ │ ├── cloudsqlmysqlcreateinstance.md
│ │ │ │ ├── cloudsqlpgcreateinstances.md
│ │ │ │ ├── cloudsqlpgupgradeprecheck.md
│ │ │ │ └── cloudsqlwaitforoperation.md
│ │ │ ├── couchbase
│ │ │ │ ├── _index.md
│ │ │ │ └── couchbase-sql.md
│ │ │ ├── dataform
│ │ │ │ ├── _index.md
│ │ │ │ └── dataform-compile-local.md
│ │ │ ├── dataplex
│ │ │ │ ├── _index.md
│ │ │ │ ├── dataplex-lookup-entry.md
│ │ │ │ ├── dataplex-search-aspect-types.md
│ │ │ │ └── dataplex-search-entries.md
│ │ │ ├── dgraph
│ │ │ │ ├── _index.md
│ │ │ │ └── dgraph-dql.md
│ │ │ ├── elasticsearch
│ │ │ │ ├── _index.md
│ │ │ │ └── elasticsearch-esql.md
│ │ │ ├── firebird
│ │ │ │ ├── _index.md
│ │ │ │ ├── firebird-execute-sql.md
│ │ │ │ └── firebird-sql.md
│ │ │ ├── firestore
│ │ │ │ ├── _index.md
│ │ │ │ ├── firestore-add-documents.md
│ │ │ │ ├── firestore-delete-documents.md
│ │ │ │ ├── firestore-get-documents.md
│ │ │ │ ├── firestore-get-rules.md
│ │ │ │ ├── firestore-list-collections.md
│ │ │ │ ├── firestore-query-collection.md
│ │ │ │ ├── firestore-query.md
│ │ │ │ ├── firestore-update-document.md
│ │ │ │ └── firestore-validate-rules.md
│ │ │ ├── http
│ │ │ │ ├── _index.md
│ │ │ │ └── http.md
│ │ │ ├── looker
│ │ │ │ ├── _index.md
│ │ │ │ ├── looker-add-dashboard-element.md
│ │ │ │ ├── looker-add-dashboard-filter.md
│ │ │ │ ├── looker-conversational-analytics.md
│ │ │ │ ├── looker-create-project-file.md
│ │ │ │ ├── looker-delete-project-file.md
│ │ │ │ ├── looker-dev-mode.md
│ │ │ │ ├── looker-generate-embed-url.md
│ │ │ │ ├── looker-get-connection-databases.md
│ │ │ │ ├── looker-get-connection-schemas.md
│ │ │ │ ├── looker-get-connection-table-columns.md
│ │ │ │ ├── looker-get-connection-tables.md
│ │ │ │ ├── looker-get-connections.md
│ │ │ │ ├── looker-get-dashboards.md
│ │ │ │ ├── looker-get-dimensions.md
│ │ │ │ ├── looker-get-explores.md
│ │ │ │ ├── looker-get-filters.md
│ │ │ │ ├── looker-get-looks.md
│ │ │ │ ├── looker-get-measures.md
│ │ │ │ ├── looker-get-models.md
│ │ │ │ ├── looker-get-parameters.md
│ │ │ │ ├── looker-get-project-file.md
│ │ │ │ ├── looker-get-project-files.md
│ │ │ │ ├── looker-get-projects.md
│ │ │ │ ├── looker-health-analyze.md
│ │ │ │ ├── looker-health-pulse.md
│ │ │ │ ├── looker-health-vacuum.md
│ │ │ │ ├── looker-make-dashboard.md
│ │ │ │ ├── looker-make-look.md
│ │ │ │ ├── looker-query-sql.md
│ │ │ │ ├── looker-query-url.md
│ │ │ │ ├── looker-query.md
│ │ │ │ ├── looker-run-dashboard.md
│ │ │ │ ├── looker-run-look.md
│ │ │ │ └── looker-update-project-file.md
│ │ │ ├── mindsdb
│ │ │ │ ├── _index.md
│ │ │ │ ├── mindsdb-execute-sql.md
│ │ │ │ └── mindsdb-sql.md
│ │ │ ├── mongodb
│ │ │ │ ├── _index.md
│ │ │ │ ├── mongodb-aggregate.md
│ │ │ │ ├── mongodb-delete-many.md
│ │ │ │ ├── mongodb-delete-one.md
│ │ │ │ ├── mongodb-find-one.md
│ │ │ │ ├── mongodb-find.md
│ │ │ │ ├── mongodb-insert-many.md
│ │ │ │ ├── mongodb-insert-one.md
│ │ │ │ ├── mongodb-update-many.md
│ │ │ │ └── mongodb-update-one.md
│ │ │ ├── mssql
│ │ │ │ ├── _index.md
│ │ │ │ ├── mssql-execute-sql.md
│ │ │ │ ├── mssql-list-tables.md
│ │ │ │ └── mssql-sql.md
│ │ │ ├── mysql
│ │ │ │ ├── _index.md
│ │ │ │ ├── mysql-execute-sql.md
│ │ │ │ ├── mysql-get-query-plan.md
│ │ │ │ ├── mysql-list-active-queries.md
│ │ │ │ ├── mysql-list-table-fragmentation.md
│ │ │ │ ├── mysql-list-tables-missing-unique-indexes.md
│ │ │ │ ├── mysql-list-tables.md
│ │ │ │ └── mysql-sql.md
│ │ │ ├── neo4j
│ │ │ │ ├── _index.md
│ │ │ │ ├── neo4j-cypher.md
│ │ │ │ ├── neo4j-execute-cypher.md
│ │ │ │ └── neo4j-schema.md
│ │ │ ├── oceanbase
│ │ │ │ ├── _index.md
│ │ │ │ ├── oceanbase-execute-sql.md
│ │ │ │ └── oceanbase-sql.md
│ │ │ ├── oracle
│ │ │ │ ├── _index.md
│ │ │ │ ├── oracle-execute-sql.md
│ │ │ │ └── oracle-sql.md
│ │ │ ├── postgres
│ │ │ │ ├── _index.md
│ │ │ │ ├── postgres-database-overview.md
│ │ │ │ ├── postgres-execute-sql.md
│ │ │ │ ├── postgres-get-column-cardinality.md
│ │ │ │ ├── postgres-list-active-queries.md
│ │ │ │ ├── postgres-list-available-extensions.md
│ │ │ │ ├── postgres-list-database-stats.md
│ │ │ │ ├── postgres-list-indexes.md
│ │ │ │ ├── postgres-list-installed-extensions.md
│ │ │ │ ├── postgres-list-locks.md
│ │ │ │ ├── postgres-list-pg-settings.md
│ │ │ │ ├── postgres-list-publication-tables.md
│ │ │ │ ├── postgres-list-query-stats.md
│ │ │ │ ├── postgres-list-roles.md
│ │ │ │ ├── postgres-list-schemas.md
│ │ │ │ ├── postgres-list-sequences.md
│ │ │ │ ├── postgres-list-stored-procedure.md
│ │ │ │ ├── postgres-list-table-stats.md
│ │ │ │ ├── postgres-list-tables.md
│ │ │ │ ├── postgres-list-tablespaces.md
│ │ │ │ ├── postgres-list-triggers.md
│ │ │ │ ├── postgres-list-views.md
│ │ │ │ ├── postgres-long-running-transactions.md
│ │ │ │ ├── postgres-replication-stats.md
│ │ │ │ └── postgres-sql.md
│ │ │ ├── redis
│ │ │ │ ├── _index.md
│ │ │ │ └── redis.md
│ │ │ ├── serverless-spark
│ │ │ │ ├── _index.md
│ │ │ │ ├── serverless-spark-cancel-batch.md
│ │ │ │ ├── serverless-spark-create-pyspark-batch.md
│ │ │ │ ├── serverless-spark-create-spark-batch.md
│ │ │ │ ├── serverless-spark-get-batch.md
│ │ │ │ └── serverless-spark-list-batches.md
│ │ │ ├── singlestore
│ │ │ │ ├── _index.md
│ │ │ │ ├── singlestore-execute-sql.md
│ │ │ │ └── singlestore-sql.md
│ │ │ ├── snowflake
│ │ │ │ ├── _index.md
│ │ │ │ ├── snowflake-execute-sql.md
│ │ │ │ └── snowflake-sql.md
│ │ │ ├── spanner
│ │ │ │ ├── _index.md
│ │ │ │ ├── spanner-execute-sql.md
│ │ │ │ ├── spanner-list-graphs.md
│ │ │ │ ├── spanner-list-tables.md
│ │ │ │ └── spanner-sql.md
│ │ │ ├── sqlite
│ │ │ │ ├── _index.md
│ │ │ │ ├── sqlite-execute-sql.md
│ │ │ │ └── sqlite-sql.md
│ │ │ ├── tidb
│ │ │ │ ├── _index.md
│ │ │ │ ├── tidb-execute-sql.md
│ │ │ │ └── tidb-sql.md
│ │ │ ├── trino
│ │ │ │ ├── _index.md
│ │ │ │ ├── trino-execute-sql.md
│ │ │ │ └── trino-sql.md
│ │ │ ├── utility
│ │ │ │ ├── _index.md
│ │ │ │ └── wait.md
│ │ │ ├── valkey
│ │ │ │ ├── _index.md
│ │ │ │ └── valkey.md
│ │ │ └── yuagbytedb
│ │ │ ├── _index.md
│ │ │ └── yugabytedb-sql.md
│ │ ├── samples
│ │ │ ├── _index.md
│ │ │ ├── alloydb
│ │ │ │ ├── _index.md
│ │ │ │ ├── ai-nl
│ │ │ │ │ ├── alloydb_ai_nl.ipynb
│ │ │ │ │ └── index.md
│ │ │ │ └── mcp_quickstart.md
│ │ │ ├── bigquery
│ │ │ │ ├── _index.md
│ │ │ │ ├── colab_quickstart_bigquery.ipynb
│ │ │ │ ├── local_quickstart.md
│ │ │ │ └── mcp_quickstart
│ │ │ │ ├── _index.md
│ │ │ │ ├── inspector_tools.png
│ │ │ │ └── inspector.png
│ │ │ ├── looker
│ │ │ │ ├── _index.md
│ │ │ │ ├── looker_gemini_oauth
│ │ │ │ │ ├── _index.md
│ │ │ │ │ ├── authenticated.png
│ │ │ │ │ ├── authorize.png
│ │ │ │ │ └── registration.png
│ │ │ │ ├── looker_gemini.md
│ │ │ │ └── looker_mcp_inspector
│ │ │ │ ├── _index.md
│ │ │ │ ├── inspector_tools.png
│ │ │ │ └── inspector.png
│ │ │ └── snowflake
│ │ │ ├── _index.md
│ │ │ ├── runme.py
│ │ │ ├── snowflake-config.yaml
│ │ │ ├── snowflake-env.sh
│ │ │ └── test-snowflake.sh
│ │ └── sdks
│ │ ├── _index.md
│ │ ├── go-sdk.md
│ │ ├── js-sdk.md
│ │ └── python-sdk.md
│ ├── LOOKER_README.md
│ ├── SPANNER_README.md
│ └── TOOLBOX_README.md
├── gemini-extension.json
├── go.mod
├── go.sum
├── internal
│ ├── auth
│ │ ├── auth.go
│ │ └── google
│ │ └── google.go
│ ├── embeddingmodels
│ │ ├── embeddingmodels.go
│ │ └── gemini
│ │ ├── gemini_test.go
│ │ └── gemini.go
│ ├── log
│ │ ├── handler.go
│ │ ├── log_test.go
│ │ ├── log.go
│ │ └── logger.go
│ ├── prebuiltconfigs
│ │ ├── prebuiltconfigs_test.go
│ │ ├── prebuiltconfigs.go
│ │ └── tools
│ │ ├── alloydb-postgres-admin.yaml
│ │ ├── alloydb-postgres-observability.yaml
│ │ ├── alloydb-postgres.yaml
│ │ ├── bigquery.yaml
│ │ ├── clickhouse.yaml
│ │ ├── cloud-healthcare.yaml
│ │ ├── cloud-sql-mssql-admin.yaml
│ │ ├── cloud-sql-mssql-observability.yaml
│ │ ├── cloud-sql-mssql.yaml
│ │ ├── cloud-sql-mysql-admin.yaml
│ │ ├── cloud-sql-mysql-observability.yaml
│ │ ├── cloud-sql-mysql.yaml
│ │ ├── cloud-sql-postgres-admin.yaml
│ │ ├── cloud-sql-postgres-observability.yaml
│ │ ├── cloud-sql-postgres.yaml
│ │ ├── dataplex.yaml
│ │ ├── elasticsearch.yaml
│ │ ├── firestore.yaml
│ │ ├── looker-conversational-analytics.yaml
│ │ ├── looker.yaml
│ │ ├── mindsdb.yaml
│ │ ├── mssql.yaml
│ │ ├── mysql.yaml
│ │ ├── neo4j.yaml
│ │ ├── oceanbase.yaml
│ │ ├── postgres.yaml
│ │ ├── serverless-spark.yaml
│ │ ├── singlestore.yaml
│ │ ├── snowflake.yaml
│ │ ├── spanner-postgres.yaml
│ │ ├── spanner.yaml
│ │ └── sqlite.yaml
│ ├── prompts
│ │ ├── arguments_test.go
│ │ ├── arguments.go
│ │ ├── custom
│ │ │ ├── custom_test.go
│ │ │ └── custom.go
│ │ ├── messages_test.go
│ │ ├── messages.go
│ │ ├── prompts_test.go
│ │ ├── prompts.go
│ │ ├── promptsets_test.go
│ │ └── promptsets.go
│ ├── server
│ │ ├── api_test.go
│ │ ├── api.go
│ │ ├── common_test.go
│ │ ├── config.go
│ │ ├── mcp
│ │ │ ├── jsonrpc
│ │ │ │ ├── jsonrpc_test.go
│ │ │ │ └── jsonrpc.go
│ │ │ ├── mcp.go
│ │ │ ├── util
│ │ │ │ └── lifecycle.go
│ │ │ ├── v20241105
│ │ │ │ ├── method.go
│ │ │ │ └── types.go
│ │ │ ├── v20250326
│ │ │ │ ├── method.go
│ │ │ │ └── types.go
│ │ │ └── v20250618
│ │ │ ├── method.go
│ │ │ └── types.go
│ │ ├── mcp_test.go
│ │ ├── mcp.go
│ │ ├── resources
│ │ │ ├── resources_test.go
│ │ │ └── resources.go
│ │ ├── server_test.go
│ │ ├── server.go
│ │ ├── static
│ │ │ ├── assets
│ │ │ │ └── mcptoolboxlogo.png
│ │ │ ├── css
│ │ │ │ └── style.css
│ │ │ ├── index.html
│ │ │ ├── js
│ │ │ │ ├── auth.js
│ │ │ │ ├── loadTools.js
│ │ │ │ ├── mainContent.js
│ │ │ │ ├── navbar.js
│ │ │ │ ├── runTool.js
│ │ │ │ ├── toolDisplay.js
│ │ │ │ ├── tools.js
│ │ │ │ └── toolsets.js
│ │ │ ├── tools.html
│ │ │ └── toolsets.html
│ │ ├── web_test.go
│ │ └── web.go
│ ├── sources
│ │ ├── alloydbadmin
│ │ │ ├── alloydbadmin_test.go
│ │ │ └── alloydbadmin.go
│ │ ├── alloydbpg
│ │ │ ├── alloydb_pg_test.go
│ │ │ └── alloydb_pg.go
│ │ ├── bigquery
│ │ │ ├── bigquery_test.go
│ │ │ ├── bigquery.go
│ │ │ └── cache.go
│ │ ├── bigtable
│ │ │ ├── bigtable_test.go
│ │ │ └── bigtable.go
│ │ ├── cassandra
│ │ │ ├── cassandra_test.go
│ │ │ └── cassandra.go
│ │ ├── clickhouse
│ │ │ ├── clickhouse_test.go
│ │ │ └── clickhouse.go
│ │ ├── cloudgda
│ │ │ ├── cloud_gda_test.go
│ │ │ └── cloud_gda.go
│ │ ├── cloudhealthcare
│ │ │ ├── cloud_healthcare_test.go
│ │ │ └── cloud_healthcare.go
│ │ ├── cloudmonitoring
│ │ │ ├── cloud_monitoring_test.go
│ │ │ └── cloud_monitoring.go
│ │ ├── cloudsqladmin
│ │ │ ├── cloud_sql_admin_test.go
│ │ │ └── cloud_sql_admin.go
│ │ ├── cloudsqlmssql
│ │ │ ├── cloud_sql_mssql_test.go
│ │ │ └── cloud_sql_mssql.go
│ │ ├── cloudsqlmysql
│ │ │ ├── cloud_sql_mysql_test.go
│ │ │ └── cloud_sql_mysql.go
│ │ ├── cloudsqlpg
│ │ │ ├── cloud_sql_pg_test.go
│ │ │ └── cloud_sql_pg.go
│ │ ├── couchbase
│ │ │ ├── couchbase_test.go
│ │ │ └── couchbase.go
│ │ ├── dataplex
│ │ │ ├── dataplex_test.go
│ │ │ └── dataplex.go
│ │ ├── dgraph
│ │ │ ├── dgraph_test.go
│ │ │ └── dgraph.go
│ │ ├── dialect.go
│ │ ├── elasticsearch
│ │ │ ├── elasticsearch_test.go
│ │ │ └── elasticsearch.go
│ │ ├── firebird
│ │ │ ├── firebird_test.go
│ │ │ └── firebird.go
│ │ ├── firestore
│ │ │ ├── firestore_test.go
│ │ │ └── firestore.go
│ │ ├── http
│ │ │ ├── http_test.go
│ │ │ └── http.go
│ │ ├── ip_type.go
│ │ ├── looker
│ │ │ ├── looker_test.go
│ │ │ └── looker.go
│ │ ├── mindsdb
│ │ │ ├── mindsdb_test.go
│ │ │ └── mindsdb.go
│ │ ├── mongodb
│ │ │ ├── mongodb_test.go
│ │ │ └── mongodb.go
│ │ ├── mssql
│ │ │ ├── mssql_test.go
│ │ │ └── mssql.go
│ │ ├── mysql
│ │ │ ├── mysql_test.go
│ │ │ └── mysql.go
│ │ ├── neo4j
│ │ │ ├── neo4j_test.go
│ │ │ └── neo4j.go
│ │ ├── oceanbase
│ │ │ ├── oceanbase_test.go
│ │ │ └── oceanbase.go
│ │ ├── oracle
│ │ │ ├── oracle_test.go
│ │ │ └── oracle.go
│ │ ├── postgres
│ │ │ ├── postgres_test.go
│ │ │ └── postgres.go
│ │ ├── redis
│ │ │ ├── redis_test.go
│ │ │ └── redis.go
│ │ ├── serverlessspark
│ │ │ ├── serverlessspark_test.go
│ │ │ ├── serverlessspark.go
│ │ │ ├── url_test.go
│ │ │ └── url.go
│ │ ├── singlestore
│ │ │ ├── singlestore_test.go
│ │ │ └── singlestore.go
│ │ ├── snowflake
│ │ │ ├── snowflake_test.go
│ │ │ └── snowflake.go
│ │ ├── sources.go
│ │ ├── spanner
│ │ │ ├── spanner_test.go
│ │ │ └── spanner.go
│ │ ├── sqlite
│ │ │ ├── sqlite_test.go
│ │ │ └── sqlite.go
│ │ ├── tidb
│ │ │ ├── tidb_test.go
│ │ │ └── tidb.go
│ │ ├── trino
│ │ │ ├── trino_test.go
│ │ │ └── trino.go
│ │ ├── util.go
│ │ ├── valkey
│ │ │ ├── valkey_test.go
│ │ │ └── valkey.go
│ │ └── yugabytedb
│ │ ├── yugabytedb_test.go
│ │ └── yugabytedb.go
│ ├── telemetry
│ │ ├── instrumentation.go
│ │ └── telemetry.go
│ ├── testutils
│ │ └── testutils.go
│ ├── tools
│ │ ├── alloydb
│ │ │ ├── alloydbcreatecluster
│ │ │ │ ├── alloydbcreatecluster_test.go
│ │ │ │ └── alloydbcreatecluster.go
│ │ │ ├── alloydbcreateinstance
│ │ │ │ ├── alloydbcreateinstance_test.go
│ │ │ │ └── alloydbcreateinstance.go
│ │ │ ├── alloydbcreateuser
│ │ │ │ ├── alloydbcreateuser_test.go
│ │ │ │ └── alloydbcreateuser.go
│ │ │ ├── alloydbgetcluster
│ │ │ │ ├── alloydbgetcluster_test.go
│ │ │ │ └── alloydbgetcluster.go
│ │ │ ├── alloydbgetinstance
│ │ │ │ ├── alloydbgetinstance_test.go
│ │ │ │ └── alloydbgetinstance.go
│ │ │ ├── alloydbgetuser
│ │ │ │ ├── alloydbgetuser_test.go
│ │ │ │ └── alloydbgetuser.go
│ │ │ ├── alloydblistclusters
│ │ │ │ ├── alloydblistclusters_test.go
│ │ │ │ └── alloydblistclusters.go
│ │ │ ├── alloydblistinstances
│ │ │ │ ├── alloydblistinstances_test.go
│ │ │ │ └── alloydblistinstances.go
│ │ │ ├── alloydblistusers
│ │ │ │ ├── alloydblistusers_test.go
│ │ │ │ └── alloydblistusers.go
│ │ │ └── alloydbwaitforoperation
│ │ │ ├── alloydbwaitforoperation_test.go
│ │ │ └── alloydbwaitforoperation.go
│ │ ├── alloydbainl
│ │ │ ├── alloydbainl_test.go
│ │ │ └── alloydbainl.go
│ │ ├── bigquery
│ │ │ ├── bigqueryanalyzecontribution
│ │ │ │ ├── bigqueryanalyzecontribution_test.go
│ │ │ │ └── bigqueryanalyzecontribution.go
│ │ │ ├── bigquerycommon
│ │ │ │ ├── table_name_parser_test.go
│ │ │ │ ├── table_name_parser.go
│ │ │ │ └── util.go
│ │ │ ├── bigqueryconversationalanalytics
│ │ │ │ ├── bigqueryconversationalanalytics_test.go
│ │ │ │ └── bigqueryconversationalanalytics.go
│ │ │ ├── bigqueryexecutesql
│ │ │ │ ├── bigqueryexecutesql_test.go
│ │ │ │ └── bigqueryexecutesql.go
│ │ │ ├── bigqueryforecast
│ │ │ │ ├── bigqueryforecast_test.go
│ │ │ │ └── bigqueryforecast.go
│ │ │ ├── bigquerygetdatasetinfo
│ │ │ │ ├── bigquerygetdatasetinfo_test.go
│ │ │ │ └── bigquerygetdatasetinfo.go
│ │ │ ├── bigquerygettableinfo
│ │ │ │ ├── bigquerygettableinfo_test.go
│ │ │ │ └── bigquerygettableinfo.go
│ │ │ ├── bigquerylistdatasetids
│ │ │ │ ├── bigquerylistdatasetids_test.go
│ │ │ │ └── bigquerylistdatasetids.go
│ │ │ ├── bigquerylisttableids
│ │ │ │ ├── bigquerylisttableids_test.go
│ │ │ │ └── bigquerylisttableids.go
│ │ │ ├── bigquerysearchcatalog
│ │ │ │ ├── bigquerysearchcatalog_test.go
│ │ │ │ └── bigquerysearchcatalog.go
│ │ │ └── bigquerysql
│ │ │ ├── bigquerysql_test.go
│ │ │ └── bigquerysql.go
│ │ ├── bigtable
│ │ │ ├── bigtable_test.go
│ │ │ └── bigtable.go
│ │ ├── cassandra
│ │ │ └── cassandracql
│ │ │ ├── cassandracql_test.go
│ │ │ └── cassandracql.go
│ │ ├── clickhouse
│ │ │ ├── clickhouseexecutesql
│ │ │ │ ├── clickhouseexecutesql_test.go
│ │ │ │ └── clickhouseexecutesql.go
│ │ │ ├── clickhouselistdatabases
│ │ │ │ ├── clickhouselistdatabases_test.go
│ │ │ │ └── clickhouselistdatabases.go
│ │ │ ├── clickhouselisttables
│ │ │ │ ├── clickhouselisttables_test.go
│ │ │ │ └── clickhouselisttables.go
│ │ │ └── clickhousesql
│ │ │ ├── clickhousesql_test.go
│ │ │ └── clickhousesql.go
│ │ ├── cloudgda
│ │ │ ├── cloudgda_test.go
│ │ │ ├── cloudgda.go
│ │ │ └── types.go
│ │ ├── cloudhealthcare
│ │ │ ├── cloudhealthcarefhirfetchpage
│ │ │ │ ├── cloudhealthcarefhirfetchpage_test.go
│ │ │ │ └── cloudhealthcarefhirfetchpage.go
│ │ │ ├── cloudhealthcarefhirpatienteverything
│ │ │ │ ├── cloudhealthcarefhirpatienteverything_test.go
│ │ │ │ └── cloudhealthcarefhirpatienteverything.go
│ │ │ ├── cloudhealthcarefhirpatientsearch
│ │ │ │ ├── cloudhealthcarefhirpatientsearch_test.go
│ │ │ │ └── cloudhealthcarefhirpatientsearch.go
│ │ │ ├── cloudhealthcaregetdataset
│ │ │ │ ├── cloudhealthcaregetdataset_test.go
│ │ │ │ └── cloudhealthcaregetdataset.go
│ │ │ ├── cloudhealthcaregetdicomstore
│ │ │ │ ├── cloudhealthcaregetdicomstore_test.go
│ │ │ │ └── cloudhealthcaregetdicomstore.go
│ │ │ ├── cloudhealthcaregetdicomstoremetrics
│ │ │ │ ├── cloudhealthcaregetdicomstoremetrics_test.go
│ │ │ │ └── cloudhealthcaregetdicomstoremetrics.go
│ │ │ ├── cloudhealthcaregetfhirresource
│ │ │ │ ├── cloudhealthcaregetfhirresource_test.go
│ │ │ │ └── cloudhealthcaregetfhirresource.go
│ │ │ ├── cloudhealthcaregetfhirstore
│ │ │ │ ├── cloudhealthcaregetfhirstore_test.go
│ │ │ │ └── cloudhealthcaregetfhirstore.go
│ │ │ ├── cloudhealthcaregetfhirstoremetrics
│ │ │ │ ├── cloudhealthcaregetfhirstoremetrics_test.go
│ │ │ │ └── cloudhealthcaregetfhirstoremetrics.go
│ │ │ ├── cloudhealthcarelistdicomstores
│ │ │ │ ├── cloudhealthcarelistdicomstores_test.go
│ │ │ │ └── cloudhealthcarelistdicomstores.go
│ │ │ ├── cloudhealthcarelistfhirstores
│ │ │ │ ├── cloudhealthcarelistfhirstores_test.go
│ │ │ │ └── cloudhealthcarelistfhirstores.go
│ │ │ ├── cloudhealthcareretrieverendereddicominstance
│ │ │ │ ├── cloudhealthcareretrieverendereddicominstance_test.go
│ │ │ │ └── cloudhealthcareretrieverendereddicominstance.go
│ │ │ ├── cloudhealthcaresearchdicominstances
│ │ │ │ ├── cloudhealthcaresearchdicominstances_test.go
│ │ │ │ └── cloudhealthcaresearchdicominstances.go
│ │ │ ├── cloudhealthcaresearchdicomseries
│ │ │ │ ├── cloudhealthcaresearchdicomseries_test.go
│ │ │ │ └── cloudhealthcaresearchdicomseries.go
│ │ │ ├── cloudhealthcaresearchdicomstudies
│ │ │ │ ├── cloudhealthcaresearchdicomstudies_test.go
│ │ │ │ └── cloudhealthcaresearchdicomstudies.go
│ │ │ └── common
│ │ │ └── util.go
│ │ ├── cloudmonitoring
│ │ │ ├── cloudmonitoring_test.go
│ │ │ └── cloudmonitoring.go
│ │ ├── cloudsql
│ │ │ ├── cloudsqlcloneinstance
│ │ │ │ ├── cloudsqlcloneinstance_test.go
│ │ │ │ └── cloudsqlcloneinstance.go
│ │ │ ├── cloudsqlcreatedatabase
│ │ │ │ ├── cloudsqlcreatedatabase_test.go
│ │ │ │ └── cloudsqlcreatedatabase.go
│ │ │ ├── cloudsqlcreateusers
│ │ │ │ ├── cloudsqlcreateusers_test.go
│ │ │ │ └── cloudsqlcreateusers.go
│ │ │ ├── cloudsqlgetinstances
│ │ │ │ ├── cloudsqlgetinstances_test.go
│ │ │ │ └── cloudsqlgetinstances.go
│ │ │ ├── cloudsqllistdatabases
│ │ │ │ ├── cloudsqllistdatabases_test.go
│ │ │ │ └── cloudsqllistdatabases.go
│ │ │ ├── cloudsqllistinstances
│ │ │ │ ├── cloudsqllistinstances_test.go
│ │ │ │ └── cloudsqllistinstances.go
│ │ │ └── cloudsqlwaitforoperation
│ │ │ ├── cloudsqlwaitforoperation_test.go
│ │ │ └── cloudsqlwaitforoperation.go
│ │ ├── cloudsqlmssql
│ │ │ └── cloudsqlmssqlcreateinstance
│ │ │ ├── cloudsqlmssqlcreateinstance_test.go
│ │ │ └── cloudsqlmssqlcreateinstance.go
│ │ ├── cloudsqlmysql
│ │ │ └── cloudsqlmysqlcreateinstance
│ │ │ ├── cloudsqlmysqlcreateinstance_test.go
│ │ │ └── cloudsqlmysqlcreateinstance.go
│ │ ├── cloudsqlpg
│ │ │ ├── cloudsqlpgcreateinstances
│ │ │ │ ├── cloudsqlpgcreateinstances_test.go
│ │ │ │ └── cloudsqlpgcreateinstances.go
│ │ │ └── cloudsqlpgupgradeprecheck
│ │ │ ├── cloudsqlpgupgradeprecheck_test.go
│ │ │ └── cloudsqlpgupgradeprecheck.go
│ │ ├── couchbase
│ │ │ ├── couchbase_test.go
│ │ │ └── couchbase.go
│ │ ├── dataform
│ │ │ └── dataformcompilelocal
│ │ │ ├── dataformcompilelocal_test.go
│ │ │ └── dataformcompilelocal.go
│ │ ├── dataplex
│ │ │ ├── dataplexlookupentry
│ │ │ │ ├── dataplexlookupentry_test.go
│ │ │ │ └── dataplexlookupentry.go
│ │ │ ├── dataplexsearchaspecttypes
│ │ │ │ ├── dataplexsearchaspecttypes_test.go
│ │ │ │ └── dataplexsearchaspecttypes.go
│ │ │ └── dataplexsearchentries
│ │ │ ├── dataplexsearchentries_test.go
│ │ │ └── dataplexsearchentries.go
│ │ ├── dgraph
│ │ │ ├── dgraph_test.go
│ │ │ └── dgraph.go
│ │ ├── elasticsearch
│ │ │ └── elasticsearchesql
│ │ │ ├── elasticsearchesql_test.go
│ │ │ └── elasticsearchesql.go
│ │ ├── firebird
│ │ │ ├── firebirdexecutesql
│ │ │ │ ├── firebirdexecutesql_test.go
│ │ │ │ └── firebirdexecutesql.go
│ │ │ └── firebirdsql
│ │ │ ├── firebirdsql_test.go
│ │ │ └── firebirdsql.go
│ │ ├── firestore
│ │ │ ├── firestoreadddocuments
│ │ │ │ ├── firestoreadddocuments_test.go
│ │ │ │ └── firestoreadddocuments.go
│ │ │ ├── firestoredeletedocuments
│ │ │ │ ├── firestoredeletedocuments_test.go
│ │ │ │ └── firestoredeletedocuments.go
│ │ │ ├── firestoregetdocuments
│ │ │ │ ├── firestoregetdocuments_test.go
│ │ │ │ └── firestoregetdocuments.go
│ │ │ ├── firestoregetrules
│ │ │ │ ├── firestoregetrules_test.go
│ │ │ │ └── firestoregetrules.go
│ │ │ ├── firestorelistcollections
│ │ │ │ ├── firestorelistcollections_test.go
│ │ │ │ └── firestorelistcollections.go
│ │ │ ├── firestorequery
│ │ │ │ ├── firestorequery_test.go
│ │ │ │ └── firestorequery.go
│ │ │ ├── firestorequerycollection
│ │ │ │ ├── firestorequerycollection_test.go
│ │ │ │ └── firestorequerycollection.go
│ │ │ ├── firestoreupdatedocument
│ │ │ │ ├── firestoreupdatedocument_test.go
│ │ │ │ └── firestoreupdatedocument.go
│ │ │ ├── firestorevalidaterules
│ │ │ │ ├── firestorevalidaterules_test.go
│ │ │ │ └── firestorevalidaterules.go
│ │ │ └── util
│ │ │ ├── converter_test.go
│ │ │ ├── converter.go
│ │ │ ├── validator_test.go
│ │ │ └── validator.go
│ │ ├── http
│ │ │ ├── http_test.go
│ │ │ └── http.go
│ │ ├── http_method.go
│ │ ├── looker
│ │ │ ├── lookeradddashboardelement
│ │ │ │ ├── lookeradddashboardelement_test.go
│ │ │ │ └── lookeradddashboardelement.go
│ │ │ ├── lookeradddashboardfilter
│ │ │ │ ├── lookeradddashboardfilter_test.go
│ │ │ │ └── lookeradddashboardfilter.go
│ │ │ ├── lookercommon
│ │ │ │ ├── lookercommon_test.go
│ │ │ │ └── lookercommon.go
│ │ │ ├── lookerconversationalanalytics
│ │ │ │ ├── lookerconversationalanalytics_test.go
│ │ │ │ └── lookerconversationalanalytics.go
│ │ │ ├── lookercreateprojectfile
│ │ │ │ ├── lookercreateprojectfile_test.go
│ │ │ │ └── lookercreateprojectfile.go
│ │ │ ├── lookerdeleteprojectfile
│ │ │ │ ├── lookerdeleteprojectfile_test.go
│ │ │ │ └── lookerdeleteprojectfile.go
│ │ │ ├── lookerdevmode
│ │ │ │ ├── lookerdevmode_test.go
│ │ │ │ └── lookerdevmode.go
│ │ │ ├── lookergenerateembedurl
│ │ │ │ ├── lookergenerateembedurl_test.go
│ │ │ │ └── lookergenerateembedurl.go
│ │ │ ├── lookergetconnectiondatabases
│ │ │ │ ├── lookergetconnectiondatabases_test.go
│ │ │ │ └── lookergetconnectiondatabases.go
│ │ │ ├── lookergetconnections
│ │ │ │ ├── lookergetconnections_test.go
│ │ │ │ └── lookergetconnections.go
│ │ │ ├── lookergetconnectionschemas
│ │ │ │ ├── lookergetconnectionschemas_test.go
│ │ │ │ └── lookergetconnectionschemas.go
│ │ │ ├── lookergetconnectiontablecolumns
│ │ │ │ ├── lookergetconnectiontablecolumns_test.go
│ │ │ │ └── lookergetconnectiontablecolumns.go
│ │ │ ├── lookergetconnectiontables
│ │ │ │ ├── lookergetconnectiontables_test.go
│ │ │ │ └── lookergetconnectiontables.go
│ │ │ ├── lookergetdashboards
│ │ │ │ ├── lookergetdashboards_test.go
│ │ │ │ └── lookergetdashboards.go
│ │ │ ├── lookergetdimensions
│ │ │ │ ├── lookergetdimensions_test.go
│ │ │ │ └── lookergetdimensions.go
│ │ │ ├── lookergetexplores
│ │ │ │ ├── lookergetexplores_test.go
│ │ │ │ └── lookergetexplores.go
│ │ │ ├── lookergetfilters
│ │ │ │ ├── lookergetfilters_test.go
│ │ │ │ └── lookergetfilters.go
│ │ │ ├── lookergetlooks
│ │ │ │ ├── lookergetlooks_test.go
│ │ │ │ └── lookergetlooks.go
│ │ │ ├── lookergetmeasures
│ │ │ │ ├── lookergetmeasures_test.go
│ │ │ │ └── lookergetmeasures.go
│ │ │ ├── lookergetmodels
│ │ │ │ ├── lookergetmodels_test.go
│ │ │ │ └── lookergetmodels.go
│ │ │ ├── lookergetparameters
│ │ │ │ ├── lookergetparameters_test.go
│ │ │ │ └── lookergetparameters.go
│ │ │ ├── lookergetprojectfile
│ │ │ │ ├── lookergetprojectfile_test.go
│ │ │ │ └── lookergetprojectfile.go
│ │ │ ├── lookergetprojectfiles
│ │ │ │ ├── lookergetprojectfiles_test.go
│ │ │ │ └── lookergetprojectfiles.go
│ │ │ ├── lookergetprojects
│ │ │ │ ├── lookergetprojects_test.go
│ │ │ │ └── lookergetprojects.go
│ │ │ ├── lookerhealthanalyze
│ │ │ │ ├── lookerhealthanalyze_test.go
│ │ │ │ └── lookerhealthanalyze.go
│ │ │ ├── lookerhealthpulse
│ │ │ │ ├── lookerhealthpulse_test.go
│ │ │ │ └── lookerhealthpulse.go
│ │ │ ├── lookerhealthvacuum
│ │ │ │ ├── lookerhealthvacuum_test.go
│ │ │ │ └── lookerhealthvacuum.go
│ │ │ ├── lookermakedashboard
│ │ │ │ ├── lookermakedashboard_test.go
│ │ │ │ └── lookermakedashboard.go
│ │ │ ├── lookermakelook
│ │ │ │ ├── lookermakelook_test.go
│ │ │ │ └── lookermakelook.go
│ │ │ ├── lookerquery
│ │ │ │ ├── lookerquery_test.go
│ │ │ │ └── lookerquery.go
│ │ │ ├── lookerquerysql
│ │ │ │ ├── lookerquerysql_test.go
│ │ │ │ └── lookerquerysql.go
│ │ │ ├── lookerqueryurl
│ │ │ │ ├── lookerqueryurl_test.go
│ │ │ │ └── lookerqueryurl.go
│ │ │ ├── lookerrundashboard
│ │ │ │ ├── lookerrundashboard_test.go
│ │ │ │ └── lookerrundashboard.go
│ │ │ ├── lookerrunlook
│ │ │ │ ├── lookerrunlook_test.go
│ │ │ │ └── lookerrunlook.go
│ │ │ └── lookerupdateprojectfile
│ │ │ ├── lookerupdateprojectfile_test.go
│ │ │ └── lookerupdateprojectfile.go
│ │ ├── mindsdb
│ │ │ ├── mindsdbexecutesql
│ │ │ │ ├── mindsdbexecutesql_test.go
│ │ │ │ └── mindsdbexecutesql.go
│ │ │ └── mindsdbsql
│ │ │ ├── mindsdbsql_test.go
│ │ │ └── mindsdbsql.go
│ │ ├── mongodb
│ │ │ ├── mongodbaggregate
│ │ │ │ ├── mongodbaggregate_test.go
│ │ │ │ └── mongodbaggregate.go
│ │ │ ├── mongodbdeletemany
│ │ │ │ ├── mongodbdeletemany_test.go
│ │ │ │ └── mongodbdeletemany.go
│ │ │ ├── mongodbdeleteone
│ │ │ │ ├── mongodbdeleteone_test.go
│ │ │ │ └── mongodbdeleteone.go
│ │ │ ├── mongodbfind
│ │ │ │ ├── mongodbfind_test.go
│ │ │ │ └── mongodbfind.go
│ │ │ ├── mongodbfindone
│ │ │ │ ├── mongodbfindone_test.go
│ │ │ │ └── mongodbfindone.go
│ │ │ ├── mongodbinsertmany
│ │ │ │ ├── mongodbinsertmany_test.go
│ │ │ │ └── mongodbinsertmany.go
│ │ │ ├── mongodbinsertone
│ │ │ │ ├── mongodbinsertone_test.go
│ │ │ │ └── mongodbinsertone.go
│ │ │ ├── mongodbupdatemany
│ │ │ │ ├── mongodbupdatemany_test.go
│ │ │ │ └── mongodbupdatemany.go
│ │ │ └── mongodbupdateone
│ │ │ ├── mongodbupdateone_test.go
│ │ │ └── mongodbupdateone.go
│ │ ├── mssql
│ │ │ ├── mssqlexecutesql
│ │ │ │ ├── mssqlexecutesql_test.go
│ │ │ │ └── mssqlexecutesql.go
│ │ │ ├── mssqllisttables
│ │ │ │ ├── mssqllisttables_test.go
│ │ │ │ └── mssqllisttables.go
│ │ │ └── mssqlsql
│ │ │ ├── mssqlsql_test.go
│ │ │ └── mssqlsql.go
│ │ ├── mysql
│ │ │ ├── mysqlcommon
│ │ │ │ └── mysqlcommon.go
│ │ │ ├── mysqlexecutesql
│ │ │ │ ├── mysqlexecutesql_test.go
│ │ │ │ └── mysqlexecutesql.go
│ │ │ ├── mysqlgetqueryplan
│ │ │ │ ├── mysqlgetqueryplan_test.go
│ │ │ │ └── mysqlgetqueryplan.go
│ │ │ ├── mysqllistactivequeries
│ │ │ │ ├── mysqllistactivequeries_test.go
│ │ │ │ └── mysqllistactivequeries.go
│ │ │ ├── mysqllisttablefragmentation
│ │ │ │ ├── mysqllisttablefragmentation_test.go
│ │ │ │ └── mysqllisttablefragmentation.go
│ │ │ ├── mysqllisttables
│ │ │ │ ├── mysqllisttables_test.go
│ │ │ │ └── mysqllisttables.go
│ │ │ ├── mysqllisttablesmissinguniqueindexes
│ │ │ │ ├── mysqllisttablesmissinguniqueindexes_test.go
│ │ │ │ └── mysqllisttablesmissinguniqueindexes.go
│ │ │ └── mysqlsql
│ │ │ ├── mysqlsql_test.go
│ │ │ └── mysqlsql.go
│ │ ├── neo4j
│ │ │ ├── neo4jcypher
│ │ │ │ ├── neo4jcypher_test.go
│ │ │ │ └── neo4jcypher.go
│ │ │ ├── neo4jexecutecypher
│ │ │ │ ├── classifier
│ │ │ │ │ ├── classifier_test.go
│ │ │ │ │ └── classifier.go
│ │ │ │ ├── neo4jexecutecypher_test.go
│ │ │ │ └── neo4jexecutecypher.go
│ │ │ └── neo4jschema
│ │ │ ├── cache
│ │ │ │ ├── cache_test.go
│ │ │ │ └── cache.go
│ │ │ ├── helpers
│ │ │ │ ├── helpers_test.go
│ │ │ │ └── helpers.go
│ │ │ ├── neo4jschema_test.go
│ │ │ ├── neo4jschema.go
│ │ │ └── types
│ │ │ └── types.go
│ │ ├── oceanbase
│ │ │ ├── oceanbaseexecutesql
│ │ │ │ ├── oceanbaseexecutesql_test.go
│ │ │ │ └── oceanbaseexecutesql.go
│ │ │ └── oceanbasesql
│ │ │ ├── oceanbasesql_test.go
│ │ │ └── oceanbasesql.go
│ │ ├── oracle
│ │ │ ├── oracleexecutesql
│ │ │ │ ├── oracleexecutesql_test.go
│ │ │ │ └── oracleexecutesql.go
│ │ │ └── oraclesql
│ │ │ ├── oraclesql_test.go
│ │ │ └── oraclesql.go
│ │ ├── postgres
│ │ │ ├── postgresdatabaseoverview
│ │ │ │ ├── postgresdatabaseoverview_test.go
│ │ │ │ └── postgresdatabaseoverview.go
│ │ │ ├── postgresexecutesql
│ │ │ │ ├── postgresexecutesql_test.go
│ │ │ │ └── postgresexecutesql.go
│ │ │ ├── postgresgetcolumncardinality
│ │ │ │ ├── postgresgetcolumncardinality_test.go
│ │ │ │ └── postgresgetcolumncardinality.go
│ │ │ ├── postgreslistactivequeries
│ │ │ │ ├── postgreslistactivequeries_test.go
│ │ │ │ └── postgreslistactivequeries.go
│ │ │ ├── postgreslistavailableextensions
│ │ │ │ ├── postgreslistavailableextensions_test.go
│ │ │ │ └── postgreslistavailableextensions.go
│ │ │ ├── postgreslistdatabasestats
│ │ │ │ ├── postgreslistdatabasestats_test.go
│ │ │ │ └── postgreslistdatabasestats.go
│ │ │ ├── postgreslistindexes
│ │ │ │ ├── postgreslistindexes_test.go
│ │ │ │ └── postgreslistindexes.go
│ │ │ ├── postgreslistinstalledextensions
│ │ │ │ ├── postgreslistinstalledextensions_test.go
│ │ │ │ └── postgreslistinstalledextensions.go
│ │ │ ├── postgreslistlocks
│ │ │ │ ├── postgreslistlocks_test.go
│ │ │ │ └── postgreslistlocks.go
│ │ │ ├── postgreslistpgsettings
│ │ │ │ ├── postgreslistpgsettings_test.go
│ │ │ │ └── postgreslistpgsettings.go
│ │ │ ├── postgreslistpublicationtables
│ │ │ │ ├── postgreslistpublicationtables_test.go
│ │ │ │ └── postgreslistpublicationtables.go
│ │ │ ├── postgreslistquerystats
│ │ │ │ ├── postgreslistquerystats_test.go
│ │ │ │ └── postgreslistquerystats.go
│ │ │ ├── postgreslistroles
│ │ │ │ ├── postgreslistroles_test.go
│ │ │ │ └── postgreslistroles.go
│ │ │ ├── postgreslistschemas
│ │ │ │ ├── postgreslistschemas_test.go
│ │ │ │ └── postgreslistschemas.go
│ │ │ ├── postgreslistsequences
│ │ │ │ ├── postgreslistsequences_test.go
│ │ │ │ └── postgreslistsequences.go
│ │ │ ├── postgresliststoredprocedure
│ │ │ │ ├── postgresliststoredprocedure_test.go
│ │ │ │ └── postgresliststoredprocedure.go
│ │ │ ├── postgreslisttables
│ │ │ │ ├── postgreslisttables_test.go
│ │ │ │ └── postgreslisttables.go
│ │ │ ├── postgreslisttablespaces
│ │ │ │ ├── postgreslisttablespaces_test.go
│ │ │ │ └── postgreslisttablespaces.go
│ │ │ ├── postgreslisttablestats
│ │ │ │ ├── postgreslisttablestats_test.go
│ │ │ │ └── postgreslisttablestats.go
│ │ │ ├── postgreslisttriggers
│ │ │ │ ├── postgreslisttriggers_test.go
│ │ │ │ └── postgreslisttriggers.go
│ │ │ ├── postgreslistviews
│ │ │ │ ├── postgreslistviews_test.go
│ │ │ │ └── postgreslistviews.go
│ │ │ ├── postgreslongrunningtransactions
│ │ │ │ ├── postgreslongrunningtransactions_test.go
│ │ │ │ └── postgreslongrunningtransactions.go
│ │ │ ├── postgresreplicationstats
│ │ │ │ ├── postgresreplicationstats_test.go
│ │ │ │ └── postgresreplicationstats.go
│ │ │ └── postgressql
│ │ │ ├── postgressql_test.go
│ │ │ └── postgressql.go
│ │ ├── redis
│ │ │ ├── redis_test.go
│ │ │ └── redis.go
│ │ ├── serverlessspark
│ │ │ ├── createbatch
│ │ │ │ ├── config.go
│ │ │ │ └── tool.go
│ │ │ ├── serverlesssparkcancelbatch
│ │ │ │ ├── serverlesssparkcancelbatch_test.go
│ │ │ │ └── serverlesssparkcancelbatch.go
│ │ │ ├── serverlesssparkcreatepysparkbatch
│ │ │ │ ├── serverlesssparkcreatepysparkbatch_test.go
│ │ │ │ └── serverlesssparkcreatepysparkbatch.go
│ │ │ ├── serverlesssparkcreatesparkbatch
│ │ │ │ ├── serverlesssparkcreatesparkbatch_test.go
│ │ │ │ └── serverlesssparkcreatesparkbatch.go
│ │ │ ├── serverlesssparkgetbatch
│ │ │ │ ├── serverlesssparkgetbatch_test.go
│ │ │ │ └── serverlesssparkgetbatch.go
│ │ │ ├── serverlesssparklistbatches
│ │ │ │ ├── serverlesssparklistbatches_test.go
│ │ │ │ └── serverlesssparklistbatches.go
│ │ │ └── testutils
│ │ │ └── testutils.go
│ │ ├── singlestore
│ │ │ ├── singlestoreexecutesql
│ │ │ │ ├── singlestoreexecutesql_test.go
│ │ │ │ └── singlestoreexecutesql.go
│ │ │ └── singlestoresql
│ │ │ ├── singlestoresql_test.go
│ │ │ └── singlestoresql.go
│ │ ├── snowflake
│ │ │ ├── snowflakeexecutesql
│ │ │ │ ├── snowflakeexecutesql_test.go
│ │ │ │ └── snowflakeexecutesql.go
│ │ │ └── snowflakesql
│ │ │ ├── snowflakesql_test.go
│ │ │ └── snowflakesql.go
│ │ ├── spanner
│ │ │ ├── spannerexecutesql
│ │ │ │ ├── spannerexecutesql_test.go
│ │ │ │ └── spannerexecutesql.go
│ │ │ ├── spannerlistgraphs
│ │ │ │ ├── spannerlistgraphs_test.go
│ │ │ │ └── spannerlistgraphs.go
│ │ │ ├── spannerlisttables
│ │ │ │ ├── spannerlisttables_test.go
│ │ │ │ └── spannerlisttables.go
│ │ │ └── spannersql
│ │ │ ├── spanner_test.go
│ │ │ └── spannersql.go
│ │ ├── sqlite
│ │ │ ├── sqliteexecutesql
│ │ │ │ ├── sqliteexecutesql_test.go
│ │ │ │ └── sqliteexecutesql.go
│ │ │ └── sqlitesql
│ │ │ ├── sqlitesql_test.go
│ │ │ └── sqlitesql.go
│ │ ├── tidb
│ │ │ ├── tidbexecutesql
│ │ │ │ ├── tidbexecutesql_test.go
│ │ │ │ └── tidbexecutesql.go
│ │ │ └── tidbsql
│ │ │ ├── tidbsql_test.go
│ │ │ └── tidbsql.go
│ │ ├── tools_test.go
│ │ ├── tools.go
│ │ ├── toolsets.go
│ │ ├── trino
│ │ │ ├── trinoexecutesql
│ │ │ │ ├── trinoexecutesql_test.go
│ │ │ │ └── trinoexecutesql.go
│ │ │ └── trinosql
│ │ │ ├── trinosql_test.go
│ │ │ └── trinosql.go
│ │ ├── utility
│ │ │ └── wait
│ │ │ ├── wait_test.go
│ │ │ └── wait.go
│ │ ├── valkey
│ │ │ ├── valkey_test.go
│ │ │ └── valkey.go
│ │ └── yugabytedbsql
│ │ ├── yugabytedbsql_test.go
│ │ └── yugabytedbsql.go
│ └── util
│ ├── orderedmap
│ │ ├── orderedmap_test.go
│ │ └── orderedmap.go
│ ├── parameters
│ │ ├── common_test.go
│ │ ├── common.go
│ │ ├── parameters_test.go
│ │ └── parameters.go
│ └── util.go
├── LICENSE
├── logo.png
├── main.go
├── MCP-TOOLBOX-EXTENSION.md
├── README.md
├── server.json
└── tests
├── alloydb
│ ├── alloydb_integration_test.go
│ └── alloydb_wait_for_operation_test.go
├── alloydbainl
│ └── alloydb_ai_nl_integration_test.go
├── alloydbpg
│ └── alloydb_pg_integration_test.go
├── auth.go
├── bigquery
│ └── bigquery_integration_test.go
├── bigtable
│ └── bigtable_integration_test.go
├── cassandra
│ └── cassandra_integration_test.go
├── clickhouse
│ └── clickhouse_integration_test.go
├── cloudgda
│ └── cloud_gda_integration_test.go
├── cloudhealthcare
│ └── cloud_healthcare_integration_test.go
├── cloudmonitoring
│ └── cloud_monitoring_integration_test.go
├── cloudsql
│ ├── cloud_sql_clone_instance_test.go
│ ├── cloud_sql_create_database_test.go
│ ├── cloud_sql_create_users_test.go
│ ├── cloud_sql_get_instances_test.go
│ ├── cloud_sql_list_databases_test.go
│ ├── cloudsql_list_instances_test.go
│ └── cloudsql_wait_for_operation_test.go
├── cloudsqlmssql
│ ├── cloud_sql_mssql_create_instance_integration_test.go
│ └── cloud_sql_mssql_integration_test.go
├── cloudsqlmysql
│ ├── cloud_sql_mysql_create_instance_integration_test.go
│ └── cloud_sql_mysql_integration_test.go
├── cloudsqlpg
│ ├── cloud_sql_pg_create_instances_test.go
│ ├── cloud_sql_pg_integration_test.go
│ └── cloud_sql_pg_upgrade_precheck_test.go
├── common.go
├── couchbase
│ └── couchbase_integration_test.go
├── dataform
│ └── dataform_integration_test.go
├── dataplex
│ └── dataplex_integration_test.go
├── dgraph
│ └── dgraph_integration_test.go
├── elasticsearch
│ └── elasticsearch_integration_test.go
├── firebird
│ └── firebird_integration_test.go
├── firestore
│ └── firestore_integration_test.go
├── http
│ └── http_integration_test.go
├── looker
│ └── looker_integration_test.go
├── mariadb
│ └── mariadb_integration_test.go
├── mindsdb
│ └── mindsdb_integration_test.go
├── mongodb
│ └── mongodb_integration_test.go
├── mssql
│ └── mssql_integration_test.go
├── mysql
│ └── mysql_integration_test.go
├── neo4j
│ └── neo4j_integration_test.go
├── oceanbase
│ └── oceanbase_integration_test.go
├── option.go
├── oracle
│ └── oracle_integration_test.go
├── postgres
│ └── postgres_integration_test.go
├── prompts
│ └── custom
│ └── prompts_integration_test.go
├── redis
│ └── redis_test.go
├── server.go
├── serverlessspark
│ └── serverless_spark_integration_test.go
├── singlestore
│ └── singlestore_integration_test.go
├── snowflake
│ └── snowflake_integration_test.go
├── source.go
├── spanner
│ └── spanner_integration_test.go
├── sqlite
│ └── sqlite_integration_test.go
├── tidb
│ └── tidb_integration_test.go
├── tool.go
├── trino
│ └── trino_integration_test.go
├── utility
│ └── wait_integration_test.go
├── valkey
│ └── valkey_test.go
└── yugabytedb
└── yugabytedb_integration_test.go
```
# Files
--------------------------------------------------------------------------------
/internal/util/orderedmap/orderedmap.go:
--------------------------------------------------------------------------------
```go
1 | // Copyright 2025 Google LLC
2 | //
3 | // Licensed under the Apache License, Version 2.0 (the "License");
4 | // you may not use this file except in compliance with the License.
5 | // You may obtain a copy of the License at
6 | //
7 | // http://www.apache.org/licenses/LICENSE-2.0
8 | //
9 | // Unless required by applicable law or agreed to in writing, software
10 | // distributed under the License is distributed on an "AS IS" BASIS,
11 | // WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 | // See the License for the specific language governing permissions and
13 | // limitations under the License.
14 | package orderedmap
15 |
16 | import (
17 | "bytes"
18 | "encoding/json"
19 | )
20 |
21 | // Column represents a single column in a row.
22 | type Column struct {
23 | Name string
24 | Value any
25 | }
26 |
27 | // Row represents a row of data with columns in a specific order.
28 | type Row struct {
29 | Columns []Column
30 | }
31 |
32 | // Add adds a new column to the row.
33 | func (r *Row) Add(name string, value any) {
34 | r.Columns = append(r.Columns, Column{Name: name, Value: value})
35 | }
36 |
37 | // MarshalJSON implements the json.Marshaler interface for the Row struct.
38 | // It marshals the row into a JSON object, preserving the order of the columns.
39 | func (r Row) MarshalJSON() ([]byte, error) {
40 | var buf bytes.Buffer
41 | buf.WriteString("{")
42 | for i, col := range r.Columns {
43 | if i > 0 {
44 | buf.WriteString(",")
45 | }
46 | // Marshal the key
47 | key, err := json.Marshal(col.Name)
48 | if err != nil {
49 | return nil, err
50 | }
51 | buf.Write(key)
52 | buf.WriteString(":")
53 | // Marshal the value
54 | val, err := json.Marshal(col.Value)
55 | if err != nil {
56 | return nil, err
57 | }
58 | buf.Write(val)
59 | }
60 | buf.WriteString("}")
61 | return buf.Bytes(), nil
62 | }
63 |
```
--------------------------------------------------------------------------------
/internal/embeddingmodels/embeddingmodels.go:
--------------------------------------------------------------------------------
```go
1 | // Copyright 2026 Google LLC
2 | //
3 | // Licensed under the Apache License, Version 2.0 (the "License");
4 | // you may not use this file except in compliance with the License.
5 | // You may obtain a copy of the License at
6 | //
7 | // http://www.apache.org/licenses/LICENSE-2.0
8 | //
9 | // Unless required by applicable law or agreed to in writing, software
10 | // distributed under the License is distributed on an "AS IS" BASIS,
11 | // WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 | // See the License for the specific language governing permissions and
13 | // limitations under the License.
14 |
15 | package embeddingmodels
16 |
17 | import (
18 | "context"
19 | "strconv"
20 | "strings"
21 | )
22 |
23 | // EmbeddingModelConfig is the interface for configuring embedding models.
24 | type EmbeddingModelConfig interface {
25 | EmbeddingModelConfigKind() string
26 | Initialize(context.Context) (EmbeddingModel, error)
27 | }
28 |
29 | type EmbeddingModel interface {
30 | EmbeddingModelKind() string
31 | ToConfig() EmbeddingModelConfig
32 | EmbedParameters(context.Context, []string) ([][]float32, error)
33 | }
34 |
35 | type VectorFormatter func(vectorFloats []float32) any
36 |
37 | // FormatVectorForPgvector converts a slice of floats into a PostgreSQL vector literal string: '[x, y, z]'
38 | func FormatVectorForPgvector(vectorFloats []float32) any {
39 | if len(vectorFloats) == 0 {
40 | return "[]"
41 | }
42 |
43 | // Pre-allocate the builder.
44 | var b strings.Builder
45 | b.Grow(len(vectorFloats) * 10)
46 |
47 | b.WriteByte('[')
48 | for i, f := range vectorFloats {
49 | if i > 0 {
50 | b.WriteString(", ")
51 | }
52 | b.Write(strconv.AppendFloat(nil, float64(f), 'g', -1, 32))
53 | }
54 | b.WriteByte(']')
55 |
56 | return b.String()
57 | }
58 |
59 | var _ VectorFormatter = FormatVectorForPgvector
60 |
```
--------------------------------------------------------------------------------
/internal/server/static/toolsets.html:
--------------------------------------------------------------------------------
```html
1 | <!DOCTYPE html>
2 | <html lang="en">
3 | <head>
4 | <meta charset="UTF-8">
5 | <meta name="viewport" content="width=device-width, initial-scale=1.0">
6 | <title>Toolsets View</title>
7 | <link rel="stylesheet" href="/ui/css/style.css">
8 | <link href="https://fonts.googleapis.com/icon?family=Material+Icons" rel="stylesheet">
9 | <script src="https://accounts.google.com/gsi/client" async defer></script>
10 | </head>
11 | <body>
12 | <div id="navbar-container" data-active-nav="/ui/toolsets"></div>
13 |
14 | <aside class="second-nav">
15 | <h4>Retrieve Toolset</h4>
16 | <div class="search-container">
17 | <input type="text" id="toolset-search-input" placeholder="Enter toolset name...">
18 | <button id="toolset-search-button" aria-label="Retrieve Tools">
19 | <span class="material-icons">search</span>
20 | </button>
21 | </div>
22 | <div id="secondary-panel-content">
23 | <p>Retrieve toolset to see available tools.</p>
24 | </div>
25 | </aside>
26 |
27 | <div id="main-content-container"></div>
28 |
29 | <script type="module" src="/ui/js/toolsets.js"></script>
30 | <script src="/ui/js/navbar.js"></script>
31 | <script src="/ui/js/mainContent.js"></script>
32 | <script>
33 | document.addEventListener('DOMContentLoaded', () => {
34 | const navbarContainer = document.getElementById('navbar-container');
35 | const activeNav = navbarContainer.getAttribute('data-active-nav');
36 | renderNavbar('navbar-container', activeNav);
37 | renderMainContent('main-content-container', 'tool-display-area', getToolsetInstructions());
38 | });
39 | </script>
40 | </body>
41 | </html>
```
--------------------------------------------------------------------------------
/docs/en/resources/tools/postgres/postgres-list-views.md:
--------------------------------------------------------------------------------
```markdown
1 | ---
2 | title: "postgres-list-views"
3 | type: docs
4 | weight: 1
5 | description: >
6 | The "postgres-list-views" tool lists views in a Postgres database, with a default limit of 50 rows.
7 | aliases:
8 | - /resources/tools/postgres-list-views
9 | ---
10 |
11 | ## About
12 |
13 | The `postgres-list-views` tool retrieves a list of top N (default 50) views from
14 | a Postgres database, excluding those in system schemas (`pg_catalog`,
15 | `information_schema`). It's compatible with any of the following sources:
16 |
17 | - [alloydb-postgres](../../sources/alloydb-pg.md)
18 | - [cloud-sql-postgres](../../sources/cloud-sql-pg.md)
19 | - [postgres](../../sources/postgres.md)
20 |
21 | `postgres-list-views` lists detailed view information (schemaname, viewname,
22 | ownername, definition) as JSON for views in a database. The tool takes the following input
23 | parameters:
24 |
25 | - `view_name` (optional): A string pattern to filter view names. Default: `""`
26 | - `schema_name` (optional): A string pattern to filter schema names. Default: `""`
27 | - `limit` (optional): The maximum number of rows to return. Default: `50`.
28 |
29 | ## Example
30 |
31 | ```yaml
32 | tools:
33 | list_views:
34 | kind: postgres-list-views
35 | source: cloudsql-pg-source
36 | ```
37 |
38 | ## Reference
39 |
40 | | **field** | **type** | **required** | **description** |
41 | |-------------|:--------:|:------------:|------------------------------------------------------|
42 | | kind | string | true | Must be "postgres-list-views". |
43 | | source | string | true | Name of the source the SQL should execute on. |
44 | | description | string | false | Description of the tool that is passed to the agent. |
45 |
```
--------------------------------------------------------------------------------
/internal/prebuiltconfigs/tools/cloud-sql-mssql-admin.yaml:
--------------------------------------------------------------------------------
```yaml
1 | # Copyright 2025 Google LLC
2 | #
3 | # Licensed under the Apache License, Version 2.0 (the "License");
4 | # you may not use this file except in compliance with the License.
5 | # You may obtain a copy of the License at
6 | #
7 | # http://www.apache.org/licenses/LICENSE-2.0
8 | #
9 | # Unless required by applicable law or agreed to in writing, software
10 | # distributed under the License is distributed on an "AS IS" BASIS,
11 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 | # See the License for the specific language governing permissions and
13 | # limitations under the License.
14 |
15 | sources:
16 | cloud-sql-admin-source:
17 | kind: cloud-sql-admin
18 | defaultProject: ${CLOUD_SQL_MSSQL_PROJECT:}
19 |
20 | tools:
21 | create_instance:
22 | kind: cloud-sql-mssql-create-instance
23 | source: cloud-sql-admin-source
24 | get_instance:
25 | kind: cloud-sql-get-instance
26 | source: cloud-sql-admin-source
27 | list_instances:
28 | kind: cloud-sql-list-instances
29 | source: cloud-sql-admin-source
30 | create_database:
31 | kind: cloud-sql-create-database
32 | source: cloud-sql-admin-source
33 | list_databases:
34 | kind: cloud-sql-list-databases
35 | source: cloud-sql-admin-source
36 | create_user:
37 | kind: cloud-sql-create-users
38 | source: cloud-sql-admin-source
39 | wait_for_operation:
40 | kind: cloud-sql-wait-for-operation
41 | source: cloud-sql-admin-source
42 | multiplier: 4
43 | clone_instance:
44 | kind: cloud-sql-clone-instance
45 | source: cloud-sql-admin-source
46 |
47 | toolsets:
48 | cloud_sql_mssql_admin_tools:
49 | - create_instance
50 | - get_instance
51 | - list_instances
52 | - create_database
53 | - list_databases
54 | - create_user
55 | - wait_for_operation
56 | - clone_instance
57 |
```
--------------------------------------------------------------------------------
/internal/prebuiltconfigs/tools/cloud-sql-mysql-admin.yaml:
--------------------------------------------------------------------------------
```yaml
1 | # Copyright 2025 Google LLC
2 | #
3 | # Licensed under the Apache License, Version 2.0 (the "License");
4 | # you may not use this file except in compliance with the License.
5 | # You may obtain a copy of the License at
6 | #
7 | # http://www.apache.org/licenses/LICENSE-2.0
8 | #
9 | # Unless required by applicable law or agreed to in writing, software
10 | # distributed under the License is distributed on an "AS IS" BASIS,
11 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 | # See the License for the specific language governing permissions and
13 | # limitations under the License.
14 |
15 | sources:
16 | cloud-sql-admin-source:
17 | kind: cloud-sql-admin
18 | defaultProject: ${CLOUD_SQL_MYSQL_PROJECT:}
19 |
20 | tools:
21 | create_instance:
22 | kind: cloud-sql-mysql-create-instance
23 | source: cloud-sql-admin-source
24 | get_instance:
25 | kind: cloud-sql-get-instance
26 | source: cloud-sql-admin-source
27 | list_instances:
28 | kind: cloud-sql-list-instances
29 | source: cloud-sql-admin-source
30 | create_database:
31 | kind: cloud-sql-create-database
32 | source: cloud-sql-admin-source
33 | list_databases:
34 | kind: cloud-sql-list-databases
35 | source: cloud-sql-admin-source
36 | create_user:
37 | kind: cloud-sql-create-users
38 | source: cloud-sql-admin-source
39 | wait_for_operation:
40 | kind: cloud-sql-wait-for-operation
41 | source: cloud-sql-admin-source
42 | multiplier: 4
43 | clone_instance:
44 | kind: cloud-sql-clone-instance
45 | source: cloud-sql-admin-source
46 |
47 | toolsets:
48 | cloud_sql_mysql_admin_tools:
49 | - create_instance
50 | - get_instance
51 | - list_instances
52 | - create_database
53 | - list_databases
54 | - create_user
55 | - wait_for_operation
56 | - clone_instance
57 |
```
--------------------------------------------------------------------------------
/docs/en/resources/tools/looker/looker-delete-project-file.md:
--------------------------------------------------------------------------------
```markdown
1 | ---
2 | title: "looker-delete-project-file"
3 | type: docs
4 | weight: 1
5 | description: >
6 | A "looker-delete-project-file" tool deletes a LookML file in a project.
7 | aliases:
8 | - /resources/tools/looker-delete-project-file
9 | ---
10 |
11 | ## About
12 |
13 | A `looker-delete-project-file` tool deletes a LookML file in a project
14 |
15 | It's compatible with the following sources:
16 |
17 | - [looker](../../sources/looker.md)
18 |
19 | `looker-delete-project-file` accepts a project_id parameter and a file_path parameter.
20 |
21 | ## Example
22 |
23 | ```yaml
24 | tools:
25 | delete_project_file:
26 | kind: looker-delete-project-file
27 | source: looker-source
28 | description: |
29 | This tool permanently deletes a specified LookML file from within a project.
30 | Use with caution, as this action cannot be undone through the API.
31 |
32 | Prerequisite: The Looker session must be in Development Mode. Use `dev_mode: true` first.
33 |
34 | Parameters:
35 | - project_id (required): The unique ID of the LookML project.
36 | - file_path (required): The exact path to the LookML file to delete within the project.
37 |
38 | Output:
39 | A confirmation message upon successful file deletion.
40 | ```
41 |
42 | ## Reference
43 |
44 | | **field** | **type** | **required** | **description** |
45 | |-------------|:--------:|:------------:|----------------------------------------------------|
46 | | kind | string | true | Must be "looker-delete-project-file". |
47 | | source | string | true | Name of the source Looker instance. |
48 | | description | string | true | Description of the tool that is passed to the LLM. |
49 |
```
--------------------------------------------------------------------------------
/docs/en/resources/tools/alloydb/alloydb-get-instance.md:
--------------------------------------------------------------------------------
```markdown
1 | ---
2 | title: alloydb-get-instance
3 | type: docs
4 | weight: 1
5 | description: "The \"alloydb-get-instance\" tool retrieves details for a specific AlloyDB instance.\n"
6 | aliases: [/resources/tools/alloydb-get-instance]
7 | ---
8 |
9 | ## About
10 |
11 | The `alloydb-get-instance` tool retrieves detailed information for a single,
12 | specified AlloyDB instance. It is compatible with
13 | [alloydb-admin](../../sources/alloydb-admin.md) source.
14 |
15 | | Parameter | Type | Description | Required |
16 | |:-----------|:-------|:----------------------------------------------------|:---------|
17 | | `project` | string | The GCP project ID to get instance for. | Yes |
18 | | `location` | string | The location of the instance (e.g., 'us-central1'). | Yes |
19 | | `cluster` | string | The ID of the cluster. | Yes |
20 | | `instance` | string | The ID of the instance to retrieve. | Yes |
21 |
22 | ## Example
23 |
24 | ```yaml
25 | tools:
26 | get_specific_instance:
27 | kind: alloydb-get-instance
28 | source: my-alloydb-admin-source
29 | description: Use this tool to retrieve details for a specific AlloyDB instance.
30 | ```
31 |
32 | ## Reference
33 |
34 | | **field** | **type** | **required** | **description** |
35 | |-------------|:--------:|:------------:|------------------------------------------------------|
36 | | kind | string | true | Must be alloydb-get-instance. |
37 | | source | string | true | The name of an `alloydb-admin` source. |
38 | | description | string | false | Description of the tool that is passed to the agent. |
39 |
```
--------------------------------------------------------------------------------
/docs/en/resources/tools/looker/looker-get-connection-schemas.md:
--------------------------------------------------------------------------------
```markdown
1 | ---
2 | title: "looker-get-connection-schemas"
3 | type: docs
4 | weight: 1
5 | description: >
6 | A "looker-get-connection-schemas" tool returns all the schemas in a connection.
7 | aliases:
8 | - /resources/tools/looker-get-connection-schemas
9 | ---
10 |
11 | ## About
12 |
13 | A `looker-get-connection-schemas` tool returns all the schemas in a connection.
14 |
15 | It's compatible with the following sources:
16 |
17 | - [looker](../../sources/looker.md)
18 |
19 | `looker-get-connection-schemas` accepts a `conn` parameter and an optional `db` parameter.
20 |
21 | ## Example
22 |
23 | ```yaml
24 | tools:
25 | get_connection_schemas:
26 | kind: looker-get-connection-schemas
27 | source: looker-source
28 | description: |
29 | This tool retrieves a list of database schemas available through a specified
30 | Looker connection.
31 |
32 | Parameters:
33 | - connection_name (required): The name of the database connection, obtained from `get_connections`.
34 | - database (optional): An optional database name to filter the schemas.
35 | Only applicable for connections that support multiple databases.
36 |
37 | Output:
38 | A JSON array of strings, where each string is the name of an available schema.
39 | ```
40 |
41 | ## Reference
42 |
43 | | **field** | **type** | **required** | **description** |
44 | |-------------|:--------:|:------------:|----------------------------------------------------|
45 | | kind | string | true | Must be "looker-get-connection-schemas". |
46 | | source | string | true | Name of the source Looker instance. |
47 | | description | string | true | Description of the tool that is passed to the LLM. |
48 |
```
--------------------------------------------------------------------------------
/docs/en/resources/tools/trino/trino-execute-sql.md:
--------------------------------------------------------------------------------
```markdown
1 | ---
2 | title: "trino-execute-sql"
3 | type: docs
4 | weight: 1
5 | description: >
6 | A "trino-execute-sql" tool executes a SQL statement against a Trino
7 | database.
8 | aliases:
9 | - /resources/tools/trino-execute-sql
10 | ---
11 |
12 | ## About
13 |
14 | A `trino-execute-sql` tool executes a SQL statement against a Trino
15 | database. It's compatible with any of the following sources:
16 |
17 | - [trino](../../sources/trino.md)
18 |
19 | `trino-execute-sql` takes one input parameter `sql` and run the sql
20 | statement against the `source`.
21 |
22 | > **Note:** This tool is intended for developer assistant workflows with
23 | > human-in-the-loop and shouldn't be used for production agents.
24 |
25 | ## Example
26 |
27 | ```yaml
28 | tools:
29 | execute_sql_tool:
30 | kind: trino-execute-sql
31 | source: my-trino-instance
32 | description: Use this tool to execute sql statement.
33 | ```
34 |
35 | ## Reference
36 |
37 | | **field** | **type** | **required** | **description** |
38 | |-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
39 | | kind | string | true | Must be "trino-execute-sql". |
40 | | source | string | true | Name of the source the SQL should execute on. |
41 | | description | string | true | Description of the tool that is passed to the LLM. |
42 |
```
--------------------------------------------------------------------------------
/docs/en/resources/tools/redis/redis.md:
--------------------------------------------------------------------------------
```markdown
1 | ---
2 | title: "redis"
3 | type: docs
4 | weight: 1
5 | description: >
6 | A "redis" tool executes a set of pre-defined Redis commands against a Redis instance.
7 | aliases:
8 | - /resources/tools/redis
9 | ---
10 |
11 | ## About
12 |
13 | A redis tool executes a series of pre-defined Redis commands against a
14 | Redis source.
15 |
16 | The specified Redis commands are executed sequentially. Each command is
17 | represented as a string list, where the first element is the command name (e.g.,
18 | SET, GET, HGETALL) and subsequent elements are its arguments.
19 |
20 | ### Dynamic Command Parameters
21 |
22 | Command arguments can be templated using the `$variableName` annotation. The
23 | array type parameters will be expanded once into multiple arguments. Take the
24 | following config for example:
25 |
26 | ```yaml
27 | commands:
28 | - [SADD, userNames, $userNames] # Array will be flattened into multiple arguments.
29 | parameters:
30 | - name: userNames
31 | type: array
32 | description: The user names to be set.
33 | ```
34 |
35 | If the input is an array of strings `["Alice", "Sid", "Bob"]`, The final command
36 | to be executed after argument expansion will be `[SADD, userNames, Alice, Sid, Bob]`.
37 |
38 | ## Example
39 |
40 | ```yaml
41 | tools:
42 | user_data_tool:
43 | kind: redis
44 | source: my-redis-instance
45 | description: |
46 | Use this tool to interact with user data stored in Redis.
47 | It can set, retrieve, and delete user-specific information.
48 | commands:
49 | - [SADD, userNames, $userNames] # Array will be flattened into multiple arguments.
50 | - [GET, $userId]
51 | parameters:
52 | - name: userId
53 | type: string
54 | description: The unique identifier for the user.
55 | - name: userNames
56 | type: array
57 | description: The user names to be set.
58 | ```
59 |
```
--------------------------------------------------------------------------------
/.ci/quickstart_test/js.integration.cloudbuild.yaml:
--------------------------------------------------------------------------------
```yaml
1 | # Copyright 2025 Google LLC
2 | #
3 | # Licensed under the Apache License, Version 2.0 (the "License");
4 | # you may not use this file except in compliance with the License.
5 | # You may obtain a copy of the License at
6 | #
7 | # http://www.apache.org/licenses/LICENSE-2.0
8 | #
9 | # Unless required by applicable law or agreed to in writing, software
10 | # distributed under the License is distributed on an "AS IS" BASIS,
11 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 | # See the License for the specific language governing permissions and
13 | # limitations under the License.
14 |
15 | steps:
16 | - name: 'node:22'
17 | id: 'js-quickstart-test'
18 | entrypoint: 'bash'
19 | args:
20 | # The '-c' flag tells bash to execute the following string as a command.
21 | # The 'set -ex' enables debug output and exits on error for easier troubleshooting.
22 | - -c
23 | - |
24 | set -ex
25 | export VERSION=$(cat ./cmd/version.txt)
26 | chmod +x .ci/quickstart_test/run_js_tests.sh
27 | .ci/quickstart_test/run_js_tests.sh
28 | env:
29 | - 'CLOUD_SQL_INSTANCE=${_CLOUD_SQL_INSTANCE}'
30 | - 'GCP_PROJECT=${_GCP_PROJECT}'
31 | - 'DATABASE_NAME=${_DATABASE_NAME}'
32 | - 'DB_USER=${_DB_USER}'
33 | secretEnv: ['TOOLS_YAML_CONTENT', 'GOOGLE_API_KEY', 'DB_PASSWORD']
34 |
35 | availableSecrets:
36 | secretManager:
37 | - versionName: projects/${_GCP_PROJECT}/secrets/${_TOOLS_YAML_SECRET}/versions/6
38 | env: 'TOOLS_YAML_CONTENT'
39 | - versionName: projects/${_GCP_PROJECT_NUMBER}/secrets/${_API_KEY_SECRET}/versions/latest
40 | env: 'GOOGLE_API_KEY'
41 | - versionName: projects/${_GCP_PROJECT}/secrets/${_DB_PASS_SECRET}/versions/latest
42 | env: 'DB_PASSWORD'
43 |
44 | timeout: 1000s
45 |
46 | options:
47 | logging: CLOUD_LOGGING_ONLY
48 |
```
--------------------------------------------------------------------------------
/docs/en/resources/tools/valkey/valkey.md:
--------------------------------------------------------------------------------
```markdown
1 | ---
2 | title: "valkey"
3 | type: docs
4 | weight: 1
5 | description: >
6 | A "valkey" tool executes a set of pre-defined Valkey commands against a Valkey instance.
7 | aliases:
8 | - /resources/tools/valkey
9 | ---
10 |
11 | ## About
12 |
13 | A valkey tool executes a series of pre-defined Valkey commands against a
14 | Valkey instance.
15 |
16 | The specified Valkey commands are executed sequentially. Each command is
17 | represented as a string array, where the first element is the command name
18 | (e.g., SET, GET, HGETALL) and subsequent elements are its arguments.
19 |
20 | ### Dynamic Command Parameters
21 |
22 | Command arguments can be templated using the `$variableName` annotation. The
23 | array type parameters will be expanded once into multiple arguments. Take the
24 | following config for example:
25 |
26 | ```yaml
27 | commands:
28 | - [SADD, userNames, $userNames] # Array will be flattened into multiple arguments.
29 | parameters:
30 | - name: userNames
31 | type: array
32 | description: The user names to be set.
33 | ```
34 |
35 | If the input is an array of strings `["Alice", "Sid", "Bob"]`, The final command
36 | to be executed after argument expansion will be `[SADD, userNames, Alice, Sid, Bob]`.
37 |
38 | ## Example
39 |
40 | ```yaml
41 | tools:
42 | user_data_tool:
43 | kind: valkey
44 | source: my-valkey-instance
45 | description: |
46 | Use this tool to interact with user data stored in Valkey.
47 | It can set, retrieve, and delete user-specific information.
48 | commands:
49 | - [SADD, userNames, $userNames] # Array will be flattened into multiple arguments.
50 | - [GET, $userId]
51 | parameters:
52 | - name: userId
53 | type: string
54 | description: The unique identifier for the user.
55 | - name: userNames
56 | type: array
57 | description: The user names to be set.
58 | ```
59 |
```
--------------------------------------------------------------------------------
/.ci/quickstart_test/go.integration.cloudbuild.yaml:
--------------------------------------------------------------------------------
```yaml
1 | # Copyright 2025 Google LLC
2 | #
3 | # Licensed under the Apache License, Version 2.0 (the "License");
4 | # you may not use this file except in compliance with the License.
5 | # You may obtain a copy of the License at
6 | #
7 | # http://www.apache.org/licenses/LICENSE-2.0
8 | #
9 | # Unless required by applicable law or agreed to in writing, software
10 | # distributed under the License is distributed on an "AS IS" BASIS,
11 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 | # See the License for the specific language governing permissions and
13 | # limitations under the License.
14 |
15 | steps:
16 | - name: 'golang:1.25.1'
17 | id: 'go-quickstart-test'
18 | entrypoint: 'bash'
19 | args:
20 | # The '-c' flag tells bash to execute the following string as a command.
21 | # The 'set -ex' enables debug output and exits on error for easier troubleshooting.
22 | - -c
23 | - |
24 | set -ex
25 | export VERSION=$(cat ./cmd/version.txt)
26 | chmod +x .ci/quickstart_test/run_go_tests.sh
27 | .ci/quickstart_test/run_go_tests.sh
28 | env:
29 | - 'CLOUD_SQL_INSTANCE=${_CLOUD_SQL_INSTANCE}'
30 | - 'GCP_PROJECT=${_GCP_PROJECT}'
31 | - 'DATABASE_NAME=${_DATABASE_NAME}'
32 | - 'DB_USER=${_DB_USER}'
33 | secretEnv: ['TOOLS_YAML_CONTENT', 'GOOGLE_API_KEY', 'DB_PASSWORD']
34 |
35 | availableSecrets:
36 | secretManager:
37 | - versionName: projects/${_GCP_PROJECT}/secrets/${_TOOLS_YAML_SECRET}/versions/7
38 | env: 'TOOLS_YAML_CONTENT'
39 | - versionName: projects/${_GCP_PROJECT_NUMBER}/secrets/${_API_KEY_SECRET}/versions/latest
40 | env: 'GOOGLE_API_KEY'
41 | - versionName: projects/${_GCP_PROJECT}/secrets/${_DB_PASS_SECRET}/versions/latest
42 | env: 'DB_PASSWORD'
43 |
44 | timeout: 1000s
45 |
46 | options:
47 | logging: CLOUD_LOGGING_ONLY
```
--------------------------------------------------------------------------------
/docs/en/resources/tools/mysql/mysql-get-query-plan.md:
--------------------------------------------------------------------------------
```markdown
1 | ---
2 | title: "mysql-get-query-plan"
3 | type: docs
4 | weight: 1
5 | description: >
6 | A "mysql-get-query-plan" tool gets the execution plan for a SQL statement against a MySQL
7 | database.
8 | aliases:
9 | - /resources/tools/mysql-get-query-plan
10 | ---
11 |
12 | ## About
13 |
14 | A `mysql-get-query-plan` tool gets the execution plan for a SQL statement against a MySQL
15 | database. It's compatible with any of the following sources:
16 |
17 | - [cloud-sql-mysql](../../sources/cloud-sql-mysql.md)
18 | - [mysql](../../sources/mysql.md)
19 |
20 | `mysql-get-query-plan` takes one input parameter `sql_statement` and gets the execution plan for the SQL
21 | statement against the `source`.
22 |
23 | ## Example
24 |
25 | ```yaml
26 | tools:
27 | get_query_plan_tool:
28 | kind: mysql-get-query-plan
29 | source: my-mysql-instance
30 | description: Use this tool to get the execution plan for a sql statement.
31 | ```
32 |
33 | ## Reference
34 |
35 | | **field** | **type** | **required** | **description** |
36 | |-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
37 | | kind | string | true | Must be "mysql-get-query-plan". |
38 | | source | string | true | Name of the source the SQL should execute on. |
39 | | description | string | true | Description of the tool that is passed to the LLM. |
40 |
```
--------------------------------------------------------------------------------
/docs/TOOLBOX_README.md:
--------------------------------------------------------------------------------
```markdown
1 | # MCP Toolbox for Databases Server
2 |
3 | The MCP Toolbox for Databases Server gives AI-powered development tools the ability to work with your custom tools. It is designed to simplify and secure the development of tools for interacting with databases.
4 |
5 |
6 | ## Prerequisites
7 |
8 | * [Node.js](https://nodejs.org/) installed.
9 | * A Google Cloud project with relevant APIs enabled.
10 | * Ensure [Application Default Credentials](https://cloud.google.com/docs/authentication/gcloud) are available in your environment.
11 |
12 | ## Install & Configuration
13 |
14 | 1. In the Antigravity MCP Store, click the **Install** button. A configuration window will appear.
15 |
16 | 2. Create your [`tools.yaml` configuration file](https://googleapis.github.io/genai-toolbox/getting-started/configure/).
17 |
18 | 3. In the configuration window, enter the full absolute path to your `tools.yaml` file and click **Save**.
19 |
20 | > [!NOTE]
21 | > If you encounter issues with Windows Defender blocking the execution, you may need to configure an allowlist. See [Configure exclusions for Microsoft Defender Antivirus](https://learn.microsoft.com/en-us/microsoft-365/security/defender-endpoint/configure-exclusions-microsoft-defender-antivirus?view=o365-worldwide) for more details.
22 |
23 | ## Usage
24 |
25 | Interact with your custom tools using natural language.
26 |
27 | ## Custom MCP Server Configuration
28 |
29 | ```json
30 | {
31 | "mcpServers": {
32 | "mcp-toolbox": {
33 | "command": "npx",
34 | "args": ["-y", "@toolbox-sdk/server", "--tools-file", "your-tool-file.yaml"],
35 | "env": {
36 | "ENV_VAR_NAME": "ENV_VAR_VALUE",
37 | }
38 | }
39 | }
40 | }
41 | ```
42 |
43 | ## Documentation
44 |
45 | For more information, visit the [MCP Toolbox for Databases documentation](https://googleapis.github.io/genai-toolbox/getting-started/introduction/).
46 |
```
--------------------------------------------------------------------------------
/docs/en/resources/tools/mysql/mysql-execute-sql.md:
--------------------------------------------------------------------------------
```markdown
1 | ---
2 | title: "mysql-execute-sql"
3 | type: docs
4 | weight: 1
5 | description: >
6 | A "mysql-execute-sql" tool executes a SQL statement against a MySQL
7 | database.
8 | aliases:
9 | - /resources/tools/mysql-execute-sql
10 | ---
11 |
12 | ## About
13 |
14 | A `mysql-execute-sql` tool executes a SQL statement against a MySQL
15 | database. It's compatible with any of the following sources:
16 |
17 | - [cloud-sql-mysql](../../sources/cloud-sql-mysql.md)
18 | - [mysql](../../sources/mysql.md)
19 |
20 | `mysql-execute-sql` takes one input parameter `sql` and run the sql
21 | statement against the `source`.
22 |
23 | > **Note:** This tool is intended for developer assistant workflows with
24 | > human-in-the-loop and shouldn't be used for production agents.
25 |
26 | ## Example
27 |
28 | ```yaml
29 | tools:
30 | execute_sql_tool:
31 | kind: mysql-execute-sql
32 | source: my-mysql-instance
33 | description: Use this tool to execute sql statement.
34 | ```
35 |
36 | ## Reference
37 |
38 | | **field** | **type** | **required** | **description** |
39 | |-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
40 | | kind | string | true | Must be "mysql-execute-sql". |
41 | | source | string | true | Name of the source the SQL should execute on. |
42 | | description | string | true | Description of the tool that is passed to the LLM. |
43 |
```
--------------------------------------------------------------------------------
/docs/en/resources/tools/looker/looker-update-project-file.md:
--------------------------------------------------------------------------------
```markdown
1 | ---
2 | title: "looker-update-project-file"
3 | type: docs
4 | weight: 1
5 | description: >
6 | A "looker-update-project-file" tool updates the content of a LookML file in a project.
7 | aliases:
8 | - /resources/tools/looker-update-project-file
9 | ---
10 |
11 | ## About
12 |
13 | A `looker-update-project-file` tool updates the content of a LookML file.
14 |
15 | It's compatible with the following sources:
16 |
17 | - [looker](../../sources/looker.md)
18 |
19 | `looker-update-project-file` accepts a project_id parameter and a file_path parameter
20 | as well as the new file content.
21 |
22 | ## Example
23 |
24 | ```yaml
25 | tools:
26 | update_project_file:
27 | kind: looker-update-project-file
28 | source: looker-source
29 | description: |
30 | This tool modifies the content of an existing LookML file within a specified project.
31 |
32 | Prerequisite: The Looker session must be in Development Mode. Use `dev_mode: true` first.
33 |
34 | Parameters:
35 | - project_id (required): The unique ID of the LookML project.
36 | - file_path (required): The exact path to the LookML file to modify within the project.
37 | - content (required): The new, complete LookML content to overwrite the existing file.
38 |
39 | Output:
40 | A confirmation message upon successful file modification.
41 | ```
42 |
43 | ## Reference
44 |
45 | | **field** | **type** | **required** | **description** |
46 | |-------------|:--------:|:------------:|----------------------------------------------------|
47 | | kind | string | true | Must be "looker-update-project-file". |
48 | | source | string | true | Name of the source Looker instance. |
49 | | description | string | true | Description of the tool that is passed to the LLM. |
50 |
```
--------------------------------------------------------------------------------
/docs/en/resources/tools/looker/looker-create-project-file.md:
--------------------------------------------------------------------------------
```markdown
1 | ---
2 | title: "looker-create-project-file"
3 | type: docs
4 | weight: 1
5 | description: >
6 | A "looker-create-project-file" tool creates a new LookML file in a project.
7 | aliases:
8 | - /resources/tools/looker-create-project-file
9 | ---
10 |
11 | ## About
12 |
13 | A `looker-create-project-file` tool creates a new LookML file in a project
14 |
15 | It's compatible with the following sources:
16 |
17 | - [looker](../../sources/looker.md)
18 |
19 | `looker-create-project-file` accepts a project_id parameter and a file_path parameter
20 | as well as the file content.
21 |
22 | ## Example
23 |
24 | ```yaml
25 | tools:
26 | create_project_file:
27 | kind: looker-create-project-file
28 | source: looker-source
29 | description: |
30 | This tool creates a new LookML file within a specified project, populating
31 | it with the provided content.
32 |
33 | Prerequisite: The Looker session must be in Development Mode. Use `dev_mode: true` first.
34 |
35 | Parameters:
36 | - project_id (required): The unique ID of the LookML project.
37 | - file_path (required): The desired path and filename for the new file within the project.
38 | - content (required): The full LookML content to write into the new file.
39 |
40 | Output:
41 | A confirmation message upon successful file creation.
42 | ```
43 |
44 | ## Reference
45 |
46 | | **field** | **type** | **required** | **description** |
47 | |-------------|:--------:|:------------:|----------------------------------------------------|
48 | | kind | string | true | Must be "looker-create-project-file". |
49 | | source | string | true | Name of the source Looker instance. |
50 | | description | string | true | Description of the tool that is passed to the LLM. |
51 |
```
--------------------------------------------------------------------------------
/docs/en/resources/tools/oracle/oracle-sql.md:
--------------------------------------------------------------------------------
```markdown
1 | ---
2 | title: "oracle-sql"
3 | type: docs
4 | weight: 1
5 | description: >
6 | An "oracle-sql" tool executes a pre-defined SQL statement against an Oracle database.
7 | aliases:
8 | - /resources/tools/oracle-sql
9 | ---
10 |
11 | ## About
12 |
13 | An `oracle-sql` tool executes a pre-defined SQL statement against an
14 | Oracle database. It's compatible with the following source:
15 |
16 | - [oracle](../../sources/oracle.md)
17 |
18 | The specified SQL statement is executed using [prepared statements][oracle-stmt]
19 | for security and performance. It expects parameter placeholders in the SQL query
20 | to be in the native Oracle format (e.g., `:1`, `:2`).
21 |
22 | [oracle-stmt]: https://docs.oracle.com/javase/tutorial/jdbc/basics/prepared.html
23 |
24 | ## Example
25 |
26 | > **Note:** This tool uses parameterized queries to prevent SQL injections.
27 | > Query parameters can be used as substitutes for arbitrary expressions.
28 | > Parameters cannot be used as substitutes for identifiers, column names, table
29 | > names, or other parts of the query.
30 |
31 | ```yaml
32 | tools:
33 | search_flights_by_number:
34 | kind: oracle-sql
35 | source: my-oracle-instance
36 | statement: |
37 | SELECT * FROM flights
38 | WHERE airline = :1
39 | AND flight_number = :2
40 | FETCH FIRST 10 ROWS ONLY
41 | description: |
42 | Use this tool to get information for a specific flight.
43 | Takes an airline code and flight number and returns info on the flight.
44 | Do NOT use this tool with a flight id. Do NOT guess an airline code or flight number.
45 | Example:
46 | {{
47 | "airline": "CY",
48 | "flight_number": "888",
49 | }}
50 | parameters:
51 | - name: airline
52 | type: string
53 | description: Airline unique 2 letter identifier
54 | - name: flight_number
55 | type: string
56 | description: 1 to 4 digit number
57 | ```
58 |
```
--------------------------------------------------------------------------------
/.ci/quickstart_test/py.integration.cloudbuild.yaml:
--------------------------------------------------------------------------------
```yaml
1 | # Copyright 2025 Google LLC
2 | #
3 | # Licensed under the Apache License, Version 2.0 (the "License");
4 | # you may not use this file except in compliance with the License.
5 | # You may obtain a copy of the License at
6 | #
7 | # http://www.apache.org/licenses/LICENSE-2.0
8 | #
9 | # Unless required by applicable law or agreed to in writing, software
10 | # distributed under the License is distributed on an "AS IS" BASIS,
11 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 | # See the License for the specific language governing permissions and
13 | # limitations under the License.
14 |
15 | steps:
16 | - name: 'gcr.io/google.com/cloudsdktool/cloud-sdk:537.0.0'
17 | id: 'python-quickstart-test'
18 | entrypoint: 'bash'
19 | args:
20 | # The '-c' flag tells bash to execute the following string as a command.
21 | # The 'set -ex' enables debug output and exits on error for easier troubleshooting.
22 | - -c
23 | - |
24 | set -ex
25 | export VERSION=$(cat ./cmd/version.txt)
26 | chmod +x .ci/quickstart_test/run_py_tests.sh
27 | .ci/quickstart_test/run_py_tests.sh
28 | env:
29 | - 'CLOUD_SQL_INSTANCE=${_CLOUD_SQL_INSTANCE}'
30 | - 'GCP_PROJECT=${_GCP_PROJECT}'
31 | - 'DATABASE_NAME=${_DATABASE_NAME}'
32 | - 'DB_USER=${_DB_USER}'
33 | secretEnv: ['TOOLS_YAML_CONTENT', 'GOOGLE_API_KEY', 'DB_PASSWORD']
34 |
35 | availableSecrets:
36 | secretManager:
37 | - versionName: projects/${_GCP_PROJECT}/secrets/${_TOOLS_YAML_SECRET}/versions/5
38 | env: 'TOOLS_YAML_CONTENT'
39 | - versionName: projects/${_GCP_PROJECT_NUMBER}/secrets/${_API_KEY_SECRET}/versions/latest
40 | env: 'GOOGLE_API_KEY'
41 | - versionName: projects/${_GCP_PROJECT}/secrets/${_DB_PASS_SECRET}/versions/latest
42 | env: 'DB_PASSWORD'
43 |
44 | timeout: 1000s
45 |
46 | options:
47 | logging: CLOUD_LOGGING_ONLY
48 |
```
--------------------------------------------------------------------------------
/docs/en/resources/tools/looker/looker-get-connections.md:
--------------------------------------------------------------------------------
```markdown
1 | ---
2 | title: "looker-get-connections"
3 | type: docs
4 | weight: 1
5 | description: >
6 | A "looker-get-connections" tool returns all the connections in the source.
7 | aliases:
8 | - /resources/tools/looker-get-connections
9 | ---
10 |
11 | ## About
12 |
13 | A `looker-get-connections` tool returns all the connections in the source.
14 |
15 | It's compatible with the following sources:
16 |
17 | - [looker](../../sources/looker.md)
18 |
19 | `looker-get-connections` accepts no parameters.
20 |
21 | ## Example
22 |
23 | ```yaml
24 | tools:
25 | get_connections:
26 | kind: looker-get-connections
27 | source: looker-source
28 | description: |
29 | This tool retrieves a list of all database connections configured in the Looker system.
30 |
31 | Parameters:
32 | This tool takes no parameters.
33 |
34 | Output:
35 | A JSON array of objects, each representing a database connection and including details such as:
36 | - `name`: The connection's unique identifier.
37 | - `dialect`: The database dialect (e.g., "mysql", "postgresql", "bigquery").
38 | - `default_schema`: The default schema for the connection.
39 | - `database`: The associated database name (if applicable).
40 | - `supports_multiple_databases`: A boolean indicating if the connection can access multiple databases.
41 | ```
42 |
43 | ## Reference
44 |
45 | | **field** | **type** | **required** | **description** |
46 | |-------------|:--------:|:------------:|----------------------------------------------------|
47 | | kind | string | true | Must be "looker-get-connections". |
48 | | source | string | true | Name of the source Looker instance. |
49 | | description | string | true | Description of the tool that is passed to the LLM. |
50 |
```
--------------------------------------------------------------------------------
/docs/en/resources/sources/cloud-monitoring.md:
--------------------------------------------------------------------------------
```markdown
1 | ---
2 | title: "Cloud Monitoring"
3 | type: docs
4 | weight: 1
5 | description: >
6 | A "cloud-monitoring" source provides a client for the Cloud Monitoring API.
7 | aliases:
8 | - /resources/sources/cloud-monitoring
9 | ---
10 |
11 | ## About
12 |
13 | The `cloud-monitoring` source provides a client to interact with the [Google
14 | Cloud Monitoring API](https://cloud.google.com/monitoring/api). This allows
15 | tools to access cloud monitoring metrics explorer and run promql queries.
16 |
17 | Authentication can be handled in two ways:
18 |
19 | 1. **Application Default Credentials (ADC):** By default, the source uses ADC
20 | to authenticate with the API.
21 | 2. **Client-side OAuth:** If `useClientOAuth` is set to `true`, the source will
22 | expect an OAuth 2.0 access token to be provided by the client (e.g., a web
23 | browser) for each request.
24 |
25 | ## Example
26 |
27 | ```yaml
28 | sources:
29 | my-cloud-monitoring:
30 | kind: cloud-monitoring
31 |
32 | my-oauth-cloud-monitoring:
33 | kind: cloud-monitoring
34 | useClientOAuth: true
35 | ```
36 |
37 | ## Reference
38 |
39 | | **field** | **type** | **required** | **description** |
40 | |----------------|:--------:|:------------:|------------------------------------------------------------------------------------------------------------------------------------------------|
41 | | kind | string | true | Must be "cloud-monitoring". |
42 | | useClientOAuth | boolean | false | If true, the source will use client-side OAuth for authorization. Otherwise, it will use Application Default Credentials. Defaults to `false`. |
43 |
```
--------------------------------------------------------------------------------
/docs/en/resources/tools/clickhouse/clickhouse-execute-sql.md:
--------------------------------------------------------------------------------
```markdown
1 | ---
2 | title: "clickhouse-execute-sql"
3 | type: docs
4 | weight: 1
5 | description: >
6 | A "clickhouse-execute-sql" tool executes a SQL statement against a ClickHouse
7 | database.
8 | aliases:
9 | - /resources/tools/clickhouse-execute-sql
10 | ---
11 |
12 | ## About
13 |
14 | A `clickhouse-execute-sql` tool executes a SQL statement against a ClickHouse
15 | database. It's compatible with the [clickhouse](../../sources/clickhouse.md)
16 | source.
17 |
18 | `clickhouse-execute-sql` takes one input parameter `sql` and runs the SQL
19 | statement against the specified `source`. This tool includes query logging
20 | capabilities for monitoring and debugging purposes.
21 |
22 | > **Note:** This tool is intended for developer assistant workflows with
23 | > human-in-the-loop and shouldn't be used for production agents.
24 |
25 | ## Example
26 |
27 | ```yaml
28 | tools:
29 | execute_sql_tool:
30 | kind: clickhouse-execute-sql
31 | source: my-clickhouse-instance
32 | description: Use this tool to execute SQL statements against ClickHouse.
33 | ```
34 |
35 | ## Parameters
36 |
37 | | **parameter** | **type** | **required** | **description** |
38 | |---------------|:--------:|:------------:|---------------------------------------------------|
39 | | sql | string | true | The SQL statement to execute against the database |
40 |
41 | ## Reference
42 |
43 | | **field** | **type** | **required** | **description** |
44 | |-------------|:--------:|:------------:|-------------------------------------------------------|
45 | | kind | string | true | Must be "clickhouse-execute-sql". |
46 | | source | string | true | Name of the ClickHouse source to execute SQL against. |
47 | | description | string | true | Description of the tool that is passed to the LLM. |
48 |
```
--------------------------------------------------------------------------------
/docs/en/resources/tools/looker/looker-get-connection-databases.md:
--------------------------------------------------------------------------------
```markdown
1 | ---
2 | title: "looker-get-connection-databases"
3 | type: docs
4 | weight: 1
5 | description: >
6 | A "looker-get-connection-databases" tool returns all the databases in a connection.
7 | aliases:
8 | - /resources/tools/looker-get-connection-databases
9 | ---
10 |
11 | ## About
12 |
13 | A `looker-get-connection-databases` tool returns all the databases in a connection.
14 |
15 | It's compatible with the following sources:
16 |
17 | - [looker](../../sources/looker.md)
18 |
19 | `looker-get-connection-databases` accepts a `conn` parameter.
20 |
21 | ## Example
22 |
23 | ```yaml
24 | tools:
25 | get_connection_databases:
26 | kind: looker-get-connection-databases
27 | source: looker-source
28 | description: |
29 | This tool retrieves a list of databases available through a specified Looker connection.
30 | This is only applicable for connections that support multiple databases.
31 | Use `get_connections` to check if a connection supports multiple databases.
32 |
33 | Parameters:
34 | - connection_name (required): The name of the database connection, obtained from `get_connections`.
35 |
36 | Output:
37 | A JSON array of strings, where each string is the name of an available database.
38 | If the connection does not support multiple databases, an empty list or an error will be returned.
39 | ```
40 |
41 | ## Reference
42 |
43 | | **field** | **type** | **required** | **description** |
44 | |-------------|:--------:|:------------:|----------------------------------------------------|
45 | | kind | string | true | Must be "looker-get-connection-databases". |
46 | | source | string | true | Name of the source Looker instance. |
47 | | description | string | true | Description of the tool that is passed to the LLM. |
48 |
```
--------------------------------------------------------------------------------
/docs/en/resources/tools/looker/looker-get-explores.md:
--------------------------------------------------------------------------------
```markdown
1 | ---
2 | title: "looker-get-explores"
3 | type: docs
4 | weight: 1
5 | description: >
6 | A "looker-get-explores" tool returns all explores
7 | for the given model from the source.
8 | aliases:
9 | - /resources/tools/looker-get-explores
10 | ---
11 |
12 | ## About
13 |
14 | A `looker-get-explores` tool returns all explores
15 | for a given model from the source.
16 |
17 | It's compatible with the following sources:
18 |
19 | - [looker](../../sources/looker.md)
20 |
21 | `looker-get-explores` accepts one parameter, the
22 | `model` id.
23 |
24 | The return type is an array of maps, each map is formatted like:
25 |
26 | ```json
27 | {
28 | "name": "explore name",
29 | "description": "explore description",
30 | "label": "explore label",
31 | "group_label": "group label"
32 | }
33 | ```
34 |
35 | ## Example
36 |
37 | ```yaml
38 | tools:
39 | get_explores:
40 | kind: looker-get-explores
41 | source: looker-source
42 | description: |
43 | This tool retrieves a list of explores defined within a specific LookML model.
44 | Explores represent a curated view of your data, typically joining several
45 | tables together to allow for focused analysis on a particular subject area.
46 | The output provides details like the explore's `name` and `label`.
47 |
48 | Parameters:
49 | - model_name (required): The name of the LookML model, obtained from `get_models`.
50 | ```
51 |
52 | ## Reference
53 |
54 | | **field** | **type** | **required** | **description** |
55 | |-------------|:--------:|:------------:|----------------------------------------------------|
56 | | kind | string | true | Must be "looker-get-explores". |
57 | | source | string | true | Name of the source the SQL should execute on. |
58 | | description | string | true | Description of the tool that is passed to the LLM. |
59 |
```
--------------------------------------------------------------------------------
/internal/prebuiltconfigs/tools/cloud-sql-postgres-admin.yaml:
--------------------------------------------------------------------------------
```yaml
1 | # Copyright 2025 Google LLC
2 | #
3 | # Licensed under the Apache License, Version 2.0 (the "License");
4 | # you may not use this file except in compliance with the License.
5 | # You may obtain a copy of the License at
6 | #
7 | # http://www.apache.org/licenses/LICENSE-2.0
8 | #
9 | # Unless required by applicable law or agreed to in writing, software
10 | # distributed under the License is distributed on an "AS IS" BASIS,
11 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 | # See the License for the specific language governing permissions and
13 | # limitations under the License.
14 |
15 | sources:
16 | cloud-sql-admin-source:
17 | kind: cloud-sql-admin
18 | defaultProject: ${CLOUD_SQL_POSTGRES_PROJECT:}
19 |
20 | tools:
21 | create_instance:
22 | kind: cloud-sql-postgres-create-instance
23 | source: cloud-sql-admin-source
24 | get_instance:
25 | kind: cloud-sql-get-instance
26 | source: cloud-sql-admin-source
27 | list_instances:
28 | kind: cloud-sql-list-instances
29 | source: cloud-sql-admin-source
30 | create_database:
31 | kind: cloud-sql-create-database
32 | source: cloud-sql-admin-source
33 | list_databases:
34 | kind: cloud-sql-list-databases
35 | source: cloud-sql-admin-source
36 | create_user:
37 | kind: cloud-sql-create-users
38 | source: cloud-sql-admin-source
39 | wait_for_operation:
40 | kind: cloud-sql-wait-for-operation
41 | source: cloud-sql-admin-source
42 | multiplier: 4
43 | clone_instance:
44 | kind: cloud-sql-clone-instance
45 | source: cloud-sql-admin-source
46 | postgres_upgrade_precheck:
47 | kind: postgres-upgrade-precheck
48 | source: cloud-sql-admin-source
49 |
50 | toolsets:
51 | cloud_sql_postgres_admin_tools:
52 | - create_instance
53 | - get_instance
54 | - list_instances
55 | - create_database
56 | - list_databases
57 | - create_user
58 | - wait_for_operation
59 | - postgres_upgrade_precheck
60 | - clone_instance
61 |
```
--------------------------------------------------------------------------------
/docs/en/resources/tools/dataform/dataform-compile-local.md:
--------------------------------------------------------------------------------
```markdown
1 | ---
2 | title: "dataform-compile-local"
3 | type: docs
4 | weight: 1
5 | description: >
6 | A "dataform-compile-local" tool runs the `dataform compile` CLI command on a local project directory.
7 | aliases:
8 | - /resources/tools/dataform-compile-local
9 | ---
10 |
11 | ## About
12 |
13 | A `dataform-compile-local` tool runs the `dataform compile` command on a local
14 | Dataform project.
15 |
16 | It is a standalone tool and **is not** compatible with any sources.
17 |
18 | At invocation time, the tool executes `dataform compile --json` in the specified
19 | project directory and returns the resulting JSON object from the CLI.
20 |
21 | `dataform-compile-local` takes the following parameter:
22 |
23 | - `project_dir` (string): The absolute or relative path to the local Dataform
24 | project directory. The server process must have read access to this path.
25 |
26 | ## Requirements
27 |
28 | ### Dataform CLI
29 |
30 | This tool executes the `dataform` command-line interface (CLI) via a system
31 | call. You must have the **`dataform` CLI** installed and available in the
32 | server's system `PATH`.
33 |
34 | You can typically install the CLI via `npm`:
35 |
36 | ```bash
37 | npm install -g @dataform/cli
38 | ```
39 |
40 | See the [official Dataform
41 | documentation](https://www.google.com/search?q=https://cloud.google.com/dataform/docs/install-dataform-cli)
42 | for more details.
43 |
44 | ## Example
45 |
46 | ```yaml
47 | tools:
48 | my_dataform_compiler:
49 | kind: dataform-compile-local
50 | description: Use this tool to compile a local Dataform project.
51 | ```
52 |
53 | ## Reference
54 |
55 | | **field** | **type** | **required** | **description** |
56 | |:------------|:---------|:-------------|:---------------------------------------------------|
57 | | kind | string | true | Must be "dataform-compile-local". |
58 | | description | string | true | Description of the tool that is passed to the LLM. |
59 |
```
--------------------------------------------------------------------------------
/docs/en/resources/tools/postgres/postgres-execute-sql.md:
--------------------------------------------------------------------------------
```markdown
1 | ---
2 | title: "postgres-execute-sql"
3 | type: docs
4 | weight: 1
5 | description: >
6 | A "postgres-execute-sql" tool executes a SQL statement against a Postgres
7 | database.
8 | aliases:
9 | - /resources/tools/postgres-execute-sql
10 | ---
11 |
12 | ## About
13 |
14 | A `postgres-execute-sql` tool executes a SQL statement against a Postgres
15 | database. It's compatible with any of the following sources:
16 |
17 | - [alloydb-postgres](../../sources/alloydb-pg.md)
18 | - [cloud-sql-postgres](../../sources/cloud-sql-pg.md)
19 | - [postgres](../../sources/postgres.md)
20 |
21 | `postgres-execute-sql` takes one input parameter `sql` and run the sql
22 | statement against the `source`.
23 |
24 | > **Note:** This tool is intended for developer assistant workflows with
25 | > human-in-the-loop and shouldn't be used for production agents.
26 |
27 | ## Example
28 |
29 | ```yaml
30 | tools:
31 | execute_sql_tool:
32 | kind: postgres-execute-sql
33 | source: my-pg-instance
34 | description: Use this tool to execute sql statement.
35 | ```
36 |
37 | ## Reference
38 |
39 | | **field** | **type** | **required** | **description** |
40 | |-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
41 | | kind | string | true | Must be "postgres-execute-sql". |
42 | | source | string | true | Name of the source the SQL should execute on. |
43 | | description | string | true | Description of the tool that is passed to the LLM. |
44 |
```
--------------------------------------------------------------------------------
/docs/en/resources/tools/clickhouse/clickhouse-list-databases.md:
--------------------------------------------------------------------------------
```markdown
1 | ---
2 | title: "clickhouse-list-databases"
3 | type: docs
4 | weight: 3
5 | description: >
6 | A "clickhouse-list-databases" tool lists all databases in a ClickHouse instance.
7 | aliases:
8 | - /resources/tools/clickhouse-list-databases
9 | ---
10 |
11 | ## About
12 |
13 | A `clickhouse-list-databases` tool lists all available databases in a ClickHouse
14 | instance. It's compatible with the [clickhouse](../../sources/clickhouse.md)
15 | source.
16 |
17 | This tool executes the `SHOW DATABASES` command and returns a list of all
18 | databases accessible to the configured user, making it useful for database
19 | discovery and exploration tasks.
20 |
21 | ## Example
22 |
23 | ```yaml
24 | tools:
25 | list_clickhouse_databases:
26 | kind: clickhouse-list-databases
27 | source: my-clickhouse-instance
28 | description: List all available databases in the ClickHouse instance
29 | ```
30 |
31 | ## Return Value
32 |
33 | The tool returns an array of objects, where each object contains:
34 |
35 | - `name`: The name of the database
36 |
37 | Example response:
38 |
39 | ```json
40 | [
41 | {"name": "default"},
42 | {"name": "system"},
43 | {"name": "analytics"},
44 | {"name": "user_data"}
45 | ]
46 | ```
47 |
48 | ## Reference
49 |
50 | | **field** | **type** | **required** | **description** |
51 | |--------------|:------------------:|:------------:|-------------------------------------------------------|
52 | | kind | string | true | Must be "clickhouse-list-databases". |
53 | | source | string | true | Name of the ClickHouse source to list databases from. |
54 | | description | string | true | Description of the tool that is passed to the LLM. |
55 | | authRequired | array of string | false | Authentication services required to use this tool. |
56 | | parameters | array of Parameter | false | Parameters for the tool (typically not used). |
57 |
```
--------------------------------------------------------------------------------
/docs/en/resources/tools/looker/looker-conversational-analytics.md:
--------------------------------------------------------------------------------
```markdown
1 | ---
2 | title: "looker-conversational-analytics"
3 | type: docs
4 | weight: 1
5 | description: >
6 | The "looker-conversational-analytics" tool will use the Conversational
7 | Analaytics API to analyze data from Looker
8 | aliases:
9 | - /resources/tools/looker-conversational-analytics
10 | ---
11 |
12 | ## About
13 |
14 | A `looker-conversational-analytics` tool allows you to ask questions about your
15 | Looker data.
16 |
17 | It's compatible with the following sources:
18 |
19 | - [looker](../../sources/looker.md)
20 |
21 | `looker-conversational-analytics` accepts two parameters:
22 |
23 | 1. `user_query_with_context`: The question asked of the Conversational Analytics
24 | system.
25 | 2. `explore_references`: A list of one to five explores that can be queried to
26 | answer the question. The form of the entry is `[{"model": "model name",
27 | "explore": "explore name"}, ...]`
28 |
29 | ## Example
30 |
31 | ```yaml
32 | tools:
33 | ask_data_insights:
34 | kind: looker-conversational-analytics
35 | source: looker-source
36 | description: |
37 | Use this tool to ask questions about your data using the Looker Conversational
38 | Analytics API. You must provide a natural language query and a list of
39 | 1 to 5 model and explore combinations (e.g. [{'model': 'the_model', 'explore': 'the_explore'}]).
40 | Use the 'get_models' and 'get_explores' tools to discover available models and explores.
41 | ```
42 |
43 | ## Reference
44 |
45 | | **field** | **type** | **required** | **description** |
46 | |-------------|:--------:|:------------:|----------------------------------------------------|
47 | | kind | string | true | Must be "lookerca-conversational-analytics". |
48 | | source | string | true | Name of the source the SQL should execute on. |
49 | | description | string | true | Description of the tool that is passed to the LLM. |
50 |
```
--------------------------------------------------------------------------------
/docs/en/resources/tools/looker/looker-get-connection-tables.md:
--------------------------------------------------------------------------------
```markdown
1 | ---
2 | title: "looker-get-connection-tables"
3 | type: docs
4 | weight: 1
5 | description: >
6 | A "looker-get-connection-tables" tool returns all the tables in a connection.
7 | aliases:
8 | - /resources/tools/looker-get-connection-tables
9 | ---
10 |
11 | ## About
12 |
13 | A `looker-get-connection-tables` tool returns all the tables in a connection.
14 |
15 | It's compatible with the following sources:
16 |
17 | - [looker](../../sources/looker.md)
18 |
19 | `looker-get-connection-tables` accepts a `conn` parameter, a `schema` parameter,
20 | and an optional `db` parameter.
21 |
22 | ## Example
23 |
24 | ```yaml
25 | tools:
26 | get_connection_tables:
27 | kind: looker-get-connection-tables
28 | source: looker-source
29 | description: |
30 | This tool retrieves a list of tables available within a specified database schema
31 | through a Looker connection.
32 |
33 | Parameters:
34 | - connection_name (required): The name of the database connection, obtained from `get_connections`.
35 | - schema (required): The name of the schema to list tables from, obtained from `get_connection_schemas`.
36 | - database (optional): The name of the database to filter by. Only applicable for connections
37 | that support multiple databases (check with `get_connections`).
38 |
39 | Output:
40 | A JSON array of strings, where each string is the name of an available table.
41 | ```
42 |
43 | ## Reference
44 |
45 | | **field** | **type** | **required** | **description** |
46 | |-------------|:--------:|:------------:|----------------------------------------------------|
47 | | kind | string | true | Must be "looker-get-connection-tables". |
48 | | source | string | true | Name of the source Looker instance. |
49 | | description | string | true | Description of the tool that is passed to the LLM. |
50 |
```
--------------------------------------------------------------------------------
/internal/prebuiltconfigs/tools/alloydb-postgres-admin.yaml:
--------------------------------------------------------------------------------
```yaml
1 | # Copyright 2025 Google LLC
2 | #
3 | # Licensed under the Apache License, Version 2.0 (the "License");
4 | # you may not use this file except in compliance with the License.
5 | # You may obtain a copy of the License at
6 | #
7 | # http://www.apache.org/licenses/LICENSE-2.0
8 | #
9 | # Unless required by applicable law or agreed to in writing, software
10 | # distributed under the License is distributed on an "AS IS" BASIS,
11 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 | # See the License for the specific language governing permissions and
13 | # limitations under the License.
14 |
15 | sources:
16 | alloydb-admin-source:
17 | kind: alloydb-admin
18 | defaultProject: ${ALLOYDB_POSTGRES_PROJECT:}
19 | tools:
20 | create_cluster:
21 | kind: alloydb-create-cluster
22 | source: alloydb-admin-source
23 | wait_for_operation:
24 | kind: alloydb-wait-for-operation
25 | source: alloydb-admin-source
26 | delay: 1s
27 | maxDelay: 4m
28 | multiplier: 2
29 | maxRetries: 10
30 | create_instance:
31 | kind: alloydb-create-instance
32 | source: alloydb-admin-source
33 | list_clusters:
34 | kind: alloydb-list-clusters
35 | source: alloydb-admin-source
36 | list_instances:
37 | kind: alloydb-list-instances
38 | source: alloydb-admin-source
39 | list_users:
40 | kind: alloydb-list-users
41 | source: alloydb-admin-source
42 | create_user:
43 | kind: alloydb-create-user
44 | source: alloydb-admin-source
45 | get_cluster:
46 | kind: alloydb-get-cluster
47 | source: alloydb-admin-source
48 | get_instance:
49 | kind: alloydb-get-instance
50 | source: alloydb-admin-source
51 | get_user:
52 | kind: alloydb-get-user
53 | source: alloydb-admin-source
54 |
55 | toolsets:
56 | alloydb_postgres_admin_tools:
57 | - create_cluster
58 | - wait_for_operation
59 | - create_instance
60 | - list_clusters
61 | - list_instances
62 | - list_users
63 | - create_user
64 | - get_cluster
65 | - get_instance
66 | - get_user
67 |
```
--------------------------------------------------------------------------------
/docs/en/resources/tools/mssql/mssql-list-tables.md:
--------------------------------------------------------------------------------
```markdown
1 | ---
2 | title: "mssql-list-tables"
3 | type: docs
4 | weight: 1
5 | description: >
6 | The "mssql-list-tables" tool lists schema information for all or specified tables in a SQL server database.
7 | aliases:
8 | - /resources/tools/mssql-list-tables
9 | ---
10 |
11 | ## About
12 |
13 | The `mssql-list-tables` tool retrieves schema information for all or specified
14 | tables in a SQL server database. It is compatible with any of the following
15 | sources:
16 |
17 | - [cloud-sql-mssql](../../sources/cloud-sql-mssql.md)
18 | - [mssql](../../sources/mssql.md)
19 |
20 | `mssql-list-tables` lists detailed schema information (object type, columns,
21 | constraints, indexes, triggers, owner, comment) as JSON for user-created tables
22 | (ordinary or partitioned).
23 |
24 | The tool takes the following input parameters:
25 |
26 | - **`table_names`** (string, optional): Filters by a comma-separated list of
27 | names. By default, it lists all tables in user schemas. Default: `""`.
28 | - **`output_format`** (string, optional): Indicate the output format of table
29 | schema. `simple` will return only the table names, `detailed` will return the
30 | full table information. Default: `detailed`.
31 |
32 | ## Example
33 |
34 | ```yaml
35 | tools:
36 | mssql_list_tables:
37 | kind: mssql-list-tables
38 | source: mssql-source
39 | description: Use this tool to retrieve schema information for all or specified tables. Output format can be simple (only table names) or detailed.
40 | ```
41 |
42 | ## Reference
43 |
44 | | **field** | **type** | **required** | **description** |
45 | |-------------|:--------:|:------------:|------------------------------------------------------|
46 | | kind | string | true | Must be "mssql-list-tables". |
47 | | source | string | true | Name of the source the SQL should execute on. |
48 | | description | string | true | Description of the tool that is passed to the agent. |
49 |
```
--------------------------------------------------------------------------------
/internal/sources/sqlite/sqlite_test.go:
--------------------------------------------------------------------------------
```go
1 | // Copyright 2025 Google LLC
2 | //
3 | // Licensed under the Apache License, Version 2.0 (the "License");
4 | // you may not use this file except in compliance with the License.
5 | // You may obtain a copy of the License at
6 | //
7 | // http://www.apache.org/licenses/LICENSE-2.0
8 | //
9 | // Unless required by applicable law or agreed to in writing, software
10 | // distributed under the License is distributed on an "AS IS" BASIS,
11 | // WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 | // See the License for the specific language governing permissions and
13 | // limitations under the License.
14 |
15 | package sqlite_test
16 |
17 | import (
18 | "testing"
19 |
20 | yaml "github.com/goccy/go-yaml"
21 | "github.com/google/go-cmp/cmp"
22 | "github.com/googleapis/genai-toolbox/internal/server"
23 | "github.com/googleapis/genai-toolbox/internal/sources"
24 | "github.com/googleapis/genai-toolbox/internal/sources/sqlite"
25 | "github.com/googleapis/genai-toolbox/internal/testutils"
26 | )
27 |
28 | func TestParseFromYamlSQLite(t *testing.T) {
29 | tcs := []struct {
30 | desc string
31 | in string
32 | want server.SourceConfigs
33 | }{
34 | {
35 | desc: "basic example",
36 | in: `
37 | sources:
38 | my-sqlite-db:
39 | kind: sqlite
40 | database: /path/to/database.db
41 | `,
42 | want: map[string]sources.SourceConfig{
43 | "my-sqlite-db": sqlite.Config{
44 | Name: "my-sqlite-db",
45 | Kind: sqlite.SourceKind,
46 | Database: "/path/to/database.db",
47 | },
48 | },
49 | },
50 | }
51 | for _, tc := range tcs {
52 | t.Run(tc.desc, func(t *testing.T) {
53 | got := struct {
54 | Sources server.SourceConfigs `yaml:"sources"`
55 | }{}
56 | // Parse contents
57 | err := yaml.Unmarshal(testutils.FormatYaml(tc.in), &got)
58 | if err != nil {
59 | t.Fatalf("unable to unmarshal: %s", err)
60 | }
61 | if !cmp.Equal(tc.want, got.Sources) {
62 | t.Fatalf("incorrect parse: want %v, got %v", tc.want, got.Sources)
63 | }
64 | })
65 | }
66 | }
67 |
```
--------------------------------------------------------------------------------
/.ci/generate_release_table.sh:
--------------------------------------------------------------------------------
```bash
1 | #! /bin/bash
2 |
3 |
4 | # Check if VERSION has been set
5 | if [ -z "${VERSION}" ]; then
6 | echo "Error: VERSION env var is not set" >&2 # Print to stderr
7 | exit 1 # Exit with a non-zero status to indicate an error
8 | fi
9 |
10 |
11 | FILES=("linux.amd64" "darwin.arm64" "darwin.amd64" "windows.amd64")
12 | output_string=""
13 |
14 | # Define the descriptions - ensure this array's order matches FILES
15 | DESCRIPTIONS=(
16 | "For **Linux** systems running on **Intel/AMD 64-bit processors**."
17 | "For **macOS** systems running on **Apple Silicon** (M1, M2, M3, etc.) processors."
18 | "For **macOS** systems running on **Intel processors**."
19 | "For **Windows** systems running on **Intel/AMD 64-bit processors**."
20 | )
21 |
22 | # Write the table header
23 | ROW_FMT="| %-105s | %-120s | %-67s |\n"
24 | output_string+=$(printf "$ROW_FMT" "**OS/Architecture**" "**Description**" "**SHA256 Hash**")$'\n'
25 | output_string+=$(printf "$ROW_FMT" "$(printf -- '-%0.s' {1..105})" "$(printf -- '-%0.s' {1..120})" "$(printf -- '-%0.s' {1..67})")$'\n'
26 |
27 |
28 | # Loop through all files matching the pattern "toolbox.*.*"
29 | for i in "${!FILES[@]}"
30 | do
31 | file_key="${FILES[$i]}" # e.g., "linux.amd64"
32 | description_text="${DESCRIPTIONS[$i]}"
33 |
34 | # Extract OS and ARCH from the filename
35 | OS=$(echo "$file_key" | cut -d '.' -f 1)
36 | ARCH=$(echo "$file_key" | cut -d '.' -f 2)
37 |
38 | # Get release URL
39 | if [ "$OS" = 'windows' ];
40 | then
41 | URL="https://storage.googleapis.com/genai-toolbox/$VERSION/$OS/$ARCH/toolbox.exe"
42 | else
43 | URL="https://storage.googleapis.com/genai-toolbox/$VERSION/$OS/$ARCH/toolbox"
44 | fi
45 |
46 | curl "$URL" --fail --output toolbox || exit 1
47 |
48 | # Calculate the SHA256 checksum of the file
49 | SHA256=$(shasum -a 256 toolbox | awk '{print $1}')
50 |
51 | # Write the table row
52 | output_string+=$(printf "$ROW_FMT" "[$OS/$ARCH]($URL)" "$description_text" "$SHA256")$'\n'
53 |
54 | rm toolbox
55 | done
56 |
57 | printf "$output_string\n"
58 |
59 |
```
--------------------------------------------------------------------------------
/docs/en/resources/tools/postgres/postgres-list-tables.md:
--------------------------------------------------------------------------------
```markdown
1 | ---
2 | title: "postgres-list-tables"
3 | type: docs
4 | weight: 1
5 | description: >
6 | The "postgres-list-tables" tool lists schema information for all or specified
7 | tables in a Postgres database.
8 | aliases:
9 | - /resources/tools/postgres-list-tables
10 | ---
11 |
12 | ## About
13 |
14 | The `postgres-list-tables` tool retrieves schema information for all or
15 | specified tables in a Postgres database. It's compatible with any of the
16 | following sources:
17 |
18 | - [alloydb-postgres](../../sources/alloydb-pg.md)
19 | - [cloud-sql-postgres](../../sources/cloud-sql-pg.md)
20 | - [postgres](../../sources/postgres.md)
21 |
22 | `postgres-list-tables` lists detailed schema information (object type, columns,
23 | constraints, indexes, triggers, owner, comment) as JSON for user-created tables
24 | (ordinary or partitioned). The tool takes the following input parameters: *
25 | `table_names` (optional): Filters by a comma-separated list of names. By
26 | default, it lists all tables in user schemas.* `output_format` (optional):
27 | Indicate the output format of table schema. `simple` will return only the
28 | table names, `detailed` will return the full table information. Default:
29 | `detailed`.
30 |
31 | ## Example
32 |
33 | ```yaml
34 | tools:
35 | postgres_list_tables:
36 | kind: postgres-list-tables
37 | source: postgres-source
38 | description: Use this tool to retrieve schema information for all or
39 | specified tables. Output format can be simple (only table names) or detailed.
40 | ```
41 |
42 | ## Reference
43 |
44 | | **field** | **type** | **required** | **description** |
45 | |-------------|:--------:|:------------:|------------------------------------------------------|
46 | | kind | string | true | Must be "postgres-list-tables". |
47 | | source | string | true | Name of the source the SQL should execute on. |
48 | | description | string | true | Description of the tool that is passed to the agent. |
49 |
```
--------------------------------------------------------------------------------
/internal/tools/clickhouse/clickhouseexecutesql/clickhouseexecutesql_test.go:
--------------------------------------------------------------------------------
```go
1 | // Copyright 2025 Google LLC
2 | //
3 | // Licensed under the Apache License, Version 2.0 (the "License");
4 | // you may not use this file except in compliance with the License.
5 | // You may obtain a copy of the License at
6 | //
7 | // http://www.apache.org/licenses/LICENSE-2.0
8 | //
9 | // Unless required by applicable law or agreed to in writing, software
10 | // distributed under the License is distributed on an "AS IS" BASIS,
11 | // WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 | // See the License for the specific language governing permissions and
13 | // limitations under the License.
14 |
15 | package clickhouse
16 |
17 | import (
18 | "testing"
19 |
20 | yaml "github.com/goccy/go-yaml"
21 | "github.com/google/go-cmp/cmp"
22 | "github.com/googleapis/genai-toolbox/internal/server"
23 | "github.com/googleapis/genai-toolbox/internal/testutils"
24 | )
25 |
26 | func TestParseFromYamlClickHouseExecuteSQL(t *testing.T) {
27 | ctx, err := testutils.ContextWithNewLogger()
28 | if err != nil {
29 | t.Fatalf("unexpected error: %s", err)
30 | }
31 | tcs := []struct {
32 | desc string
33 | in string
34 | want server.ToolConfigs
35 | }{
36 | {
37 | desc: "basic example",
38 | in: `
39 | tools:
40 | example_tool:
41 | kind: clickhouse-execute-sql
42 | source: my-instance
43 | description: some description
44 | `,
45 | want: server.ToolConfigs{
46 | "example_tool": Config{
47 | Name: "example_tool",
48 | Kind: "clickhouse-execute-sql",
49 | Source: "my-instance",
50 | Description: "some description",
51 | AuthRequired: []string{},
52 | },
53 | },
54 | },
55 | }
56 | for _, tc := range tcs {
57 | t.Run(tc.desc, func(t *testing.T) {
58 | got := struct {
59 | Tools server.ToolConfigs `yaml:"tools"`
60 | }{}
61 | err := yaml.UnmarshalContext(ctx, testutils.FormatYaml(tc.in), &got)
62 | if err != nil {
63 | t.Fatalf("unable to unmarshal: %s", err)
64 | }
65 | if diff := cmp.Diff(tc.want, got.Tools); diff != "" {
66 | t.Fatalf("incorrect parse: diff %v", diff)
67 | }
68 | })
69 | }
70 | }
71 |
```
--------------------------------------------------------------------------------
/docs/en/resources/tools/alloydb/alloydb-list-clusters.md:
--------------------------------------------------------------------------------
```markdown
1 | ---
2 | title: alloydb-list-clusters
3 | type: docs
4 | weight: 1
5 | description: "The \"alloydb-list-clusters\" tool lists the AlloyDB clusters in a given project and location.\n"
6 | aliases: [/resources/tools/alloydb-list-clusters]
7 | ---
8 |
9 | ## About
10 |
11 | The `alloydb-list-clusters` tool retrieves AlloyDB cluster information for all
12 | or specified locations in a given project. It is compatible with
13 | [alloydb-admin](../../sources/alloydb-admin.md) source.
14 |
15 | `alloydb-list-clusters` tool lists the detailed information of AlloyDB
16 | cluster(cluster name, state, configuration, etc) for a given project and
17 | location. The tool takes the following input parameters:
18 |
19 | | Parameter | Type | Description | Required |
20 | | :--------- | :----- | :----------------------------------------------------------------------------------------------- | :------- |
21 | | `project` | string | The GCP project ID to list clusters for. | Yes |
22 | | `location` | string | The location to list clusters in (e.g., 'us-central1'). Use `-` for all locations. Default: `-`. | No |
23 |
24 | ## Example
25 |
26 | ```yaml
27 | tools:
28 | list_clusters:
29 | kind: alloydb-list-clusters
30 | source: alloydb-admin-source
31 | description: Use this tool to list all AlloyDB clusters in a given project and location.
32 | ```
33 |
34 | ## Reference
35 |
36 | | **field** | **type** | **required** | **description** |
37 | | ----------- | :------: | :----------: | ---------------------------------------------------- |
38 | | kind | string | true | Must be alloydb-list-clusters. |
39 | | source | string | true | The name of an `alloydb-admin` source. |
40 | | description | string | false | Description of the tool that is passed to the agent. |
41 |
```
--------------------------------------------------------------------------------
/internal/util/orderedmap/orderedmap_test.go:
--------------------------------------------------------------------------------
```go
1 | // Copyright 2025 Google LLC
2 | //
3 | // Licensed under the Apache License, Version 2.0 (the "License");
4 | // you may not use this file except in compliance with the License.
5 | // You may obtain a copy of the License at
6 | //
7 | // http://www.apache.org/licenses/LICENSE-2.0
8 | //
9 | // Unless required by applicable law or agreed to in writing, software
10 | // distributed under the License is distributed on an "AS IS" BASIS,
11 | // WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 | // See the License for the specific language governing permissions and
13 | // limitations under the License.
14 | package orderedmap
15 |
16 | import (
17 | "encoding/json"
18 | "testing"
19 | )
20 |
21 | func TestRowMarshalJSON(t *testing.T) {
22 | tests := []struct {
23 | name string
24 | row Row
25 | want string
26 | wantErr bool
27 | }{
28 | {
29 | name: "Simple row",
30 | row: Row{
31 | Columns: []Column{
32 | {Name: "A", Value: 1},
33 | {Name: "B", Value: "two"},
34 | {Name: "C", Value: true},
35 | },
36 | },
37 | want: `{"A":1,"B":"two","C":true}`,
38 | wantErr: false,
39 | },
40 | {
41 | name: "Row with different order",
42 | row: Row{
43 | Columns: []Column{
44 | {Name: "C", Value: true},
45 | {Name: "A", Value: 1},
46 | {Name: "B", Value: "two"},
47 | },
48 | },
49 | want: `{"C":true,"A":1,"B":"two"}`,
50 | wantErr: false,
51 | },
52 | {
53 | name: "Empty row",
54 | row: Row{},
55 | want: `{}`,
56 | wantErr: false,
57 | },
58 | {
59 | name: "Row with nil value",
60 | row: Row{
61 | Columns: []Column{
62 | {Name: "A", Value: 1},
63 | {Name: "B", Value: nil},
64 | },
65 | },
66 | want: `{"A":1,"B":null}`,
67 | wantErr: false,
68 | },
69 | }
70 |
71 | for _, tt := range tests {
72 | t.Run(tt.name, func(t *testing.T) {
73 | got, err := json.Marshal(tt.row)
74 | if (err != nil) != tt.wantErr {
75 | t.Errorf("Row.MarshalJSON() error = %v, wantErr %v", err, tt.wantErr)
76 | return
77 | }
78 | if string(got) != tt.want {
79 | t.Errorf("Row.MarshalJSON() = %s, want %s", string(got), tt.want)
80 | }
81 | })
82 | }
83 | }
84 |
```
--------------------------------------------------------------------------------
/internal/tools/cloudsql/cloudsqllistinstances/cloudsqllistinstances_test.go:
--------------------------------------------------------------------------------
```go
1 | // Copyright 2025 Google LLC
2 | //
3 | // Licensed under the Apache License, Version 2.0 (the "License");
4 | // you may not use this file except in compliance with the License.
5 | // You may obtain a copy of the License at
6 | //
7 | // http://www.apache.org/licenses/LICENSE-2.0
8 | //
9 | // Unless required by applicable law or agreed to in writing, software
10 | // distributed under the License is distributed on an "AS IS" BASIS,
11 | // WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 | // See the License for the specific language governing permissions and
13 | // limitations under the License.
14 |
15 | package cloudsqllistinstances
16 |
17 | import (
18 | "testing"
19 |
20 | "github.com/goccy/go-yaml"
21 | "github.com/google/go-cmp/cmp"
22 | "github.com/googleapis/genai-toolbox/internal/server"
23 | "github.com/googleapis/genai-toolbox/internal/testutils"
24 | )
25 |
26 | func TestParseFromYaml(t *testing.T) {
27 | ctx, err := testutils.ContextWithNewLogger()
28 | if err != nil {
29 | t.Fatalf("unexpected error: %s", err)
30 | }
31 | tcs := []struct {
32 | desc string
33 | in string
34 | want server.ToolConfigs
35 | }{
36 | {
37 | desc: "basic example",
38 | in: `
39 | tools:
40 | list-my-instances:
41 | kind: cloud-sql-list-instances
42 | description: some description
43 | source: some-source
44 | `,
45 | want: server.ToolConfigs{
46 | "list-my-instances": Config{
47 | Name: "list-my-instances",
48 | Kind: "cloud-sql-list-instances",
49 | Description: "some description",
50 | AuthRequired: []string{},
51 | Source: "some-source",
52 | },
53 | },
54 | },
55 | }
56 | for _, tc := range tcs {
57 | t.Run(tc.desc, func(t *testing.T) {
58 | got := struct {
59 | Tools server.ToolConfigs `yaml:"tools"`
60 | }{}
61 | // Parse contents
62 | err := yaml.UnmarshalContext(ctx, testutils.FormatYaml(tc.in), &got)
63 | if err != nil {
64 | t.Fatalf("unable to unmarshal: %s", err)
65 | }
66 | if diff := cmp.Diff(tc.want, got.Tools); diff != "" {
67 | t.Fatalf("incorrect parse: diff %v", diff)
68 | }
69 | })
70 | }
71 | }
72 |
```
--------------------------------------------------------------------------------
/docs/en/resources/tools/serverless-spark/serverless-spark-cancel-batch.md:
--------------------------------------------------------------------------------
```markdown
1 | ---
2 | title: "serverless-spark-cancel-batch"
3 | type: docs
4 | weight: 2
5 | description: >
6 | A "serverless-spark-cancel-batch" tool cancels a running Spark batch operation.
7 | aliases:
8 | - /resources/tools/serverless-spark-cancel-batch
9 | ---
10 |
11 | ## About
12 |
13 | `serverless-spark-cancel-batch` tool cancels a running Spark batch operation in
14 | a Google Cloud Serverless for Apache Spark source. The cancellation request is
15 | asynchronous, so the batch state will not change immediately after the tool
16 | returns; it can take a minute or so for the cancellation to be reflected.
17 |
18 | It's compatible with the following sources:
19 |
20 | - [serverless-spark](../../sources/serverless-spark.md)
21 |
22 | `serverless-spark-cancel-batch` accepts the following parameters:
23 |
24 | - **`operation`** (required): The name of the operation to cancel. For example,
25 | for `projects/my-project/locations/us-central1/operations/my-operation`, you
26 | would pass `my-operation`.
27 |
28 | The tool inherits the `project` and `location` from the source configuration.
29 |
30 | ## Example
31 |
32 | ```yaml
33 | tools:
34 | cancel_spark_batch:
35 | kind: serverless-spark-cancel-batch
36 | source: my-serverless-spark-source
37 | description: Use this tool to cancel a running serverless spark batch operation.
38 | ```
39 |
40 | ## Response Format
41 |
42 | ```json
43 | "Cancelled [projects/my-project/regions/us-central1/operations/my-operation]."
44 | ```
45 |
46 | ## Reference
47 |
48 | | **field** | **type** | **required** | **description** |
49 | | ------------ | :------: | :----------: | -------------------------------------------------- |
50 | | kind | string | true | Must be "serverless-spark-cancel-batch". |
51 | | source | string | true | Name of the source the tool should use. |
52 | | description | string | true | Description of the tool that is passed to the LLM. |
53 | | authRequired | string[] | false | List of auth services required to invoke this tool |
54 |
```
--------------------------------------------------------------------------------
/Dockerfile:
--------------------------------------------------------------------------------
```dockerfile
1 | # Copyright 2024 Google LLC
2 | #
3 | # Licensed under the Apache License, Version 2.0 (the "License");
4 | # you may not use this file except in compliance with the License.
5 | # You may obtain a copy of the License at
6 | #
7 | # http://www.apache.org/licenses/LICENSE-2.0
8 | #
9 | # Unless required by applicable law or agreed to in writing, software
10 | # distributed under the License is distributed on an "AS IS" BASIS,
11 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 | # See the License for the specific language governing permissions and
13 | # limitations under the License.
14 | FROM --platform=$BUILDPLATFORM golang:1 AS build
15 |
16 | # Install Zig for CGO cross-compilation
17 | RUN apt-get update && apt-get install -y xz-utils
18 | RUN curl -fL "https://ziglang.org/download/0.15.2/zig-x86_64-linux-0.15.2.tar.xz" -o zig.tar.xz && \
19 | mkdir -p /zig && \
20 | tar -xf zig.tar.xz -C /zig --strip-components=1 && \
21 | rm zig.tar.xz
22 |
23 | WORKDIR /go/src/genai-toolbox
24 | COPY . .
25 |
26 | ARG TARGETOS
27 | ARG TARGETARCH
28 | ARG BUILD_TYPE="container.dev"
29 | ARG COMMIT_SHA=""
30 |
31 | RUN go get ./...
32 |
33 | RUN export ZIG_TARGET="" && \
34 | case "${TARGETARCH}" in \
35 | ("amd64") ZIG_TARGET="x86_64-linux-gnu" ;; \
36 | ("arm64") ZIG_TARGET="aarch64-linux-gnu" ;; \
37 | (*) echo "Unsupported architecture: ${TARGETARCH}" && exit 1 ;; \
38 | esac && \
39 | CGO_ENABLED=1 GOOS=${TARGETOS} GOARCH=${TARGETARCH} \
40 | CC="/zig/zig cc -target ${ZIG_TARGET}" \
41 | CXX="/zig/zig c++ -target ${ZIG_TARGET}" \
42 | go build \
43 | -ldflags "-X github.com/googleapis/genai-toolbox/cmd.buildType=${BUILD_TYPE} -X github.com/googleapis/genai-toolbox/cmd.commitSha=${COMMIT_SHA}" \
44 | -o genai-toolbox .
45 |
46 | # Final Stage
47 | FROM gcr.io/distroless/cc-debian12:nonroot
48 |
49 | WORKDIR /app
50 | COPY --from=build --chown=nonroot /go/src/genai-toolbox/genai-toolbox /toolbox
51 | USER nonroot
52 |
53 | LABEL io.modelcontextprotocol.server.name="io.github.googleapis/genai-toolbox"
54 |
55 | ENTRYPOINT ["/toolbox"]
56 |
```
--------------------------------------------------------------------------------
/docs/en/resources/tools/looker/looker-make-dashboard.md:
--------------------------------------------------------------------------------
```markdown
1 | ---
2 | title: "looker-make-dashboard"
3 | type: docs
4 | weight: 1
5 | description: >
6 | "looker-make-dashboard" generates a Looker dashboard in the users personal folder in
7 | Looker
8 | aliases:
9 | - /resources/tools/looker-make-dashboard
10 | ---
11 |
12 | ## About
13 |
14 | The `looker-make-dashboard` creates a dashboard in the user's
15 | Looker personal folder.
16 |
17 | It's compatible with the following sources:
18 |
19 | - [looker](../../sources/looker.md)
20 |
21 | `looker-make-dashboard` takes three parameters:
22 |
23 | 1. the `title`
24 | 2. the `description`
25 | 3. an optional `folder` id. If not provided, the user's default folder will be used.
26 |
27 | ## Example
28 |
29 | ```yaml
30 | tools:
31 | make_dashboard:
32 | kind: looker-make-dashboard
33 | source: looker-source
34 | description: |
35 | This tool creates a new, empty dashboard in Looker. Dashboards are stored
36 | in the user's personal folder, and the dashboard name must be unique.
37 | After creation, use `add_dashboard_filter` to add filters and
38 | `add_dashboard_element` to add content tiles.
39 |
40 | Required Parameters:
41 | - title (required): A unique title for the new dashboard.
42 | - description (required): A brief description of the dashboard's purpose.
43 |
44 | Output:
45 | A JSON object containing a link (`url`) to the newly created dashboard and
46 | its unique `id`. This `dashboard_id` is crucial for subsequent calls to
47 | `add_dashboard_filter` and `add_dashboard_element`.
48 | ```
49 |
50 | ## Reference
51 |
52 | | **field** | **type** | **required** | **description** |
53 | |-------------|:--------:|:------------:|----------------------------------------------------|
54 | | kind | string | true | Must be "looker-make-dashboard" |
55 | | source | string | true | Name of the source the SQL should execute on. |
56 | | description | string | true | Description of the tool that is passed to the LLM. |
57 |
```
--------------------------------------------------------------------------------
/docs/en/resources/tools/cloudhealthcare/cloud-healthcare-get-fhir-resource.md:
--------------------------------------------------------------------------------
```markdown
1 | ---
2 | title: "cloud-healthcare-get-fhir-resource"
3 | linkTitle: "cloud-healthcare-get-fhir-resource"
4 | type: docs
5 | weight: 1
6 | description: >
7 | A "cloud-healthcare-get-fhir-resource" tool retrieves a specific FHIR resource.
8 | aliases:
9 | - /resources/tools/cloud-healthcare-get-fhir-resource
10 | ---
11 |
12 | ## About
13 |
14 | A `cloud-healthcare-get-fhir-resource` tool retrieves a specific FHIR resource
15 | from a FHIR store.
16 | It's compatible with the following sources:
17 |
18 | - [cloud-healthcare](../../sources/cloud-healthcare.md)
19 |
20 | `cloud-healthcare-get-fhir-resource` returns a single FHIR resource, identified
21 | by its type and ID.
22 |
23 | ## Example
24 |
25 | ```yaml
26 | tools:
27 | get_fhir_resource:
28 | kind: cloud-healthcare-get-fhir-resource
29 | source: my-healthcare-source
30 | description: Use this tool to retrieve a specific FHIR resource.
31 | ```
32 |
33 | ## Reference
34 |
35 | | **field** | **type** | **required** | **description** |
36 | |-------------|:--------:|:------------:|----------------------------------------------------|
37 | | kind | string | true | Must be "cloud-healthcare-get-fhir-resource". |
38 | | source | string | true | Name of the healthcare source. |
39 | | description | string | true | Description of the tool that is passed to the LLM. |
40 |
41 | ### Parameters
42 |
43 | | **field** | **type** | **required** | **description** |
44 | |--------------|:--------:|:------------:|------------------------------------------------------------------|
45 | | resourceType | string | true | The FHIR resource type to retrieve (e.g., Patient, Observation). |
46 | | resourceID | string | true | The ID of the FHIR resource to retrieve. |
47 | | storeID | string | true* | The FHIR store ID to retrieve the resource from. |
48 |
49 | *If the `allowedFHIRStores` in the source has length 1, then the `storeID`
50 | parameter is not needed.
51 |
```
--------------------------------------------------------------------------------
/internal/tools/dataform/dataformcompilelocal/dataformcompilelocal_test.go:
--------------------------------------------------------------------------------
```go
1 | // Copyright 2025 Google LLC
2 | //
3 | // Licensed under the Apache License, Version 2.0 (the "License");
4 | // you may not use this file except in compliance with the License.
5 | // You may obtain a copy of the License at
6 | //
7 | // http://www.apache.org/licenses/LICENSE-2.0
8 | //
9 | // Unless required by applicable law or agreed to in writing, software
10 | // distributed under the License is distributed on an "AS IS" BASIS,
11 | // WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 | // See the License for the specific language governing permissions and
13 | // limitations under the License.
14 |
15 | package dataformcompilelocal_test
16 |
17 | import (
18 | "testing"
19 |
20 | yaml "github.com/goccy/go-yaml"
21 | "github.com/google/go-cmp/cmp"
22 | "github.com/googleapis/genai-toolbox/internal/server"
23 | "github.com/googleapis/genai-toolbox/internal/testutils"
24 | "github.com/googleapis/genai-toolbox/internal/tools/dataform/dataformcompilelocal"
25 | )
26 |
27 | func TestParseFromYamlDataformCompile(t *testing.T) {
28 | ctx, err := testutils.ContextWithNewLogger()
29 | if err != nil {
30 | t.Fatalf("unexpected error: %s", err)
31 | }
32 | tcs := []struct {
33 | desc string
34 | in string
35 | want server.ToolConfigs
36 | }{
37 | {
38 | desc: "basic example",
39 | in: `
40 | tools:
41 | example_tool:
42 | kind: dataform-compile-local
43 | description: some description
44 | `,
45 | want: server.ToolConfigs{
46 | "example_tool": dataformcompilelocal.Config{
47 | Name: "example_tool",
48 | Kind: "dataform-compile-local",
49 | Description: "some description",
50 | AuthRequired: []string{},
51 | },
52 | },
53 | },
54 | }
55 | for _, tc := range tcs {
56 | t.Run(tc.desc, func(t *testing.T) {
57 | got := struct {
58 | Tools server.ToolConfigs `yaml:"tools"`
59 | }{}
60 | // Parse contents
61 | err := yaml.UnmarshalContext(ctx, testutils.FormatYaml(tc.in), &got)
62 | if err != nil {
63 | t.Fatalf("unable to unmarshal: %s", err)
64 | }
65 | if diff := cmp.Diff(tc.want, got.Tools); diff != "" {
66 | t.Fatalf("incorrect parse: diff %v", diff)
67 | }
68 | })
69 | }
70 |
71 | }
72 |
```
--------------------------------------------------------------------------------
/docs/en/resources/sources/cloud-sql-admin.md:
--------------------------------------------------------------------------------
```markdown
1 | ---
2 | title: Cloud SQL Admin
3 | type: docs
4 | weight: 1
5 | description: "A \"cloud-sql-admin\" source provides a client for the Cloud SQL Admin API.\n"
6 | aliases: [/resources/sources/cloud-sql-admin]
7 | ---
8 |
9 | ## About
10 |
11 | The `cloud-sql-admin` source provides a client to interact with the [Google
12 | Cloud SQL Admin API](https://cloud.google.com/sql/docs/mysql/admin-api). This
13 | allows tools to perform administrative tasks on Cloud SQL instances, such as
14 | creating users and databases.
15 |
16 | Authentication can be handled in two ways:
17 |
18 | 1. **Application Default Credentials (ADC):** By default, the source uses ADC
19 | to authenticate with the API.
20 | 2. **Client-side OAuth:** If `useClientOAuth` is set to `true`, the source will
21 | expect an OAuth 2.0 access token to be provided by the client (e.g., a web
22 | browser) for each request.
23 |
24 | ## Example
25 |
26 | ```yaml
27 | sources:
28 | my-cloud-sql-admin:
29 | kind: cloud-sql-admin
30 |
31 | my-oauth-cloud-sql-admin:
32 | kind: cloud-sql-admin
33 | useClientOAuth: true
34 | ```
35 |
36 | ## Reference
37 |
38 | | **field** | **type** | **required** | **description** |
39 | | -------------- | :------: | :----------: | ---------------------------------------------------------------------------------------------------------------------------------------------- |
40 | | kind | string | true | Must be "cloud-sql-admin". |
41 | | defaultProject | string | false | The Google Cloud project ID to use for Cloud SQL infrastructure tools. |
42 | | useClientOAuth | boolean | false | If true, the source will use client-side OAuth for authorization. Otherwise, it will use Application Default Credentials. Defaults to `false`. |
43 |
```
--------------------------------------------------------------------------------
/.github/workflows/docs_preview_clean.yaml:
--------------------------------------------------------------------------------
```yaml
1 | # Copyright 2025 Google LLC
2 | #
3 | # Licensed under the Apache License, Version 2.0 (the "License");
4 | # you may not use this file except in compliance with the License.
5 | # You may obtain a copy of the License at
6 | #
7 | # http://www.apache.org/licenses/LICENSE-2.0
8 | #
9 | # Unless required by applicable law or agreed to in writing, software
10 | # distributed under the License is distributed on an "AS IS" BASIS,
11 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 | # See the License for the specific language governing permissions and
13 | # limitations under the License.
14 |
15 | name: "docs"
16 |
17 | permissions:
18 | contents: write
19 | pull-requests: write
20 |
21 | # This Workflow depends on 'github.event.number',
22 | # not compatible with branch or manual triggers.
23 | on:
24 | pull_request:
25 | types:
26 | - closed
27 |
28 | jobs:
29 | clean:
30 | if: ${{ !github.event.pull_request.head.repo.fork }}
31 | runs-on: ubuntu-24.04
32 | concurrency:
33 | # Shared concurrency group wih preview staging.
34 | group: "preview-${{ github.event.number }}"
35 | cancel-in-progress: true
36 | steps:
37 | - uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6
38 | with:
39 | ref: versioned-gh-pages
40 |
41 | - name: Remove Preview
42 | run: |
43 | rm -Rf ./previews/PR-${{ github.event.number }}
44 | git config user.name 'github-actions[bot]'
45 | git config user.email 'github-actions[bot]@users.noreply.github.com'
46 | git add -u previews/PR-${{ github.event.number }}
47 | git commit --message "cleanup: previews/PR-${{ github.event.number }}"
48 | git push
49 |
50 | - name: Comment
51 | uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8
52 | with:
53 | script: |
54 | github.rest.issues.createComment({
55 | issue_number: context.payload.number,
56 | owner: context.repo.owner,
57 | repo: context.repo.repo,
58 | body: "🧨 Preview deployments removed."
59 | })
60 |
```
--------------------------------------------------------------------------------
/docs/en/resources/sources/alloydb-admin.md:
--------------------------------------------------------------------------------
```markdown
1 | ---
2 | title: AlloyDB Admin
3 | linkTitle: AlloyDB Admin
4 | type: docs
5 | weight: 1
6 | description: "The \"alloydb-admin\" source provides a client for the AlloyDB API.\n"
7 | aliases: [/resources/sources/alloydb-admin]
8 | ---
9 |
10 | ## About
11 |
12 | The `alloydb-admin` source provides a client to interact with the [Google
13 | AlloyDB API](https://cloud.google.com/alloydb/docs/reference/rest). This allows
14 | tools to perform administrative tasks on AlloyDB resources, such as managing
15 | clusters, instances, and users.
16 |
17 | Authentication can be handled in two ways:
18 |
19 | 1. **Application Default Credentials (ADC):** By default, the source uses ADC
20 | to authenticate with the API.
21 | 2. **Client-side OAuth:** If `useClientOAuth` is set to `true`, the source will
22 | expect an OAuth 2.0 access token to be provided by the client (e.g., a web
23 | browser) for each request.
24 |
25 | ## Example
26 |
27 | ```yaml
28 | sources:
29 | my-alloydb-admin:
30 | kind: alloy-admin
31 |
32 | my-oauth-alloydb-admin:
33 | kind: alloydb-admin
34 | useClientOAuth: true
35 | ```
36 |
37 | ## Reference
38 |
39 | | **field** | **type** | **required** | **description** |
40 | | -------------- | :------: | :----------: | ---------------------------------------------------------------------------------------------------------------------------------------------- |
41 | | kind | string | true | Must be "alloydb-admin". |
42 | | defaultProject | string | false | The Google Cloud project ID to use for AlloyDB infrastructure tools. |
43 | | useClientOAuth | boolean | false | If true, the source will use client-side OAuth for authorization. Otherwise, it will use Application Default Credentials. Defaults to `false`. |
44 |
```
--------------------------------------------------------------------------------
/.github/release-please.yml:
--------------------------------------------------------------------------------
```yaml
1 | # Copyright 2024 Google LLC
2 | #
3 | # Licensed under the Apache License, Version 2.0 (the "License");
4 | # you may not use this file except in compliance with the License.
5 | # You may obtain a copy of the License at
6 | #
7 | # http://www.apache.org/licenses/LICENSE-2.0
8 | #
9 | # Unless required by applicable law or agreed to in writing, software
10 | # distributed under the License is distributed on an "AS IS" BASIS,
11 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 | # See the License for the specific language governing permissions and
13 | # limitations under the License.
14 |
15 | handleGHRelease: true
16 | packageName: genai-toolbox
17 | releaseType: simple
18 | versionFile: "cmd/version.txt"
19 | extraFiles: [
20 | "README.md",
21 | "docs/en/getting-started/colab_quickstart.ipynb",
22 | "docs/en/getting-started/introduction/_index.md",
23 | "docs/en/getting-started/mcp_quickstart/_index.md",
24 | "docs/en/getting-started/quickstart/shared/configure_toolbox.md",
25 | "docs/en/samples/alloydb/_index.md",
26 | "docs/en/samples/alloydb/mcp_quickstart.md",
27 | "docs/en/samples/alloydb/ai-nl/alloydb_ai_nl.ipynb",
28 | "docs/en/samples/bigquery/local_quickstart.md",
29 | "docs/en/samples/bigquery/mcp_quickstart/_index.md",
30 | "docs/en/samples/bigquery/colab_quickstart_bigquery.ipynb",
31 | "docs/en/samples/looker/looker_gemini.md",
32 | "docs/en/samples/looker/looker_gemini_oauth/_index.md",
33 | "docs/en/samples/looker/looker_mcp_inspector/_index.md",
34 | "docs/en/how-to/connect-ide/looker_mcp.md",
35 | "docs/en/how-to/connect-ide/mysql_mcp.md",
36 | "docs/en/how-to/connect-ide/mssql_mcp.md",
37 | "docs/en/how-to/connect-ide/postgres_mcp.md",
38 | "docs/en/how-to/connect-ide/neo4j_mcp.md",
39 | "docs/en/how-to/connect-ide/sqlite_mcp.md",
40 | "gemini-extension.json",
41 | {
42 | "type": "json",
43 | "path": "server.json",
44 | "jsonpath": "$.version"
45 | },
46 | {
47 | "type": "json",
48 | "path": "server.json",
49 | "jsonpath": "$.packages[0].identifier"
50 | },
51 | ]
52 |
```
--------------------------------------------------------------------------------
/docs/en/resources/authServices/google.md:
--------------------------------------------------------------------------------
```markdown
1 | ---
2 | title: "Google Sign-In"
3 | type: docs
4 | weight: 1
5 | description: >
6 | Use Google Sign-In for Oauth 2.0 flow and token lifecycle.
7 | ---
8 |
9 | ## Getting Started
10 |
11 | Google Sign-In manages the OAuth 2.0 flow and token lifecycle. To integrate the
12 | Google Sign-In workflow to your web app [follow this guide][gsi-setup].
13 |
14 | After setting up the Google Sign-In workflow, you should have registered your
15 | application and retrieved a [Client ID][client-id]. Configure your auth service
16 | in with the `Client ID`.
17 |
18 | [gsi-setup]: https://developers.google.com/identity/sign-in/web/sign-in
19 | [client-id]: https://developers.google.com/identity/sign-in/web/sign-in#create_authorization_credentials
20 |
21 | ## Behavior
22 |
23 | ### Authorized Invocations
24 |
25 | When using [Authorized Invocations][auth-invoke], a tool will be
26 | considered authorized if it has a valid Oauth 2.0 token that matches the Client
27 | ID.
28 |
29 | [auth-invoke]: ../tools/#authorized-invocations
30 |
31 | ### Authenticated Parameters
32 |
33 | When using [Authenticated Parameters][auth-params], any [claim provided by the
34 | id-token][provided-claims] can be used for the parameter.
35 |
36 | [auth-params]: ../tools/#authenticated-parameters
37 | [provided-claims]:
38 | https://developers.google.com/identity/openid-connect/openid-connect#obtaininguserprofileinformation
39 |
40 | ## Example
41 |
42 | ```yaml
43 | authServices:
44 | my-google-auth:
45 | kind: google
46 | clientId: ${YOUR_GOOGLE_CLIENT_ID}
47 | ```
48 |
49 | {{< notice tip >}}
50 | Use environment variable replacement with the format ${ENV_NAME}
51 | instead of hardcoding your secrets into the configuration file.
52 | {{< /notice >}}
53 |
54 | ## Reference
55 |
56 | | **field** | **type** | **required** | **description** |
57 | |-----------|:--------:|:------------:|------------------------------------------------------------------|
58 | | kind | string | true | Must be "google". |
59 | | clientId | string | true | Client ID of your application from registering your application. |
60 |
```