#
tokens: 49735/50000 74/1002 files (page 2/50)
lines: off (toggle) GitHub
raw markdown copy
This is page 2 of 50. Use http://codebase.md/googleapis/genai-toolbox?page={x} to view the full context.

# Directory Structure

```
├── .ci
│   ├── continuous.release.cloudbuild.yaml
│   ├── generate_release_table.sh
│   ├── integration.cloudbuild.yaml
│   ├── quickstart_test
│   │   ├── go.integration.cloudbuild.yaml
│   │   ├── js.integration.cloudbuild.yaml
│   │   ├── py.integration.cloudbuild.yaml
│   │   ├── run_go_tests.sh
│   │   ├── run_js_tests.sh
│   │   ├── run_py_tests.sh
│   │   └── setup_hotels_sample.sql
│   ├── test_prompts_with_coverage.sh
│   ├── test_with_coverage.sh
│   └── versioned.release.cloudbuild.yaml
├── .github
│   ├── auto-label.yaml
│   ├── blunderbuss.yml
│   ├── CODEOWNERS
│   ├── header-checker-lint.yml
│   ├── ISSUE_TEMPLATE
│   │   ├── bug_report.yml
│   │   ├── config.yml
│   │   ├── feature_request.yml
│   │   └── question.yml
│   ├── label-sync.yml
│   ├── labels.yaml
│   ├── PULL_REQUEST_TEMPLATE.md
│   ├── release-please.yml
│   ├── renovate.json5
│   ├── sync-repo-settings.yaml
│   ├── trusted-contribution.yml
│   └── workflows
│       ├── cloud_build_failure_reporter.yml
│       ├── deploy_dev_docs.yaml
│       ├── deploy_previous_version_docs.yaml
│       ├── deploy_versioned_docs.yaml
│       ├── docs_preview_clean.yaml
│       ├── docs_preview_deploy.yaml
│       ├── lint.yaml
│       ├── publish-mcp.yml
│       ├── schedule_reporter.yml
│       ├── sync-labels.yaml
│       └── tests.yaml
├── .gitignore
├── .gitmodules
├── .golangci.yaml
├── .hugo
│   ├── archetypes
│   │   └── default.md
│   ├── assets
│   │   ├── icons
│   │   │   └── logo.svg
│   │   └── scss
│   │       ├── _styles_project.scss
│   │       └── _variables_project.scss
│   ├── go.mod
│   ├── go.sum
│   ├── hugo.toml
│   ├── layouts
│   │   ├── _default
│   │   │   └── home.releases.releases
│   │   ├── index.llms-full.txt
│   │   ├── index.llms.txt
│   │   ├── partials
│   │   │   ├── hooks
│   │   │   │   └── head-end.html
│   │   │   ├── navbar-version-selector.html
│   │   │   ├── page-meta-links.html
│   │   │   └── td
│   │   │       └── render-heading.html
│   │   ├── robot.txt
│   │   └── shortcodes
│   │       ├── include.html
│   │       ├── ipynb.html
│   │       └── regionInclude.html
│   ├── package-lock.json
│   ├── package.json
│   └── static
│       ├── favicons
│       │   ├── android-chrome-192x192.png
│       │   ├── android-chrome-512x512.png
│       │   ├── apple-touch-icon.png
│       │   ├── favicon-16x16.png
│       │   ├── favicon-32x32.png
│       │   └── favicon.ico
│       └── js
│           └── w3.js
├── CHANGELOG.md
├── cmd
│   ├── options_test.go
│   ├── options.go
│   ├── root_test.go
│   ├── root.go
│   └── version.txt
├── CODE_OF_CONDUCT.md
├── CONTRIBUTING.md
├── DEVELOPER.md
├── Dockerfile
├── docs
│   ├── ALLOYDBADMIN_README.md
│   ├── ALLOYDBPG_README.md
│   ├── BIGQUERY_README.md
│   ├── CLOUDSQLMSSQL_README.md
│   ├── CLOUDSQLMSSQLADMIN_README.md
│   ├── CLOUDSQLMYSQL_README.md
│   ├── CLOUDSQLMYSQLADMIN_README.md
│   ├── CLOUDSQLPG_README.md
│   ├── CLOUDSQLPGADMIN_README.md
│   ├── DATAPLEX_README.md
│   ├── en
│   │   ├── _index.md
│   │   ├── about
│   │   │   ├── _index.md
│   │   │   └── faq.md
│   │   ├── concepts
│   │   │   ├── _index.md
│   │   │   └── telemetry
│   │   │       ├── index.md
│   │   │       ├── telemetry_flow.png
│   │   │       └── telemetry_traces.png
│   │   ├── getting-started
│   │   │   ├── _index.md
│   │   │   ├── colab_quickstart.ipynb
│   │   │   ├── configure.md
│   │   │   ├── introduction
│   │   │   │   ├── _index.md
│   │   │   │   └── architecture.png
│   │   │   ├── local_quickstart_go.md
│   │   │   ├── local_quickstart_js.md
│   │   │   ├── local_quickstart.md
│   │   │   ├── mcp_quickstart
│   │   │   │   ├── _index.md
│   │   │   │   ├── inspector_tools.png
│   │   │   │   └── inspector.png
│   │   │   └── quickstart
│   │   │       ├── go
│   │   │       │   ├── adkgo
│   │   │       │   │   ├── go.mod
│   │   │       │   │   ├── go.sum
│   │   │       │   │   └── quickstart.go
│   │   │       │   ├── genAI
│   │   │       │   │   ├── go.mod
│   │   │       │   │   ├── go.sum
│   │   │       │   │   └── quickstart.go
│   │   │       │   ├── genkit
│   │   │       │   │   ├── go.mod
│   │   │       │   │   ├── go.sum
│   │   │       │   │   └── quickstart.go
│   │   │       │   ├── langchain
│   │   │       │   │   ├── go.mod
│   │   │       │   │   ├── go.sum
│   │   │       │   │   └── quickstart.go
│   │   │       │   ├── openAI
│   │   │       │   │   ├── go.mod
│   │   │       │   │   ├── go.sum
│   │   │       │   │   └── quickstart.go
│   │   │       │   └── quickstart_test.go
│   │   │       ├── golden.txt
│   │   │       ├── js
│   │   │       │   ├── genAI
│   │   │       │   │   ├── package-lock.json
│   │   │       │   │   ├── package.json
│   │   │       │   │   └── quickstart.js
│   │   │       │   ├── genkit
│   │   │       │   │   ├── package-lock.json
│   │   │       │   │   ├── package.json
│   │   │       │   │   └── quickstart.js
│   │   │       │   ├── langchain
│   │   │       │   │   ├── package-lock.json
│   │   │       │   │   ├── package.json
│   │   │       │   │   └── quickstart.js
│   │   │       │   ├── llamaindex
│   │   │       │   │   ├── package-lock.json
│   │   │       │   │   ├── package.json
│   │   │       │   │   └── quickstart.js
│   │   │       │   └── quickstart.test.js
│   │   │       ├── python
│   │   │       │   ├── __init__.py
│   │   │       │   ├── adk
│   │   │       │   │   ├── quickstart.py
│   │   │       │   │   └── requirements.txt
│   │   │       │   ├── core
│   │   │       │   │   ├── quickstart.py
│   │   │       │   │   └── requirements.txt
│   │   │       │   ├── langchain
│   │   │       │   │   ├── quickstart.py
│   │   │       │   │   └── requirements.txt
│   │   │       │   ├── llamaindex
│   │   │       │   │   ├── quickstart.py
│   │   │       │   │   └── requirements.txt
│   │   │       │   └── quickstart_test.py
│   │   │       └── shared
│   │   │           ├── cloud_setup.md
│   │   │           ├── configure_toolbox.md
│   │   │           └── database_setup.md
│   │   ├── how-to
│   │   │   ├── _index.md
│   │   │   ├── connect_via_geminicli.md
│   │   │   ├── connect_via_mcp.md
│   │   │   ├── connect-ide
│   │   │   │   ├── _index.md
│   │   │   │   ├── alloydb_pg_admin_mcp.md
│   │   │   │   ├── alloydb_pg_mcp.md
│   │   │   │   ├── bigquery_mcp.md
│   │   │   │   ├── cloud_sql_mssql_admin_mcp.md
│   │   │   │   ├── cloud_sql_mssql_mcp.md
│   │   │   │   ├── cloud_sql_mysql_admin_mcp.md
│   │   │   │   ├── cloud_sql_mysql_mcp.md
│   │   │   │   ├── cloud_sql_pg_admin_mcp.md
│   │   │   │   ├── cloud_sql_pg_mcp.md
│   │   │   │   ├── firestore_mcp.md
│   │   │   │   ├── looker_mcp.md
│   │   │   │   ├── mssql_mcp.md
│   │   │   │   ├── mysql_mcp.md
│   │   │   │   ├── neo4j_mcp.md
│   │   │   │   ├── postgres_mcp.md
│   │   │   │   ├── spanner_mcp.md
│   │   │   │   └── sqlite_mcp.md
│   │   │   ├── deploy_adk_agent.md
│   │   │   ├── deploy_docker.md
│   │   │   ├── deploy_gke.md
│   │   │   ├── deploy_toolbox.md
│   │   │   ├── export_telemetry.md
│   │   │   └── toolbox-ui
│   │   │       ├── edit-headers.gif
│   │   │       ├── edit-headers.png
│   │   │       ├── index.md
│   │   │       ├── optional-param-checked.png
│   │   │       ├── optional-param-unchecked.png
│   │   │       ├── run-tool.gif
│   │   │       ├── tools.png
│   │   │       └── toolsets.png
│   │   ├── reference
│   │   │   ├── _index.md
│   │   │   ├── cli.md
│   │   │   └── prebuilt-tools.md
│   │   ├── resources
│   │   │   ├── _index.md
│   │   │   ├── authServices
│   │   │   │   ├── _index.md
│   │   │   │   └── google.md
│   │   │   ├── prompts
│   │   │   │   ├── _index.md
│   │   │   │   └── custom
│   │   │   │       └── _index.md
│   │   │   ├── sources
│   │   │   │   ├── _index.md
│   │   │   │   ├── alloydb-admin.md
│   │   │   │   ├── alloydb-pg.md
│   │   │   │   ├── bigquery.md
│   │   │   │   ├── bigtable.md
│   │   │   │   ├── cassandra.md
│   │   │   │   ├── clickhouse.md
│   │   │   │   ├── cloud-healthcare.md
│   │   │   │   ├── cloud-monitoring.md
│   │   │   │   ├── cloud-sql-admin.md
│   │   │   │   ├── cloud-sql-mssql.md
│   │   │   │   ├── cloud-sql-mysql.md
│   │   │   │   ├── cloud-sql-pg.md
│   │   │   │   ├── couchbase.md
│   │   │   │   ├── dataplex.md
│   │   │   │   ├── dgraph.md
│   │   │   │   ├── elasticsearch.md
│   │   │   │   ├── firebird.md
│   │   │   │   ├── firestore.md
│   │   │   │   ├── http.md
│   │   │   │   ├── looker.md
│   │   │   │   ├── mindsdb.md
│   │   │   │   ├── mongodb.md
│   │   │   │   ├── mssql.md
│   │   │   │   ├── mysql.md
│   │   │   │   ├── neo4j.md
│   │   │   │   ├── oceanbase.md
│   │   │   │   ├── oracle.md
│   │   │   │   ├── postgres.md
│   │   │   │   ├── redis.md
│   │   │   │   ├── serverless-spark.md
│   │   │   │   ├── singlestore.md
│   │   │   │   ├── spanner.md
│   │   │   │   ├── sqlite.md
│   │   │   │   ├── tidb.md
│   │   │   │   ├── trino.md
│   │   │   │   ├── valkey.md
│   │   │   │   └── yugabytedb.md
│   │   │   └── tools
│   │   │       ├── _index.md
│   │   │       ├── alloydb
│   │   │       │   ├── _index.md
│   │   │       │   ├── alloydb-create-cluster.md
│   │   │       │   ├── alloydb-create-instance.md
│   │   │       │   ├── alloydb-create-user.md
│   │   │       │   ├── alloydb-get-cluster.md
│   │   │       │   ├── alloydb-get-instance.md
│   │   │       │   ├── alloydb-get-user.md
│   │   │       │   ├── alloydb-list-clusters.md
│   │   │       │   ├── alloydb-list-instances.md
│   │   │       │   ├── alloydb-list-users.md
│   │   │       │   └── alloydb-wait-for-operation.md
│   │   │       ├── alloydbainl
│   │   │       │   ├── _index.md
│   │   │       │   └── alloydb-ai-nl.md
│   │   │       ├── bigquery
│   │   │       │   ├── _index.md
│   │   │       │   ├── bigquery-analyze-contribution.md
│   │   │       │   ├── bigquery-conversational-analytics.md
│   │   │       │   ├── bigquery-execute-sql.md
│   │   │       │   ├── bigquery-forecast.md
│   │   │       │   ├── bigquery-get-dataset-info.md
│   │   │       │   ├── bigquery-get-table-info.md
│   │   │       │   ├── bigquery-list-dataset-ids.md
│   │   │       │   ├── bigquery-list-table-ids.md
│   │   │       │   ├── bigquery-search-catalog.md
│   │   │       │   └── bigquery-sql.md
│   │   │       ├── bigtable
│   │   │       │   ├── _index.md
│   │   │       │   └── bigtable-sql.md
│   │   │       ├── cassandra
│   │   │       │   ├── _index.md
│   │   │       │   └── cassandra-cql.md
│   │   │       ├── clickhouse
│   │   │       │   ├── _index.md
│   │   │       │   ├── clickhouse-execute-sql.md
│   │   │       │   ├── clickhouse-list-databases.md
│   │   │       │   ├── clickhouse-list-tables.md
│   │   │       │   └── clickhouse-sql.md
│   │   │       ├── cloudhealthcare
│   │   │       │   ├── _index.md
│   │   │       │   ├── cloud-healthcare-fhir-fetch-page.md
│   │   │       │   ├── cloud-healthcare-fhir-patient-everything.md
│   │   │       │   ├── cloud-healthcare-fhir-patient-search.md
│   │   │       │   ├── cloud-healthcare-get-dataset.md
│   │   │       │   ├── cloud-healthcare-get-dicom-store-metrics.md
│   │   │       │   ├── cloud-healthcare-get-dicom-store.md
│   │   │       │   ├── cloud-healthcare-get-fhir-resource.md
│   │   │       │   ├── cloud-healthcare-get-fhir-store-metrics.md
│   │   │       │   ├── cloud-healthcare-get-fhir-store.md
│   │   │       │   ├── cloud-healthcare-list-dicom-stores.md
│   │   │       │   ├── cloud-healthcare-list-fhir-stores.md
│   │   │       │   ├── cloud-healthcare-retrieve-rendered-dicom-instance.md
│   │   │       │   ├── cloud-healthcare-search-dicom-instances.md
│   │   │       │   ├── cloud-healthcare-search-dicom-series.md
│   │   │       │   └── cloud-healthcare-search-dicom-studies.md
│   │   │       ├── cloudmonitoring
│   │   │       │   ├── _index.md
│   │   │       │   └── cloud-monitoring-query-prometheus.md
│   │   │       ├── cloudsql
│   │   │       │   ├── _index.md
│   │   │       │   ├── cloudsqlcreatedatabase.md
│   │   │       │   ├── cloudsqlcreateusers.md
│   │   │       │   ├── cloudsqlgetinstances.md
│   │   │       │   ├── cloudsqllistdatabases.md
│   │   │       │   ├── cloudsqllistinstances.md
│   │   │       │   ├── cloudsqlmssqlcreateinstance.md
│   │   │       │   ├── cloudsqlmysqlcreateinstance.md
│   │   │       │   ├── cloudsqlpgcreateinstances.md
│   │   │       │   ├── cloudsqlpgupgradeprecheck.md
│   │   │       │   └── cloudsqlwaitforoperation.md
│   │   │       ├── couchbase
│   │   │       │   ├── _index.md
│   │   │       │   └── couchbase-sql.md
│   │   │       ├── dataform
│   │   │       │   ├── _index.md
│   │   │       │   └── dataform-compile-local.md
│   │   │       ├── dataplex
│   │   │       │   ├── _index.md
│   │   │       │   ├── dataplex-lookup-entry.md
│   │   │       │   ├── dataplex-search-aspect-types.md
│   │   │       │   └── dataplex-search-entries.md
│   │   │       ├── dgraph
│   │   │       │   ├── _index.md
│   │   │       │   └── dgraph-dql.md
│   │   │       ├── elasticsearch
│   │   │       │   ├── _index.md
│   │   │       │   └── elasticsearch-esql.md
│   │   │       ├── firebird
│   │   │       │   ├── _index.md
│   │   │       │   ├── firebird-execute-sql.md
│   │   │       │   └── firebird-sql.md
│   │   │       ├── firestore
│   │   │       │   ├── _index.md
│   │   │       │   ├── firestore-add-documents.md
│   │   │       │   ├── firestore-delete-documents.md
│   │   │       │   ├── firestore-get-documents.md
│   │   │       │   ├── firestore-get-rules.md
│   │   │       │   ├── firestore-list-collections.md
│   │   │       │   ├── firestore-query-collection.md
│   │   │       │   ├── firestore-query.md
│   │   │       │   ├── firestore-update-document.md
│   │   │       │   └── firestore-validate-rules.md
│   │   │       ├── http
│   │   │       │   ├── _index.md
│   │   │       │   └── http.md
│   │   │       ├── looker
│   │   │       │   ├── _index.md
│   │   │       │   ├── looker-add-dashboard-element.md
│   │   │       │   ├── looker-conversational-analytics.md
│   │   │       │   ├── looker-create-project-file.md
│   │   │       │   ├── looker-delete-project-file.md
│   │   │       │   ├── looker-dev-mode.md
│   │   │       │   ├── looker-generate-embed-url.md
│   │   │       │   ├── looker-get-connection-databases.md
│   │   │       │   ├── looker-get-connection-schemas.md
│   │   │       │   ├── looker-get-connection-table-columns.md
│   │   │       │   ├── looker-get-connection-tables.md
│   │   │       │   ├── looker-get-connections.md
│   │   │       │   ├── looker-get-dashboards.md
│   │   │       │   ├── looker-get-dimensions.md
│   │   │       │   ├── looker-get-explores.md
│   │   │       │   ├── looker-get-filters.md
│   │   │       │   ├── looker-get-looks.md
│   │   │       │   ├── looker-get-measures.md
│   │   │       │   ├── looker-get-models.md
│   │   │       │   ├── looker-get-parameters.md
│   │   │       │   ├── looker-get-project-file.md
│   │   │       │   ├── looker-get-project-files.md
│   │   │       │   ├── looker-get-projects.md
│   │   │       │   ├── looker-health-analyze.md
│   │   │       │   ├── looker-health-pulse.md
│   │   │       │   ├── looker-health-vacuum.md
│   │   │       │   ├── looker-make-dashboard.md
│   │   │       │   ├── looker-make-look.md
│   │   │       │   ├── looker-query-sql.md
│   │   │       │   ├── looker-query-url.md
│   │   │       │   ├── looker-query.md
│   │   │       │   ├── looker-run-dashboard.md
│   │   │       │   ├── looker-run-look.md
│   │   │       │   └── looker-update-project-file.md
│   │   │       ├── mindsdb
│   │   │       │   ├── _index.md
│   │   │       │   ├── mindsdb-execute-sql.md
│   │   │       │   └── mindsdb-sql.md
│   │   │       ├── mongodb
│   │   │       │   ├── _index.md
│   │   │       │   ├── mongodb-aggregate.md
│   │   │       │   ├── mongodb-delete-many.md
│   │   │       │   ├── mongodb-delete-one.md
│   │   │       │   ├── mongodb-find-one.md
│   │   │       │   ├── mongodb-find.md
│   │   │       │   ├── mongodb-insert-many.md
│   │   │       │   ├── mongodb-insert-one.md
│   │   │       │   ├── mongodb-update-many.md
│   │   │       │   └── mongodb-update-one.md
│   │   │       ├── mssql
│   │   │       │   ├── _index.md
│   │   │       │   ├── mssql-execute-sql.md
│   │   │       │   ├── mssql-list-tables.md
│   │   │       │   └── mssql-sql.md
│   │   │       ├── mysql
│   │   │       │   ├── _index.md
│   │   │       │   ├── mysql-execute-sql.md
│   │   │       │   ├── mysql-list-active-queries.md
│   │   │       │   ├── mysql-list-table-fragmentation.md
│   │   │       │   ├── mysql-list-tables-missing-unique-indexes.md
│   │   │       │   ├── mysql-list-tables.md
│   │   │       │   └── mysql-sql.md
│   │   │       ├── neo4j
│   │   │       │   ├── _index.md
│   │   │       │   ├── neo4j-cypher.md
│   │   │       │   ├── neo4j-execute-cypher.md
│   │   │       │   └── neo4j-schema.md
│   │   │       ├── oceanbase
│   │   │       │   ├── _index.md
│   │   │       │   ├── oceanbase-execute-sql.md
│   │   │       │   └── oceanbase-sql.md
│   │   │       ├── oracle
│   │   │       │   ├── _index.md
│   │   │       │   ├── oracle-execute-sql.md
│   │   │       │   └── oracle-sql.md
│   │   │       ├── postgres
│   │   │       │   ├── _index.md
│   │   │       │   ├── postgres-database-overview.md
│   │   │       │   ├── postgres-execute-sql.md
│   │   │       │   ├── postgres-get-column-cardinality.md
│   │   │       │   ├── postgres-list-active-queries.md
│   │   │       │   ├── postgres-list-available-extensions.md
│   │   │       │   ├── postgres-list-indexes.md
│   │   │       │   ├── postgres-list-installed-extensions.md
│   │   │       │   ├── postgres-list-locks.md
│   │   │       │   ├── postgres-list-query-stats.md
│   │   │       │   ├── postgres-list-schemas.md
│   │   │       │   ├── postgres-list-sequences.md
│   │   │       │   ├── postgres-list-tables.md
│   │   │       │   ├── postgres-list-triggers.md
│   │   │       │   ├── postgres-list-views.md
│   │   │       │   ├── postgres-long-running-transactions.md
│   │   │       │   ├── postgres-replication-stats.md
│   │   │       │   └── postgres-sql.md
│   │   │       ├── redis
│   │   │       │   ├── _index.md
│   │   │       │   └── redis.md
│   │   │       ├── serverless-spark
│   │   │       │   ├── _index.md
│   │   │       │   ├── serverless-spark-cancel-batch.md
│   │   │       │   ├── serverless-spark-get-batch.md
│   │   │       │   └── serverless-spark-list-batches.md
│   │   │       ├── singlestore
│   │   │       │   ├── _index.md
│   │   │       │   ├── singlestore-execute-sql.md
│   │   │       │   └── singlestore-sql.md
│   │   │       ├── spanner
│   │   │       │   ├── _index.md
│   │   │       │   ├── spanner-execute-sql.md
│   │   │       │   ├── spanner-list-graphs.md
│   │   │       │   ├── spanner-list-tables.md
│   │   │       │   └── spanner-sql.md
│   │   │       ├── sqlite
│   │   │       │   ├── _index.md
│   │   │       │   ├── sqlite-execute-sql.md
│   │   │       │   └── sqlite-sql.md
│   │   │       ├── tidb
│   │   │       │   ├── _index.md
│   │   │       │   ├── tidb-execute-sql.md
│   │   │       │   └── tidb-sql.md
│   │   │       ├── trino
│   │   │       │   ├── _index.md
│   │   │       │   ├── trino-execute-sql.md
│   │   │       │   └── trino-sql.md
│   │   │       ├── utility
│   │   │       │   ├── _index.md
│   │   │       │   └── wait.md
│   │   │       ├── valkey
│   │   │       │   ├── _index.md
│   │   │       │   └── valkey.md
│   │   │       └── yuagbytedb
│   │   │           ├── _index.md
│   │   │           └── yugabytedb-sql.md
│   │   ├── samples
│   │   │   ├── _index.md
│   │   │   ├── alloydb
│   │   │   │   ├── _index.md
│   │   │   │   ├── ai-nl
│   │   │   │   │   ├── alloydb_ai_nl.ipynb
│   │   │   │   │   └── index.md
│   │   │   │   └── mcp_quickstart.md
│   │   │   ├── bigquery
│   │   │   │   ├── _index.md
│   │   │   │   ├── colab_quickstart_bigquery.ipynb
│   │   │   │   ├── local_quickstart.md
│   │   │   │   └── mcp_quickstart
│   │   │   │       ├── _index.md
│   │   │   │       ├── inspector_tools.png
│   │   │   │       └── inspector.png
│   │   │   └── looker
│   │   │       ├── _index.md
│   │   │       ├── looker_gemini_oauth
│   │   │       │   ├── _index.md
│   │   │       │   ├── authenticated.png
│   │   │       │   ├── authorize.png
│   │   │       │   └── registration.png
│   │   │       ├── looker_gemini.md
│   │   │       └── looker_mcp_inspector
│   │   │           ├── _index.md
│   │   │           ├── inspector_tools.png
│   │   │           └── inspector.png
│   │   └── sdks
│   │       ├── _index.md
│   │       ├── go-sdk.md
│   │       ├── js-sdk.md
│   │       └── python-sdk.md
│   ├── LOOKER_README.md
│   ├── SPANNER_README.md
│   └── TOOLBOX_README.md
├── gemini-extension.json
├── go.mod
├── go.sum
├── internal
│   ├── auth
│   │   ├── auth.go
│   │   └── google
│   │       └── google.go
│   ├── log
│   │   ├── handler.go
│   │   ├── log_test.go
│   │   ├── log.go
│   │   └── logger.go
│   ├── prebuiltconfigs
│   │   ├── prebuiltconfigs_test.go
│   │   ├── prebuiltconfigs.go
│   │   └── tools
│   │       ├── alloydb-postgres-admin.yaml
│   │       ├── alloydb-postgres-observability.yaml
│   │       ├── alloydb-postgres.yaml
│   │       ├── bigquery.yaml
│   │       ├── clickhouse.yaml
│   │       ├── cloud-healthcare.yaml
│   │       ├── cloud-sql-mssql-admin.yaml
│   │       ├── cloud-sql-mssql-observability.yaml
│   │       ├── cloud-sql-mssql.yaml
│   │       ├── cloud-sql-mysql-admin.yaml
│   │       ├── cloud-sql-mysql-observability.yaml
│   │       ├── cloud-sql-mysql.yaml
│   │       ├── cloud-sql-postgres-admin.yaml
│   │       ├── cloud-sql-postgres-observability.yaml
│   │       ├── cloud-sql-postgres.yaml
│   │       ├── dataplex.yaml
│   │       ├── elasticsearch.yaml
│   │       ├── firestore.yaml
│   │       ├── looker-conversational-analytics.yaml
│   │       ├── looker.yaml
│   │       ├── mindsdb.yaml
│   │       ├── mssql.yaml
│   │       ├── mysql.yaml
│   │       ├── neo4j.yaml
│   │       ├── oceanbase.yaml
│   │       ├── postgres.yaml
│   │       ├── serverless-spark.yaml
│   │       ├── singlestore.yaml
│   │       ├── spanner-postgres.yaml
│   │       ├── spanner.yaml
│   │       └── sqlite.yaml
│   ├── prompts
│   │   ├── arguments_test.go
│   │   ├── arguments.go
│   │   ├── custom
│   │   │   ├── custom_test.go
│   │   │   └── custom.go
│   │   ├── messages_test.go
│   │   ├── messages.go
│   │   ├── prompts_test.go
│   │   ├── prompts.go
│   │   ├── promptsets_test.go
│   │   └── promptsets.go
│   ├── server
│   │   ├── api_test.go
│   │   ├── api.go
│   │   ├── common_test.go
│   │   ├── config.go
│   │   ├── mcp
│   │   │   ├── jsonrpc
│   │   │   │   ├── jsonrpc_test.go
│   │   │   │   └── jsonrpc.go
│   │   │   ├── mcp.go
│   │   │   ├── util
│   │   │   │   └── lifecycle.go
│   │   │   ├── v20241105
│   │   │   │   ├── method.go
│   │   │   │   └── types.go
│   │   │   ├── v20250326
│   │   │   │   ├── method.go
│   │   │   │   └── types.go
│   │   │   └── v20250618
│   │   │       ├── method.go
│   │   │       └── types.go
│   │   ├── mcp_test.go
│   │   ├── mcp.go
│   │   ├── server_test.go
│   │   ├── server.go
│   │   ├── static
│   │   │   ├── assets
│   │   │   │   └── mcptoolboxlogo.png
│   │   │   ├── css
│   │   │   │   └── style.css
│   │   │   ├── index.html
│   │   │   ├── js
│   │   │   │   ├── auth.js
│   │   │   │   ├── loadTools.js
│   │   │   │   ├── mainContent.js
│   │   │   │   ├── navbar.js
│   │   │   │   ├── runTool.js
│   │   │   │   ├── toolDisplay.js
│   │   │   │   ├── tools.js
│   │   │   │   └── toolsets.js
│   │   │   ├── tools.html
│   │   │   └── toolsets.html
│   │   ├── web_test.go
│   │   └── web.go
│   ├── sources
│   │   ├── alloydbadmin
│   │   │   ├── alloydbadmin_test.go
│   │   │   └── alloydbadmin.go
│   │   ├── alloydbpg
│   │   │   ├── alloydb_pg_test.go
│   │   │   └── alloydb_pg.go
│   │   ├── bigquery
│   │   │   ├── bigquery_test.go
│   │   │   ├── bigquery.go
│   │   │   └── cache.go
│   │   ├── bigtable
│   │   │   ├── bigtable_test.go
│   │   │   └── bigtable.go
│   │   ├── cassandra
│   │   │   ├── cassandra_test.go
│   │   │   └── cassandra.go
│   │   ├── clickhouse
│   │   │   ├── clickhouse_test.go
│   │   │   └── clickhouse.go
│   │   ├── cloudhealthcare
│   │   │   ├── cloud_healthcare_test.go
│   │   │   └── cloud_healthcare.go
│   │   ├── cloudmonitoring
│   │   │   ├── cloud_monitoring_test.go
│   │   │   └── cloud_monitoring.go
│   │   ├── cloudsqladmin
│   │   │   ├── cloud_sql_admin_test.go
│   │   │   └── cloud_sql_admin.go
│   │   ├── cloudsqlmssql
│   │   │   ├── cloud_sql_mssql_test.go
│   │   │   └── cloud_sql_mssql.go
│   │   ├── cloudsqlmysql
│   │   │   ├── cloud_sql_mysql_test.go
│   │   │   └── cloud_sql_mysql.go
│   │   ├── cloudsqlpg
│   │   │   ├── cloud_sql_pg_test.go
│   │   │   └── cloud_sql_pg.go
│   │   ├── couchbase
│   │   │   ├── couchbase_test.go
│   │   │   └── couchbase.go
│   │   ├── dataplex
│   │   │   ├── dataplex_test.go
│   │   │   └── dataplex.go
│   │   ├── dgraph
│   │   │   ├── dgraph_test.go
│   │   │   └── dgraph.go
│   │   ├── dialect.go
│   │   ├── elasticsearch
│   │   │   ├── elasticsearch_test.go
│   │   │   └── elasticsearch.go
│   │   ├── firebird
│   │   │   ├── firebird_test.go
│   │   │   └── firebird.go
│   │   ├── firestore
│   │   │   ├── firestore_test.go
│   │   │   └── firestore.go
│   │   ├── http
│   │   │   ├── http_test.go
│   │   │   └── http.go
│   │   ├── ip_type.go
│   │   ├── looker
│   │   │   ├── looker_test.go
│   │   │   └── looker.go
│   │   ├── mindsdb
│   │   │   ├── mindsdb_test.go
│   │   │   └── mindsdb.go
│   │   ├── mongodb
│   │   │   ├── mongodb_test.go
│   │   │   └── mongodb.go
│   │   ├── mssql
│   │   │   ├── mssql_test.go
│   │   │   └── mssql.go
│   │   ├── mysql
│   │   │   ├── mysql_test.go
│   │   │   └── mysql.go
│   │   ├── neo4j
│   │   │   ├── neo4j_test.go
│   │   │   └── neo4j.go
│   │   ├── oceanbase
│   │   │   ├── oceanbase_test.go
│   │   │   └── oceanbase.go
│   │   ├── oracle
│   │   │   └── oracle.go
│   │   ├── postgres
│   │   │   ├── postgres_test.go
│   │   │   └── postgres.go
│   │   ├── redis
│   │   │   ├── redis_test.go
│   │   │   └── redis.go
│   │   ├── serverlessspark
│   │   │   ├── serverlessspark_test.go
│   │   │   └── serverlessspark.go
│   │   ├── singlestore
│   │   │   ├── singlestore_test.go
│   │   │   └── singlestore.go
│   │   ├── sources.go
│   │   ├── spanner
│   │   │   ├── spanner_test.go
│   │   │   └── spanner.go
│   │   ├── sqlite
│   │   │   ├── sqlite_test.go
│   │   │   └── sqlite.go
│   │   ├── tidb
│   │   │   ├── tidb_test.go
│   │   │   └── tidb.go
│   │   ├── trino
│   │   │   ├── trino_test.go
│   │   │   └── trino.go
│   │   ├── util.go
│   │   ├── valkey
│   │   │   ├── valkey_test.go
│   │   │   └── valkey.go
│   │   └── yugabytedb
│   │       ├── yugabytedb_test.go
│   │       └── yugabytedb.go
│   ├── telemetry
│   │   ├── instrumentation.go
│   │   └── telemetry.go
│   ├── testutils
│   │   └── testutils.go
│   ├── tools
│   │   ├── alloydb
│   │   │   ├── alloydbcreatecluster
│   │   │   │   ├── alloydbcreatecluster_test.go
│   │   │   │   └── alloydbcreatecluster.go
│   │   │   ├── alloydbcreateinstance
│   │   │   │   ├── alloydbcreateinstance_test.go
│   │   │   │   └── alloydbcreateinstance.go
│   │   │   ├── alloydbcreateuser
│   │   │   │   ├── alloydbcreateuser_test.go
│   │   │   │   └── alloydbcreateuser.go
│   │   │   ├── alloydbgetcluster
│   │   │   │   ├── alloydbgetcluster_test.go
│   │   │   │   └── alloydbgetcluster.go
│   │   │   ├── alloydbgetinstance
│   │   │   │   ├── alloydbgetinstance_test.go
│   │   │   │   └── alloydbgetinstance.go
│   │   │   ├── alloydbgetuser
│   │   │   │   ├── alloydbgetuser_test.go
│   │   │   │   └── alloydbgetuser.go
│   │   │   ├── alloydblistclusters
│   │   │   │   ├── alloydblistclusters_test.go
│   │   │   │   └── alloydblistclusters.go
│   │   │   ├── alloydblistinstances
│   │   │   │   ├── alloydblistinstances_test.go
│   │   │   │   └── alloydblistinstances.go
│   │   │   ├── alloydblistusers
│   │   │   │   ├── alloydblistusers_test.go
│   │   │   │   └── alloydblistusers.go
│   │   │   └── alloydbwaitforoperation
│   │   │       ├── alloydbwaitforoperation_test.go
│   │   │       └── alloydbwaitforoperation.go
│   │   ├── alloydbainl
│   │   │   ├── alloydbainl_test.go
│   │   │   └── alloydbainl.go
│   │   ├── bigquery
│   │   │   ├── bigqueryanalyzecontribution
│   │   │   │   ├── bigqueryanalyzecontribution_test.go
│   │   │   │   └── bigqueryanalyzecontribution.go
│   │   │   ├── bigquerycommon
│   │   │   │   ├── table_name_parser_test.go
│   │   │   │   ├── table_name_parser.go
│   │   │   │   └── util.go
│   │   │   ├── bigqueryconversationalanalytics
│   │   │   │   ├── bigqueryconversationalanalytics_test.go
│   │   │   │   └── bigqueryconversationalanalytics.go
│   │   │   ├── bigqueryexecutesql
│   │   │   │   ├── bigqueryexecutesql_test.go
│   │   │   │   └── bigqueryexecutesql.go
│   │   │   ├── bigqueryforecast
│   │   │   │   ├── bigqueryforecast_test.go
│   │   │   │   └── bigqueryforecast.go
│   │   │   ├── bigquerygetdatasetinfo
│   │   │   │   ├── bigquerygetdatasetinfo_test.go
│   │   │   │   └── bigquerygetdatasetinfo.go
│   │   │   ├── bigquerygettableinfo
│   │   │   │   ├── bigquerygettableinfo_test.go
│   │   │   │   └── bigquerygettableinfo.go
│   │   │   ├── bigquerylistdatasetids
│   │   │   │   ├── bigquerylistdatasetids_test.go
│   │   │   │   └── bigquerylistdatasetids.go
│   │   │   ├── bigquerylisttableids
│   │   │   │   ├── bigquerylisttableids_test.go
│   │   │   │   └── bigquerylisttableids.go
│   │   │   ├── bigquerysearchcatalog
│   │   │   │   ├── bigquerysearchcatalog_test.go
│   │   │   │   └── bigquerysearchcatalog.go
│   │   │   └── bigquerysql
│   │   │       ├── bigquerysql_test.go
│   │   │       └── bigquerysql.go
│   │   ├── bigtable
│   │   │   ├── bigtable_test.go
│   │   │   └── bigtable.go
│   │   ├── cassandra
│   │   │   └── cassandracql
│   │   │       ├── cassandracql_test.go
│   │   │       └── cassandracql.go
│   │   ├── clickhouse
│   │   │   ├── clickhouseexecutesql
│   │   │   │   ├── clickhouseexecutesql_test.go
│   │   │   │   └── clickhouseexecutesql.go
│   │   │   ├── clickhouselistdatabases
│   │   │   │   ├── clickhouselistdatabases_test.go
│   │   │   │   └── clickhouselistdatabases.go
│   │   │   ├── clickhouselisttables
│   │   │   │   ├── clickhouselisttables_test.go
│   │   │   │   └── clickhouselisttables.go
│   │   │   └── clickhousesql
│   │   │       ├── clickhousesql_test.go
│   │   │       └── clickhousesql.go
│   │   ├── cloudhealthcare
│   │   │   ├── cloudhealthcarefhirfetchpage
│   │   │   │   ├── cloudhealthcarefhirfetchpage_test.go
│   │   │   │   └── cloudhealthcarefhirfetchpage.go
│   │   │   ├── cloudhealthcarefhirpatienteverything
│   │   │   │   ├── cloudhealthcarefhirpatienteverything_test.go
│   │   │   │   └── cloudhealthcarefhirpatienteverything.go
│   │   │   ├── cloudhealthcarefhirpatientsearch
│   │   │   │   ├── cloudhealthcarefhirpatientsearch_test.go
│   │   │   │   └── cloudhealthcarefhirpatientsearch.go
│   │   │   ├── cloudhealthcaregetdataset
│   │   │   │   ├── cloudhealthcaregetdataset_test.go
│   │   │   │   └── cloudhealthcaregetdataset.go
│   │   │   ├── cloudhealthcaregetdicomstore
│   │   │   │   ├── cloudhealthcaregetdicomstore_test.go
│   │   │   │   └── cloudhealthcaregetdicomstore.go
│   │   │   ├── cloudhealthcaregetdicomstoremetrics
│   │   │   │   ├── cloudhealthcaregetdicomstoremetrics_test.go
│   │   │   │   └── cloudhealthcaregetdicomstoremetrics.go
│   │   │   ├── cloudhealthcaregetfhirresource
│   │   │   │   ├── cloudhealthcaregetfhirresource_test.go
│   │   │   │   └── cloudhealthcaregetfhirresource.go
│   │   │   ├── cloudhealthcaregetfhirstore
│   │   │   │   ├── cloudhealthcaregetfhirstore_test.go
│   │   │   │   └── cloudhealthcaregetfhirstore.go
│   │   │   ├── cloudhealthcaregetfhirstoremetrics
│   │   │   │   ├── cloudhealthcaregetfhirstoremetrics_test.go
│   │   │   │   └── cloudhealthcaregetfhirstoremetrics.go
│   │   │   ├── cloudhealthcarelistdicomstores
│   │   │   │   ├── cloudhealthcarelistdicomstores_test.go
│   │   │   │   └── cloudhealthcarelistdicomstores.go
│   │   │   ├── cloudhealthcarelistfhirstores
│   │   │   │   ├── cloudhealthcarelistfhirstores_test.go
│   │   │   │   └── cloudhealthcarelistfhirstores.go
│   │   │   ├── cloudhealthcareretrieverendereddicominstance
│   │   │   │   ├── cloudhealthcareretrieverendereddicominstance_test.go
│   │   │   │   └── cloudhealthcareretrieverendereddicominstance.go
│   │   │   ├── cloudhealthcaresearchdicominstances
│   │   │   │   ├── cloudhealthcaresearchdicominstances_test.go
│   │   │   │   └── cloudhealthcaresearchdicominstances.go
│   │   │   ├── cloudhealthcaresearchdicomseries
│   │   │   │   ├── cloudhealthcaresearchdicomseries_test.go
│   │   │   │   └── cloudhealthcaresearchdicomseries.go
│   │   │   ├── cloudhealthcaresearchdicomstudies
│   │   │   │   ├── cloudhealthcaresearchdicomstudies_test.go
│   │   │   │   └── cloudhealthcaresearchdicomstudies.go
│   │   │   └── common
│   │   │       └── util.go
│   │   ├── cloudmonitoring
│   │   │   ├── cloudmonitoring_test.go
│   │   │   └── cloudmonitoring.go
│   │   ├── cloudsql
│   │   │   ├── cloudsqlcreatedatabase
│   │   │   │   ├── cloudsqlcreatedatabase_test.go
│   │   │   │   └── cloudsqlcreatedatabase.go
│   │   │   ├── cloudsqlcreateusers
│   │   │   │   ├── cloudsqlcreateusers_test.go
│   │   │   │   └── cloudsqlcreateusers.go
│   │   │   ├── cloudsqlgetinstances
│   │   │   │   ├── cloudsqlgetinstances_test.go
│   │   │   │   └── cloudsqlgetinstances.go
│   │   │   ├── cloudsqllistdatabases
│   │   │   │   ├── cloudsqllistdatabases_test.go
│   │   │   │   └── cloudsqllistdatabases.go
│   │   │   ├── cloudsqllistinstances
│   │   │   │   ├── cloudsqllistinstances_test.go
│   │   │   │   └── cloudsqllistinstances.go
│   │   │   └── cloudsqlwaitforoperation
│   │   │       ├── cloudsqlwaitforoperation_test.go
│   │   │       └── cloudsqlwaitforoperation.go
│   │   ├── cloudsqlmssql
│   │   │   └── cloudsqlmssqlcreateinstance
│   │   │       ├── cloudsqlmssqlcreateinstance_test.go
│   │   │       └── cloudsqlmssqlcreateinstance.go
│   │   ├── cloudsqlmysql
│   │   │   └── cloudsqlmysqlcreateinstance
│   │   │       ├── cloudsqlmysqlcreateinstance_test.go
│   │   │       └── cloudsqlmysqlcreateinstance.go
│   │   ├── cloudsqlpg
│   │   │   ├── cloudsqlpgcreateinstances
│   │   │   │   ├── cloudsqlpgcreateinstances_test.go
│   │   │   │   └── cloudsqlpgcreateinstances.go
│   │   │   └── cloudsqlpgupgradeprecheck
│   │   │       ├── cloudsqlpgupgradeprecheck_test.go
│   │   │       └── cloudsqlpgupgradeprecheck.go
│   │   ├── couchbase
│   │   │   ├── couchbase_test.go
│   │   │   └── couchbase.go
│   │   ├── dataform
│   │   │   └── dataformcompilelocal
│   │   │       ├── dataformcompilelocal_test.go
│   │   │       └── dataformcompilelocal.go
│   │   ├── dataplex
│   │   │   ├── dataplexlookupentry
│   │   │   │   ├── dataplexlookupentry_test.go
│   │   │   │   └── dataplexlookupentry.go
│   │   │   ├── dataplexsearchaspecttypes
│   │   │   │   ├── dataplexsearchaspecttypes_test.go
│   │   │   │   └── dataplexsearchaspecttypes.go
│   │   │   └── dataplexsearchentries
│   │   │       ├── dataplexsearchentries_test.go
│   │   │       └── dataplexsearchentries.go
│   │   ├── dgraph
│   │   │   ├── dgraph_test.go
│   │   │   └── dgraph.go
│   │   ├── elasticsearch
│   │   │   └── elasticsearchesql
│   │   │       ├── elasticsearchesql_test.go
│   │   │       └── elasticsearchesql.go
│   │   ├── firebird
│   │   │   ├── firebirdexecutesql
│   │   │   │   ├── firebirdexecutesql_test.go
│   │   │   │   └── firebirdexecutesql.go
│   │   │   └── firebirdsql
│   │   │       ├── firebirdsql_test.go
│   │   │       └── firebirdsql.go
│   │   ├── firestore
│   │   │   ├── firestoreadddocuments
│   │   │   │   ├── firestoreadddocuments_test.go
│   │   │   │   └── firestoreadddocuments.go
│   │   │   ├── firestoredeletedocuments
│   │   │   │   ├── firestoredeletedocuments_test.go
│   │   │   │   └── firestoredeletedocuments.go
│   │   │   ├── firestoregetdocuments
│   │   │   │   ├── firestoregetdocuments_test.go
│   │   │   │   └── firestoregetdocuments.go
│   │   │   ├── firestoregetrules
│   │   │   │   ├── firestoregetrules_test.go
│   │   │   │   └── firestoregetrules.go
│   │   │   ├── firestorelistcollections
│   │   │   │   ├── firestorelistcollections_test.go
│   │   │   │   └── firestorelistcollections.go
│   │   │   ├── firestorequery
│   │   │   │   ├── firestorequery_test.go
│   │   │   │   └── firestorequery.go
│   │   │   ├── firestorequerycollection
│   │   │   │   ├── firestorequerycollection_test.go
│   │   │   │   └── firestorequerycollection.go
│   │   │   ├── firestoreupdatedocument
│   │   │   │   ├── firestoreupdatedocument_test.go
│   │   │   │   └── firestoreupdatedocument.go
│   │   │   ├── firestorevalidaterules
│   │   │   │   ├── firestorevalidaterules_test.go
│   │   │   │   └── firestorevalidaterules.go
│   │   │   └── util
│   │   │       ├── converter_test.go
│   │   │       ├── converter.go
│   │   │       ├── validator_test.go
│   │   │       └── validator.go
│   │   ├── http
│   │   │   ├── http_test.go
│   │   │   └── http.go
│   │   ├── http_method.go
│   │   ├── looker
│   │   │   ├── lookeradddashboardelement
│   │   │   │   ├── lookeradddashboardelement_test.go
│   │   │   │   └── lookeradddashboardelement.go
│   │   │   ├── lookercommon
│   │   │   │   ├── lookercommon_test.go
│   │   │   │   └── lookercommon.go
│   │   │   ├── lookerconversationalanalytics
│   │   │   │   ├── lookerconversationalanalytics_test.go
│   │   │   │   └── lookerconversationalanalytics.go
│   │   │   ├── lookercreateprojectfile
│   │   │   │   ├── lookercreateprojectfile_test.go
│   │   │   │   └── lookercreateprojectfile.go
│   │   │   ├── lookerdeleteprojectfile
│   │   │   │   ├── lookerdeleteprojectfile_test.go
│   │   │   │   └── lookerdeleteprojectfile.go
│   │   │   ├── lookerdevmode
│   │   │   │   ├── lookerdevmode_test.go
│   │   │   │   └── lookerdevmode.go
│   │   │   ├── lookergenerateembedurl
│   │   │   │   ├── lookergenerateembedurl_test.go
│   │   │   │   └── lookergenerateembedurl.go
│   │   │   ├── lookergetconnectiondatabases
│   │   │   │   ├── lookergetconnectiondatabases_test.go
│   │   │   │   └── lookergetconnectiondatabases.go
│   │   │   ├── lookergetconnections
│   │   │   │   ├── lookergetconnections_test.go
│   │   │   │   └── lookergetconnections.go
│   │   │   ├── lookergetconnectionschemas
│   │   │   │   ├── lookergetconnectionschemas_test.go
│   │   │   │   └── lookergetconnectionschemas.go
│   │   │   ├── lookergetconnectiontablecolumns
│   │   │   │   ├── lookergetconnectiontablecolumns_test.go
│   │   │   │   └── lookergetconnectiontablecolumns.go
│   │   │   ├── lookergetconnectiontables
│   │   │   │   ├── lookergetconnectiontables_test.go
│   │   │   │   └── lookergetconnectiontables.go
│   │   │   ├── lookergetdashboards
│   │   │   │   ├── lookergetdashboards_test.go
│   │   │   │   └── lookergetdashboards.go
│   │   │   ├── lookergetdimensions
│   │   │   │   ├── lookergetdimensions_test.go
│   │   │   │   └── lookergetdimensions.go
│   │   │   ├── lookergetexplores
│   │   │   │   ├── lookergetexplores_test.go
│   │   │   │   └── lookergetexplores.go
│   │   │   ├── lookergetfilters
│   │   │   │   ├── lookergetfilters_test.go
│   │   │   │   └── lookergetfilters.go
│   │   │   ├── lookergetlooks
│   │   │   │   ├── lookergetlooks_test.go
│   │   │   │   └── lookergetlooks.go
│   │   │   ├── lookergetmeasures
│   │   │   │   ├── lookergetmeasures_test.go
│   │   │   │   └── lookergetmeasures.go
│   │   │   ├── lookergetmodels
│   │   │   │   ├── lookergetmodels_test.go
│   │   │   │   └── lookergetmodels.go
│   │   │   ├── lookergetparameters
│   │   │   │   ├── lookergetparameters_test.go
│   │   │   │   └── lookergetparameters.go
│   │   │   ├── lookergetprojectfile
│   │   │   │   ├── lookergetprojectfile_test.go
│   │   │   │   └── lookergetprojectfile.go
│   │   │   ├── lookergetprojectfiles
│   │   │   │   ├── lookergetprojectfiles_test.go
│   │   │   │   └── lookergetprojectfiles.go
│   │   │   ├── lookergetprojects
│   │   │   │   ├── lookergetprojects_test.go
│   │   │   │   └── lookergetprojects.go
│   │   │   ├── lookerhealthanalyze
│   │   │   │   ├── lookerhealthanalyze_test.go
│   │   │   │   └── lookerhealthanalyze.go
│   │   │   ├── lookerhealthpulse
│   │   │   │   ├── lookerhealthpulse_test.go
│   │   │   │   └── lookerhealthpulse.go
│   │   │   ├── lookerhealthvacuum
│   │   │   │   ├── lookerhealthvacuum_test.go
│   │   │   │   └── lookerhealthvacuum.go
│   │   │   ├── lookermakedashboard
│   │   │   │   ├── lookermakedashboard_test.go
│   │   │   │   └── lookermakedashboard.go
│   │   │   ├── lookermakelook
│   │   │   │   ├── lookermakelook_test.go
│   │   │   │   └── lookermakelook.go
│   │   │   ├── lookerquery
│   │   │   │   ├── lookerquery_test.go
│   │   │   │   └── lookerquery.go
│   │   │   ├── lookerquerysql
│   │   │   │   ├── lookerquerysql_test.go
│   │   │   │   └── lookerquerysql.go
│   │   │   ├── lookerqueryurl
│   │   │   │   ├── lookerqueryurl_test.go
│   │   │   │   └── lookerqueryurl.go
│   │   │   ├── lookerrundashboard
│   │   │   │   ├── lookerrundashboard_test.go
│   │   │   │   └── lookerrundashboard.go
│   │   │   ├── lookerrunlook
│   │   │   │   ├── lookerrunlook_test.go
│   │   │   │   └── lookerrunlook.go
│   │   │   └── lookerupdateprojectfile
│   │   │       ├── lookerupdateprojectfile_test.go
│   │   │       └── lookerupdateprojectfile.go
│   │   ├── mindsdb
│   │   │   ├── mindsdbexecutesql
│   │   │   │   ├── mindsdbexecutesql_test.go
│   │   │   │   └── mindsdbexecutesql.go
│   │   │   └── mindsdbsql
│   │   │       ├── mindsdbsql_test.go
│   │   │       └── mindsdbsql.go
│   │   ├── mongodb
│   │   │   ├── mongodbaggregate
│   │   │   │   ├── mongodbaggregate_test.go
│   │   │   │   └── mongodbaggregate.go
│   │   │   ├── mongodbdeletemany
│   │   │   │   ├── mongodbdeletemany_test.go
│   │   │   │   └── mongodbdeletemany.go
│   │   │   ├── mongodbdeleteone
│   │   │   │   ├── mongodbdeleteone_test.go
│   │   │   │   └── mongodbdeleteone.go
│   │   │   ├── mongodbfind
│   │   │   │   ├── mongodbfind_test.go
│   │   │   │   └── mongodbfind.go
│   │   │   ├── mongodbfindone
│   │   │   │   ├── mongodbfindone_test.go
│   │   │   │   └── mongodbfindone.go
│   │   │   ├── mongodbinsertmany
│   │   │   │   ├── mongodbinsertmany_test.go
│   │   │   │   └── mongodbinsertmany.go
│   │   │   ├── mongodbinsertone
│   │   │   │   ├── mongodbinsertone_test.go
│   │   │   │   └── mongodbinsertone.go
│   │   │   ├── mongodbupdatemany
│   │   │   │   ├── mongodbupdatemany_test.go
│   │   │   │   └── mongodbupdatemany.go
│   │   │   └── mongodbupdateone
│   │   │       ├── mongodbupdateone_test.go
│   │   │       └── mongodbupdateone.go
│   │   ├── mssql
│   │   │   ├── mssqlexecutesql
│   │   │   │   ├── mssqlexecutesql_test.go
│   │   │   │   └── mssqlexecutesql.go
│   │   │   ├── mssqllisttables
│   │   │   │   ├── mssqllisttables_test.go
│   │   │   │   └── mssqllisttables.go
│   │   │   └── mssqlsql
│   │   │       ├── mssqlsql_test.go
│   │   │       └── mssqlsql.go
│   │   ├── mysql
│   │   │   ├── mysqlcommon
│   │   │   │   └── mysqlcommon.go
│   │   │   ├── mysqlexecutesql
│   │   │   │   ├── mysqlexecutesql_test.go
│   │   │   │   └── mysqlexecutesql.go
│   │   │   ├── mysqllistactivequeries
│   │   │   │   ├── mysqllistactivequeries_test.go
│   │   │   │   └── mysqllistactivequeries.go
│   │   │   ├── mysqllisttablefragmentation
│   │   │   │   ├── mysqllisttablefragmentation_test.go
│   │   │   │   └── mysqllisttablefragmentation.go
│   │   │   ├── mysqllisttables
│   │   │   │   ├── mysqllisttables_test.go
│   │   │   │   └── mysqllisttables.go
│   │   │   ├── mysqllisttablesmissinguniqueindexes
│   │   │   │   ├── mysqllisttablesmissinguniqueindexes_test.go
│   │   │   │   └── mysqllisttablesmissinguniqueindexes.go
│   │   │   └── mysqlsql
│   │   │       ├── mysqlsql_test.go
│   │   │       └── mysqlsql.go
│   │   ├── neo4j
│   │   │   ├── neo4jcypher
│   │   │   │   ├── neo4jcypher_test.go
│   │   │   │   └── neo4jcypher.go
│   │   │   ├── neo4jexecutecypher
│   │   │   │   ├── classifier
│   │   │   │   │   ├── classifier_test.go
│   │   │   │   │   └── classifier.go
│   │   │   │   ├── neo4jexecutecypher_test.go
│   │   │   │   └── neo4jexecutecypher.go
│   │   │   └── neo4jschema
│   │   │       ├── cache
│   │   │       │   ├── cache_test.go
│   │   │       │   └── cache.go
│   │   │       ├── helpers
│   │   │       │   ├── helpers_test.go
│   │   │       │   └── helpers.go
│   │   │       ├── neo4jschema_test.go
│   │   │       ├── neo4jschema.go
│   │   │       └── types
│   │   │           └── types.go
│   │   ├── oceanbase
│   │   │   ├── oceanbaseexecutesql
│   │   │   │   ├── oceanbaseexecutesql_test.go
│   │   │   │   └── oceanbaseexecutesql.go
│   │   │   └── oceanbasesql
│   │   │       ├── oceanbasesql_test.go
│   │   │       └── oceanbasesql.go
│   │   ├── oracle
│   │   │   ├── oracleexecutesql
│   │   │   │   └── oracleexecutesql.go
│   │   │   └── oraclesql
│   │   │       └── oraclesql.go
│   │   ├── postgres
│   │   │   ├── postgresdatabaseoverview
│   │   │   │   ├── postgresdatabaseoverview_test.go
│   │   │   │   └── postgresdatabaseoverview.go
│   │   │   ├── postgresexecutesql
│   │   │   │   ├── postgresexecutesql_test.go
│   │   │   │   └── postgresexecutesql.go
│   │   │   ├── postgresgetcolumncardinality
│   │   │   │   ├── postgresgetcolumncardinality_test.go
│   │   │   │   └── postgresgetcolumncardinality.go
│   │   │   ├── postgreslistactivequeries
│   │   │   │   ├── postgreslistactivequeries_test.go
│   │   │   │   └── postgreslistactivequeries.go
│   │   │   ├── postgreslistavailableextensions
│   │   │   │   ├── postgreslistavailableextensions_test.go
│   │   │   │   └── postgreslistavailableextensions.go
│   │   │   ├── postgreslistindexes
│   │   │   │   ├── postgreslistindexes_test.go
│   │   │   │   └── postgreslistindexes.go
│   │   │   ├── postgreslistinstalledextensions
│   │   │   │   ├── postgreslistinstalledextensions_test.go
│   │   │   │   └── postgreslistinstalledextensions.go
│   │   │   ├── postgreslistlocks
│   │   │   │   ├── postgreslistlocks_test.go
│   │   │   │   └── postgreslistlocks.go
│   │   │   ├── postgreslistquerystats
│   │   │   │   ├── postgreslistquerystats_test.go
│   │   │   │   └── postgreslistquerystats.go
│   │   │   ├── postgreslistschemas
│   │   │   │   ├── postgreslistschemas_test.go
│   │   │   │   └── postgreslistschemas.go
│   │   │   ├── postgreslistsequences
│   │   │   │   ├── postgreslistsequences_test.go
│   │   │   │   └── postgreslistsequences.go
│   │   │   ├── postgreslisttables
│   │   │   │   ├── postgreslisttables_test.go
│   │   │   │   └── postgreslisttables.go
│   │   │   ├── postgreslisttriggers
│   │   │   │   ├── postgreslisttriggers_test.go
│   │   │   │   └── postgreslisttriggers.go
│   │   │   ├── postgreslistviews
│   │   │   │   ├── postgreslistviews_test.go
│   │   │   │   └── postgreslistviews.go
│   │   │   ├── postgreslongrunningtransactions
│   │   │   │   ├── postgreslongrunningtransactions_test.go
│   │   │   │   └── postgreslongrunningtransactions.go
│   │   │   ├── postgresreplicationstats
│   │   │   │   ├── postgresreplicationstats_test.go
│   │   │   │   └── postgresreplicationstats.go
│   │   │   └── postgressql
│   │   │       ├── postgressql_test.go
│   │   │       └── postgressql.go
│   │   ├── redis
│   │   │   ├── redis_test.go
│   │   │   └── redis.go
│   │   ├── serverlessspark
│   │   │   ├── serverlesssparkcancelbatch
│   │   │   │   ├── serverlesssparkcancelbatch_test.go
│   │   │   │   └── serverlesssparkcancelbatch.go
│   │   │   ├── serverlesssparkgetbatch
│   │   │   │   ├── serverlesssparkgetbatch_test.go
│   │   │   │   └── serverlesssparkgetbatch.go
│   │   │   └── serverlesssparklistbatches
│   │   │       ├── serverlesssparklistbatches_test.go
│   │   │       └── serverlesssparklistbatches.go
│   │   ├── singlestore
│   │   │   ├── singlestoreexecutesql
│   │   │   │   ├── singlestoreexecutesql_test.go
│   │   │   │   └── singlestoreexecutesql.go
│   │   │   └── singlestoresql
│   │   │       ├── singlestoresql_test.go
│   │   │       └── singlestoresql.go
│   │   ├── spanner
│   │   │   ├── spannerexecutesql
│   │   │   │   ├── spannerexecutesql_test.go
│   │   │   │   └── spannerexecutesql.go
│   │   │   ├── spannerlistgraphs
│   │   │   │   ├── spannerlistgraphs_test.go
│   │   │   │   └── spannerlistgraphs.go
│   │   │   ├── spannerlisttables
│   │   │   │   ├── spannerlisttables_test.go
│   │   │   │   └── spannerlisttables.go
│   │   │   └── spannersql
│   │   │       ├── spanner_test.go
│   │   │       └── spannersql.go
│   │   ├── sqlite
│   │   │   ├── sqliteexecutesql
│   │   │   │   ├── sqliteexecutesql_test.go
│   │   │   │   └── sqliteexecutesql.go
│   │   │   └── sqlitesql
│   │   │       ├── sqlitesql_test.go
│   │   │       └── sqlitesql.go
│   │   ├── tidb
│   │   │   ├── tidbexecutesql
│   │   │   │   ├── tidbexecutesql_test.go
│   │   │   │   └── tidbexecutesql.go
│   │   │   └── tidbsql
│   │   │       ├── tidbsql_test.go
│   │   │       └── tidbsql.go
│   │   ├── tools_test.go
│   │   ├── tools.go
│   │   ├── toolsets.go
│   │   ├── trino
│   │   │   ├── trinoexecutesql
│   │   │   │   ├── trinoexecutesql_test.go
│   │   │   │   └── trinoexecutesql.go
│   │   │   └── trinosql
│   │   │       ├── trinosql_test.go
│   │   │       └── trinosql.go
│   │   ├── utility
│   │   │   └── wait
│   │   │       ├── wait_test.go
│   │   │       └── wait.go
│   │   ├── valkey
│   │   │   ├── valkey_test.go
│   │   │   └── valkey.go
│   │   └── yugabytedbsql
│   │       ├── yugabytedbsql_test.go
│   │       └── yugabytedbsql.go
│   └── util
│       ├── orderedmap
│       │   ├── orderedmap_test.go
│       │   └── orderedmap.go
│       ├── parameters
│       │   ├── common_test.go
│       │   ├── common.go
│       │   ├── parameters_test.go
│       │   └── parameters.go
│       └── util.go
├── LICENSE
├── logo.png
├── main.go
├── MCP-TOOLBOX-EXTENSION.md
├── README.md
├── server.json
└── tests
    ├── alloydb
    │   ├── alloydb_integration_test.go
    │   └── alloydb_wait_for_operation_test.go
    ├── alloydbainl
    │   └── alloydb_ai_nl_integration_test.go
    ├── alloydbpg
    │   └── alloydb_pg_integration_test.go
    ├── auth.go
    ├── bigquery
    │   └── bigquery_integration_test.go
    ├── bigtable
    │   └── bigtable_integration_test.go
    ├── cassandra
    │   └── cassandra_integration_test.go
    ├── clickhouse
    │   └── clickhouse_integration_test.go
    ├── cloudhealthcare
    │   └── cloud_healthcare_integration_test.go
    ├── cloudmonitoring
    │   └── cloud_monitoring_integration_test.go
    ├── cloudsql
    │   ├── cloud_sql_create_database_test.go
    │   ├── cloud_sql_create_users_test.go
    │   ├── cloud_sql_get_instances_test.go
    │   ├── cloud_sql_list_databases_test.go
    │   ├── cloudsql_list_instances_test.go
    │   └── cloudsql_wait_for_operation_test.go
    ├── cloudsqlmssql
    │   ├── cloud_sql_mssql_create_instance_integration_test.go
    │   └── cloud_sql_mssql_integration_test.go
    ├── cloudsqlmysql
    │   ├── cloud_sql_mysql_create_instance_integration_test.go
    │   └── cloud_sql_mysql_integration_test.go
    ├── cloudsqlpg
    │   ├── cloud_sql_pg_create_instances_test.go
    │   ├── cloud_sql_pg_integration_test.go
    │   └── cloud_sql_pg_upgrade_precheck_test.go
    ├── common.go
    ├── couchbase
    │   └── couchbase_integration_test.go
    ├── dataform
    │   └── dataform_integration_test.go
    ├── dataplex
    │   └── dataplex_integration_test.go
    ├── dgraph
    │   └── dgraph_integration_test.go
    ├── elasticsearch
    │   └── elasticsearch_integration_test.go
    ├── firebird
    │   └── firebird_integration_test.go
    ├── firestore
    │   └── firestore_integration_test.go
    ├── http
    │   └── http_integration_test.go
    ├── looker
    │   └── looker_integration_test.go
    ├── mindsdb
    │   └── mindsdb_integration_test.go
    ├── mongodb
    │   └── mongodb_integration_test.go
    ├── mssql
    │   └── mssql_integration_test.go
    ├── mysql
    │   └── mysql_integration_test.go
    ├── neo4j
    │   └── neo4j_integration_test.go
    ├── oceanbase
    │   └── oceanbase_integration_test.go
    ├── option.go
    ├── oracle
    │   └── oracle_integration_test.go
    ├── postgres
    │   └── postgres_integration_test.go
    ├── prompts
    │   └── custom
    │       └── prompts_integration_test.go
    ├── redis
    │   └── redis_test.go
    ├── server.go
    ├── serverlessspark
    │   └── serverless_spark_integration_test.go
    ├── singlestore
    │   └── singlestore_integration_test.go
    ├── source.go
    ├── spanner
    │   └── spanner_integration_test.go
    ├── sqlite
    │   └── sqlite_integration_test.go
    ├── tidb
    │   └── tidb_integration_test.go
    ├── tool.go
    ├── trino
    │   └── trino_integration_test.go
    ├── utility
    │   └── wait_integration_test.go
    ├── valkey
    │   └── valkey_test.go
    └── yugabytedb
        └── yugabytedb_integration_test.go
```

# Files

--------------------------------------------------------------------------------
/docs/en/resources/tools/looker/looker-delete-project-file.md:
--------------------------------------------------------------------------------

```markdown
---
title: "looker-delete-project-file"
type: docs
weight: 1
description: >
  A "looker-delete-project-file" tool deletes a LookML file in a project.
aliases:
- /resources/tools/looker-delete-project-file
---

## About

A `looker-delete-project-file` tool deletes a LookML file in a project

It's compatible with the following sources:

- [looker](../../sources/looker.md)

`looker-delete-project-file` accepts a project_id parameter and a file_path parameter.

## Example

```yaml
tools:
    delete_project_file:
        kind: looker-delete-project-file
        source: looker-source
        description: |
          delete_project_file Tool

          Given a project_id and a file path within the project, this tool will delete
          the file from the project.

          This tool must be called after the dev_mode tool has changed the session to
          dev mode.
```

## Reference

| **field**   | **type** | **required** | **description**                                    |
|-------------|:--------:|:------------:|----------------------------------------------------|
| kind        |  string  |     true     | Must be "looker-delete-project-file".              |
| source      |  string  |     true     | Name of the source Looker instance.                |
| description |  string  |     true     | Description of the tool that is passed to the LLM. |

```

--------------------------------------------------------------------------------
/internal/prebuiltconfigs/tools/clickhouse.yaml:
--------------------------------------------------------------------------------

```yaml
# Copyright 2025 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
#     http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
sources:
  clickhouse-source:
    kind: clickhouse
    host: ${CLICKHOUSE_HOST}
    port: ${CLICKHOUSE_PORT}
    user: ${CLICKHOUSE_USER}
    password: ${CLICKHOUSE_PASSWORD}
    database: ${CLICKHOUSE_DATABASE}
    protocol: ${CLICKHOUSE_PROTOCOL}

tools:
  execute_sql:
    kind: clickhouse-execute-sql
    source: clickhouse-source
    description: Use this tool to execute SQL.

  list_databases:
    kind: clickhouse-list-databases
    source: clickhouse-source
    description: Use this tool to list all databases in ClickHouse.

  list_tables:
    kind: clickhouse-list-tables
    source: clickhouse-source
    description: Use this tool to list all tables in a specific ClickHouse database.

toolsets:
  clickhouse_database_tools:
    - execute_sql
    - list_databases
    - list_tables

```

--------------------------------------------------------------------------------
/docs/en/resources/tools/firestore/firestore-get-documents.md:
--------------------------------------------------------------------------------

```markdown
---
title: "firestore-get-documents"
type: docs
weight: 1
description: >
  A "firestore-get-documents" tool retrieves multiple documents from Firestore by their paths.
aliases:
- /resources/tools/firestore-get-documents
---

## About

A `firestore-get-documents` tool retrieves multiple documents from Firestore by
their paths.
It's compatible with the following sources:

- [firestore](../../sources/firestore.md)

`firestore-get-documents` takes one input parameter `documentPaths` which is an
array of document paths, and returns the documents' data along with metadata
such as existence status, creation time, update time, and read time.

## Example

```yaml
tools:
  get_user_documents:
    kind: firestore-get-documents
    source: my-firestore-source
    description: Use this tool to retrieve multiple documents from Firestore.
```

## Reference

| **field**   |    **type**    | **required** | **description**                                            |
|-------------|:--------------:|:------------:|------------------------------------------------------------|
| kind        |     string     |     true     | Must be "firestore-get-documents".                         |
| source      |     string     |     true     | Name of the Firestore source to retrieve documents from.   |
| description |     string     |     true     | Description of the tool that is passed to the LLM.         |

```

--------------------------------------------------------------------------------
/tests/auth.go:
--------------------------------------------------------------------------------

```go
// Copyright 2024 Google LLC
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
//	http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.

package tests

import (
	"context"
	"os"
	"os/exec"
	"strings"

	"google.golang.org/api/idtoken"
)

var ServiceAccountEmail = os.Getenv("SERVICE_ACCOUNT_EMAIL")
var ClientId = os.Getenv("CLIENT_ID")

// GetGoogleIdToken retrieve and return the Google ID token
func GetGoogleIdToken(audience string) (string, error) {
	// For local testing - use gcloud command to print personal ID token
	cmd := exec.Command("gcloud", "auth", "print-identity-token")
	output, err := cmd.Output()
	if err == nil {
		return strings.TrimSpace(string(output)), nil
	}
	// For Cloud Build testing - retrieve ID token from GCE metadata server
	ts, err := idtoken.NewTokenSource(context.Background(), ClientId)
	if err != nil {
		return "", err
	}
	token, err := ts.Token()
	if err != nil {
		return "", err
	}
	return token.AccessToken, nil
}

```

--------------------------------------------------------------------------------
/docs/en/resources/tools/firestore/firestore-delete-documents.md:
--------------------------------------------------------------------------------

```markdown
---
title: "firestore-delete-documents"
type: docs
weight: 1
description: >
  A "firestore-delete-documents" tool deletes multiple documents from Firestore by their paths.
aliases:
- /resources/tools/firestore-delete-documents
---

## About

A `firestore-delete-documents` tool deletes multiple documents from Firestore by
their paths.
It's compatible with the following sources:

- [firestore](../../sources/firestore.md)

`firestore-delete-documents` takes one input parameter `documentPaths` which is
an array of document paths to delete. The tool uses Firestore's BulkWriter for
efficient batch deletion and returns the success status for each document.

## Example

```yaml
tools:
  delete_user_documents:
    kind: firestore-delete-documents
    source: my-firestore-source
    description: Use this tool to delete multiple documents from Firestore.
```

## Reference

| **field**   |     **type**   | **required** | **description**                                          |
|-------------|:--------------:|:------------:|----------------------------------------------------------|
| kind        |     string     |     true     | Must be "firestore-delete-documents".                    |
| source      |     string     |     true     | Name of the Firestore source to delete documents from.   |
| description |     string     |     true     | Description of the tool that is passed to the LLM.       |

```

--------------------------------------------------------------------------------
/docs/en/resources/tools/cloudhealthcare/cloud-healthcare-get-dataset.md:
--------------------------------------------------------------------------------

```markdown
---
title: "cloud-healthcare-get-dataset"
type: docs
weight: 1
description: >
  A "cloud-healthcare-get-dataset" tool retrieves metadata for the Healthcare dataset in the source.
aliases:
- /resources/tools/cloud-healthcare-get-dataset
---

## About

A `cloud-healthcare-get-dataset` tool retrieves metadata for a Healthcare dataset.
It's compatible with the following sources:

- [cloud-healthcare](../../sources/cloud-healthcare.md)

`cloud-healthcare-get-dataset` returns the metadata of the healthcare dataset
configured in the source. It takes no extra parameters.

## Example

```yaml
tools:
  get_dataset:
    kind: cloud-healthcare-get-dataset
    source: my-healthcare-source
    description: Use this tool to get healthcare dataset metadata.
```

## Reference

| **field**   |                  **type**                  | **required** | **description**                                    |
|-------------|:------------------------------------------:|:------------:|----------------------------------------------------|
| kind        |                   string                   |     true     | Must be "cloud-healthcare-get-dataset".            |
| source      |                   string                   |     true     | Name of the healthcare source.                     |
| description |                   string                   |     true     | Description of the tool that is passed to the LLM. |

```

--------------------------------------------------------------------------------
/internal/server/web.go:
--------------------------------------------------------------------------------

```go
package server

import (
	"bytes"
	"embed"
	"fmt"
	"io"
	"io/fs"
	"net/http"

	"github.com/go-chi/chi/v5"
	"github.com/go-chi/chi/v5/middleware"
)

//go:embed all:static
var staticContent embed.FS

// webRouter creates a router that represents the routes under /ui
func webRouter() (chi.Router, error) {
	r := chi.NewRouter()
	r.Use(middleware.StripSlashes)

	// direct routes for html pages to provide clean URLs
	r.Get("/", func(w http.ResponseWriter, r *http.Request) { serveHTML(w, r, "static/index.html") })
	r.Get("/tools", func(w http.ResponseWriter, r *http.Request) { serveHTML(w, r, "static/tools.html") })
	r.Get("/toolsets", func(w http.ResponseWriter, r *http.Request) { serveHTML(w, r, "static/toolsets.html") })

	// handler for all other static files/assets
	staticFS, _ := fs.Sub(staticContent, "static")
	r.Handle("/*", http.StripPrefix("/ui", http.FileServer(http.FS(staticFS))))

	return r, nil
}

func serveHTML(w http.ResponseWriter, r *http.Request, filepath string) {
	file, err := staticContent.Open(filepath)
	if err != nil {
		http.Error(w, "File not found", http.StatusNotFound)
		return
	}
	defer file.Close()

	fileBytes, err := io.ReadAll(file)
	if err != nil {
		http.Error(w, fmt.Sprintf("Error reading file: %v", err), http.StatusInternalServerError)
		return
	}

	fileInfo, err := file.Stat()
	if err != nil {
		return
	}
	http.ServeContent(w, r, fileInfo.Name(), fileInfo.ModTime(), bytes.NewReader(fileBytes))
}

```

--------------------------------------------------------------------------------
/internal/tools/mysql/mysqlcommon/mysqlcommon.go:
--------------------------------------------------------------------------------

```go
// Copyright 2025 Google LLC
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
//     http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.

package mysqlcommon

import (
	"database/sql"
	"encoding/json"
	"reflect"
)

// ConvertToType handles casting mysql returns to the right type
// types for mysql driver: https://github.com/go-sql-driver/mysql/blob/v1.9.3/fields.go
// all numeric type or unknown type will be return as is.
func ConvertToType(t *sql.ColumnType, v any) (any, error) {
	switch t.ScanType() {
	case reflect.TypeOf(""), reflect.TypeOf([]byte{}), reflect.TypeOf(sql.NullString{}):
		// unmarshal JSON data before returning to prevent double marshaling
		if t.DatabaseTypeName() == "JSON" {
			// unmarshal JSON data before storing to prevent double marshaling
			var unmarshaledData any
			err := json.Unmarshal(v.([]byte), &unmarshaledData)
			if err != nil {
				return nil, err
			}
			return unmarshaledData, nil
		}
		return string(v.([]byte)), nil
	default:
		return v, nil
	}
}

```

--------------------------------------------------------------------------------
/docs/en/resources/tools/looker/looker-create-project-file.md:
--------------------------------------------------------------------------------

```markdown
---
title: "looker-create-project-file"
type: docs
weight: 1
description: >
  A "looker-create-project-file" tool creates a new LookML file in a project.
aliases:
- /resources/tools/looker-create-project-file
---

## About

A `looker-create-project-file` tool creates a new LookML file in a project

It's compatible with the following sources:

- [looker](../../sources/looker.md)

`looker-create-project-file` accepts a project_id parameter and a file_path parameter
as well as the file content.

## Example

```yaml
tools:
    create_project_file:
        kind: looker-create-project-file
        source: looker-source
        description: |
          create_project_file Tool

          Given a project_id and a file path within the project, as well as the content
          of a LookML file, this tool will create a new file within the project.

          This tool must be called after the dev_mode tool has changed the session to
          dev mode.
```

## Reference

| **field**   | **type** | **required** | **description**                                    |
|-------------|:--------:|:------------:|----------------------------------------------------|
| kind        |  string  |     true     | Must be "looker-create-project-file".              |
| source      |  string  |     true     | Name of the source Looker instance.                |
| description |  string  |     true     | Description of the tool that is passed to the LLM. |

```

--------------------------------------------------------------------------------
/.hugo/layouts/shortcodes/regionInclude.html:
--------------------------------------------------------------------------------

```html
{{/*
  snippet.html
  Usage:
    {{< regionInclude "filename.md" "region_name" >}}
    {{< regionInclude "filename.python" "region_name" "python" >}}
*/}}

{{ $file := .Get 0 }}
{{ $region := .Get 1 }}
{{ $lang := .Get 2 | default "text" }}
{{ $path := printf "%s%s" .Page.File.Dir $file }}

{{ if or (not $file) (eq $file "") }}
  {{ errorf "The file parameter (first argument) is required and must be non-empty in %s" .Page.File.Path }}
{{ end }}
{{ if or (not $region) (eq $region "") }}
  {{ errorf "The region parameter (second argument) is required and must be non-empty in %s" .Page.File.Path }}
{{ end }}
{{ if not (fileExists $path) }}
  {{ errorf "File %q not found (referenced in %s)" $path .Page.File.Path }}
{{ end }}

{{ $content := readFile $path }}
{{ $start_tag := printf "[START %s]" $region }}
{{ $end_tag := printf "[END %s]" $region }}

{{ $snippet := "" }}
{{ $in_snippet := false }}
{{ range split $content "\n" }}
  {{ if $in_snippet }}
    {{ if in . $end_tag }}
      {{ $in_snippet = false }}
    {{ else }}
      {{ $snippet = printf "%s%s\n" $snippet . }}
    {{ end }}
  {{ else if in . $start_tag }}
    {{ $in_snippet = true }}
  {{ end }}
{{ end }}

{{ if eq (trim $snippet "") "" }}
  {{ errorf "Region %q not found or empty in file %s (referenced in %s)" $region $file .Page.File.Path }}
{{ end }}

{{ if eq $lang "text" }}
  {{ $snippet | markdownify }}
{{ else }}
  {{ highlight (trim $snippet "\n") $lang "" }}
{{ end }}

```

--------------------------------------------------------------------------------
/docs/en/resources/tools/looker/looker-update-project-file.md:
--------------------------------------------------------------------------------

```markdown
---
title: "looker-update-project-file"
type: docs
weight: 1
description: >
  A "looker-update-project-file" tool updates the content of a LookML file in a project.
aliases:
- /resources/tools/looker-update-project-file
---

## About

A `looker-update-project-file` tool updates the content of a LookML file.

It's compatible with the following sources:

- [looker](../../sources/looker.md)

`looker-update-project-file` accepts a project_id parameter and a file_path parameter
as well as the new file content.

## Example

```yaml
tools:
    update_project_file:
        kind: looker-update-project-file
        source: looker-source
        description: |
          update_project_file Tool

          Given a project_id and a file path within the project, as well as the content
          of a LookML file, this tool will modify the file within the project.

          This tool must be called after the dev_mode tool has changed the session to
          dev mode.
```

## Reference

| **field**   | **type** | **required** | **description**                                    |
|-------------|:--------:|:------------:|----------------------------------------------------|
| kind        |  string  |     true     | Must be "looker-update-project-file".              |
| source      |  string  |     true     | Name of the source Looker instance.                |
| description |  string  |     true     | Description of the tool that is passed to the LLM. |

```

--------------------------------------------------------------------------------
/docs/en/resources/tools/looker/looker-get-explores.md:
--------------------------------------------------------------------------------

```markdown
---
title: "looker-get-explores"
type: docs
weight: 1
description: >
  A "looker-get-explores" tool returns all explores
  for the given model from the source.
aliases:
- /resources/tools/looker-get-explores
---

## About

A `looker-get-explores` tool returns all explores
for a given model from the source.

It's compatible with the following sources:

- [looker](../../sources/looker.md)

`looker-get-explores` accepts one parameter, the
`model` id.

The return type is an array of maps, each map is formatted like:

```json
{
    "name": "explore name",
    "description": "explore description",
    "label": "explore label",
    "group_label": "group label"
}
```

## Example

```yaml
tools:
    get_explores:
        kind: looker-get-explores
        source: looker-source
        description: |
          The get_explores tool retrieves the list of explores defined in a LookML model
          in the Looker system.

          It takes one parameter, the model_name looked up from get_models.
```

## Reference

| **field**   | **type** | **required** | **description**                                    |
|-------------|:--------:|:------------:|----------------------------------------------------|
| kind        |  string  |     true     | Must be "looker-get-explores".                     |
| source      |  string  |     true     | Name of the source the SQL should execute on.      |
| description |  string  |     true     | Description of the tool that is passed to the LLM. |

```

--------------------------------------------------------------------------------
/docs/en/resources/tools/mssql/mssql-execute-sql.md:
--------------------------------------------------------------------------------

```markdown
---
title: "mssql-execute-sql"
type: docs
weight: 1
description: >
  A "mssql-execute-sql" tool executes a SQL statement against a SQL Server
  database.
aliases:
- /resources/tools/mssql-execute-sql
---

## About

A `mssql-execute-sql` tool executes a SQL statement against a SQL Server
database. It's compatible with any of the following sources:

- [cloud-sql-mssql](../../sources/cloud-sql-mssql.md)
- [mssql](../../sources/mssql.md)

`mssql-execute-sql` takes one input parameter `sql` and run the sql
statement against the `source`.

> **Note:** This tool is intended for developer assistant workflows with
> human-in-the-loop and shouldn't be used for production agents.

## Example

```yaml
tools:
 execute_sql_tool:
    kind: mssql-execute-sql
    source: my-mssql-instance
    description: Use this tool to execute sql statement.
```

## Reference

| **field**   |                  **type**                  | **required** | **description**                                    |
|-------------|:------------------------------------------:|:------------:|----------------------------------------------------|
| kind        |                   string                   |     true     | Must be "mssql-execute-sql".                       |
| source      |                   string                   |     true     | Name of the source the SQL should execute on.      |
| description |                   string                   |     true     | Description of the tool that is passed to the LLM. |

```

--------------------------------------------------------------------------------
/internal/prebuiltconfigs/tools/cloud-sql-mssql-admin.yaml:
--------------------------------------------------------------------------------

```yaml
# Copyright 2025 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
#     http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

sources:
  cloud-sql-admin-source:
    kind: cloud-sql-admin
    defaultProject: ${CLOUD_SQL_MSSQL_PROJECT:}

tools:
  create_instance:
    kind: cloud-sql-mssql-create-instance
    source: cloud-sql-admin-source
  get_instance:
    kind: cloud-sql-get-instance
    source: cloud-sql-admin-source
  list_instances:
    kind: cloud-sql-list-instances
    source: cloud-sql-admin-source
  create_database:
    kind: cloud-sql-create-database
    source: cloud-sql-admin-source
  list_databases:
    kind: cloud-sql-list-databases
    source: cloud-sql-admin-source
  create_user:
    kind: cloud-sql-create-users
    source: cloud-sql-admin-source
  wait_for_operation:
    kind: cloud-sql-wait-for-operation
    source: cloud-sql-admin-source

toolsets:
  cloud_sql_mssql_admin_tools:
    - create_instance
    - get_instance
    - list_instances
    - create_database
    - list_databases
    - create_user
    - wait_for_operation

```

--------------------------------------------------------------------------------
/internal/prebuiltconfigs/tools/cloud-sql-mysql-admin.yaml:
--------------------------------------------------------------------------------

```yaml
# Copyright 2025 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
#     http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

sources:
  cloud-sql-admin-source:
    kind: cloud-sql-admin
    defaultProject: ${CLOUD_SQL_MYSQL_PROJECT:}

tools:
  create_instance:
    kind: cloud-sql-mysql-create-instance
    source: cloud-sql-admin-source
  get_instance:
    kind: cloud-sql-get-instance
    source: cloud-sql-admin-source
  list_instances:
    kind: cloud-sql-list-instances
    source: cloud-sql-admin-source
  create_database:
    kind: cloud-sql-create-database
    source: cloud-sql-admin-source
  list_databases:
    kind: cloud-sql-list-databases
    source: cloud-sql-admin-source
  create_user:
    kind: cloud-sql-create-users
    source: cloud-sql-admin-source
  wait_for_operation:
    kind: cloud-sql-wait-for-operation
    source: cloud-sql-admin-source

toolsets:
  cloud_sql_mysql_admin_tools:
    - create_instance
    - get_instance
    - list_instances
    - create_database
    - list_databases
    - create_user
    - wait_for_operation

```

--------------------------------------------------------------------------------
/.github/sync-repo-settings.yaml:
--------------------------------------------------------------------------------

```yaml
# Copyright 2024 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
#      http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

# Synchronize repository settings from a centralized config
# https://github.com/googleapis/repo-automation-bots/tree/main/packages/sync-repo-settings
# Install: https://github.com/apps/sync-repo-settings

# Disable merge commits
rebaseMergeAllowed: true
squashMergeAllowed: true
mergeCommitAllowed: false
# Enable branch protection
branchProtectionRules:
  - pattern: main
    isAdminEnforced: true
    requiredStatusCheckContexts:
      - "cla/google"
      - "lint"
      - "conventionalcommits.org"
      - "header-check"
    # - Add required status checks like presubmit tests
      - "unit tests (ubuntu-latest)"
      - "unit tests (windows-latest)"
      - "unit tests (macos-latest)"
      - "integration-test-pr (toolbox-testing-438616)"
    requiredApprovingReviewCount: 1
    requiresCodeOwnerReviews: true
    requiresStrictStatusChecks: true

# Set team access
permissionRules:
  - team: senseai-eco
    permission: admin

```

--------------------------------------------------------------------------------
/docs/en/resources/tools/postgres/postgres-list-installed-extensions.md:
--------------------------------------------------------------------------------

```markdown
---
title: "postgres-list-installed-extensions"
type: docs
weight: 1
description: >
  The "postgres-list-installed-extensions" tool retrieves all PostgreSQL
  extensions installed on a Postgres database.
aliases:
- /resources/tools/postgres-list-installed-extensions
---

## About

The `postgres-list-installed-extensions` tool retrieves all PostgreSQL
extensions installed on a Postgres database. It's compatible with any of the
following sources:

- [alloydb-postgres](../../sources/alloydb-pg.md)
- [cloud-sql-postgres](../../sources/cloud-sql-pg.md)
- [postgres](../../sources/postgres.md)

`postgres-list-installed-extensions` lists all installed PostgreSQL extensions
(extension name, version, schema, owner, description) as JSON. The does not
support any input parameter.

## Example

```yaml
tools:
  list_installed_extensions:
    kind: postgres-list-installed-extensions
    source: postgres-source
    description: List all installed PostgreSQL extensions with their name, version, schema, owner, and description.
```

## Reference

| **field**   | **type** | **required** | **description**                                    |
|-------------|:--------:|:------------:|----------------------------------------------------|
| kind        |  string  |     true     | Must be "postgres-list-active-queries".            |
| source      |  string  |     true     | Name of the source the SQL should execute on.      |
| description |  string  |     true     | Description of the tool that is passed to the LLM. |

```

--------------------------------------------------------------------------------
/docs/en/resources/tools/looker/looker-get-connection-table-columns.md:
--------------------------------------------------------------------------------

```markdown
---
title: "looker-get-connection-table-columns"
type: docs
weight: 1
description: >
  A "looker-get-connection-table-columns" tool returns all the columns for each table specified.
aliases:
- /resources/tools/looker-get-connection-table-columns
---

## About

A `looker-get-connection-table-columns` tool returns all the columnes for each table specified.

It's compatible with the following sources:

- [looker](../../sources/looker.md)

`looker-get-connection-table-columns` accepts a `conn` parameter, a `schema` parameter, a `tables` parameter with a comma separated list of tables, and an optional `db` parameter.

## Example

```yaml
tools:
    get_connection_table_columns:
        kind: looker-get-connection-table-columns
        source: looker-source
        description: |
          get_connection_table_columns Tool

          This tool will list the columns available from a connection, for all the tables
          given in a comma separated list of table names, filtered by the 
          schema name and optional database name.
```

## Reference

| **field**   | **type** | **required** | **description**                                    |
|-------------|:--------:|:------------:|----------------------------------------------------|
| kind        |  string  |     true     | Must be "looker-get-connection-table-columns".     |
| source      |  string  |     true     | Name of the source Looker instance.                |
| description |  string  |     true     | Description of the tool that is passed to the LLM. |

```

--------------------------------------------------------------------------------
/docs/en/resources/tools/cloudhealthcare/cloud-healthcare-get-fhir-store.md:
--------------------------------------------------------------------------------

```markdown
---
title: "cloud-healthcare-get-fhir-store"
type: docs
weight: 1
description: >
  A "cloud-healthcare-get-fhir-store" tool retrieves information about a FHIR store.
aliases:
- /resources/tools/cloud-healthcare-get-fhir-store
---

## About

A `cloud-healthcare-get-fhir-store` tool retrieves information about a FHIR store. It's
compatible with the following sources:

- [cloud-healthcare](../../sources/cloud-healthcare.md)

`cloud-healthcare-get-fhir-store` returns the details of a FHIR store.

## Example

```yaml
tools:
  get_fhir_store:
    kind: cloud-healthcare-get-fhir-store
    source: my-healthcare-source
    description: Use this tool to get information about a FHIR store.
```

## Reference

| **field**   | **type** | **required** | **description**                                    |
|-------------|:--------:|:------------:|----------------------------------------------------|
| kind        |  string  |     true     | Must be "cloud-healthcare-get-fhir-store".         |
| source      |  string  |     true     | Name of the healthcare source.                     |
| description |  string  |     true     | Description of the tool that is passed to the LLM. |

### Parameters

| **field** | **type** | **required** | **description**                       |
|-----------|:--------:|:------------:|---------------------------------------|
| storeID   |  string  |    true*     | The FHIR store ID to get details for. |

*If the `allowedFHIRStores` in the source has length 1, then the `storeID` parameter is not needed.

```

--------------------------------------------------------------------------------
/docs/en/resources/tools/alloydb/alloydb-get-cluster.md:
--------------------------------------------------------------------------------

```markdown
---
title: alloydb-get-cluster
type: docs
weight: 1
description: "The \"alloydb-get-cluster\" tool retrieves details for a specific AlloyDB cluster.\n"
aliases: [/resources/tools/alloydb-get-cluster]
---

## About

The `alloydb-get-cluster` tool retrieves detailed information for a single,
specified AlloyDB cluster. It is compatible with
[alloydb-admin](../../sources/alloydb-admin.md) source.

| Parameter  | Type   | Description                                        | Required |
| :--------- | :----- | :------------------------------------------------- | :------- |
| `project`  | string | The GCP project ID to get cluster for.             | Yes      |
| `location` | string | The location of the cluster (e.g., 'us-central1'). | Yes      |
| `cluster`  | string | The ID of the cluster to retrieve.                 | Yes      |

## Example

```yaml
tools:
  get_specific_cluster:
    kind: alloydb-get-cluster
    source: my-alloydb-admin-source
    description: Use this tool to retrieve details for a specific AlloyDB cluster.
```

## Reference

| **field**   | **type** | **required** | **description**                                      |
| ----------- | :------: | :----------: | ---------------------------------------------------- |
| kind        |  string  |     true     | Must be alloydb-get-cluster.                         |
| source      |  string  |     true     | The name of an `alloydb-admin` source.               |
| description |  string  |     false    | Description of the tool that is passed to the agent. |

```

--------------------------------------------------------------------------------
/internal/tools/http_method.go:
--------------------------------------------------------------------------------

```go
// Copyright 2025 Google LLC
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
//	http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package tools

import (
	"context"
	"fmt"
	"net/http"
	"strings"
)

// HTTPMethod is a string of a valid HTTP method (e.g "GET")
type HTTPMethod string

// isValidHTTPMethod checks if the input string matches one of the method constants defined in the net/http package
func isValidHTTPMethod(method string) bool {

	switch method {
	case http.MethodGet, http.MethodPost, http.MethodPut, http.MethodDelete,
		http.MethodPatch, http.MethodHead, http.MethodOptions, http.MethodTrace,
		http.MethodConnect:
		return true
	}
	return false
}

func (i *HTTPMethod) UnmarshalYAML(ctx context.Context, unmarshal func(interface{}) error) error {
	var httpMethod string
	if err := unmarshal(&httpMethod); err != nil {
		return fmt.Errorf(`error unmarshalling HTTP method: %s`, err)
	}
	httpMethod = strings.ToUpper(httpMethod)
	if !isValidHTTPMethod(httpMethod) {
		return fmt.Errorf(`%s is not a valid http method`, httpMethod)
	}
	*i = HTTPMethod(httpMethod)
	return nil
}

```

--------------------------------------------------------------------------------
/docs/en/resources/tools/alloydb/alloydb-list-users.md:
--------------------------------------------------------------------------------

```markdown
---
title: alloydb-list-users
type: docs
weight: 1
description: "The \"alloydb-list-users\" tool lists all database users within an AlloyDB cluster.\n"
aliases: [/resources/tools/alloydb-list-users]
---

## About

The `alloydb-list-users` tool lists all database users within an AlloyDB
cluster. It is compatible with [alloydb-admin](../../sources/alloydb-admin.md)
source.
The tool takes the following input parameters:

| Parameter  | Type   | Description                                        | Required |
| :--------- | :----- | :------------------------------------------------- | :------- |
| `project`  | string | The GCP project ID to list users for.              | Yes      |
| `cluster`  | string | The ID of the cluster to list users from.          | Yes      |
| `location` | string | The location of the cluster (e.g., 'us-central1'). | Yes      |

## Example

```yaml
tools:
  list_users:
    kind: alloydb-list-users
    source: alloydb-admin-source
    description: Use this tool to list all database users within an AlloyDB cluster
```

## Reference

| **field**   | **type** | **required** | **description**                                      |
| ----------- | :------: | :----------: | ---------------------------------------------------- |
| kind        |  string  |     true     | Must be alloydb-list-users.                          |
| source      |  string  |     true     | The name of an `alloydb-admin` source.               |
| description |  string  |     false    | Description of the tool that is passed to the agent. |

```

--------------------------------------------------------------------------------
/docs/en/resources/tools/cloudhealthcare/cloud-healthcare-get-dicom-store.md:
--------------------------------------------------------------------------------

```markdown
---
title: "cloud-healthcare-get-dicom-store"
type: docs
weight: 1
description: >
  A "cloud-healthcare-get-dicom-store" tool retrieves information about a DICOM store.
aliases:
- /resources/tools/cloud-healthcare-get-dicom-store
---

## About

A `cloud-healthcare-get-dicom-store` tool retrieves information about a DICOM store. It's
compatible with the following sources:

- [cloud-healthcare](../../sources/cloud-healthcare.md)

`cloud-healthcare-get-dicom-store` returns the details of a DICOM store.

## Example

```yaml
tools:
  get_dicom_store:
    kind: cloud-healthcare-get-dicom-store
    source: my-healthcare-source
    description: Use this tool to get information about a DICOM store.
```

## Reference

| **field**   | **type** | **required** | **description**                                    |
|-------------|:--------:|:------------:|----------------------------------------------------|
| kind        |  string  |     true     | Must be "cloud-healthcare-get-dicom-store".        |
| source      |  string  |     true     | Name of the healthcare source.                     |
| description |  string  |     true     | Description of the tool that is passed to the LLM. |

### Parameters

| **field** | **type** | **required** | **description**                        |
|-----------|:--------:|:------------:|----------------------------------------|
| storeID   |  string  |    true*     | The DICOM store ID to get details for. |

*If the `allowedDICOMStores` in the source has length 1, then the `storeID` parameter is not needed.

```

--------------------------------------------------------------------------------
/internal/prebuiltconfigs/tools/cloud-sql-mssql.yaml:
--------------------------------------------------------------------------------

```yaml
# Copyright 2025 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
#     http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

sources:
    cloudsql-mssql-source:
        kind: cloud-sql-mssql
        project: ${CLOUD_SQL_MSSQL_PROJECT}
        region: ${CLOUD_SQL_MSSQL_REGION}
        instance: ${CLOUD_SQL_MSSQL_INSTANCE}
        database: ${CLOUD_SQL_MSSQL_DATABASE}
        user: ${CLOUD_SQL_MSSQL_USER}
        password: ${CLOUD_SQL_MSSQL_PASSWORD}
        ipType: ${CLOUD_SQL_MSSQL_IP_TYPE:public}
tools:
    execute_sql:
        kind: mssql-execute-sql
        source: cloudsql-mssql-source
        description: Use this tool to execute SQL.

    list_tables:
        kind: mssql-list-tables
        source: cloudsql-mssql-source
        description: "Lists detailed schema information (object type, columns, constraints, indexes, triggers, comment) as JSON for user-created tables (ordinary or partitioned). Filters by a comma-separated list of names. If names are omitted, lists all tables in user schemas."

toolsets:
    cloud_sql_mssql_database_tools:
        - execute_sql
        - list_tables

```

--------------------------------------------------------------------------------
/.ci/quickstart_test/setup_hotels_sample.sql:
--------------------------------------------------------------------------------

```sql
-- Copyright 2025 Google LLC
-- 
-- Licensed under the Apache License, Version 2.0 (the "License");
-- you may not use this file except in compliance with the License.
-- You may obtain a copy of the License at
-- 
--      http://www.apache.org/licenses/LICENSE-2.0
-- 
-- Unless required by applicable law or agreed to in writing, software
-- distributed under the License is distributed on an "AS IS" BASIS,
-- WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-- See the License for the specific language governing permissions and
-- limitations under the License.

TRUNCATE TABLE $TABLE_NAME;

INSERT INTO $TABLE_NAME (id, name, location, price_tier, checkin_date, checkout_date, booked)
VALUES
  (1, 'Hilton Basel', 'Basel', 'Luxury', '2024-04-22', '2024-04-20', B'0'),
  (2, 'Marriott Zurich', 'Zurich', 'Upscale', '2024-04-14', '2024-04-21', B'0'),
  (3, 'Hyatt Regency Basel', 'Basel', 'Upper Upscale', '2024-04-02', '2024-04-20', B'0'),
  (4, 'Radisson Blu Lucerne', 'Lucerne', 'Midscale', '2024-04-24', '2024-04-05', B'0'),
  (5, 'Best Western Bern', 'Bern', 'Upper Midscale', '2024-04-23', '2024-04-01', B'0'),
  (6, 'InterContinental Geneva', 'Geneva', 'Luxury', '2024-04-23', '2024-04-28', B'0'),
  (7, 'Sheraton Zurich', 'Zurich', 'Upper Upscale', '2024-04-27', '2024-04-02', B'0'),
  (8, 'Holiday Inn Basel', 'Basel', 'Upper Midscale', '2024-04-24', '2024-04-09', B'0'),
  (9, 'Courtyard Zurich', 'Zurich', 'Upscale', '2024-04-03', '2024-04-13', B'0'),
  (10, 'Comfort Inn Bern', 'Bern', 'Midscale', '2024-04-04', '2024-04-16', B'0');
```

--------------------------------------------------------------------------------
/.hugo/layouts/shortcodes/ipynb.html:
--------------------------------------------------------------------------------

```html
{{ $notebookFile := .Get 0 }}
{{ with .Page.Resources.Get $notebookFile }}

  {{ $content := .Content | transform.Unmarshal }}
  {{ range $content.cells }}

    {{ if eq .cell_type "markdown" }}
      <div class="notebook-markdown">
        {{ $markdown := "" }}
        {{ range .source }}{{ $markdown = print $markdown . }}{{ end }}
        {{ $markdown | markdownify }}
      </div>
    {{ end }}

    {{ if eq .cell_type "code" }}
      <div class="notebook-code">
        {{ $code := "" }}
        {{ range .source }}{{ $code = print $code . }}{{ end }}
        {{ highlight $code "python" "" }}

        {{ range .outputs }}
          <div class="notebook-output">
            {{ with .text }}
              <pre class="notebook-stream"><code>{{- range . }}{{ . }}{{ end -}}</code></pre>
            {{ end }}

            {{ with .data }}
              {{ with index . "image/png" }}
                <img src="data:image/png;base64,{{ . }}" alt="Notebook output image">
              {{ end }}
              {{ with index . "image/jpeg" }}
                <img src="data:image/jpeg;base64,{{ . }}" alt="Notebook output image">
              {{ end }}
              {{ with index . "text/html" }}
                {{ $html := "" }}
                {{ range . }}{{ $html = print $html . }}{{ end }}
                {{ $html | safeHTML }}
              {{ end }}
            {{ end }}
          </div>
        {{ end }}
      </div>
    {{ end }}
  {{ end }}
{{ else }}
  <p style="color: red;">Error: Notebook '{{ $notebookFile }}' not found in page resources.</p>
{{ end }}
```

--------------------------------------------------------------------------------
/docs/en/resources/tools/cloudhealthcare/cloud-healthcare-get-fhir-store-metrics.md:
--------------------------------------------------------------------------------

```markdown
---
title: "cloud-healthcare-get-fhir-store-metrics"
type: docs
weight: 1
description: >
  A "cloud-healthcare-get-fhir-store-metrics" tool retrieves metrics for a FHIR store.
aliases:
- /resources/tools/cloud-healthcare-get-fhir-store-metrics
---

## About

A `cloud-healthcare-get-fhir-store-metrics` tool retrieves metrics for a FHIR store. It's
compatible with the following sources:

- [cloud-healthcare](../../sources/cloud-healthcare.md)

`cloud-healthcare-get-fhir-store-metrics` returns the metrics of a FHIR store.

## Example

```yaml
tools:
  get_fhir_store_metrics:
    kind: cloud-healthcare-get-fhir-store-metrics
    source: my-healthcare-source
    description: Use this tool to get metrics for a FHIR store.
```

## Reference

| **field**   | **type** | **required** | **description**                                    |
|-------------|:--------:|:------------:|----------------------------------------------------|
| kind        |  string  |     true     | Must be "cloud-healthcare-get-fhir-store-metrics". |
| source      |  string  |     true     | Name of the healthcare source.                     |
| description |  string  |     true     | Description of the tool that is passed to the LLM. |

### Parameters

| **field** | **type** | **required** | **description**                       |
|-----------|:--------:|:------------:|---------------------------------------|
| storeID   |  string  |    true*     | The FHIR store ID to get metrics for. |

*If the `allowedFHIRStores` in the source has length 1, then the `storeID` parameter is not needed.

```

--------------------------------------------------------------------------------
/docs/en/resources/tools/cloudsql/cloudsqlcreatedatabase.md:
--------------------------------------------------------------------------------

```markdown
---
title: cloud-sql-create-database
type: docs
weight: 10
description: >
  Create a new database in a Cloud SQL instance.
---

The `cloud-sql-create-database` tool creates a new database in a specified Cloud
SQL instance.

{{< notice info >}}
This tool uses a `source` of kind `cloud-sql-admin`.
{{< /notice >}}

## Example

```yaml
tools:
  create-cloud-sql-database:
    kind: cloud-sql-create-database
    source: my-cloud-sql-admin-source
    description: "Creates a new database in a Cloud SQL instance."
```

## Reference

| **field**   | **type** | **required** | **description**                                  |
| ----------- | :------: | :----------: | ------------------------------------------------ |
| kind        |  string  |     true     | Must be "cloud-sql-create-database".             |
| source      |  string  |     true     | The name of the `cloud-sql-admin` source to use. |
| description |  string  |     false    | A description of the tool.                       |

## Input Parameters

| **parameter** | **type** | **required** | **description**                                                    |
| ------------- | :------: | :----------: | ------------------------------------------------------------------ |
| project       |  string  |     true     | The project ID.                                                    |
| instance      |  string  |     true     | The ID of the instance where the database will be created.         |
| name          |  string  |     true     | The name for the new database. Must be unique within the instance. |

```

--------------------------------------------------------------------------------
/docs/en/resources/tools/cloudsql/cloudsqllistinstances.md:
--------------------------------------------------------------------------------

```markdown
---
title: Cloud SQL List Instances
type: docs
weight: 1
description: "List Cloud SQL instances in a project.\n"
---

The `cloud-sql-list-instances` tool lists all Cloud SQL instances in a specified
Google Cloud project.

{{< notice info >}}
This tool uses the `cloud-sql-admin` source, which automatically handles
authentication on behalf of the user.
{{< /notice >}}

## Configuration

Here is an example of how to configure the `cloud-sql-list-instances` tool in
your `tools.yaml` file:

```yaml
sources:
  my-cloud-sql-admin-source:
    kind: cloud-sql-admin

tools:
  list_my_instances:
    kind: cloud-sql-list-instances
    source: my-cloud-sql-admin-source
    description: Use this tool to list all Cloud SQL instances in a project.
```

## Parameters

The `cloud-sql-list-instances` tool has one required parameter:

| **field** | **type** | **required** | **description**              |
| --------- | :------: | :----------: | ---------------------------- |
| project   |  string  |     true     | The Google Cloud project ID. |

## Reference

| **field**   | **type** | **required** | **description**                                                |
|-------------|:--------:|:------------:|----------------------------------------------------------------|
| kind        |  string  |     true     | Must be "cloud-sql-list-instances".                            |
| description |  string  |    false     | Description of the tool that is passed to the agent.           |
| source      |  string  |     true     | The name of the `cloud-sql-admin` source to use for this tool. |

```

--------------------------------------------------------------------------------
/docs/en/resources/tools/postgres/postgres-list-views.md:
--------------------------------------------------------------------------------

```markdown
---
title: "postgres-list-views"
type: docs
weight: 1
description: >
  The "postgres-list-views" tool lists views in a Postgres database, with a default limit of 50 rows.
aliases:
- /resources/tools/postgres-list-views
---

## About

The `postgres-list-views` tool retrieves a list of top N (default 50) views from
a Postgres database, excluding those in system schemas (`pg_catalog`,
`information_schema`). It's compatible with any of the following sources:

- [alloydb-postgres](../../sources/alloydb-pg.md)
- [cloud-sql-postgres](../../sources/cloud-sql-pg.md)
- [postgres](../../sources/postgres.md)

`postgres-list-views` lists detailed view information (schemaname, viewname,
ownername) as JSON for views in a database. The tool takes the following input
parameters:

- `viewname` (optional): A string pattern to filter view names. The search uses
  SQL LIKE operator to filter the views. Default: `""`
- `limit` (optional): The maximum number of rows to return. Default: `50`.

## Example

```yaml
tools:
  list_views:
    kind: postgres-list-views
    source: cloudsql-pg-source
```

## Reference

| **field**   | **type** | **required** | **description**                                      |
|-------------|:--------:|:------------:|------------------------------------------------------|
| kind        |  string  |     true     | Must be "postgres-list-views".                       |
| source      |  string  |     true     | Name of the source the SQL should execute on.        |
| description |  string  |    false     | Description of the tool that is passed to the agent. |

```

--------------------------------------------------------------------------------
/docs/en/resources/tools/looker/looker-conversational-analytics.md:
--------------------------------------------------------------------------------

```markdown
---
title: "looker-conversational-analytics"
type: docs
weight: 1
description: >
  The "looker-conversational-analytics" tool will use the Conversational
  Analaytics API to analyze data from Looker
aliases:
- /resources/tools/looker-conversational-analytics
---

## About

A `looker-conversational-analytics` tool allows you to ask questions about your
Looker data.

It's compatible with the following sources:

- [looker](../../sources/looker.md)

`looker-conversational-analytics` accepts two parameters:

1. `user_query_with_context`: The question asked of the Conversational Analytics
   system.
2. `explore_references`: A list of one to five explores that can be queried to
   answer the question. The form of the entry is `[{"model": "model name",
   "explore": "explore name"}, ...]`

## Example

```yaml
tools:
    ask_data_insights:
        kind: looker-conversational-analytics
        source: looker-source
        description: |
          Use this tool to perform data analysis, get insights,
          or answer complex questions about the contents of specific
          Looker explores.
```

## Reference

| **field**   | **type** | **required** | **description**                                    |
|-------------|:--------:|:------------:|----------------------------------------------------|
| kind        |  string  |     true     | Must be "lookerca-conversational-analytics".       |
| source      |  string  |     true     | Name of the source the SQL should execute on.      |
| description |  string  |     true     | Description of the tool that is passed to the LLM. |

```

--------------------------------------------------------------------------------
/docs/en/resources/tools/cloudhealthcare/cloud-healthcare-get-dicom-store-metrics.md:
--------------------------------------------------------------------------------

```markdown
---
title: "cloud-healthcare-get-dicom-store-metrics"
type: docs
weight: 1
description: >
  A "cloud-healthcare-get-dicom-store-metrics" tool retrieves metrics for a DICOM store.
aliases:
- /resources/tools/cloud-healthcare-get-dicom-store-metrics
---

## About

A `cloud-healthcare-get-dicom-store-metrics` tool retrieves metrics for a DICOM
store. It's compatible with the following sources:

- [cloud-healthcare](../../sources/cloud-healthcare.md)

`cloud-healthcare-get-dicom-store-metrics` returns the metrics of a DICOM store.

## Example

```yaml
tools:
  get_dicom_store_metrics:
    kind: cloud-healthcare-get-dicom-store-metrics
    source: my-healthcare-source
    description: Use this tool to get metrics for a DICOM store.
```

## Reference

| **field**   | **type** | **required** | **description**                                     |
|-------------|:--------:|:------------:|-----------------------------------------------------|
| kind        |  string  |     true     | Must be "cloud-healthcare-get-dicom-store-metrics". |
| source      |  string  |     true     | Name of the healthcare source.                      |
| description |  string  |     true     | Description of the tool that is passed to the LLM.  |

### Parameters

| **field** | **type** | **required** | **description**                        |
|-----------|:--------:|:------------:|----------------------------------------|
| storeID   |  string  |    true*     | The DICOM store ID to get metrics for. |

*If the `allowedDICOMStores` in the source has length 1, then the `storeID`
parameter is not needed.

```

--------------------------------------------------------------------------------
/internal/prebuiltconfigs/tools/spanner.yaml:
--------------------------------------------------------------------------------

```yaml
# Copyright 2025 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
#     http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

sources:
  spanner-source:
    kind: spanner
    project: ${SPANNER_PROJECT}
    instance: ${SPANNER_INSTANCE}
    database: ${SPANNER_DATABASE}
    dialect: ${SPANNER_DIALECT:googlesql}

tools:
  execute_sql:
    kind: spanner-execute-sql
    source: spanner-source
    description: Use this tool to execute DML SQL. Please use the ${SPANNER_DIALECT:googlesql} interface for Spanner.

  execute_sql_dql:
    kind: spanner-execute-sql
    source: spanner-source
    description: Use this tool to execute DQL SQL. Please use the ${SPANNER_DIALECT:googlesql} interface for Spanner.
    readOnly: true

  list_tables:
    kind: spanner-list-tables
    source: spanner-source
    description: "Lists detailed schema information (object type, columns, constraints, indexes) as JSON for user-created tables (ordinary or partitioned). Filters by a comma-separated list of names. If names are omitted, lists all tables in user schemas."

toolsets:
  spanner-database-tools:
    - execute_sql
    - execute_sql_dql
    - list_tables

```

--------------------------------------------------------------------------------
/cmd/options_test.go:
--------------------------------------------------------------------------------

```go
// Copyright 2024 Google LLC
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
//     http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.

package cmd

import (
	"errors"
	"io"
	"testing"

	"github.com/spf13/cobra"
)

func TestCommandOptions(t *testing.T) {
	w := io.Discard
	tcs := []struct {
		desc    string
		isValid func(*Command) error
		option  Option
	}{
		{
			desc: "with logger",
			isValid: func(c *Command) error {
				if c.outStream != w || c.errStream != w {
					return errors.New("loggers do not match")
				}
				return nil
			},
			option: WithStreams(w, w),
		},
	}
	for _, tc := range tcs {
		t.Run(tc.desc, func(t *testing.T) {
			got, err := invokeProxyWithOption(tc.option)
			if err != nil {
				t.Fatal(err)
			}
			if err := tc.isValid(got); err != nil {
				t.Errorf("option did not initialize command correctly: %v", err)
			}
		})
	}
}

func invokeProxyWithOption(o Option) (*Command, error) {
	c := NewCommand(o)
	// Keep the test output quiet
	c.SilenceUsage = true
	c.SilenceErrors = true
	// Disable execute behavior
	c.RunE = func(*cobra.Command, []string) error {
		return nil
	}

	err := c.Execute()
	return c, err
}

```

--------------------------------------------------------------------------------
/docs/en/resources/tools/alloydb/alloydb-get-user.md:
--------------------------------------------------------------------------------

```markdown
---
title: alloydb-get-user
type: docs
weight: 1
description: "The \"alloydb-get-user\" tool retrieves details for a specific AlloyDB user.\n"
aliases: [/resources/tools/alloydb-get-user]
---

## About

The `alloydb-get-user` tool retrieves detailed information for a single,
specified AlloyDB user. It is compatible with
[alloydb-admin](../../sources/alloydb-admin.md) source.

| Parameter  | Type   | Description                                        | Required |
| :--------- | :----- | :------------------------------------------------- | :------- |
| `project`  | string | The GCP project ID to get user for.                | Yes      |
| `location` | string | The location of the cluster (e.g., 'us-central1'). | Yes      |
| `cluster`  | string | The ID of the cluster to retrieve the user from.   | Yes      |
| `user`     | string | The ID of the user to retrieve.                    | Yes      |

## Example

```yaml
tools:
  get_specific_user:
    kind: alloydb-get-user
    source: my-alloydb-admin-source
    description: Use this tool to retrieve details for a specific AlloyDB user.
```

## Reference

| **field**   | **type** | **required** | **description**                                      |
| ----------- | :------: | :----------: | ---------------------------------------------------- |
| kind        |  string  |     true     | Must be alloydb-get-user.                            |
| source      |  string  |     true     | The name of an `alloydb-admin` source.               |
| description |  string  |     false    | Description of the tool that is passed to the agent. |

```

--------------------------------------------------------------------------------
/docs/en/resources/tools/cloudsql/cloudsqllistdatabases.md:
--------------------------------------------------------------------------------

```markdown
---
title: cloud-sql-list-databases
type: docs
weight: 1
description: List Cloud SQL databases in an instance.
---

The `cloud-sql-list-databases` tool lists all Cloud SQL databases in a specified
Google Cloud project and instance.

{{< notice info >}}
This tool uses the `cloud-sql-admin` source.
{{< /notice >}}

## Configuration

Here is an example of how to configure the `cloud-sql-list-databases` tool in your
`tools.yaml` file:

```yaml
sources:
  my-cloud-sql-admin-source:
    kind: cloud-sql-admin

tools:
  list_my_databases:
    kind: cloud-sql-list-databases
    source: my-cloud-sql-admin-source
    description: Use this tool to list all Cloud SQL databases in an instance.
```

## Parameters

The `cloud-sql-list-databases` tool has two required parameters:

| **field** | **type** | **required** | **description**              |
| --------- | :------: | :----------: | ---------------------------- |
| project   |  string  |     true     | The Google Cloud project ID. |
| instance  |  string  |     true     | The Cloud SQL instance ID.   |

## Reference

| **field**   | **type** | **required** | **description**                                                |
| ----------- | :------: | :----------: | -------------------------------------------------------------- |
| kind        |  string  |     true     | Must be "cloud-sql-list-databases".                            |
| source      |  string  |     true     | The name of the `cloud-sql-admin` source to use for this tool. |
| description |  string  |     false    | Description of the tool that is passed to the agent.           |

```

--------------------------------------------------------------------------------
/docs/en/resources/tools/spanner/spanner-execute-sql.md:
--------------------------------------------------------------------------------

```markdown
---
title: "spanner-execute-sql"
type: docs
weight: 1
description: >
  A "spanner-execute-sql" tool executes a SQL statement against a Spanner
  database.
aliases:
- /resources/tools/spanner-execute-sql
---

## About

A `spanner-execute-sql` tool executes a SQL statement against a Spanner
database. It's compatible with any of the following sources:

- [spanner](../../sources/spanner.md)

`spanner-execute-sql` takes one input parameter `sql` and run the sql
statement against the `source`.

> **Note:** This tool is intended for developer assistant workflows with
> human-in-the-loop and shouldn't be used for production agents.

## Example

```yaml
tools:
 execute_sql_tool:
    kind: spanner-execute-sql
    source: my-spanner-instance
    description: Use this tool to execute sql statement.
```

## Reference

| **field**   | **type** | **required** | **description**                                                                          |
|-------------|:--------:|:------------:|------------------------------------------------------------------------------------------|
| kind        |  string  |     true     | Must be "spanner-execute-sql".                                                           |
| source      |  string  |     true     | Name of the source the SQL should execute on.                                            |
| description |  string  |     true     | Description of the tool that is passed to the LLM.                                       |
| readOnly    |   bool   |    false     | When set to `true`, the `statement` is run as a read-only transaction. Default: `false`. |

```

--------------------------------------------------------------------------------
/docs/en/resources/tools/firestore/firestore-list-collections.md:
--------------------------------------------------------------------------------

```markdown
---
title: "firestore-list-collections"
type: docs
weight: 1
description: >
  A "firestore-list-collections" tool lists collections in Firestore, either at the root level or as subcollections of a document.
aliases:
- /resources/tools/firestore-list-collections
---

## About

A `firestore-list-collections` tool lists
[collections](https://firebase.google.com/docs/firestore/data-model#collections)
in Firestore, either at the root level or as
[subcollections](https://firebase.google.com/docs/firestore/data-model#subcollections)
of a specific document.
It's compatible with the following sources:

- [firestore](../../sources/firestore.md)

`firestore-list-collections` takes an optional `parentPath` parameter to specify
a document path. If provided, it lists all subcollections of that document. If
not provided, it lists all root-level collections in the database.

## Example

```yaml
tools:
  list_firestore_collections:
    kind: firestore-list-collections
    source: my-firestore-source
    description: Use this tool to list collections in Firestore.
```

## Reference

| **field**   |      **type**    | **required** | **description**                                        |
|-------------|:----------------:|:------------:|--------------------------------------------------------|
| kind        |      string      |     true     | Must be "firestore-list-collections".                  |
| source      |      string      |     true     | Name of the Firestore source to list collections from. |
| description |      string      |     true     | Description of the tool that is passed to the LLM.     |

```

--------------------------------------------------------------------------------
/internal/util/orderedmap/orderedmap.go:
--------------------------------------------------------------------------------

```go
// Copyright 2025 Google LLC
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
//	http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package orderedmap

import (
	"bytes"
	"encoding/json"
)

// Column represents a single column in a row.
type Column struct {
	Name  string
	Value any
}

// Row represents a row of data with columns in a specific order.
type Row struct {
	Columns []Column
}

// Add adds a new column to the row.
func (r *Row) Add(name string, value any) {
	r.Columns = append(r.Columns, Column{Name: name, Value: value})
}

// MarshalJSON implements the json.Marshaler interface for the Row struct.
// It marshals the row into a JSON object, preserving the order of the columns.
func (r Row) MarshalJSON() ([]byte, error) {
	var buf bytes.Buffer
	buf.WriteString("{")
	for i, col := range r.Columns {
		if i > 0 {
			buf.WriteString(",")
		}
		// Marshal the key
		key, err := json.Marshal(col.Name)
		if err != nil {
			return nil, err
		}
		buf.Write(key)
		buf.WriteString(":")
		// Marshal the value
		val, err := json.Marshal(col.Value)
		if err != nil {
			return nil, err
		}
		buf.Write(val)
	}
	buf.WriteString("}")
	return buf.Bytes(), nil
}

```

--------------------------------------------------------------------------------
/internal/server/static/toolsets.html:
--------------------------------------------------------------------------------

```html
<!DOCTYPE html>
<html lang="en">
<head>
    <meta charset="UTF-8">
    <meta name="viewport" content="width=device-width, initial-scale=1.0">
    <title>Toolsets View</title>
    <link rel="stylesheet" href="/ui/css/style.css">
    <link href="https://fonts.googleapis.com/icon?family=Material+Icons" rel="stylesheet">
    <script src="https://accounts.google.com/gsi/client" async defer></script>
</head>
<body>
    <div id="navbar-container" data-active-nav="/ui/toolsets"></div>

    <aside class="second-nav">
        <h4>Retrieve Toolset</h4>
        <div class="search-container">
            <input type="text" id="toolset-search-input" placeholder="Enter toolset name...">
            <button id="toolset-search-button" aria-label="Retrieve Tools">
                <span class="material-icons">search</span>
            </button>
        </div>
        <div id="secondary-panel-content">
            <p>Retrieve toolset to see available tools.</p>
        </div>
    </aside>

    <div id="main-content-container"></div>

    <script type="module" src="/ui/js/toolsets.js"></script>
    <script src="/ui/js/navbar.js"></script>
    <script src="/ui/js/mainContent.js"></script>
    <script>
        document.addEventListener('DOMContentLoaded', () => {
            const navbarContainer = document.getElementById('navbar-container');
            const activeNav = navbarContainer.getAttribute('data-active-nav');
            renderNavbar('navbar-container', activeNav);
            renderMainContent('main-content-container', 'tool-display-area', getToolsetInstructions());
        });
    </script>
</body>
</html>
```

--------------------------------------------------------------------------------
/docs/en/resources/tools/looker/looker-make-dashboard.md:
--------------------------------------------------------------------------------

```markdown
---
title: "looker-make-dashboard"
type: docs
weight: 1
description: >
  "looker-make-dashboard" generates a Looker dashboard in the users personal folder in
  Looker
aliases:
- /resources/tools/looker-make-dashboard
---

## About

The `looker-make-dashboard` creates a dashboard in the user's
Looker personal folder.

It's compatible with the following sources:

- [looker](../../sources/looker.md)

`looker-make-dashboard` takes one parameter:

1. the `title`

## Example

```yaml
tools:
    make_dashboard:
        kind: looker-make-dashboard
        source: looker-source
        description: |
          make_dashboard Tool

          This tool creates a new dashboard in Looker. The dashboard is
          initially empty and the add_dashboard_element tool is used to
          add content to the dashboard.

          The newly created dashboard will be created in the user's
          personal folder in looker. The dashboard name must be unique.

          The result is a json document with a link to the newly
          created dashboard and the id of the dashboard. Use the id
          when calling add_dashboard_element.
```

## Reference

| **field**   | **type** | **required** | **description**                                    |
|-------------|:--------:|:------------:|----------------------------------------------------|
| kind        |  string  |     true     | Must be "looker-make-dashboard"                    |
| source      |  string  |     true     | Name of the source the SQL should execute on.      |
| description |  string  |     true     | Description of the tool that is passed to the LLM. |

```

--------------------------------------------------------------------------------
/docs/en/resources/tools/alloydb/alloydb-get-instance.md:
--------------------------------------------------------------------------------

```markdown
---
title: alloydb-get-instance
type: docs
weight: 1
description: "The \"alloydb-get-instance\" tool retrieves details for a specific AlloyDB instance.\n"
aliases: [/resources/tools/alloydb-get-instance]
---

## About

The `alloydb-get-instance` tool retrieves detailed information for a single,
specified AlloyDB instance. It is compatible with
[alloydb-admin](../../sources/alloydb-admin.md) source.

| Parameter  | Type   | Description                                         | Required |
|:-----------|:-------|:----------------------------------------------------|:---------|
| `project`  | string | The GCP project ID to get instance for.             | Yes      |
| `location` | string | The location of the instance (e.g., 'us-central1'). | Yes      |
| `cluster`  | string | The ID of the cluster.                              | Yes      |
| `instance` | string | The ID of the instance to retrieve.                 | Yes      |

## Example

```yaml
tools:
  get_specific_instance:
    kind: alloydb-get-instance
    source: my-alloydb-admin-source
    description: Use this tool to retrieve details for a specific AlloyDB instance.
```

## Reference

| **field**   | **type** | **required** | **description**                                      |
|-------------|:--------:|:------------:|------------------------------------------------------|
| kind        |  string  |     true     | Must be alloydb-get-instance.                        |
| source      |  string  |     true     | The name of an `alloydb-admin` source.               |
| description |  string  |    false     | Description of the tool that is passed to the agent. |

```

--------------------------------------------------------------------------------
/internal/prebuiltconfigs/tools/cloud-sql-postgres-admin.yaml:
--------------------------------------------------------------------------------

```yaml
# Copyright 2025 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
#     http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

sources:
  cloud-sql-admin-source:
    kind: cloud-sql-admin
    defaultProject: ${CLOUD_SQL_POSTGRES_PROJECT:}

tools:
  create_instance:
    kind: cloud-sql-postgres-create-instance
    source: cloud-sql-admin-source
  get_instance:
    kind: cloud-sql-get-instance
    source: cloud-sql-admin-source
  list_instances:
    kind: cloud-sql-list-instances
    source: cloud-sql-admin-source
  create_database:
    kind: cloud-sql-create-database
    source: cloud-sql-admin-source
  list_databases:
    kind: cloud-sql-list-databases
    source: cloud-sql-admin-source
  create_user:
    kind: cloud-sql-create-users
    source: cloud-sql-admin-source
  wait_for_operation:
    kind: cloud-sql-wait-for-operation
    source: cloud-sql-admin-source
  postgres_upgrade_precheck:
    kind: postgres-upgrade-precheck
    source: cloud-sql-admin-source

toolsets:
  cloud_sql_postgres_admin_tools:
    - create_instance
    - get_instance
    - list_instances
    - create_database
    - list_databases
    - create_user
    - wait_for_operation
    - postgres_upgrade_precheck

```

--------------------------------------------------------------------------------
/docs/en/resources/tools/trino/trino-execute-sql.md:
--------------------------------------------------------------------------------

```markdown
---
title: "trino-execute-sql"
type: docs
weight: 1
description: >
  A "trino-execute-sql" tool executes a SQL statement against a Trino
  database.
aliases:
- /resources/tools/trino-execute-sql
---

## About

A `trino-execute-sql` tool executes a SQL statement against a Trino
database. It's compatible with any of the following sources:

- [trino](../../sources/trino.md)

`trino-execute-sql` takes one input parameter `sql` and run the sql
statement against the `source`.

> **Note:** This tool is intended for developer assistant workflows with
> human-in-the-loop and shouldn't be used for production agents.

## Example

```yaml
tools:
 execute_sql_tool:
    kind: trino-execute-sql
    source: my-trino-instance
    description: Use this tool to execute sql statement.
```

## Reference

| **field**   |                  **type**                  | **required** | **description**                                                                                  |
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
| kind        |                   string                   |     true     | Must be "trino-execute-sql".                                                                     |
| source      |                   string                   |     true     | Name of the source the SQL should execute on.                                                    |
| description |                   string                   |     true     | Description of the tool that is passed to the LLM.                                               |

```

--------------------------------------------------------------------------------
/docs/en/resources/tools/redis/redis.md:
--------------------------------------------------------------------------------

```markdown
---
title: "redis"
type: docs
weight: 1
description: > 
  A "redis" tool executes a set of pre-defined Redis commands against a Redis instance.
aliases:
- /resources/tools/redis
---

## About

A redis tool executes a series of pre-defined Redis commands against a
Redis source.

The specified Redis commands are executed sequentially. Each command is
represented as a string list, where the first element is the command name (e.g.,
SET, GET, HGETALL) and subsequent elements are its arguments.

### Dynamic Command Parameters

Command arguments can be templated using the `$variableName` annotation. The
array type parameters will be expanded once into multiple arguments. Take the
following config for example:

```yaml
  commands:
      - [SADD, userNames, $userNames] # Array will be flattened into multiple arguments.
  parameters:
    - name: userNames
      type: array
      description: The user names to be set.  
```

If the input is an array of strings `["Alice", "Sid", "Bob"]`,  The final command
to be executed after argument expansion will be `[SADD, userNames, Alice, Sid, Bob]`.

## Example

```yaml
tools:
  user_data_tool:
    kind: redis
    source: my-redis-instance
    description: |
      Use this tool to interact with user data stored in Redis.
      It can set, retrieve, and delete user-specific information.
    commands:
      - [SADD, userNames, $userNames] # Array will be flattened into multiple arguments.
      - [GET, $userId]
    parameters:
      - name: userId
        type: string
        description: The unique identifier for the user.
      - name: userNames
        type: array
        description: The user names to be set.  
```

```

--------------------------------------------------------------------------------
/.ci/quickstart_test/js.integration.cloudbuild.yaml:
--------------------------------------------------------------------------------

```yaml
# Copyright 2025 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
#      http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

steps:
  - name: 'node:20'
    id: 'js-quickstart-test'
    entrypoint: 'bash'
    args:
      # The '-c' flag tells bash to execute the following string as a command.
      # The 'set -ex' enables debug output and exits on error for easier troubleshooting.
      - -c
      - |
        set -ex
        export VERSION=$(cat ./cmd/version.txt)
        chmod +x .ci/quickstart_test/run_js_tests.sh
        .ci/quickstart_test/run_js_tests.sh
    env:
      - 'CLOUD_SQL_INSTANCE=${_CLOUD_SQL_INSTANCE}'
      - 'GCP_PROJECT=${_GCP_PROJECT}'
      - 'DATABASE_NAME=${_DATABASE_NAME}'
      - 'DB_USER=${_DB_USER}'
    secretEnv: ['TOOLS_YAML_CONTENT', 'GOOGLE_API_KEY', 'DB_PASSWORD']

availableSecrets:
  secretManager:
  - versionName: projects/${_GCP_PROJECT}/secrets/${_TOOLS_YAML_SECRET}/versions/6
    env: 'TOOLS_YAML_CONTENT'
  - versionName: projects/${_GCP_PROJECT_NUMBER}/secrets/${_API_KEY_SECRET}/versions/latest
    env: 'GOOGLE_API_KEY'
  - versionName: projects/${_GCP_PROJECT}/secrets/${_DB_PASS_SECRET}/versions/latest
    env: 'DB_PASSWORD'

timeout: 1000s

options:
  logging: CLOUD_LOGGING_ONLY
```

--------------------------------------------------------------------------------
/docs/en/resources/tools/valkey/valkey.md:
--------------------------------------------------------------------------------

```markdown
---
title: "valkey"
type: docs
weight: 1
description: > 
  A "valkey" tool executes a set of pre-defined Valkey commands against a Valkey instance.
aliases:
- /resources/tools/valkey
---

## About

A valkey tool executes a series of pre-defined Valkey commands against a
Valkey instance.

The specified Valkey commands are executed sequentially. Each command is
represented as a string array, where the first element is the command name
(e.g., SET, GET, HGETALL) and subsequent elements are its arguments.

### Dynamic Command Parameters

Command arguments can be templated using the `$variableName` annotation. The
array type parameters will be expanded once into multiple arguments. Take the
following config for example:

```yaml
  commands:
      - [SADD, userNames, $userNames] # Array will be flattened into multiple arguments.
  parameters:
    - name: userNames
      type: array
      description: The user names to be set.  
```

If the input is an array of strings `["Alice", "Sid", "Bob"]`,  The final command
to be executed after argument expansion will be `[SADD, userNames, Alice, Sid, Bob]`.

## Example

```yaml
tools:
  user_data_tool:
    kind: valkey
    source: my-valkey-instance
    description: |
      Use this tool to interact with user data stored in Valkey.
      It can set, retrieve, and delete user-specific information.
    commands:
      - [SADD, userNames, $userNames] # Array will be flattened into multiple arguments.
      - [GET, $userId]
    parameters:
      - name: userId
        type: string
        description: The unique identifier for the user.
      - name: userNames
        type: array
        description: The user names to be set.  
```

```

--------------------------------------------------------------------------------
/.ci/quickstart_test/go.integration.cloudbuild.yaml:
--------------------------------------------------------------------------------

```yaml
# Copyright 2025 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
#      http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

steps:
  - name: 'golang:1.25.1'
    id: 'go-quickstart-test'
    entrypoint: 'bash'
    args:
      # The '-c' flag tells bash to execute the following string as a command.
      # The 'set -ex' enables debug output and exits on error for easier troubleshooting.
      - -c
      - |
        set -ex
        export VERSION=$(cat ./cmd/version.txt)
        chmod +x .ci/quickstart_test/run_go_tests.sh
        .ci/quickstart_test/run_go_tests.sh
    env:
      - 'CLOUD_SQL_INSTANCE=${_CLOUD_SQL_INSTANCE}'
      - 'GCP_PROJECT=${_GCP_PROJECT}'
      - 'DATABASE_NAME=${_DATABASE_NAME}'
      - 'DB_USER=${_DB_USER}'
    secretEnv: ['TOOLS_YAML_CONTENT', 'GOOGLE_API_KEY', 'DB_PASSWORD']

availableSecrets:
  secretManager:
  - versionName: projects/${_GCP_PROJECT}/secrets/${_TOOLS_YAML_SECRET}/versions/7
    env: 'TOOLS_YAML_CONTENT'
  - versionName: projects/${_GCP_PROJECT_NUMBER}/secrets/${_API_KEY_SECRET}/versions/latest
    env: 'GOOGLE_API_KEY'
  - versionName: projects/${_GCP_PROJECT}/secrets/${_DB_PASS_SECRET}/versions/latest
    env: 'DB_PASSWORD'

timeout: 1000s

options:
  logging: CLOUD_LOGGING_ONLY
```

--------------------------------------------------------------------------------
/docs/en/resources/tools/mysql/mysql-execute-sql.md:
--------------------------------------------------------------------------------

```markdown
---
title: "mysql-execute-sql"
type: docs
weight: 1
description: >
  A "mysql-execute-sql" tool executes a SQL statement against a MySQL
  database.
aliases:
- /resources/tools/mysql-execute-sql
---

## About

A `mysql-execute-sql` tool executes a SQL statement against a MySQL
database. It's compatible with any of the following sources:

- [cloud-sql-mysql](../../sources/cloud-sql-mysql.md)
- [mysql](../../sources/mysql.md)

`mysql-execute-sql` takes one input parameter `sql` and run the sql
statement against the `source`.

> **Note:** This tool is intended for developer assistant workflows with
> human-in-the-loop and shouldn't be used for production agents.

## Example

```yaml
tools:
 execute_sql_tool:
    kind: mysql-execute-sql
    source: my-mysql-instance
    description: Use this tool to execute sql statement.
```

## Reference

| **field**   |                  **type**                  | **required** | **description**                                                                                  |
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
| kind        |                   string                   |     true     | Must be "mysql-execute-sql".                                                                     |
| source      |                   string                   |     true     | Name of the source the SQL should execute on.                                                    |
| description |                   string                   |     true     | Description of the tool that is passed to the LLM.                                               |

```

--------------------------------------------------------------------------------
/docs/en/resources/tools/oracle/oracle-sql.md:
--------------------------------------------------------------------------------

```markdown
---
title: "oracle-sql"
type: docs
weight: 1
description: > 
  An "oracle-sql" tool executes a pre-defined SQL statement against an Oracle database.
aliases:
- /resources/tools/oracle-sql
---

## About

An `oracle-sql` tool executes a pre-defined SQL statement against an
Oracle database. It's compatible with the following source:

- [oracle](../../sources/oracle.md)

The specified SQL statement is executed using [prepared statements][oracle-stmt]
for security and performance. It expects parameter placeholders in the SQL query
to be in the native Oracle format (e.g., `:1`, `:2`).

[oracle-stmt]: https://docs.oracle.com/javase/tutorial/jdbc/basics/prepared.html

## Example

> **Note:** This tool uses parameterized queries to prevent SQL injections.
> Query parameters can be used as substitutes for arbitrary expressions.
> Parameters cannot be used as substitutes for identifiers, column names, table
> names, or other parts of the query.

```yaml
tools:
  search_flights_by_number:
    kind: oracle-sql
    source: my-oracle-instance
    statement: |
      SELECT * FROM flights
      WHERE airline = :1
      AND flight_number = :2
      FETCH FIRST 10 ROWS ONLY
    description: |
      Use this tool to get information for a specific flight.
      Takes an airline code and flight number and returns info on the flight.
      Do NOT use this tool with a flight id. Do NOT guess an airline code or flight number.
      Example:
      {{
          "airline": "CY",
          "flight_number": "888",
      }}
    parameters:
      - name: airline
        type: string
        description: Airline unique 2 letter identifier
      - name: flight_number
        type: string
        description: 1 to 4 digit number
```

```

--------------------------------------------------------------------------------
/.ci/quickstart_test/py.integration.cloudbuild.yaml:
--------------------------------------------------------------------------------

```yaml
# Copyright 2025 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
#      http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

steps:
  - name: 'gcr.io/google.com/cloudsdktool/cloud-sdk:537.0.0'
    id: 'python-quickstart-test'
    entrypoint: 'bash'
    args:
      # The '-c' flag tells bash to execute the following string as a command.
      # The 'set -ex' enables debug output and exits on error for easier troubleshooting.
      - -c
      - |
        set -ex
        export VERSION=$(cat ./cmd/version.txt)
        chmod +x .ci/quickstart_test/run_py_tests.sh
        .ci/quickstart_test/run_py_tests.sh
    env:
      - 'CLOUD_SQL_INSTANCE=${_CLOUD_SQL_INSTANCE}'
      - 'GCP_PROJECT=${_GCP_PROJECT}'
      - 'DATABASE_NAME=${_DATABASE_NAME}'
      - 'DB_USER=${_DB_USER}'
    secretEnv: ['TOOLS_YAML_CONTENT', 'GOOGLE_API_KEY', 'DB_PASSWORD']

availableSecrets:
  secretManager:
  - versionName: projects/${_GCP_PROJECT}/secrets/${_TOOLS_YAML_SECRET}/versions/5
    env: 'TOOLS_YAML_CONTENT'
  - versionName: projects/${_GCP_PROJECT_NUMBER}/secrets/${_API_KEY_SECRET}/versions/latest
    env: 'GOOGLE_API_KEY'
  - versionName: projects/${_GCP_PROJECT}/secrets/${_DB_PASS_SECRET}/versions/latest
    env: 'DB_PASSWORD'

timeout: 1000s

options:
  logging: CLOUD_LOGGING_ONLY

```

--------------------------------------------------------------------------------
/docs/en/resources/sources/cloud-monitoring.md:
--------------------------------------------------------------------------------

```markdown
---
title: "Cloud Monitoring"
type: docs
weight: 1
description: >
  A "cloud-monitoring" source provides a client for the Cloud Monitoring API.
aliases:
- /resources/sources/cloud-monitoring
---

## About

The `cloud-monitoring` source provides a client to interact with the [Google
Cloud Monitoring API](https://cloud.google.com/monitoring/api). This allows
tools to access cloud monitoring metrics explorer and run promql queries.

Authentication can be handled in two ways:

1.  **Application Default Credentials (ADC):** By default, the source uses ADC
    to authenticate with the API.
2.  **Client-side OAuth:** If `useClientOAuth` is set to `true`, the source will
    expect an OAuth 2.0 access token to be provided by the client (e.g., a web
    browser) for each request.

## Example

```yaml
sources:
    my-cloud-monitoring:
        kind: cloud-monitoring

    my-oauth-cloud-monitoring:
        kind: cloud-monitoring
        useClientOAuth: true
```

## Reference

| **field**      | **type** | **required** | **description**                                                                                                                                |
|----------------|:--------:|:------------:|------------------------------------------------------------------------------------------------------------------------------------------------|
| kind           |  string  |     true     | Must be "cloud-monitoring".                                                                                                                    |
| useClientOAuth | boolean  |    false     | If true, the source will use client-side OAuth for authorization. Otherwise, it will use Application Default Credentials. Defaults to `false`. |

```

--------------------------------------------------------------------------------
/docs/en/resources/tools/clickhouse/clickhouse-execute-sql.md:
--------------------------------------------------------------------------------

```markdown
---
title: "clickhouse-execute-sql"
type: docs
weight: 1
description: >
  A "clickhouse-execute-sql" tool executes a SQL statement against a ClickHouse
  database.
aliases:
- /resources/tools/clickhouse-execute-sql
---

## About

A `clickhouse-execute-sql` tool executes a SQL statement against a ClickHouse
database. It's compatible with the [clickhouse](../../sources/clickhouse.md)
source.

`clickhouse-execute-sql` takes one input parameter `sql` and runs the SQL
statement against the specified `source`. This tool includes query logging
capabilities for monitoring and debugging purposes.

> **Note:** This tool is intended for developer assistant workflows with
> human-in-the-loop and shouldn't be used for production agents.

## Example

```yaml
tools:
  execute_sql_tool:
    kind: clickhouse-execute-sql
    source: my-clickhouse-instance
    description: Use this tool to execute SQL statements against ClickHouse.
```

## Parameters

| **parameter** | **type** | **required** | **description**                                   |
|---------------|:--------:|:------------:|---------------------------------------------------|
| sql           |  string  |     true     | The SQL statement to execute against the database |

## Reference

| **field**   | **type** | **required** | **description**                                       |
|-------------|:--------:|:------------:|-------------------------------------------------------|
| kind        |  string  |     true     | Must be "clickhouse-execute-sql".                     |
| source      |  string  |     true     | Name of the ClickHouse source to execute SQL against. |
| description |  string  |     true     | Description of the tool that is passed to the LLM.    |

```

--------------------------------------------------------------------------------
/docs/en/resources/tools/looker/looker-get-filters.md:
--------------------------------------------------------------------------------

```markdown
---
title: "looker-get-filters"
type: docs
weight: 1
description: >
  A "looker-get-filters" tool returns all the filters from a given explore
  in a given model in the source.
aliases:
- /resources/tools/looker-get-filters
---

## About

A `looker-get-filters` tool returns all the filters from a given explore
in a given model in the source.

It's compatible with the following sources:

- [looker](../../sources/looker.md)

`looker-get-filters` accepts two parameters, the `model` and the `explore`.

## Example

```yaml
tools:
    get_dimensions:
        kind: looker-get-filters
        source: looker-source
        description: |
          The get_filters tool retrieves the list of filters defined in
          an explore.

          It takes two parameters, the model_name looked up from get_models and the
          explore_name looked up from get_explores.
```

The response is a json array with the following elements:

```json
{
  "name": "field name",
  "description": "field description",
  "type": "field type",
  "label": "field label",
  "label_short": "field short label",
  "tags": ["tags", ...],
  "synonyms": ["synonyms", ...],
  "suggestions": ["suggestion", ...],
  "suggest_explore": "explore",
  "suggest_dimension": "dimension"
}
```

## Reference

| **field**   | **type** | **required** | **description**                                    |
|-------------|:--------:|:------------:|----------------------------------------------------|
| kind        |  string  |     true     | Must be "looker-get-filters".                      |
| source      |  string  |     true     | Name of the source the SQL should execute on.      |
| description |  string  |     true     | Description of the tool that is passed to the LLM. |

```

--------------------------------------------------------------------------------
/docs/en/resources/tools/looker/looker-generate-embed-url.md:
--------------------------------------------------------------------------------

```markdown
---
title: "looker-generate-embed-url"
type: docs
weight: 1
description: >
  "looker-generate-embed-url" generates an embeddable URL for Looker content.
aliases:
- /resources/tools/looker-generate-embed-url
---

## About

The `looker-generate-embed-url` tool generates an embeddable URL for a given
piece of Looker content. The url generated is created for the user authenticated
to the Looker source. When opened in the browser it will create a Looker Embed
session.

It's compatible with the following sources:

- [looker](../../sources/looker.md)

`looker-generate-embed-url` takes two parameters:

1. the `type` of content (e.g., "dashboards", "looks", "query-visualization")
2. the `id` of the content

It's recommended to use other tools from the Looker MCP toolbox with this tool
to do things like fetch dashboard id's, generate a query, etc that can be
supplied to this tool.

## Example

```yaml
tools:
    generate_embed_url:
        kind: looker-generate-embed-url
        source: looker-source
        description: |
          generate_embed_url Tool

          This tool generates an embeddable URL for Looker content.
          You need to provide the type of content (e.g., 'dashboards', 'looks', 'query-visualization')
          and the ID of the content.
```

## Reference

| **field**   | **type** | **required** | **description**                                    |
|-------------|:--------:|:------------:|----------------------------------------------------|
| kind        |  string  |     true     | Must be "looker-generate-embed-url"                |
| source      |  string  |     true     | Name of the source the SQL should execute on.      |
| description |  string  |     true     | Description of the tool that is passed to the LLM. |

```

--------------------------------------------------------------------------------
/docs/en/resources/tools/looker/looker-get-parameters.md:
--------------------------------------------------------------------------------

```markdown
---
title: "looker-get-parameters"
type: docs
weight: 1
description: >
  A "looker-get-parameters" tool returns all the parameters from a given explore
  in a given model in the source.
aliases:
- /resources/tools/looker-get-parameters
---

## About

A `looker-get-parameters` tool returns all the parameters from a given explore
in a given model in the source.

It's compatible with the following sources:

- [looker](../../sources/looker.md)

`looker-get-parameters` accepts two parameters, the `model` and the `explore`.

## Example

```yaml
tools:
    get_parameters:
        kind: looker-get-parameters
        source: looker-source
        description: |
          The get_parameters tool retrieves the list of parameters defined in
          an explore.

          It takes two parameters, the model_name looked up from get_models and the
          explore_name looked up from get_explores.
```

The response is a json array with the following elements:

```json
{
  "name": "field name",
  "description": "field description",
  "type": "field type",
  "label": "field label",
  "label_short": "field short label",
  "tags": ["tags", ...],
  "synonyms": ["synonyms", ...],
  "suggestions": ["suggestion", ...],
  "suggest_explore": "explore",
  "suggest_dimension": "dimension"
}
```

## Reference

| **field**   | **type** | **required** | **description**                                    |
|-------------|:--------:|:------------:|----------------------------------------------------|
| kind        |  string  |     true     | Must be "looker-get-parameters".                   |
| source      |  string  |     true     | Name of the source the SQL should execute on.      |
| description |  string  |     true     | Description of the tool that is passed to the LLM. |

```

--------------------------------------------------------------------------------
/docs/en/resources/tools/dataform/dataform-compile-local.md:
--------------------------------------------------------------------------------

```markdown
---
title: "dataform-compile-local"
type: docs
weight: 1
description: > 
  A "dataform-compile-local" tool runs the `dataform compile` CLI command on a local project directory.
aliases:
- /resources/tools/dataform-compile-local
---

## About

A `dataform-compile-local` tool runs the `dataform compile` command on a local
Dataform project.

It is a standalone tool and **is not** compatible with any sources.

At invocation time, the tool executes `dataform compile --json` in the specified
project directory and returns the resulting JSON object from the CLI.

`dataform-compile-local` takes the following parameter:

- `project_dir` (string): The absolute or relative path to the local Dataform
  project directory. The server process must have read access to this path.

## Requirements

### Dataform CLI

This tool executes the `dataform` command-line interface (CLI) via a system
call. You must have the **`dataform` CLI** installed and available in the
server's system `PATH`.

You can typically install the CLI via `npm`:

```bash
npm install -g @dataform/cli
```

See the [official Dataform
documentation](https://www.google.com/search?q=https://cloud.google.com/dataform/docs/install-dataform-cli)
for more details.

## Example

```yaml
tools:  
  my_dataform_compiler:  
    kind: dataform-compile-local  
    description: Use this tool to compile a local Dataform project.
```

## Reference

| **field**   | **type** | **required** | **description**                                    |
|:------------|:---------|:-------------|:---------------------------------------------------|
| kind        | string   | true         | Must be "dataform-compile-local".                  |
| description | string   | true         | Description of the tool that is passed to the LLM. |

```

--------------------------------------------------------------------------------
/docs/en/resources/tools/postgres/postgres-execute-sql.md:
--------------------------------------------------------------------------------

```markdown
---
title: "postgres-execute-sql"
type: docs
weight: 1
description: >
  A "postgres-execute-sql" tool executes a SQL statement against a Postgres
  database.
aliases:
- /resources/tools/postgres-execute-sql
---

## About

A `postgres-execute-sql` tool executes a SQL statement against a Postgres
database. It's compatible with any of the following sources:

- [alloydb-postgres](../../sources/alloydb-pg.md)
- [cloud-sql-postgres](../../sources/cloud-sql-pg.md)
- [postgres](../../sources/postgres.md)

`postgres-execute-sql` takes one input parameter `sql` and run the sql
statement against the `source`.

> **Note:** This tool is intended for developer assistant workflows with
> human-in-the-loop and shouldn't be used for production agents.

## Example

```yaml
tools:
 execute_sql_tool:
    kind: postgres-execute-sql
    source: my-pg-instance
    description: Use this tool to execute sql statement.
```

## Reference

| **field**   |                  **type**                  | **required** | **description**                                                                                  |
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
| kind        |                   string                   |     true     | Must be "postgres-execute-sql".                                                                  |
| source      |                   string                   |     true     | Name of the source the SQL should execute on.                                                    |
| description |                   string                   |     true     | Description of the tool that is passed to the LLM.                                               |

```

--------------------------------------------------------------------------------
/docs/en/resources/tools/clickhouse/clickhouse-list-databases.md:
--------------------------------------------------------------------------------

```markdown
---
title: "clickhouse-list-databases"
type: docs
weight: 3
description: >
  A "clickhouse-list-databases" tool lists all databases in a ClickHouse instance.
aliases:
- /resources/tools/clickhouse-list-databases
---

## About

A `clickhouse-list-databases` tool lists all available databases in a ClickHouse
instance. It's compatible with the [clickhouse](../../sources/clickhouse.md)
source.

This tool executes the `SHOW DATABASES` command and returns a list of all
databases accessible to the configured user, making it useful for database
discovery and exploration tasks.

## Example

```yaml
tools:
  list_clickhouse_databases:
    kind: clickhouse-list-databases
    source: my-clickhouse-instance
    description: List all available databases in the ClickHouse instance
```

## Return Value

The tool returns an array of objects, where each object contains:

- `name`: The name of the database

Example response:

```json
[
  {"name": "default"},
  {"name": "system"},
  {"name": "analytics"},
  {"name": "user_data"}
]
```

## Reference

| **field**    |      **type**      | **required** | **description**                                       |
|--------------|:------------------:|:------------:|-------------------------------------------------------|
| kind         |       string       |     true     | Must be "clickhouse-list-databases".                  |
| source       |       string       |     true     | Name of the ClickHouse source to list databases from. |
| description  |       string       |     true     | Description of the tool that is passed to the LLM.    |
| authRequired |  array of string   |    false     | Authentication services required to use this tool.    |
| parameters   | array of Parameter |    false     | Parameters for the tool (typically not used).         |

```

--------------------------------------------------------------------------------
/internal/prebuiltconfigs/tools/alloydb-postgres-admin.yaml:
--------------------------------------------------------------------------------

```yaml
# Copyright 2025 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
#     http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

sources:
  alloydb-admin-source:
    kind: alloydb-admin
    defaultProject: ${ALLOYDB_POSTGRES_PROJECT:}
tools:
  create_cluster:
    kind: alloydb-create-cluster
    source: alloydb-admin-source
  wait_for_operation:
    kind: alloydb-wait-for-operation
    source: alloydb-admin-source
    delay: 1s
    maxDelay: 4m
    multiplier: 2
    maxRetries: 10
  create_instance:
    kind: alloydb-create-instance
    source: alloydb-admin-source
  list_clusters:
    kind: alloydb-list-clusters
    source: alloydb-admin-source
  list_instances:
    kind: alloydb-list-instances
    source: alloydb-admin-source
  list_users:
    kind: alloydb-list-users
    source: alloydb-admin-source
  create_user:
    kind: alloydb-create-user
    source: alloydb-admin-source
  get_cluster:
    kind: alloydb-get-cluster
    source: alloydb-admin-source
  get_instance:
    kind: alloydb-get-instance
    source: alloydb-admin-source
  get_user:
    kind: alloydb-get-user
    source: alloydb-admin-source
        
toolsets:
  alloydb_postgres_admin_tools:
    - create_cluster
    - wait_for_operation
    - create_instance
    - list_clusters
    - list_instances
    - list_users
    - create_user
    - get_cluster
    - get_instance
    - get_user

```

--------------------------------------------------------------------------------
/docs/en/resources/tools/looker/looker-make-look.md:
--------------------------------------------------------------------------------

```markdown
---
title: "looker-make-look"
type: docs
weight: 1
description: >
  "looker-make-look" generates a Looker look in the users personal folder in
  Looker
aliases:
- /resources/tools/looker-make-look
---

## About

The `looker-make-look` creates a saved Look in the user's
Looker personal folder.

It's compatible with the following sources:

- [looker](../../sources/looker.md)

`looker-make-look` takes eleven parameters:

1. the `model`
2. the `explore`
3. the `fields` list
4. an optional set of `filters`
5. an optional set of `pivots`
6. an optional set of `sorts`
7. an optional `limit`
8. an optional `tz`
9. an optional `vis_config`
10. the `title`
11. an optional `description`

## Example

```yaml
tools:
    make_look:
        kind: looker-make-look
        source: looker-source
        description: |
          make_look Tool

          This tool creates a new look in Looker, using the query
          parameters and the vis_config specified.

          Most of the parameters are the same as the query_url
          tool. In addition, there is a title and a description
          that must be provided.

          The newly created look will be created in the user's
          personal folder in looker. The look name must be unique.

          The result is a json document with a link to the newly
          created look.
```

## Reference

| **field**   | **type** | **required** | **description**                                    |
|-------------|:--------:|:------------:|----------------------------------------------------|
| kind        |  string  |     true     | Must be "looker-make-look"                         |
| source      |  string  |     true     | Name of the source the SQL should execute on.      |
| description |  string  |     true     | Description of the tool that is passed to the LLM. |

```

--------------------------------------------------------------------------------
/docs/en/resources/tools/mssql/mssql-list-tables.md:
--------------------------------------------------------------------------------

```markdown
---
title: "mssql-list-tables"
type: docs
weight: 1
description: >
  The "mssql-list-tables" tool lists schema information for all or specified tables in a SQL server database.
aliases:
- /resources/tools/mssql-list-tables
---

## About

The `mssql-list-tables` tool retrieves schema information for all or specified
tables in a SQL server database. It is compatible with any of the following
sources:

- [cloud-sql-mssql](../../sources/cloud-sql-mssql.md)
- [mssql](../../sources/mssql.md)

`mssql-list-tables` lists detailed schema information (object type, columns,
constraints, indexes, triggers, owner, comment) as JSON for user-created tables
(ordinary or partitioned).

The tool takes the following input parameters:

- **`table_names`** (string, optional): Filters by a comma-separated list of
  names. By default, it lists all tables in user schemas. Default: `""`.
- **`output_format`** (string, optional): Indicate the output format of table
  schema. `simple` will return only the table names, `detailed` will return the
  full table information. Default: `detailed`.

## Example

```yaml
tools:
  mssql_list_tables:
    kind: mssql-list-tables
    source: mssql-source
    description: Use this tool to retrieve schema information for all or specified tables. Output format can be simple (only table names) or detailed.
```

## Reference

| **field**   | **type** | **required** | **description**                                      |
|-------------|:--------:|:------------:|------------------------------------------------------|
| kind        |  string  |     true     | Must be "mssql-list-tables".                         |
| source      |  string  |     true     | Name of the source the SQL should execute on.        |
| description |  string  |     true     | Description of the tool that is passed to the agent. |

```

--------------------------------------------------------------------------------
/internal/sources/elasticsearch/elasticsearch_test.go:
--------------------------------------------------------------------------------

```go
// Copyright 2025 Google LLC
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
//     http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.

package elasticsearch_test

import (
	"testing"

	yaml "github.com/goccy/go-yaml"
	"github.com/google/go-cmp/cmp"
	"github.com/googleapis/genai-toolbox/internal/server"
	"github.com/googleapis/genai-toolbox/internal/sources/elasticsearch"
)

func TestParseFromYamlElasticsearch(t *testing.T) {
	tcs := []struct {
		desc string
		in   string
		want server.SourceConfigs
	}{
		{
			desc: "basic example",
			in: `
            sources:
              my-es-instance:
                kind: elasticsearch
                addresses:
                  - http://localhost:9200
                apikey: somekey
            `,
			want: server.SourceConfigs{
				"my-es-instance": elasticsearch.Config{
					Name:      "my-es-instance",
					Kind:      elasticsearch.SourceKind,
					Addresses: []string{"http://localhost:9200"},
					APIKey:    "somekey",
				},
			},
		},
	}
	for _, tc := range tcs {
		t.Run(tc.desc, func(t *testing.T) {
			got := struct {
				Sources server.SourceConfigs `yaml:"sources"`
			}{}
			err := yaml.Unmarshal([]byte(tc.in), &got)
			if err != nil {
				t.Fatalf("failed to parse yaml: %v", err)
			}
			if diff := cmp.Diff(tc.want, got.Sources); diff != "" {
				t.Errorf("unexpected config diff (-want +got):\n%s", diff)
			}
		})
	}
}

```

--------------------------------------------------------------------------------
/internal/sources/sqlite/sqlite_test.go:
--------------------------------------------------------------------------------

```go
// Copyright 2025 Google LLC
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
//     http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.

package sqlite_test

import (
	"testing"

	yaml "github.com/goccy/go-yaml"
	"github.com/google/go-cmp/cmp"
	"github.com/googleapis/genai-toolbox/internal/server"
	"github.com/googleapis/genai-toolbox/internal/sources"
	"github.com/googleapis/genai-toolbox/internal/sources/sqlite"
	"github.com/googleapis/genai-toolbox/internal/testutils"
)

func TestParseFromYamlSQLite(t *testing.T) {
	tcs := []struct {
		desc string
		in   string
		want server.SourceConfigs
	}{
		{
			desc: "basic example",
			in: `
            sources:
                my-sqlite-db:
                    kind: sqlite
                    database: /path/to/database.db
            `,
			want: map[string]sources.SourceConfig{
				"my-sqlite-db": sqlite.Config{
					Name:     "my-sqlite-db",
					Kind:     sqlite.SourceKind,
					Database: "/path/to/database.db",
				},
			},
		},
	}
	for _, tc := range tcs {
		t.Run(tc.desc, func(t *testing.T) {
			got := struct {
				Sources server.SourceConfigs `yaml:"sources"`
			}{}
			// Parse contents
			err := yaml.Unmarshal(testutils.FormatYaml(tc.in), &got)
			if err != nil {
				t.Fatalf("unable to unmarshal: %s", err)
			}
			if !cmp.Equal(tc.want, got.Sources) {
				t.Fatalf("incorrect parse: want %v, got %v", tc.want, got.Sources)
			}
		})
	}
}

```

--------------------------------------------------------------------------------
/.ci/generate_release_table.sh:
--------------------------------------------------------------------------------

```bash
#! /bin/bash


# Check if VERSION has been set
if [ -z "${VERSION}" ]; then
  echo "Error: VERSION env var is not set" >&2  # Print to stderr
  exit 1  # Exit with a non-zero status to indicate an error
fi


FILES=("linux.amd64" "darwin.arm64" "darwin.amd64" "windows.amd64")
output_string=""

# Define the descriptions - ensure this array's order matches FILES
DESCRIPTIONS=(
    "For **Linux** systems running on **Intel/AMD 64-bit processors**."
    "For **macOS** systems running on **Apple Silicon** (M1, M2, M3, etc.) processors."
    "For **macOS** systems running on **Intel processors**."
    "For **Windows** systems running on **Intel/AMD 64-bit processors**."
)

# Write the table header
ROW_FMT="| %-105s | %-120s | %-67s |\n"
output_string+=$(printf "$ROW_FMT" "**OS/Architecture**" "**Description**" "**SHA256 Hash**")$'\n'
output_string+=$(printf "$ROW_FMT" "$(printf -- '-%0.s' {1..105})" "$(printf -- '-%0.s' {1..120})" "$(printf -- '-%0.s' {1..67})")$'\n'


# Loop through all files matching the pattern "toolbox.*.*"
for i in "${!FILES[@]}"
do
    file_key="${FILES[$i]}" # e.g., "linux.amd64"
    description_text="${DESCRIPTIONS[$i]}"

    # Extract OS and ARCH from the filename
    OS=$(echo "$file_key" | cut -d '.' -f 1)
    ARCH=$(echo "$file_key" | cut -d '.' -f 2)

    # Get release URL
    if [ "$OS" = 'windows' ];
    then
        URL="https://storage.googleapis.com/genai-toolbox/$VERSION/$OS/$ARCH/toolbox.exe"
    else
        URL="https://storage.googleapis.com/genai-toolbox/$VERSION/$OS/$ARCH/toolbox"
    fi

    curl "$URL" --fail --output toolbox || exit 1

    # Calculate the SHA256 checksum of the file
    SHA256=$(shasum -a 256 toolbox | awk '{print $1}')

    # Write the table row
    output_string+=$(printf "$ROW_FMT" "[$OS/$ARCH]($URL)" "$description_text" "$SHA256")$'\n'

    rm toolbox
done

printf "$output_string\n"


```

--------------------------------------------------------------------------------
/docs/en/resources/tools/postgres/postgres-list-tables.md:
--------------------------------------------------------------------------------

```markdown
---
title: "postgres-list-tables"
type: docs
weight: 1
description: >
  The "postgres-list-tables" tool lists schema information for all or specified
  tables in a Postgres database.
aliases:
- /resources/tools/postgres-list-tables
---

## About

The `postgres-list-tables` tool retrieves schema information for all or
specified tables in a Postgres database. It's compatible with any of the
following sources:

- [alloydb-postgres](../../sources/alloydb-pg.md)
- [cloud-sql-postgres](../../sources/cloud-sql-pg.md)
- [postgres](../../sources/postgres.md)

`postgres-list-tables` lists detailed schema information (object type, columns,
constraints, indexes, triggers, owner, comment) as JSON for user-created tables
(ordinary or partitioned). The tool takes the following input parameters: *
 `table_names` (optional): Filters by a comma-separated list of names. By
 default, it lists all tables in user schemas.* `output_format` (optional):
 Indicate the output format of table schema. `simple` will return only the
 table names, `detailed` will return the full table information. Default:
 `detailed`.

## Example

```yaml
tools:
  postgres_list_tables:
    kind: postgres-list-tables
    source: postgres-source
    description: Use this tool to retrieve schema information for all or
    specified tables. Output format can be simple (only table names) or detailed.
```

## Reference

| **field**   | **type** | **required** | **description**                                      |
|-------------|:--------:|:------------:|------------------------------------------------------|
| kind        |  string  |     true     | Must be "postgres-list-tables".                      |
| source      |  string  |     true     | Name of the source the SQL should execute on.        |
| description |  string  |     true     | Description of the tool that is passed to the agent. |

```

--------------------------------------------------------------------------------
/internal/tools/clickhouse/clickhouseexecutesql/clickhouseexecutesql_test.go:
--------------------------------------------------------------------------------

```go
// Copyright 2025 Google LLC
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
//     http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.

package clickhouse

import (
	"testing"

	yaml "github.com/goccy/go-yaml"
	"github.com/google/go-cmp/cmp"
	"github.com/googleapis/genai-toolbox/internal/server"
	"github.com/googleapis/genai-toolbox/internal/testutils"
)

func TestParseFromYamlClickHouseExecuteSQL(t *testing.T) {
	ctx, err := testutils.ContextWithNewLogger()
	if err != nil {
		t.Fatalf("unexpected error: %s", err)
	}
	tcs := []struct {
		desc string
		in   string
		want server.ToolConfigs
	}{
		{
			desc: "basic example",
			in: `
			tools:
				example_tool:
					kind: clickhouse-execute-sql
					source: my-instance
					description: some description
			`,
			want: server.ToolConfigs{
				"example_tool": Config{
					Name:         "example_tool",
					Kind:         "clickhouse-execute-sql",
					Source:       "my-instance",
					Description:  "some description",
					AuthRequired: []string{},
				},
			},
		},
	}
	for _, tc := range tcs {
		t.Run(tc.desc, func(t *testing.T) {
			got := struct {
				Tools server.ToolConfigs `yaml:"tools"`
			}{}
			err := yaml.UnmarshalContext(ctx, testutils.FormatYaml(tc.in), &got)
			if err != nil {
				t.Fatalf("unable to unmarshal: %s", err)
			}
			if diff := cmp.Diff(tc.want, got.Tools); diff != "" {
				t.Fatalf("incorrect parse: diff %v", diff)
			}
		})
	}
}

```

--------------------------------------------------------------------------------
/docs/en/resources/tools/alloydb/alloydb-list-clusters.md:
--------------------------------------------------------------------------------

```markdown
---
title: alloydb-list-clusters
type: docs
weight: 1
description: "The \"alloydb-list-clusters\" tool lists the AlloyDB clusters in a given project and location.\n"
aliases: [/resources/tools/alloydb-list-clusters]
---

## About

The `alloydb-list-clusters` tool retrieves AlloyDB cluster information for all
or specified locations in a given project. It is compatible with
[alloydb-admin](../../sources/alloydb-admin.md) source.

`alloydb-list-clusters` tool lists the detailed information of AlloyDB
cluster(cluster name, state, configuration, etc) for a given project and
location. The tool takes the following input parameters:

| Parameter  | Type   | Description                                                                                      | Required |
| :--------- | :----- | :----------------------------------------------------------------------------------------------- | :------- |
| `project`  | string | The GCP project ID to list clusters for.                                                         | Yes      |
| `location` | string | The location to list clusters in (e.g., 'us-central1'). Use `-` for all locations. Default: `-`. | No       |

## Example

```yaml
tools:
  list_clusters:
    kind: alloydb-list-clusters
    source: alloydb-admin-source
    description: Use this tool to list all AlloyDB clusters in a given project and location.
```

## Reference

| **field**   | **type** | **required** | **description**                                      |
| ----------- | :------: | :----------: | ---------------------------------------------------- |
| kind        |  string  |     true     | Must be alloydb-list-clusters.                       |
| source      |  string  |     true     | The name of an `alloydb-admin` source.               |
| description |  string  |     false    | Description of the tool that is passed to the agent. |

```

--------------------------------------------------------------------------------
/internal/util/orderedmap/orderedmap_test.go:
--------------------------------------------------------------------------------

```go
// Copyright 2025 Google LLC
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
//	http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package orderedmap

import (
	"encoding/json"
	"testing"
)

func TestRowMarshalJSON(t *testing.T) {
	tests := []struct {
		name    string
		row     Row
		want    string
		wantErr bool
	}{
		{
			name: "Simple row",
			row: Row{
				Columns: []Column{
					{Name: "A", Value: 1},
					{Name: "B", Value: "two"},
					{Name: "C", Value: true},
				},
			},
			want:    `{"A":1,"B":"two","C":true}`,
			wantErr: false,
		},
		{
			name: "Row with different order",
			row: Row{
				Columns: []Column{
					{Name: "C", Value: true},
					{Name: "A", Value: 1},
					{Name: "B", Value: "two"},
				},
			},
			want:    `{"C":true,"A":1,"B":"two"}`,
			wantErr: false,
		},
		{
			name:    "Empty row",
			row:     Row{},
			want:    `{}`,
			wantErr: false,
		},
		{
			name: "Row with nil value",
			row: Row{
				Columns: []Column{
					{Name: "A", Value: 1},
					{Name: "B", Value: nil},
				},
			},
			want:    `{"A":1,"B":null}`,
			wantErr: false,
		},
	}

	for _, tt := range tests {
		t.Run(tt.name, func(t *testing.T) {
			got, err := json.Marshal(tt.row)
			if (err != nil) != tt.wantErr {
				t.Errorf("Row.MarshalJSON() error = %v, wantErr %v", err, tt.wantErr)
				return
			}
			if string(got) != tt.want {
				t.Errorf("Row.MarshalJSON() = %s, want %s", string(got), tt.want)
			}
		})
	}
}

```

--------------------------------------------------------------------------------
/docs/en/resources/tools/looker/looker-add-dashboard-element.md:
--------------------------------------------------------------------------------

```markdown
---
title: "looker-add-dashboard-element"
type: docs
weight: 1
description: >
  "looker-add-dashboard-element" creates a dashboard element in the given dashboard.
aliases:
- /resources/tools/looker-add-dashboard-element
---

## About

The `looker-add-dashboard-element` creates a dashboard element
in the given dashboard.

It's compatible with the following sources:

- [looker](../../sources/looker.md)

`looker-add-dashboard-element` takes eleven parameters:

1. the `model`
2. the `explore`
3. the `fields` list
4. an optional set of `filters`
5. an optional set of `pivots`
6. an optional set of `sorts`
7. an optional `limit`
8. an optional `tz`
9. an optional `vis_config`
10. the `title`
11. the `dashboard_id`

## Example

```yaml
tools:
    add_dashboard_element:
        kind: looker-add-dashboard-element
        source: looker-source
        description: |
          add_dashboard_element Tool

          This tool creates a new tile in a Looker dashboard using
          the query parameters and the vis_config specified.

          Most of the parameters are the same as the query_url
          tool. In addition, there is a title that may be provided.
          The dashboard_id must be specified. That is obtained
          from calling make_dashboard.

          This tool can be called many times for one dashboard_id
          and the resulting tiles will be added in order.
```

## Reference

| **field**   | **type** | **required** | **description**                                    |
|-------------|:--------:|:------------:|----------------------------------------------------|
| kind        |  string  |     true     | Must be "looker-add-dashboard-element"             |
| source      |  string  |     true     | Name of the source the SQL should execute on.      |
| description |  string  |     true     | Description of the tool that is passed to the LLM. |

```

--------------------------------------------------------------------------------
/internal/tools/cloudsql/cloudsqllistinstances/cloudsqllistinstances_test.go:
--------------------------------------------------------------------------------

```go
// Copyright 2025 Google LLC
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
//      http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.

package cloudsqllistinstances

import (
	"testing"

	"github.com/goccy/go-yaml"
	"github.com/google/go-cmp/cmp"
	"github.com/googleapis/genai-toolbox/internal/server"
	"github.com/googleapis/genai-toolbox/internal/testutils"
)

func TestParseFromYaml(t *testing.T) {
	ctx, err := testutils.ContextWithNewLogger()
	if err != nil {
		t.Fatalf("unexpected error: %s", err)
	}
	tcs := []struct {
		desc string
		in   string
		want server.ToolConfigs
	}{
		{
			desc: "basic example",
			in: `
			tools:
				list-my-instances:
					kind: cloud-sql-list-instances
					description: some description
					source: some-source
			`,
			want: server.ToolConfigs{
				"list-my-instances": Config{
					Name:         "list-my-instances",
					Kind:         "cloud-sql-list-instances",
					Description:  "some description",
					AuthRequired: []string{},
					Source:       "some-source",
				},
			},
		},
	}
	for _, tc := range tcs {
		t.Run(tc.desc, func(t *testing.T) {
			got := struct {
				Tools server.ToolConfigs `yaml:"tools"`
			}{}
			// Parse contents
			err := yaml.UnmarshalContext(ctx, testutils.FormatYaml(tc.in), &got)
			if err != nil {
				t.Fatalf("unable to unmarshal: %s", err)
			}
			if diff := cmp.Diff(tc.want, got.Tools); diff != "" {
				t.Fatalf("incorrect parse: diff %v", diff)
			}
		})
	}
}

```

--------------------------------------------------------------------------------
/docs/en/resources/tools/serverless-spark/serverless-spark-cancel-batch.md:
--------------------------------------------------------------------------------

```markdown
---
title: "serverless-spark-cancel-batch"
type: docs
weight: 2
description: >
  A "serverless-spark-cancel-batch" tool cancels a running Spark batch operation.
aliases:
  - /resources/tools/serverless-spark-cancel-batch
---

## About

 `serverless-spark-cancel-batch` tool cancels a running Spark batch operation in
 a Google Cloud Serverless for Apache Spark source. The cancellation request is
 asynchronous, so the batch state will not change immediately after the tool
 returns; it can take a minute or so for the cancellation to be reflected.

It's compatible with the following sources:

- [serverless-spark](../../sources/serverless-spark.md)

`serverless-spark-cancel-batch` accepts the following parameters:

- **`operation`** (required): The name of the operation to cancel. For example,
  for `projects/my-project/locations/us-central1/operations/my-operation`, you
  would pass `my-operation`.

The tool inherits the `project` and `location` from the source configuration.

## Example

```yaml
tools:
  cancel_spark_batch:
    kind: serverless-spark-cancel-batch
    source: my-serverless-spark-source
    description: Use this tool to cancel a running serverless spark batch operation.
```

## Response Format

```json
"Cancelled [projects/my-project/regions/us-central1/operations/my-operation]."
```

## Reference

| **field**    | **type** | **required** | **description**                                    |
| ------------ | :------: | :----------: | -------------------------------------------------- |
| kind         |  string  |     true     | Must be "serverless-spark-cancel-batch".           |
| source       |  string  |     true     | Name of the source the tool should use.            |
| description  |  string  |     true     | Description of the tool that is passed to the LLM. |
| authRequired | string[] |    false     | List of auth services required to invoke this tool |

```

--------------------------------------------------------------------------------
/Dockerfile:
--------------------------------------------------------------------------------

```dockerfile
# Copyright 2024 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
#      http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
FROM --platform=$BUILDPLATFORM golang:1 AS build

# Install Zig for CGO cross-compilation
RUN apt-get update && apt-get install -y xz-utils
RUN curl -fL "https://ziglang.org/download/0.15.2/zig-x86_64-linux-0.15.2.tar.xz" -o zig.tar.xz && \
    mkdir -p /zig && \
    tar -xf zig.tar.xz -C /zig --strip-components=1 && \
    rm zig.tar.xz

WORKDIR /go/src/genai-toolbox
COPY . .

ARG TARGETOS
ARG TARGETARCH
ARG BUILD_TYPE="container.dev"
ARG COMMIT_SHA=""

RUN go get ./...

RUN export ZIG_TARGET="" && \
    case "${TARGETARCH}" in \
      ("amd64") ZIG_TARGET="x86_64-linux-gnu" ;; \
      ("arm64") ZIG_TARGET="aarch64-linux-gnu" ;; \
      (*) echo "Unsupported architecture: ${TARGETARCH}" && exit 1 ;; \
    esac && \
    CGO_ENABLED=1 GOOS=${TARGETOS} GOARCH=${TARGETARCH} \
    CC="/zig/zig cc -target ${ZIG_TARGET}" \
    CXX="/zig/zig c++ -target ${ZIG_TARGET}" \
    go build \
    -ldflags "-X github.com/googleapis/genai-toolbox/cmd.buildType=${BUILD_TYPE} -X github.com/googleapis/genai-toolbox/cmd.commitSha=${COMMIT_SHA}" \
    -o genai-toolbox .

# Final Stage
FROM gcr.io/distroless/cc-debian12:nonroot

WORKDIR /app
COPY --from=build --chown=nonroot /go/src/genai-toolbox/genai-toolbox /toolbox
USER nonroot

LABEL io.modelcontextprotocol.server.name="io.github.googleapis/genai-toolbox"

ENTRYPOINT ["/toolbox"] 

```

--------------------------------------------------------------------------------
/docs/en/resources/tools/looker/looker-get-looks.md:
--------------------------------------------------------------------------------

```markdown
---
title: "looker-get-looks"
type: docs
weight: 1
description: >
  "looker-get-looks" searches for saved Looks in a Looker
  source.
aliases:
- /resources/tools/looker-get-looks
---

## About

The `looker-get-looks` tool searches for a saved Look by
name or description.

It's compatible with the following sources:

- [looker](../../sources/looker.md)

`looker-get-looks` takes four parameters, the `title`, `desc`, `limit`
and `offset`.

Title and description use SQL style wildcards and are case insensitive.

Limit and offset are used to page through a larger set of matches and
default to 100 and 0.

## Example

```yaml
tools:
    get_looks:
        kind: looker-get-looks
        source: looker-source
        description: |
          get_looks Tool

          This tool is used to search for saved looks in a Looker instance.
          String search params use case-insensitive matching. String search
          params can contain % and '_' as SQL LIKE pattern match wildcard
          expressions. example="dan%" will match "danger" and "Danzig" but
          not "David" example="D_m%" will match "Damage" and "dump".

          Most search params can accept "IS NULL" and "NOT NULL" as special
          expressions to match or exclude (respectively) rows where the
          column is null.

          The limit and offset are used to paginate the results.

          The result of the get_looks tool is a list of json objects.
```

## Reference

| **field**   | **type** | **required** | **description**                                    |
|-------------|:--------:|:------------:|----------------------------------------------------|
| kind        |  string  |     true     | Must be "looker-get-looks"                         |
| source      |  string  |     true     | Name of the source the SQL should execute on.      |
| description |  string  |     true     | Description of the tool that is passed to the LLM. |

```
Page 2/50FirstPrevNextLast