#
tokens: 26986/50000 49/49 files
lines: on (toggle) GitHub
raw markdown copy reset
# Directory Structure

```
├── .env.example
├── .github
│   └── workflows
│       ├── publish-mcp.yml
│       ├── pypi-publish.yaml
│       └── release.yml
├── .gitignore
├── .python-version
├── cliff.toml
├── CONTRIBUTING.md
├── docker-compose-elasticsearch.yml
├── docker-compose-opensearch.yml
├── LICENSE
├── Makefile
├── mcp_client
│   ├── python-sdk-anthropic
│   │   ├── __init__.py
│   │   ├── .gitignore
│   │   ├── client.py
│   │   └── config.py
│   └── spring-ai
│       ├── build.gradle
│       ├── gradle
│       │   └── wrapper
│       │       ├── gradle-wrapper.jar
│       │       └── gradle-wrapper.properties
│       ├── gradle.properties
│       ├── gradlew
│       ├── gradlew.bat
│       ├── README.md
│       ├── settings.gradle
│       └── src
│           ├── main
│           │   ├── java
│           │   │   └── spring
│           │   │       └── ai
│           │   │           └── mcp
│           │   │               └── spring_ai_mcp
│           │   │                   └── Application.java
│           │   └── resources
│           │       ├── application.yml
│           │       └── mcp-servers-config.json
│           └── test
│               └── java
│                   └── spring
│                       └── ai
│                           └── mcp
│                               └── spring_ai_mcp
│                                   └── SpringAiMcpApplicationTests.java
├── pyproject.toml
├── README.md
├── server.json
├── src
│   ├── __init__.py
│   ├── clients
│   │   ├── __init__.py
│   │   ├── base.py
│   │   ├── common
│   │   │   ├── __init__.py
│   │   │   ├── alias.py
│   │   │   ├── client.py
│   │   │   ├── cluster.py
│   │   │   ├── data_stream.py
│   │   │   ├── document.py
│   │   │   ├── general.py
│   │   │   └── index.py
│   │   └── exceptions.py
│   ├── server.py
│   ├── tools
│   │   ├── __init__.py
│   │   ├── alias.py
│   │   ├── cluster.py
│   │   ├── data_stream.py
│   │   ├── document.py
│   │   ├── general.py
│   │   ├── index.py
│   │   └── register.py
│   └── version.py
└── uv.lock
```

# Files

--------------------------------------------------------------------------------
/.python-version:
--------------------------------------------------------------------------------

```
1 | 3.10
2 | 
```

--------------------------------------------------------------------------------
/mcp_client/python-sdk-anthropic/.gitignore:
--------------------------------------------------------------------------------

```
1 | .env
2 | 
```

--------------------------------------------------------------------------------
/.gitignore:
--------------------------------------------------------------------------------

```
 1 | # IDE
 2 | .idea
 3 | .vscode
 4 | .kiro
 5 | 
 6 | # Python
 7 | .venv
 8 | dist
 9 | __pycache__
10 | *.egg-info
11 | 
12 | # Configuration and Credentials
13 | .env
14 | 
15 | # Spring AI
16 | .gradle/
17 | build
18 | 
19 | # MCP Registry
20 | .mcpregistry*
```

--------------------------------------------------------------------------------
/.env.example:
--------------------------------------------------------------------------------

```
 1 | # Elasticsearch connection settings
 2 | ELASTICSEARCH_HOSTS=https://localhost:9200
 3 | ELASTICSEARCH_USERNAME=elastic
 4 | ELASTICSEARCH_PASSWORD=test123
 5 | ELASTICSEARCH_VERIFY_CERTS=false
 6 | 
 7 | # OpenSearch connection settings
 8 | OPENSEARCH_HOSTS=https://localhost:9200
 9 | OPENSEARCH_USERNAME=admin
10 | OPENSEARCH_PASSWORD=admin
11 | OPENSEARCH_VERIFY_CERTS=false
12 | 
```

--------------------------------------------------------------------------------
/mcp_client/spring-ai/README.md:
--------------------------------------------------------------------------------

```markdown
 1 | ## Start MCP Client
 2 | 
 3 | ```bash
 4 | export OPENAI_API_KEY=<your-openai-api-key>
 5 | export OPENAI_BASE_URL=<your-openai-base-url>
 6 | ./gradlew bootRun
 7 | ```
 8 | 
 9 | ## Ask Questions
10 | 
11 | ```bash
12 | USER: List the indices in the cluster.
13 | ASSISTANT: Here are the indices currently present in the cluster:
14 | 
15 | 1. `.internal.alerts-default.alerts-default-000001`
16 | 2. `.internal.alerts-observability.threshold.alerts-default-000001`
17 | 3. `.internal.alerts-ml.anomaly-detection-health.alerts-default-000001`
18 | 4. `.internal.alerts-observability.metrics.alerts-default-000001`
19 | 5. `.internal.alerts-stack.alerts-default-000001`
20 | 6. `.internal.alerts-security.alerts-default-000001`
21 | 7. `.internal.alerts-observability.slo.alerts-default-000001`
22 | 8. `.internal.alerts-ml.anomaly-detection.alerts-default-000001`
23 | 9. `.internal.alerts-transform.health.alerts-default-000001`
24 | 10. `student`
25 | 11. `.internal.alerts-observability.logs.alerts-default-000001`
26 | 12. `.internal.alerts-observability.uptime.alerts-default-000001`
27 | 13. `.internal.alerts-observability.apm.alerts-default-000001`
28 | 
29 | If you need more information about any specific index, feel free to ask!
30 | 
31 | USER: Get student index's mapping.
32 | ASSISTANT: The mapping for the `student` index is as follows:
33 | 
34 | - **Mappings:**
35 |   - **Properties:**
36 |     - **age:** Type is `long`
37 |     - **major:** Type is `text` with a sub-field `keyword` of type `keyword`, and `ignore_above` set to 256.
38 |     - **name:** Type is `text` with a sub-field `keyword` of type `keyword`, and `ignore_above` set to 256.
39 | 
40 | - **Settings:**
41 |   - **Index:**
42 |     - Routing:
43 |       - Allocation:
44 |         - Include:
45 |           - `_tier_preference`: `data_content`
46 |     - Number of shards: `1`
47 |     - Provided name: `student`
48 |     - Creation date: `1745300793755`
49 |     - Number of replicas: `1`
50 |     - UUID: `RIsbajmcSwKr8DGFjL0sMQ`
51 |     - Version created: `8521000`
52 | 
53 | If you need further details or have any other questions, feel free to ask!
54 | ```
```

--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------

```markdown
  1 | 
  2 | <!-- mcp-name: io.github.cr7258/elasticsearch-mcp-server -->
  3 | 
  4 | # Elasticsearch/OpenSearch MCP Server
  5 | 
  6 | [![MseeP.ai Security Assessment Badge](https://mseep.net/pr/cr7258-elasticsearch-mcp-server-badge.png)](https://mseep.ai/app/cr7258-elasticsearch-mcp-server)
  7 | 
  8 | [![Trust Score](https://archestra.ai/mcp-catalog/api/badge/quality/cr7258/elasticsearch-mcp-server)](https://archestra.ai/mcp-catalog/cr7258__elasticsearch-mcp-server)
  9 | 
 10 | [MCP Official Registry]( https://registry.modelcontextprotocol.io/v0/servers?search=io.github.cr7258/elasticsearch-mcp-server)
 11 | 
 12 | ## Overview
 13 | 
 14 | A Model Context Protocol (MCP) server implementation that provides Elasticsearch and OpenSearch interaction. This server enables searching documents, analyzing indices, and managing cluster through a set of tools.
 15 | 
 16 | <a href="https://glama.ai/mcp/servers/b3po3delex"><img width="380" height="200" src="https://glama.ai/mcp/servers/b3po3delex/badge" alt="Elasticsearch MCP Server" /></a>
 17 | 
 18 | ## Demo
 19 | 
 20 | https://github.com/user-attachments/assets/f7409e31-fac4-4321-9c94-b0ff2ea7ff15
 21 | 
 22 | ## Features
 23 | 
 24 | ### General Operations
 25 | 
 26 | - `general_api_request`: Perform a general HTTP API request. Use this tool for any Elasticsearch/OpenSearch API that does not have a dedicated tool.
 27 | 
 28 | ### Index Operations
 29 | 
 30 | - `list_indices`: List all indices.
 31 | - `get_index`: Returns information (mappings, settings, aliases) about one or more indices.
 32 | - `create_index`: Create a new index.
 33 | - `delete_index`: Delete an index.
 34 | - `create_data_stream`: Create a new data stream (requires matching index template).
 35 | - `get_data_stream`: Get information about one or more data streams.
 36 | - `delete_data_stream`: Delete one or more data streams and their backing indices.
 37 | 
 38 | ### Document Operations
 39 | 
 40 | - `search_documents`: Search for documents.
 41 | - `index_document`: Creates or updates a document in the index.
 42 | - `get_document`: Get a document by ID.
 43 | - `delete_document`: Delete a document by ID.
 44 | - `delete_by_query`: Deletes documents matching the provided query.
 45 | 
 46 | ### Cluster Operations
 47 | 
 48 | - `get_cluster_health`: Returns basic information about the health of the cluster.
 49 | - `get_cluster_stats`: Returns high-level overview of cluster statistics.
 50 | 
 51 | ### Alias Operations
 52 | 
 53 | - `list_aliases`: List all aliases.
 54 | - `get_alias`: Get alias information for a specific index.
 55 | - `put_alias`: Create or update an alias for a specific index.
 56 | - `delete_alias`: Delete an alias for a specific index.
 57 | 
 58 | ## Configure Environment Variables
 59 | 
 60 | The MCP server supports the following environment variables for authentication:
 61 | 
 62 | ### Basic Authentication (Username/Password)
 63 | - `ELASTICSEARCH_USERNAME`: Username for basic authentication
 64 | - `ELASTICSEARCH_PASSWORD`: Password for basic authentication
 65 | - `OPENSEARCH_USERNAME`: Username for OpenSearch basic authentication
 66 | - `OPENSEARCH_PASSWORD`: Password for OpenSearch basic authentication
 67 | 
 68 | ### API Key Authentication (Elasticsearch only) - Recommended
 69 | - `ELASTICSEARCH_API_KEY`: API key for [Elasticsearch](https://www.elastic.co/docs/deploy-manage/api-keys/elasticsearch-api-keys) or [Elastic Cloud](https://www.elastic.co/docs/deploy-manage/api-keys/elastic-cloud-api-keys) Authentication.
 70 | 
 71 | ### Other Configuration
 72 | - `ELASTICSEARCH_HOSTS` / `OPENSEARCH_HOSTS`: Comma-separated list of hosts (default: `https://localhost:9200`)
 73 | - `ELASTICSEARCH_VERIFY_CERTS` / `OPENSEARCH_VERIFY_CERTS`: Whether to verify SSL certificates (default: `false`)
 74 | 
 75 | ## Start Elasticsearch/OpenSearch Cluster
 76 | 
 77 | Start the Elasticsearch/OpenSearch cluster using Docker Compose:
 78 | 
 79 | ```bash
 80 | # For Elasticsearch
 81 | docker-compose -f docker-compose-elasticsearch.yml up -d
 82 | 
 83 | # For OpenSearch
 84 | docker-compose -f docker-compose-opensearch.yml up -d
 85 | ```
 86 | 
 87 | The default Elasticsearch username is `elastic` and password is `test123`. The default OpenSearch username is `admin` and password is `admin`.
 88 | 
 89 | You can access Kibana/OpenSearch Dashboards from http://localhost:5601.
 90 | 
 91 | ## Stdio
 92 | 
 93 | ### Option 1: Using uvx
 94 | 
 95 | Using `uvx` will automatically install the package from PyPI, no need to clone the repository locally. Add the following configuration to 's config file `claude_desktop_config.json`.
 96 | 
 97 | ```json
 98 | // For Elasticsearch with username/password
 99 | {
100 |   "mcpServers": {
101 |     "elasticsearch-mcp-server": {
102 |       "command": "uvx",
103 |       "args": [
104 |         "elasticsearch-mcp-server"
105 |       ],
106 |       "env": {
107 |         "ELASTICSEARCH_HOSTS": "https://localhost:9200",
108 |         "ELASTICSEARCH_USERNAME": "elastic",
109 |         "ELASTICSEARCH_PASSWORD": "test123"
110 |       }
111 |     }
112 |   }
113 | }
114 | 
115 | // For Elasticsearch with API key
116 | {
117 |   "mcpServers": {
118 |     "elasticsearch-mcp-server": {
119 |       "command": "uvx",
120 |       "args": [
121 |         "elasticsearch-mcp-server"
122 |       ],
123 |       "env": {
124 |         "ELASTICSEARCH_HOSTS": "https://localhost:9200",
125 |         "ELASTICSEARCH_API_KEY": "<YOUR_ELASTICSEARCH_API_KEY>"
126 |       }
127 |     }
128 |   }
129 | }
130 | 
131 | // For OpenSearch
132 | {
133 |   "mcpServers": {
134 |     "opensearch-mcp-server": {
135 |       "command": "uvx",
136 |       "args": [
137 |         "opensearch-mcp-server"
138 |       ],
139 |       "env": {
140 |         "OPENSEARCH_HOSTS": "https://localhost:9200",
141 |         "OPENSEARCH_USERNAME": "admin",
142 |         "OPENSEARCH_PASSWORD": "admin"
143 |       }
144 |     }
145 |   }
146 | }
147 | ```
148 | 
149 | ### Option 2: Using uv with local development
150 | 
151 | Using `uv` requires cloning the repository locally and specifying the path to the source code. Add the following configuration to Claude Desktop's config file `claude_desktop_config.json`.
152 | 
153 | ```json
154 | // For Elasticsearch with username/password
155 | {
156 |   "mcpServers": {
157 |     "elasticsearch-mcp-server": {
158 |       "command": "uv",
159 |       "args": [
160 |         "--directory",
161 |         "path/to/elasticsearch-mcp-server",
162 |         "run",
163 |         "elasticsearch-mcp-server"
164 |       ],
165 |       "env": {
166 |         "ELASTICSEARCH_HOSTS": "https://localhost:9200",
167 |         "ELASTICSEARCH_USERNAME": "elastic",
168 |         "ELASTICSEARCH_PASSWORD": "test123"
169 |       }
170 |     }
171 |   }
172 | }
173 | 
174 | // For Elasticsearch with API key
175 | {
176 |   "mcpServers": {
177 |     "elasticsearch-mcp-server": {
178 |       "command": "uv",
179 |       "args": [
180 |         "--directory",
181 |         "path/to/elasticsearch-mcp-server",
182 |         "run",
183 |         "elasticsearch-mcp-server"
184 |       ],
185 |       "env": {
186 |         "ELASTICSEARCH_HOSTS": "https://localhost:9200",
187 |         "ELASTICSEARCH_API_KEY": "<YOUR_ELASTICSEARCH_API_KEY>"
188 |       }
189 |     }
190 |   }
191 | }
192 | 
193 | // For OpenSearch
194 | {
195 |   "mcpServers": {
196 |     "opensearch-mcp-server": {
197 |       "command": "uv",
198 |       "args": [
199 |         "--directory",
200 |         "path/to/elasticsearch-mcp-server",
201 |         "run",
202 |         "opensearch-mcp-server"
203 |       ],
204 |       "env": {
205 |         "OPENSEARCH_HOSTS": "https://localhost:9200",
206 |         "OPENSEARCH_USERNAME": "admin",
207 |         "OPENSEARCH_PASSWORD": "admin"
208 |       }
209 |     }
210 |   }
211 | }
212 | ```
213 | 
214 | ## SSE
215 | 
216 | ### Option 1: Using uvx
217 | 
218 | ```bash
219 | # export environment variables (with username/password)
220 | export ELASTICSEARCH_HOSTS="https://localhost:9200"
221 | export ELASTICSEARCH_USERNAME="elastic"
222 | export ELASTICSEARCH_PASSWORD="test123"
223 | 
224 | # OR export environment variables (with API key)
225 | export ELASTICSEARCH_HOSTS="https://localhost:9200"
226 | export ELASTICSEARCH_API_KEY="<YOUR_ELASTICSEARCH_API_KEY>"
227 | 
228 | # By default, the SSE MCP server will serve on http://127.0.0.1:8000/sse
229 | uvx elasticsearch-mcp-server --transport sse
230 | 
231 | # The host, port, and path can be specified using the --host, --port, and --path options
232 | uvx elasticsearch-mcp-server --transport sse --host 0.0.0.0 --port 8000 --path /sse
233 | ```
234 | 
235 | ### Option 2: Using uv
236 | 
237 | ```bash
238 | # By default, the SSE MCP server will serve on http://127.0.0.1:8000/sse
239 | uv run src/server.py elasticsearch-mcp-server --transport sse
240 | 
241 | # The host, port, and path can be specified using the --host, --port, and --path options
242 | uv run src/server.py elasticsearch-mcp-server --transport sse --host 0.0.0.0 --port 8000 --path /sse
243 | ```
244 | 
245 | ## Streamable HTTP
246 | 
247 | ### Option 1: Using uvx
248 | 
249 | ```bash
250 | # export environment variables (with username/password)
251 | export ELASTICSEARCH_HOSTS="https://localhost:9200"
252 | export ELASTICSEARCH_USERNAME="elastic"
253 | export ELASTICSEARCH_PASSWORD="test123"
254 | 
255 | # OR export environment variables (with API key)
256 | export ELASTICSEARCH_HOSTS="https://localhost:9200"
257 | export ELASTICSEARCH_API_KEY="<YOUR_ELASTICSEARCH_API_KEY>"
258 | 
259 | # By default, the Streamable HTTP MCP server will serve on http://127.0.0.1:8000/mcp
260 | uvx elasticsearch-mcp-server --transport streamable-http
261 | 
262 | # The host, port, and path can be specified using the --host, --port, and --path options
263 | uvx elasticsearch-mcp-server --transport streamable-http --host 0.0.0.0 --port 8000 --path /mcp
264 | ```
265 | 
266 | ### Option 2: Using uv
267 | 
268 | ```bash
269 | # By default, the Streamable HTTP MCP server will serve on http://127.0.0.1:8000/mcp
270 | uv run src/server.py elasticsearch-mcp-server --transport streamable-http
271 | 
272 | # The host, port, and path can be specified using the --host, --port, and --path options
273 | uv run src/server.py elasticsearch-mcp-server --transport streamable-http --host 0.0.0.0 --port 8000 --path /mcp
274 | ```
275 | 
276 | ## Compatibility
277 | 
278 | The MCP server is compatible with Elasticsearch 7.x, 8.x, and 9.x. By default, it uses the Elasticsearch 8.x client (without a suffix).
279 | 
280 | | MCP Server | Elasticsearch |
281 | | --- | --- |
282 | | elasticsearch-mcp-server-es7 | Elasticsearch 7.x |
283 | | elasticsearch-mcp-server | Elasticsearch 8.x |
284 | | elasticsearch-mcp-server-es9 | Elasticsearch 9.x |
285 | | opensearch-mcp-server | OpenSearch 1.x, 2.x, 3.x |
286 | 
287 |  To use the Elasticsearch 7.x client, run the `elasticsearch-mcp-server-es7` variant. For Elasticsearch 9.x, use `elasticsearch-mcp-server-es9`. For example:
288 | 
289 | ```bash
290 | uvx elasticsearch-mcp-server-es7
291 | ```
292 | 
293 | If you want to run different Elasticsearch variants (e.g., 7.x or 9.x) locally, simply update the `elasticsearch` dependency version in `pyproject.toml`, then start the server with:
294 | 
295 | ```bash
296 | uv run src/server.py elasticsearch-mcp-server
297 | ```
298 | 
299 | ## License
300 | 
301 | This project is licensed under the Apache License Version 2.0 - see the [LICENSE](LICENSE) file for details.
302 | 
```

--------------------------------------------------------------------------------
/CONTRIBUTING.md:
--------------------------------------------------------------------------------

```markdown
 1 | # Contributing to Elasticsearch/OpenSearch MCP Server
 2 | 
 3 | Thank you for your interest in contributing to the Elasticsearch/OpenSearch MCP Server! All kinds of contributions are welcome.
 4 | 
 5 | ## Bug reports
 6 | 
 7 | If you think you've found a bug in the Elasticsearch/OpenSearch MCP Server, we welcome your report. It's very helpful if you can provide steps to reproduce the bug, as it makes it easier to identify and fix the issue.
 8 | 
 9 | ## Feature requests
10 | 
11 | If you find yourself wishing for a feature that doesn't exist in the Elasticsearch/OpenSearch MCP Server, you are probably not alone. Don't be hesitate to open an issue which describes the feature you would like to see, why you need it, and how it should work.
12 | 
13 | ## Pull requests
14 | 
15 | If you have a fix or a new feature, we welcome your pull requests. You can follow the following steps:
16 | 
17 | 1. Fork your own copy of the repository to your GitHub account by clicking on
18 |    `Fork` button on [elasticsearch-mcp-server's GitHub repository](https://github.com/cr7258/elasticsearch-mcp-server).
19 | 2. Clone the forked repository on your local setup.
20 | 
21 |     ```bash
22 |     git clone https://github.com/$user/elasticsearch-mcp-server
23 |     ```
24 | 
25 |    Add a remote upstream to track upstream `elasticsearch-mcp-server` repository.
26 | 
27 |     ```bash
28 |     git remote add upstream https://github.com/cr7258/elasticsearch-mcp-server
29 |     ```
30 | 
31 | 3. Create a topic branch.
32 | 
33 |     ```bash
34 |     git checkout -b <branch-name>
35 |     ```
36 | 
37 | 4. Make changes and commit it locally.
38 | 
39 |     ```bash
40 |     git add <modifiedFile>
41 |     git commit
42 |     ```
43 | 
44 | Commit message could help reviewers better understand what is the purpose of submitted PR. It could help accelerate the code review procedure as well. We encourage contributors to use **EXPLICIT** commit message rather than ambiguous message. In general, we advocate the following commit message type:
45 | - Features: commit message start with `feat`, For example: "feat: add user authentication module"
46 | - Bug Fixes: commit message start with `fix`, For example: "fix: resolve null pointer exception in user service"
47 | - Documentation, commit message start with `doc`, For example: "doc: update API documentation for user endpoints"
48 | - Performance: commit message start with `perf`, For example: "perf: improve the performance of user service"
49 | - Refactor: commit message start with `refactor`, For example: "refactor: refactor user service to improve code readability"
50 | - Test: commit message start with `test`, For example: "test: add unit test for user service"
51 | - Chore: commit message start with `chore`, For example: "chore: update dependencies in pom.xml"
52 | - Style: commit message start with `style`, For example: "style: format the code in user service"
53 | - Revert: commit message start with `revert`, For example: "revert: revert the changes in user service"
54 | 
55 | 5. Push local branch to your forked repository.
56 | 
57 |     ```bash
58 |     git push
59 |     ```
60 | 
61 | 6. Create a Pull request on GitHub.
62 |    Visit your fork at `https://github.com/$user/elasticsearch-mcp-server` and click
63 |    `Compare & Pull Request` button next to your `<branch-name>`.
64 | 
65 | 
66 | ## Keeping branch in sync with upstream
67 | 
68 | Click `Sync fork` button on your forked repository to keep your forked repository in sync with the upstream repository. 
69 | 
70 | If you have already created a branch and want to keep it in sync with the upstream repository, follow the below steps:
71 | 
72 | ```bash
73 | git checkout <branch-name>
74 | git fetch upstream
75 | git rebase upstream/main
76 | ```
77 | 
78 | ## Release
79 | 
80 | ```bash
81 | uv sync
82 | source .venv/bin/activate
83 | # example: make release version=v0.0.6
84 | make release version=<RELEASE_VERSION>
85 | ```
```

--------------------------------------------------------------------------------
/mcp_client/python-sdk-anthropic/__init__.py:
--------------------------------------------------------------------------------

```python
1 | 
```

--------------------------------------------------------------------------------
/src/version.py:
--------------------------------------------------------------------------------

```python
1 | __version__ = "2.0.16"
2 | 
```

--------------------------------------------------------------------------------
/mcp_client/spring-ai/gradle.properties:
--------------------------------------------------------------------------------

```
1 | org.gradle.console = plain
2 | 
```

--------------------------------------------------------------------------------
/mcp_client/spring-ai/settings.gradle:
--------------------------------------------------------------------------------

```
1 | rootProject.name = 'spring-ai-mcp'
```

--------------------------------------------------------------------------------
/src/__init__.py:
--------------------------------------------------------------------------------

```python
1 | """
2 | Search MCP Server package.
3 | """
4 | from src.server import elasticsearch_mcp_server, opensearch_mcp_server, run_search_server
5 | 
6 | __all__ = ['elasticsearch_mcp_server', 'opensearch_mcp_server', 'run_search_server']
7 | 
```

--------------------------------------------------------------------------------
/src/clients/common/__init__.py:
--------------------------------------------------------------------------------

```python
1 | from .index import IndexClient
2 | from .document import DocumentClient
3 | from .cluster import ClusterClient
4 | from .alias import AliasClient
5 | 
6 | __all__ = ['IndexClient', 'DocumentClient', 'ClusterClient', 'AliasClient']
7 | 
```

--------------------------------------------------------------------------------
/mcp_client/spring-ai/src/test/java/spring/ai/mcp/spring_ai_mcp/SpringAiMcpApplicationTests.java:
--------------------------------------------------------------------------------

```java
 1 | package spring.ai.mcp.spring_ai_mcp;
 2 | 
 3 | import org.junit.jupiter.api.Test;
 4 | import org.springframework.boot.test.context.SpringBootTest;
 5 | 
 6 | @SpringBootTest
 7 | class SpringAiMcpApplicationTests {
 8 | 
 9 | 	@Test
10 | 	void contextLoads() {
11 | 	}
12 | 
13 | }
14 | 
```

--------------------------------------------------------------------------------
/mcp_client/spring-ai/gradle/wrapper/gradle-wrapper.properties:
--------------------------------------------------------------------------------

```
1 | distributionBase=GRADLE_USER_HOME
2 | distributionPath=wrapper/dists
3 | distributionUrl=https\://services.gradle.org/distributions/gradle-8.13-bin.zip
4 | networkTimeout=10000
5 | validateDistributionUrl=true
6 | zipStoreBase=GRADLE_USER_HOME
7 | zipStorePath=wrapper/dists
8 | 
```

--------------------------------------------------------------------------------
/mcp_client/spring-ai/src/main/resources/mcp-servers-config.json:
--------------------------------------------------------------------------------

```json
 1 | {
 2 |   "mcpServers": {
 3 |     "elasticsearch-mcp-server": {
 4 |       "command": "uvx",
 5 |       "args": [
 6 |         "elasticsearch-mcp-server"
 7 |       ],
 8 |       "env": {
 9 |         "ELASTICSEARCH_HOSTS": "https://localhost:9200",
10 |         "ELASTICSEARCH_USERNAME": "elastic",
11 |         "ELASTICSEARCH_PASSWORD": "test123"
12 |       }
13 |     }
14 |   }
15 | }
16 | 
```

--------------------------------------------------------------------------------
/mcp_client/spring-ai/src/main/resources/application.yml:
--------------------------------------------------------------------------------

```yaml
 1 | spring:
 2 |   application:
 3 |     name: spring-ai-mcp
 4 |   main:
 5 |     web-application-type: none
 6 |   ai:
 7 |     openai:
 8 |       api-key: ${OPENAI_API_KEY}
 9 |       base-url: ${OPENAI_BASE_URL:https://openrouter.ai/api}
10 |       chat:
11 |         options:
12 |           model: ${CHAT_MODEL:openai/gpt-4o}
13 |     mcp:
14 |       client:
15 |         stdio:
16 |           servers-configuration: classpath:mcp-servers-config.json
```

--------------------------------------------------------------------------------
/src/tools/__init__.py:
--------------------------------------------------------------------------------

```python
 1 | from src.tools.alias import AliasTools
 2 | from src.tools.cluster import ClusterTools
 3 | from src.tools.document import DocumentTools
 4 | from src.tools.general import GeneralTools
 5 | from src.tools.index import IndexTools
 6 | from src.tools.register import ToolsRegister
 7 | 
 8 | __all__ = [
 9 |     'AliasTools',
10 |     'ClusterTools',
11 |     'DocumentTools',
12 |     'GeneralTools',
13 |     'IndexTools',
14 |     'ToolsRegister',
15 | ]
16 | 
```

--------------------------------------------------------------------------------
/src/clients/common/cluster.py:
--------------------------------------------------------------------------------

```python
 1 | from typing import Dict
 2 | 
 3 | from src.clients.base import SearchClientBase
 4 | 
 5 | class ClusterClient(SearchClientBase):
 6 |     def get_cluster_health(self) -> Dict:
 7 |         """Get cluster health information from OpenSearch."""
 8 |         return self.client.cluster.health()
 9 |     
10 |     def get_cluster_stats(self) -> Dict:
11 |         """Get cluster statistics from OpenSearch."""
12 |         return self.client.cluster.stats()
```

--------------------------------------------------------------------------------
/src/clients/common/general.py:
--------------------------------------------------------------------------------

```python
 1 | from typing import Dict, Optional
 2 | 
 3 | from src.clients.base import SearchClientBase
 4 | 
 5 | class GeneralClient(SearchClientBase):
 6 |     def general_api_request(self, method: str, path: str, params: Optional[Dict] = None, body: Optional[Dict] = None):
 7 |         """Perform a general HTTP API request.
 8 |            Use this tool for any Elasticsearch/OpenSearch API that does not have a dedicated tool.
 9 |         """
10 |         return self.general_client.request(method, path, params, body)
11 | 
```

--------------------------------------------------------------------------------
/src/tools/cluster.py:
--------------------------------------------------------------------------------

```python
 1 | from typing import Dict
 2 | 
 3 | from fastmcp import FastMCP
 4 | 
 5 | class ClusterTools:
 6 |     def __init__(self, search_client):
 7 |         self.search_client = search_client
 8 |     def register_tools(self, mcp: FastMCP):
 9 |         @mcp.tool()
10 |         def get_cluster_health() -> Dict:
11 |             """Returns basic information about the health of the cluster."""
12 |             return self.search_client.get_cluster_health()
13 | 
14 |         @mcp.tool()
15 |         def get_cluster_stats() -> Dict:
16 |             """Returns high-level overview of cluster statistics."""
17 |             return self.search_client.get_cluster_stats()
18 | 
```

--------------------------------------------------------------------------------
/.github/workflows/publish-mcp.yml:
--------------------------------------------------------------------------------

```yaml
 1 | name: Publish Python MCP Server
 2 | 
 3 | on:
 4 |   workflow_run:
 5 |     workflows: ["PyPI Publish"]
 6 |     types:
 7 |       - completed
 8 | 
 9 | jobs:
10 |   publish:
11 |     runs-on: ubuntu-latest
12 |     permissions:
13 |       id-token: write
14 |       contents: read
15 |     
16 |     steps:
17 |       - uses: actions/checkout@v4
18 |       - name: Install MCP Publisher
19 |         run: |
20 |           curl -L "https://github.com/modelcontextprotocol/registry/releases/download/v1.0.0/mcp-publisher_1.0.0_$(uname -s | tr '[:upper:]' '[:lower:]')_$(uname -m | sed 's/x86_64/amd64/;s/aarch64/arm64/').tar.gz" | tar xz mcp-publisher
21 |       
22 |       - name: Publish to MCP Registry
23 |         run: |
24 |           ./mcp-publisher login github-oidc
25 |           ./mcp-publisher publish
26 | 
```

--------------------------------------------------------------------------------
/src/clients/common/data_stream.py:
--------------------------------------------------------------------------------

```python
 1 | from typing import Dict, Optional
 2 | from src.clients.base import SearchClientBase
 3 | 
 4 | class DataStreamClient(SearchClientBase):  
 5 |     def create_data_stream(self, name: str) -> Dict:
 6 |         """Create a new data stream."""
 7 |         return self.client.indices.create_data_stream(name=name)
 8 |     
 9 |     def get_data_stream(self, name: Optional[str] = None) -> Dict:
10 |         """Get information about one or more data streams."""
11 |         if name:
12 |             return self.client.indices.get_data_stream(name=name)
13 |         else:
14 |             return self.client.indices.get_data_stream()
15 |     
16 |     def delete_data_stream(self, name: str) -> Dict:
17 |         """Delete one or more data streams."""
18 |         return self.client.indices.delete_data_stream(name=name)
19 | 
```

--------------------------------------------------------------------------------
/pyproject.toml:
--------------------------------------------------------------------------------

```toml
 1 | [project]
 2 | name = "elasticsearch-mcp-server"
 3 | version = "2.0.16"
 4 | description = "MCP Server for interacting with Elasticsearch and OpenSearch"
 5 | readme = "README.md"
 6 | requires-python = ">=3.10"
 7 | dependencies = [
 8 |     "elasticsearch==8.17.2",
 9 |     "opensearch-py==2.8.0",
10 |     "mcp==1.9.2",
11 |     "python-dotenv==1.1.0",
12 |     "fastmcp==2.8.0",
13 |     "pydantic>=2.11.0,<2.12.0",
14 |     "anthropic==0.49.0",
15 |     "tomli==2.2.1",
16 |     "tomli-w==1.2.0",
17 | ]
18 | 
19 | [project.license]
20 | file = "LICENSE"
21 | 
22 | [project.scripts]
23 | elasticsearch-mcp-server = "src.server:elasticsearch_mcp_server"
24 | opensearch-mcp-server = "src.server:opensearch_mcp_server"
25 | 
26 | [tool.hatch.build.targets.wheel]
27 | packages = [
28 |     "src",
29 | ]
30 | 
31 | [build-system]
32 | requires = [
33 |     "hatchling",
34 | ]
35 | build-backend = "hatchling.build"
36 | 
```

--------------------------------------------------------------------------------
/src/clients/common/alias.py:
--------------------------------------------------------------------------------

```python
 1 | from typing import Dict
 2 | 
 3 | from src.clients.base import SearchClientBase
 4 | 
 5 | class AliasClient(SearchClientBase):
 6 |     def list_aliases(self) -> Dict:
 7 |         """Get all aliases."""
 8 |         return self.client.cat.aliases()
 9 |     
10 |     def get_alias(self, index: str) -> Dict:
11 |         """Get aliases for the specified index."""
12 |         return self.client.indices.get_alias(index=index)
13 | 
14 |     def put_alias(self, index: str, name: str, body: Dict) -> Dict:
15 |         """Creates or updates an alias."""
16 |         return self.client.indices.put_alias(index=index, name=name, body=body)
17 |     
18 |     def delete_alias(self, index: str, name: str) -> Dict:
19 |         """Delete an alias for the specified index."""
20 |         return self.client.indices.delete_alias(index=index, name=name)
21 | 
```

--------------------------------------------------------------------------------
/src/clients/common/index.py:
--------------------------------------------------------------------------------

```python
 1 | from typing import Dict, Optional
 2 | 
 3 | from src.clients.base import SearchClientBase
 4 | 
 5 | class IndexClient(SearchClientBase):
 6 |     def list_indices(self) -> Dict:
 7 |         """List all indices."""
 8 |         return self.client.cat.indices()
 9 |     
10 |     def get_index(self, index: str) -> Dict:
11 |         """Returns information (mappings, settings, aliases) about one or more indices."""
12 |         return self.client.indices.get(index=index)
13 |     
14 |     def create_index(self, index: str, body: Optional[Dict] = None) -> Dict:
15 |         """Creates an index with optional settings and mappings."""
16 |         return self.client.indices.create(index=index, body=body)
17 |     
18 |     def delete_index(self, index: str) -> Dict:
19 |         """Delete an index."""
20 |         return self.client.indices.delete(index=index)
21 | 
```

--------------------------------------------------------------------------------
/src/tools/general.py:
--------------------------------------------------------------------------------

```python
 1 | from typing import Dict, Optional
 2 | 
 3 | from fastmcp import FastMCP
 4 | 
 5 | class GeneralTools:
 6 |     def __init__(self, search_client):
 7 |         self.search_client = search_client
 8 |     def register_tools(self, mcp: FastMCP):
 9 |         @mcp.tool()
10 |         def general_api_request(method: str, path: str, params: Optional[Dict] = None, body: Optional[Dict] = None):
11 |             """Perform a general HTTP API request.
12 |                Use this tool for any Elasticsearch/OpenSearch API that does not have a dedicated tool.
13 |             
14 |             Args:
15 |                 method: HTTP method (GET, POST, PUT, DELETE, etc.)
16 |                 path: API endpoint path
17 |                 params: Query parameters
18 |                 body: Request body
19 |             """
20 |             return self.search_client.general_api_request(method, path, params, body)
21 | 
```

--------------------------------------------------------------------------------
/mcp_client/spring-ai/build.gradle:
--------------------------------------------------------------------------------

```
 1 | plugins {
 2 | 	id 'java'
 3 | 	id 'org.springframework.boot' version '3.4.4'
 4 | 	id 'io.spring.dependency-management' version '1.1.7'
 5 | }
 6 | 
 7 | group = 'spring.ai.mcp'
 8 | version = '0.0.1-SNAPSHOT'
 9 | 
10 | java {
11 | 	toolchain {
12 | 		languageVersion = JavaLanguageVersion.of(24)
13 | 	}
14 | }
15 | 
16 | repositories {
17 | 	mavenCentral()
18 | }
19 | 
20 | ext {
21 | 	set('springAiVersion', "1.0.0-M7")
22 | }
23 | 
24 | dependencies {
25 | 	implementation 'org.springframework.boot:spring-boot-starter-web'
26 | 	implementation 'org.springframework.ai:spring-ai-starter-model-openai'
27 | 	implementation 'org.springframework.ai:spring-ai-starter-mcp-client'
28 | 	testImplementation 'org.springframework.boot:spring-boot-starter-test'
29 | 	testRuntimeOnly 'org.junit.platform:junit-platform-launcher'
30 | }
31 | 
32 | dependencyManagement {
33 | 	imports {
34 | 		mavenBom "org.springframework.ai:spring-ai-bom:${springAiVersion}"
35 | 	}
36 | }
37 | 
38 | tasks.named('test') {
39 | 	useJUnitPlatform()
40 | }
41 | 
42 | bootRun {
43 |     standardInput = System.in
44 | }
```

--------------------------------------------------------------------------------
/.github/workflows/release.yml:
--------------------------------------------------------------------------------

```yaml
 1 | name: Release
 2 | 
 3 | on:
 4 |   push:
 5 |     tags:
 6 |       - 'v*'
 7 | 
 8 | jobs:
 9 |   release:
10 |     runs-on: ubuntu-latest
11 |     permissions:
12 |       contents: write
13 |     steps:
14 |       - uses: actions/checkout@v4
15 |         with:
16 |           fetch-depth: 0
17 | 
18 |       - name: Set up Python
19 |         uses: actions/setup-python@v4
20 |         with:
21 |           python-version: '3.x'
22 | 
23 |       - name: Install dependencies
24 |         run: |
25 |           python -m pip install --upgrade pip
26 |           pip install git-cliff
27 | 
28 |       - name: Get version from tag
29 |         id: get_version
30 |         run: echo "VERSION=${GITHUB_REF#refs/tags/v}" >> $GITHUB_ENV
31 | 
32 |       - name: Generate changelog
33 |         run: |
34 |           git-cliff --output CHANGELOG.md --latest
35 | 
36 |       - name: Create Release
37 |         uses: softprops/action-gh-release@v1
38 |         with:
39 |           name: v${{ env.VERSION }}
40 |           body_path: CHANGELOG.md
41 |           draft: false
42 |           prerelease: false
43 | 
```

--------------------------------------------------------------------------------
/src/clients/common/client.py:
--------------------------------------------------------------------------------

```python
 1 | from typing import Dict
 2 | 
 3 | from src.clients.common.alias import AliasClient
 4 | from src.clients.common.cluster import ClusterClient
 5 | from src.clients.common.data_stream import DataStreamClient
 6 | from src.clients.common.document import DocumentClient
 7 | from src.clients.common.general import GeneralClient
 8 | from src.clients.common.index import IndexClient
 9 | 
10 | class SearchClient(IndexClient, DocumentClient, ClusterClient, AliasClient, DataStreamClient, GeneralClient):
11 |     """
12 |     Unified search client that combines all search functionality.
13 |     
14 |     This class uses multiple inheritance to combine all specialized client implementations
15 |     (index, document, cluster, alias) into a single unified client.
16 |     """
17 |     
18 |     def __init__(self, config: Dict, engine_type: str):
19 |         """
20 |         Initialize the search client.
21 |         
22 |         Args:
23 |             config: Configuration dictionary with connection parameters
24 |             engine_type: Type of search engine to use ("elasticsearch" or "opensearch")
25 |         """
26 |         super().__init__(config, engine_type)
27 |         self.logger.info(f"Initialized the {engine_type} client")
28 | 
```

--------------------------------------------------------------------------------
/src/clients/__init__.py:
--------------------------------------------------------------------------------

```python
 1 | import os
 2 | 
 3 | from dotenv import load_dotenv
 4 | 
 5 | from src.clients.common.client import SearchClient
 6 | from src.clients.exceptions import handle_search_exceptions
 7 | 
 8 | def create_search_client(engine_type: str) -> SearchClient:
 9 |     """
10 |     Create a search client for the specified engine type.
11 |     
12 |     Args:
13 |         engine_type: Type of search engine to use ("elasticsearch" or "opensearch")
14 |         
15 |     Returns:
16 |         A search client instance
17 |     """
18 |     # Load configuration from environment variables
19 |     load_dotenv()
20 |     
21 |     # Get configuration from environment variables
22 |     prefix = engine_type.upper()
23 |     hosts_str = os.environ.get(f"{prefix}_HOSTS", "https://localhost:9200")
24 |     hosts = [host.strip() for host in hosts_str.split(",")]
25 |     username = os.environ.get(f"{prefix}_USERNAME")
26 |     password = os.environ.get(f"{prefix}_PASSWORD")
27 |     api_key = os.environ.get(f"{prefix}_API_KEY")
28 |     verify_certs = os.environ.get(f"{prefix}_VERIFY_CERTS", "false").lower() == "true"
29 |     
30 |     config = {
31 |         "hosts": hosts,
32 |         "username": username,
33 |         "password": password,
34 |         "api_key": api_key,
35 |         "verify_certs": verify_certs
36 |     }
37 |     
38 |     return SearchClient(config, engine_type)
39 | 
40 | __all__ = [
41 |     'create_search_client',
42 |     'handle_search_exceptions',
43 |     'SearchClient',
44 | ]
45 | 
```

--------------------------------------------------------------------------------
/src/tools/register.py:
--------------------------------------------------------------------------------

```python
 1 | import logging
 2 | from typing import List, Type
 3 | 
 4 | from fastmcp import FastMCP
 5 | 
 6 | from src.clients import SearchClient
 7 | from src.clients.exceptions import with_exception_handling
 8 | 
 9 | class ToolsRegister:
10 |     """Class to handle registration of MCP tools."""
11 |     
12 |     def __init__(self, logger: logging.Logger, search_client: SearchClient, mcp: FastMCP):
13 |         """
14 |         Initialize the tools register.
15 |         
16 |         Args:
17 |             logger: Logger instance
18 |             search_client: Search client instance
19 |             mcp: FastMCP instance
20 |         """
21 |         self.logger = logger
22 |         self.search_client = search_client
23 |         self.mcp = mcp
24 |     
25 |     def register_all_tools(self, tool_classes: List[Type]):
26 |         """
27 |         Register all tools with the MCP server.
28 |         
29 |         Args:
30 |             tool_classes: List of tool classes to register
31 |         """
32 |         for tool_class in tool_classes:
33 |             self.logger.info(f"Registering tools from {tool_class.__name__}")
34 |             tool_instance = tool_class(self.search_client)
35 |             
36 |             # Set logger and client attributes
37 |             tool_instance.logger = self.logger
38 |             tool_instance.search_client = self.search_client
39 |             
40 |             # Register tools with automatic exception handling
41 |             with_exception_handling(tool_instance, self.mcp)
42 | 
```

--------------------------------------------------------------------------------
/src/tools/index.py:
--------------------------------------------------------------------------------

```python
 1 | from typing import Dict, Optional, List
 2 | 
 3 | from fastmcp import FastMCP
 4 | 
 5 | class IndexTools:
 6 |     def __init__(self, search_client):
 7 |         self.search_client = search_client
 8 |         
 9 |     def register_tools(self, mcp: FastMCP):
10 |         @mcp.tool()
11 |         def list_indices() -> List[Dict]:
12 |             """List all indices."""
13 |             return self.search_client.list_indices()
14 | 
15 |         @mcp.tool()
16 |         def get_index(index: str) -> Dict:
17 |             """
18 |             Returns information (mappings, settings, aliases) about one or more indices.
19 |             
20 |             Args:
21 |                 index: Name of the index
22 |             """
23 |             return self.search_client.get_index(index=index)
24 | 
25 |         @mcp.tool()
26 |         def create_index(index: str, body: Optional[Dict] = None) -> Dict:
27 |             """
28 |             Create a new index.
29 |             
30 |             Args:
31 |                 index: Name of the index
32 |                 body: Optional index configuration including mappings and settings
33 |             """
34 |             return self.search_client.create_index(index=index, body=body)
35 | 
36 |         @mcp.tool()
37 |         def delete_index(index: str) -> Dict:
38 |             """
39 |             Delete an index.
40 |             
41 |             Args:
42 |                 index: Name of the index
43 |             """
44 |             return self.search_client.delete_index(index=index)
```

--------------------------------------------------------------------------------
/src/tools/alias.py:
--------------------------------------------------------------------------------

```python
 1 | from typing import Dict, List
 2 | 
 3 | from fastmcp import FastMCP
 4 | 
 5 | class AliasTools:
 6 |     def __init__(self, search_client):
 7 |         self.search_client = search_client
 8 |     def register_tools(self, mcp: FastMCP):
 9 |         @mcp.tool()
10 |         def list_aliases() -> List[Dict]:
11 |             """List all aliases."""
12 |             return self.search_client.list_aliases()
13 | 
14 |         @mcp.tool()
15 |         def get_alias(index: str) -> Dict:
16 |             """
17 |             Get alias information for a specific index.
18 | 
19 |             Args:
20 |                 index: Name of the index
21 |             """
22 |             return self.search_client.get_alias(index=index)
23 | 
24 |         @mcp.tool()
25 |         def put_alias(index: str, name: str, body: Dict) -> Dict:
26 |             """
27 |             Create or update an alias for a specific index.
28 | 
29 |             Args:
30 |                 index: Name of the index
31 |                 name: Name of the alias
32 |                 body: Alias configuration
33 |             """
34 |             return self.search_client.put_alias(index=index, name=name, body=body)
35 | 
36 |         @mcp.tool()
37 |         def delete_alias(index: str, name: str) -> Dict:
38 |             """
39 |             Delete an alias for a specific index.
40 | 
41 |             Args:
42 |                 index: Name of the index
43 |                 name: Name of the alias
44 |             """
45 |             return self.search_client.delete_alias(index=index, name=name)
46 |         
47 | 
```

--------------------------------------------------------------------------------
/mcp_client/spring-ai/src/main/java/spring/ai/mcp/spring_ai_mcp/Application.java:
--------------------------------------------------------------------------------

```java
 1 | package spring.ai.mcp.spring_ai_mcp;
 2 | 
 3 | import java.util.List;
 4 | import java.util.Scanner;
 5 | 
 6 | import io.modelcontextprotocol.client.McpSyncClient;
 7 | 
 8 | import org.springframework.ai.chat.client.ChatClient;
 9 | import org.springframework.ai.chat.client.advisor.MessageChatMemoryAdvisor;
10 | import org.springframework.ai.chat.memory.InMemoryChatMemory;
11 | import org.springframework.ai.mcp.SyncMcpToolCallbackProvider;
12 | import org.springframework.boot.CommandLineRunner;
13 | import org.springframework.boot.SpringApplication;
14 | import org.springframework.boot.autoconfigure.SpringBootApplication;
15 | import org.springframework.context.annotation.Bean;
16 | 
17 | @SpringBootApplication
18 | public class Application {
19 | 
20 | 	public static void main(String[] args) {
21 | 		SpringApplication.run(Application.class, args);
22 | 	}
23 | 
24 | 	@Bean
25 | 	public CommandLineRunner chatbot(ChatClient.Builder chatClientBuilder, List<McpSyncClient> mcpSyncClients) {
26 | 
27 | 		return args -> {
28 | 
29 | 			var chatClient = chatClientBuilder
30 | 					.defaultSystem("You are useful assistant and can query Elasticsearch to reply to your questions.")
31 | 					.defaultTools(new SyncMcpToolCallbackProvider(mcpSyncClients))
32 | 					.defaultAdvisors(new MessageChatMemoryAdvisor(new InMemoryChatMemory()))
33 | 					.build();
34 | 
35 | 			// Start the chat loop
36 | 			System.out.println("\nI am your AI assistant.\n");
37 | 			try (Scanner scanner = new Scanner(System.in)) {
38 | 				while (true) {
39 | 					System.out.print("\nUSER: ");
40 | 					System.out.println("\nASSISTANT: " +
41 | 							chatClient.prompt(scanner.nextLine()) // Get the user input
42 | 									.call()
43 | 									.content());
44 | 				}
45 | 			}
46 | 		};
47 | 	}
48 | }
```

--------------------------------------------------------------------------------
/src/clients/common/document.py:
--------------------------------------------------------------------------------

```python
 1 | from typing import Dict, Optional
 2 | 
 3 | from src.clients.base import SearchClientBase
 4 | 
 5 | class DocumentClient(SearchClientBase):
 6 |     def search_documents(self, index: str, body: Dict) -> Dict:
 7 |         """Search for documents in the index."""
 8 |         return self.client.search(index=index, body=body)
 9 |     
10 |     def index_document(self, index: str, document: Dict, id: Optional[str] = None) -> Dict:
11 |         """Creates a new document in the index."""
12 |         # Handle parameter name differences between Elasticsearch and OpenSearch
13 |         if self.engine_type == "elasticsearch":
14 |             # For Elasticsearch: index(index, document, id=None, ...)
15 |             if id is not None:
16 |                 return self.client.index(index=index, document=document, id=id)
17 |             else:
18 |                 return self.client.index(index=index, document=document)
19 |         else:
20 |             # For OpenSearch: index(index, body, id=None, ...)
21 |             if id is not None:
22 |                 return self.client.index(index=index, body=document, id=id)
23 |             else:
24 |                 return self.client.index(index=index, body=document)
25 |     
26 |     def get_document(self, index: str, id: str) -> Dict:
27 |         """Get a document by ID."""
28 |         return self.client.get(index=index, id=id)
29 |     
30 |     def delete_document(self, index: str, id: str) -> Dict:
31 |         """Removes a document from the index."""
32 |         return self.client.delete(index=index, id=id)
33 | 
34 |     def delete_by_query(self, index: str, body: Dict) -> Dict:
35 |         """Deletes documents matching the provided query."""
36 |         return self.client.delete_by_query(index=index, body=body)
37 | 
38 | 
```

--------------------------------------------------------------------------------
/src/tools/data_stream.py:
--------------------------------------------------------------------------------

```python
 1 | from typing import Dict, Optional
 2 | from fastmcp import FastMCP
 3 | 
 4 | class DataStreamTools:    
 5 |     def __init__(self, search_client):
 6 |         self.search_client = search_client
 7 |     
 8 |     def register_tools(self, mcp: FastMCP):
 9 |         """Register data stream tools with the MCP server."""
10 |         
11 |         @mcp.tool()
12 |         def create_data_stream(name: str) -> Dict:
13 |             """Create a new data stream.
14 |             
15 |             This creates a new data stream with the specified name.
16 |             The data stream must have a matching index template before creation.
17 |             
18 |             Args:
19 |                 name: Name of the data stream to create
20 |             """
21 |             return self.search_client.create_data_stream(name=name)
22 |         
23 |         @mcp.tool()
24 |         def get_data_stream(name: Optional[str] = None) -> Dict:
25 |             """Get information about one or more data streams.
26 |             
27 |             Retrieves configuration, mappings, settings, and other information
28 |             about the specified data streams.
29 |             
30 |             Args:
31 |                 name: Name of the data stream(s) to retrieve.
32 |                       Can be a comma-separated list or wildcard pattern.
33 |                       If not provided, retrieves all data streams.
34 |             """
35 |             return self.search_client.get_data_stream(name=name)
36 |         
37 |         @mcp.tool()
38 |         def delete_data_stream(name: str) -> Dict:
39 |             """Delete one or more data streams.
40 |             
41 |             Permanently deletes the specified data streams and all their backing indices.
42 |             
43 |             Args:
44 |                 name: Name of the data stream(s) to delete.
45 |                       Can be a comma-separated list or wildcard pattern.
46 |             """
47 |             return self.search_client.delete_data_stream(name=name)
48 | 
```

--------------------------------------------------------------------------------
/.github/workflows/pypi-publish.yaml:
--------------------------------------------------------------------------------

```yaml
 1 | # This workflow will upload Python Packages using uv when a release is created
 2 | # It builds and publishes multiple packages for different Elasticsearch versions
 3 | 
 4 | name: PyPI Publish
 5 | 
 6 | on:
 7 |   workflow_run:
 8 |     workflows: ["Release"]
 9 |     types:
10 |       - completed
11 | 
12 | env:
13 |   UV_PUBLISH_TOKEN: '${{ secrets.PYPI_API_TOKEN }}'
14 | 
15 | jobs:
16 |   deploy:
17 |     runs-on: ubuntu-latest
18 |     if: ${{ github.event.workflow_run.conclusion == 'success' }}
19 |     strategy:
20 |       matrix:
21 |         variant:
22 |           - name: "elasticsearch-mcp-server-es7"
23 |             elasticsearch_version: "7.13.0"
24 |           - name: "elasticsearch-mcp-server"
25 |             elasticsearch_version: "8.17.2"
26 |           - name: "elasticsearch-mcp-server-es9"
27 |             elasticsearch_version: "9.0.0"
28 |           - name: "opensearch-mcp-server"
29 |             elasticsearch_version: "8.17.2"
30 |     steps:
31 |     - uses: actions/checkout@v4
32 | 
33 |     - name: Set up Python
34 |       uses: actions/setup-python@v2
35 |       with:
36 |         python-version: '3.10.x'
37 | 
38 |     - name: Install dependencies
39 |       run: |
40 |         python -m pip install uv
41 |         uv sync
42 | 
43 |     - name: Modify pyproject.toml for ${{ matrix.variant.name }}
44 |       run: |
45 |         # Update package name
46 |         sed -i 's/^name = .*$/name = "${{ matrix.variant.name }}"/' pyproject.toml
47 | 
48 |         # Update elasticsearch version
49 |         sed -i 's/elasticsearch==.*/elasticsearch==${{ matrix.variant.elasticsearch_version }}",/' pyproject.toml
50 |         
51 |         # Update script name to match package name (only for non-opensearch packages)
52 |         if [[ "${{ matrix.variant.name }}" != "opensearch-mcp-server" ]]; then
53 |           sed -i 's/^elasticsearch-mcp-server = /"${{ matrix.variant.name }}" = /' pyproject.toml
54 |         fi
55 | 
56 |     - name: Build ${{ matrix.variant.name }} package
57 |       run: uv build
58 | 
59 |     - name: Publish ${{ matrix.variant.name }} package
60 |       run: uv publish
61 | 
62 |     - name: Clean dist directory
63 |       run: rm -rf dist/*
64 | 
```

--------------------------------------------------------------------------------
/server.json:
--------------------------------------------------------------------------------

```json
 1 | {
 2 |   "$schema": "https://static.modelcontextprotocol.io/schemas/2025-07-09/server.schema.json",
 3 |   "name": "io.github.cr7258/elasticsearch-mcp-server",
 4 |   "description": "MCP server for interacting with Elasticsearch",
 5 |   "status": "active",
 6 |   "repository": {
 7 |     "url": "https://github.com/cr7258/elasticsearch-mcp-server",
 8 |     "source": "github"
 9 |   },
10 |   "version": "2.0.16",
11 |   "packages": [
12 |     {
13 |       "registry_type": "pypi",
14 |       "registry_base_url": "https://pypi.org",
15 |       "identifier": "elasticsearch-mcp-server",
16 |       "version": "2.0.16",
17 |       "transport": {
18 |         "type": "stdio"
19 |       },
20 |       "environment_variables": [
21 |         {
22 |           "name": "ELASTICSEARCH_HOSTS",
23 |           "description": "Comma-separated list of Elasticsearch hosts (e.g., https://localhost:9200)",
24 |           "is_required": false,
25 |           "format": "string",
26 |           "is_secret": false,
27 |           "default": "https://localhost:9200"
28 |         },
29 |         {
30 |           "name": "ELASTICSEARCH_API_KEY",
31 |           "description": "API key for Elasticsearch or Elastic Cloud authentication (recommended)",
32 |           "is_required": false,
33 |           "format": "string",
34 |           "is_secret": true
35 |         },
36 |         {
37 |           "name": "ELASTICSEARCH_USERNAME",
38 |           "description": "Username for basic authentication (alternative to API key)",
39 |           "is_required": false,
40 |           "format": "string",
41 |           "is_secret": false
42 |         },
43 |         {
44 |           "name": "ELASTICSEARCH_PASSWORD",
45 |           "description": "Password for basic authentication (used with ELASTICSEARCH_USERNAME)",
46 |           "is_required": false,
47 |           "format": "string",
48 |           "is_secret": true
49 |         },
50 |         {
51 |           "name": "ELASTICSEARCH_VERIFY_CERTS",
52 |           "description": "Whether to verify SSL certificates (true/false)",
53 |           "is_required": false,
54 |           "format": "boolean",
55 |           "is_secret": false,
56 |           "default": "false"
57 |         }
58 |       ]
59 |     }
60 |   ]
61 | }
```

--------------------------------------------------------------------------------
/mcp_client/python-sdk-anthropic/config.py:
--------------------------------------------------------------------------------

```python
 1 | from dotenv import load_dotenv
 2 | import logging
 3 | from os import getenv
 4 | import pydantic_settings as py_set
 5 | 
 6 | load_dotenv()
 7 | 
 8 | class LoggerConfig(py_set.BaseSettings):
 9 |     file: str = "logs/notifications_telegram.log"
10 |     format: str = "[{name}]-[%(levelname)s]-[%(asctime)s]-[%(message)s]"
11 |     to_file: bool = getenv("LOG_TO_FILE", "False").lower() == "true"
12 |     to_terminal: bool = getenv("LOG_TO_TERMINAL", "True").lower() == "true"
13 |     file_level: int = logging.DEBUG
14 |     terminal_level: int = logging.INFO
15 | 
16 | 
17 | class ElasticsearchConfig(py_set.BaseSettings):
18 |     host: str = getenv("ELASTICSEARCH_HOSTS", "")
19 |     port: int = int(getenv("ELASTICSEARCH_PORT", "30930"))
20 |     scroll_size: int = 10_000
21 |     scroll: str = "1m"
22 |     timeout: int = 60
23 | 
24 | 
25 | class AnthropicConfig(py_set.BaseSettings):
26 |     model: str = getenv("MODEL", "claude-3-5-sonnet-20241022")
27 |     max_tokens_message: int = int(getenv("MAX_TOKENS_MESSAGE", "1000"))
28 | 
29 | 
30 | class Config(py_set.BaseSettings):
31 |     logger: LoggerConfig
32 |     elasticsearch: ElasticsearchConfig
33 |     anthropic: AnthropicConfig
34 | 
35 | 
36 | def read_config() -> Config:
37 |     logger_config = LoggerConfig()
38 |     elasticsearch_config = ElasticsearchConfig()
39 |     anthropic_config = AnthropicConfig()
40 |     return Config(
41 |         logger=logger_config,
42 |         elasticsearch=elasticsearch_config,
43 |         anthropic=anthropic_config,
44 |     )
45 | 
46 | 
47 | def get_logger(name: str) -> logging.Logger:
48 |     log_config = LoggerConfig()
49 |     logger = logging.getLogger(name)
50 |     logger.setLevel(logging.DEBUG)
51 |     formatter = logging.Formatter(log_config.format.format(name=name))
52 |     if log_config.to_file:
53 |         file_handler = logging.FileHandler(log_config.file, mode="a")
54 |         file_handler.setLevel(log_config.file_level)
55 |         file_handler.setFormatter(formatter)
56 |         logger.addHandler(file_handler)
57 |     if log_config.to_terminal:
58 |         console_handler = logging.StreamHandler()
59 |         console_handler.setLevel(log_config.terminal_level)
60 |         console_handler.setFormatter(formatter)
61 |         logger.addHandler(console_handler)
62 |     return logger
63 | 
```

--------------------------------------------------------------------------------
/src/tools/document.py:
--------------------------------------------------------------------------------

```python
 1 | from typing import Dict, Optional
 2 | 
 3 | from fastmcp import FastMCP
 4 | 
 5 | class DocumentTools:
 6 |     def __init__(self, search_client):
 7 |         self.search_client = search_client
 8 |     
 9 |     def register_tools(self, mcp: FastMCP):
10 |         @mcp.tool()
11 |         def search_documents(index: str, body: Dict) -> Dict:
12 |             """
13 |             Search for documents.
14 |             
15 |             Args:
16 |                 index: Name of the index
17 |                 body: Search query
18 |             """
19 |             return self.search_client.search_documents(index=index, body=body)
20 |         
21 |         @mcp.tool()
22 |         def index_document(index: str, document: Dict, id: Optional[str] = None) -> Dict:
23 |             """
24 |             Creates or updates a document in the index.
25 |             
26 |             Args:
27 |                 index: Name of the index
28 |                 document: Document data
29 |                 id: Optional document ID
30 |             """
31 |             return self.search_client.index_document(index=index, id=id, document=document)
32 |         
33 |         @mcp.tool()
34 |         def get_document(index: str, id: str) -> Dict:
35 |             """
36 |             Get a document by ID.
37 |             
38 |             Args:
39 |                 index: Name of the index
40 |                 id: Document ID
41 |             """
42 |             return self.search_client.get_document(index=index, id=id)
43 |         
44 |         @mcp.tool()
45 |         def delete_document(index: str, id: str) -> Dict:
46 |             """
47 |             Delete a document by ID.
48 |             
49 |             Args:
50 |                 index: Name of the index
51 |                 id: Document ID
52 |             """
53 |             return self.search_client.delete_document(index=index, id=id)
54 |         
55 |         @mcp.tool()
56 |         def delete_by_query(index: str, body: Dict) -> Dict:
57 |             """
58 |             Deletes documents matching the provided query.
59 |             
60 |             Args:
61 |                 index: Name of the index
62 |                 body: Query to match documents for deletion
63 |             """
64 |             return self.search_client.delete_by_query(index=index, body=body)
65 | 
```

--------------------------------------------------------------------------------
/src/clients/exceptions.py:
--------------------------------------------------------------------------------

```python
 1 | import functools
 2 | import logging
 3 | from typing import TypeVar, Callable
 4 | 
 5 | from fastmcp import FastMCP
 6 | from mcp.types import TextContent
 7 | 
 8 | T = TypeVar('T')
 9 | 
10 | def handle_search_exceptions(func: Callable[..., T]) -> Callable[..., list[TextContent]]:
11 |     """
12 |     Decorator to handle exceptions in search client operations.
13 |     
14 |     Args:
15 |         func: The function to decorate
16 |         
17 |     Returns:
18 |         Decorated function that handles exceptions
19 |     """
20 |     @functools.wraps(func)
21 |     def wrapper(*args, **kwargs):
22 |         logger = logging.getLogger()   
23 |         try:
24 |             return func(*args, **kwargs)
25 |         except Exception as e:
26 |             logger.error(f"Unexpected error in {func.__name__}: {e}")
27 |             return [TextContent(type="text", text=f"Unexpected error in {func.__name__}: {str(e)}")]
28 |     
29 |     return wrapper
30 | 
31 | def with_exception_handling(tool_instance: object, mcp: FastMCP) -> None:
32 |     """
33 |     Register tools from a tool instance with automatic exception handling applied to all tools.
34 |     
35 |     This function temporarily replaces mcp.tool with a wrapped version that automatically
36 |     applies the handle_search_exceptions decorator to all registered tool methods.
37 |     
38 |     Args:
39 |         tool_instance: The tool instance that has a register_tools method
40 |         mcp: The FastMCP instance used for tool registration
41 |     """
42 |     # Save the original tool method
43 |     original_tool = mcp.tool
44 | 
45 |     @functools.wraps(original_tool)
46 |     def wrapped_tool(*args, **kwargs):
47 |         # Get the original decorator
48 |         decorator = original_tool(*args, **kwargs)
49 | 
50 |         # Return a new decorator that applies both the exception handler and original decorator
51 |         def combined_decorator(func):
52 |             # First apply the exception handling decorator
53 |             wrapped_func = handle_search_exceptions(func)
54 |             # Then apply the original mcp.tool decorator
55 |             return decorator(wrapped_func)
56 | 
57 |         return combined_decorator
58 | 
59 |     try:
60 |         # Temporarily replace mcp.tool with our wrapped version
61 |         mcp.tool = wrapped_tool
62 | 
63 |         # Call the registration method on the tool instance
64 |         tool_instance.register_tools(mcp)
65 |     finally:
66 |         # Restore the original mcp.tool to avoid affecting other code that might use mcp.tool
67 |         # This ensures that our modification is isolated to just this tool registration
68 |         # and prevents multiple nested decorators if register_all_tools is called multiple times
69 |         mcp.tool = original_tool
```

--------------------------------------------------------------------------------
/mcp_client/spring-ai/gradlew.bat:
--------------------------------------------------------------------------------

```
 1 | @rem
 2 | @rem Copyright 2015 the original author or authors.
 3 | @rem
 4 | @rem Licensed under the Apache License, Version 2.0 (the "License");
 5 | @rem you may not use this file except in compliance with the License.
 6 | @rem You may obtain a copy of the License at
 7 | @rem
 8 | @rem      https://www.apache.org/licenses/LICENSE-2.0
 9 | @rem
10 | @rem Unless required by applicable law or agreed to in writing, software
11 | @rem distributed under the License is distributed on an "AS IS" BASIS,
12 | @rem WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
13 | @rem See the License for the specific language governing permissions and
14 | @rem limitations under the License.
15 | @rem
16 | @rem SPDX-License-Identifier: Apache-2.0
17 | @rem
18 | 
19 | @if "%DEBUG%"=="" @echo off
20 | @rem ##########################################################################
21 | @rem
22 | @rem  Gradle startup script for Windows
23 | @rem
24 | @rem ##########################################################################
25 | 
26 | @rem Set local scope for the variables with windows NT shell
27 | if "%OS%"=="Windows_NT" setlocal
28 | 
29 | set DIRNAME=%~dp0
30 | if "%DIRNAME%"=="" set DIRNAME=.
31 | @rem This is normally unused
32 | set APP_BASE_NAME=%~n0
33 | set APP_HOME=%DIRNAME%
34 | 
35 | @rem Resolve any "." and ".." in APP_HOME to make it shorter.
36 | for %%i in ("%APP_HOME%") do set APP_HOME=%%~fi
37 | 
38 | @rem Add default JVM options here. You can also use JAVA_OPTS and GRADLE_OPTS to pass JVM options to this script.
39 | set DEFAULT_JVM_OPTS="-Xmx64m" "-Xms64m"
40 | 
41 | @rem Find java.exe
42 | if defined JAVA_HOME goto findJavaFromJavaHome
43 | 
44 | set JAVA_EXE=java.exe
45 | %JAVA_EXE% -version >NUL 2>&1
46 | if %ERRORLEVEL% equ 0 goto execute
47 | 
48 | echo. 1>&2
49 | echo ERROR: JAVA_HOME is not set and no 'java' command could be found in your PATH. 1>&2
50 | echo. 1>&2
51 | echo Please set the JAVA_HOME variable in your environment to match the 1>&2
52 | echo location of your Java installation. 1>&2
53 | 
54 | goto fail
55 | 
56 | :findJavaFromJavaHome
57 | set JAVA_HOME=%JAVA_HOME:"=%
58 | set JAVA_EXE=%JAVA_HOME%/bin/java.exe
59 | 
60 | if exist "%JAVA_EXE%" goto execute
61 | 
62 | echo. 1>&2
63 | echo ERROR: JAVA_HOME is set to an invalid directory: %JAVA_HOME% 1>&2
64 | echo. 1>&2
65 | echo Please set the JAVA_HOME variable in your environment to match the 1>&2
66 | echo location of your Java installation. 1>&2
67 | 
68 | goto fail
69 | 
70 | :execute
71 | @rem Setup the command line
72 | 
73 | set CLASSPATH=%APP_HOME%\gradle\wrapper\gradle-wrapper.jar
74 | 
75 | 
76 | @rem Execute Gradle
77 | "%JAVA_EXE%" %DEFAULT_JVM_OPTS% %JAVA_OPTS% %GRADLE_OPTS% "-Dorg.gradle.appname=%APP_BASE_NAME%" -classpath "%CLASSPATH%" org.gradle.wrapper.GradleWrapperMain %*
78 | 
79 | :end
80 | @rem End local scope for the variables with windows NT shell
81 | if %ERRORLEVEL% equ 0 goto mainEnd
82 | 
83 | :fail
84 | rem Set variable GRADLE_EXIT_CONSOLE if you need the _script_ return code instead of
85 | rem the _cmd.exe /c_ return code!
86 | set EXIT_CODE=%ERRORLEVEL%
87 | if %EXIT_CODE% equ 0 set EXIT_CODE=1
88 | if not ""=="%GRADLE_EXIT_CONSOLE%" exit %EXIT_CODE%
89 | exit /b %EXIT_CODE%
90 | 
91 | :mainEnd
92 | if "%OS%"=="Windows_NT" endlocal
93 | 
94 | :omega
95 | 
```

--------------------------------------------------------------------------------
/docker-compose-opensearch.yml:
--------------------------------------------------------------------------------

```yaml
 1 | services:
 2 |   opensearch-node1: # This is also the hostname of the container within the Docker network (i.e. https://opensearch-node1/)
 3 |     image: opensearchproject/opensearch:2.11.0
 4 |     container_name: opensearch-node1
 5 |     environment:
 6 |       - cluster.name=opensearch-cluster # Name the cluster
 7 |       - node.name=opensearch-node1 # Name the node that will run in this container
 8 |       - discovery.seed_hosts=opensearch-node1,opensearch-node2 # Nodes to look for when discovering the cluster
 9 |       - cluster.initial_cluster_manager_nodes=opensearch-node1,opensearch-node2 # Nodes eligibile to serve as cluster manager
10 |       - bootstrap.memory_lock=true # Disable JVM heap memory swapping
11 |       - "OPENSEARCH_JAVA_OPTS=-Xms512m -Xmx512m" # Set min and max JVM heap sizes to at least 50% of system RAM
12 |       - cluster.routing.allocation.disk.watermark.low=2gb
13 |       - cluster.routing.allocation.disk.watermark.high=1gb
14 |       - cluster.routing.allocation.disk.watermark.flood_stage=512mb
15 |     ulimits:
16 |       memlock:
17 |         soft: -1 # Set memlock to unlimited (no soft or hard limit)
18 |         hard: -1
19 |       nofile:
20 |         soft: 65536 # Maximum number of open files for the opensearch user - set to at least 65536
21 |         hard: 65536
22 |     volumes:
23 |       - opensearch-data1:/usr/share/opensearch/data # Creates volume called opensearch-data1 and mounts it to the container
24 |     ports:
25 |       - 9200:9200 # REST API
26 |       - 9600:9600 # Performance Analyzer
27 |     networks:
28 |       - opensearch-net # All of the containers will join the same Docker bridge network
29 |   opensearch-node2:
30 |     image: opensearchproject/opensearch:2.11.0 # This should be the same image used for opensearch-node1 to avoid issues
31 |     container_name: opensearch-node2
32 |     environment:
33 |       - cluster.name=opensearch-cluster
34 |       - node.name=opensearch-node2
35 |       - discovery.seed_hosts=opensearch-node1,opensearch-node2
36 |       - cluster.initial_cluster_manager_nodes=opensearch-node1,opensearch-node2
37 |       - bootstrap.memory_lock=true
38 |       - "OPENSEARCH_JAVA_OPTS=-Xms512m -Xmx512m"
39 |       - cluster.routing.allocation.disk.watermark.low=2gb
40 |       - cluster.routing.allocation.disk.watermark.high=1gb
41 |       - cluster.routing.allocation.disk.watermark.flood_stage=512mb
42 |     ulimits:
43 |       memlock:
44 |         soft: -1
45 |         hard: -1
46 |       nofile:
47 |         soft: 65536
48 |         hard: 65536
49 |     volumes:
50 |       - opensearch-data2:/usr/share/opensearch/data
51 |     networks:
52 |       - opensearch-net
53 |   opensearch-dashboards:
54 |     image: opensearchproject/opensearch-dashboards:2.11.0 # Make sure the version of opensearch-dashboards matches the version of opensearch installed on other nodes
55 |     container_name: opensearch-dashboards
56 |     ports:
57 |       - 5601:5601 # Map host port 5601 to container port 5601
58 |     expose:
59 |       - "5601" # Expose port 5601 for web access to OpenSearch Dashboards
60 |     environment:
61 |       OPENSEARCH_HOSTS: '["https://opensearch-node1:9200","https://opensearch-node2:9200"]' # Define the OpenSearch nodes that OpenSearch Dashboards will query
62 |     networks:
63 |       - opensearch-net
64 | 
65 | volumes:
66 |   opensearch-data1:
67 |   opensearch-data2:
68 | 
69 | networks:
70 |   opensearch-net:
71 |   
```

--------------------------------------------------------------------------------
/cliff.toml:
--------------------------------------------------------------------------------

```toml
  1 | # git-cliff ~ configuration file
  2 | # https://git-cliff.org/docs/configuration
  3 | 
  4 | [changelog]
  5 | # template for the changelog header
  6 | header = """
  7 | # Changelog\n
  8 | """
  9 | # template for the changelog body
 10 | # https://keats.github.io/tera/docs/#introduction
 11 | body = """
 12 | {% if version %}\
 13 |     {% if previous.version %}\
 14 |         ## [{{ version | trim_start_matches(pat="v") }}]($REPO/compare/{{ previous.version }}..{{ version }}) - {{ timestamp | date(format="%Y-%m-%d") }}
 15 |     {% else %}\
 16 |         ## [{{ version | trim_start_matches(pat="v") }}] - {{ timestamp | date(format="%Y-%m-%d") }}
 17 |     {% endif %}\
 18 | {% else %}\
 19 |     ## [unreleased]
 20 | {% endif %}\
 21 | {% for group, commits in commits | group_by(attribute="group") %}
 22 |     ### {{ group | striptags | trim | upper_first }}
 23 |     {% for commit in commits
 24 |     | filter(attribute="scope")
 25 |     | sort(attribute="scope") %}
 26 |         - **({{commit.scope}})**{% if commit.breaking %} [**breaking**]{% endif %} \
 27 |             {{ commit.message }} - ([{{ commit.id | truncate(length=7, end="") }}]($REPO/commit/{{ commit.id }})) - @{{ commit.author.name }}
 28 |     {%- endfor -%}
 29 |     {% raw %}\n{% endraw %}\
 30 |     {%- for commit in commits %}
 31 |         {%- if commit.scope -%}
 32 |         {% else -%}
 33 |             - {% if commit.breaking %} [**breaking**]{% endif %}\
 34 |                 {{ commit.message }} - ([{{ commit.id | truncate(length=7, end="") }}]($REPO/commit/{{ commit.id }})) - @{{ commit.author.name }}
 35 |         {% endif -%}
 36 |     {% endfor -%}
 37 | {% endfor %}\n
 38 | """
 39 | # template for the changelog footer
 40 | footer = """
 41 | <!-- generated by git-cliff -->
 42 | """
 43 | # remove the leading and trailing whitespace from the templates
 44 | trim = true
 45 | # postprocessors
 46 | postprocessors = [
 47 |     { pattern = '\$REPO', replace = "https://github.com/cr7258/elasticsearch-mcp-server" }, # replace repository URL
 48 | ]
 49 | 
 50 | [git]
 51 | # parse the commits based on https://www.conventionalcommits.org
 52 | conventional_commits = true
 53 | # filter out the commits that are not conventional
 54 | filter_unconventional = true
 55 | # process each line of a commit as an individual commit
 56 | split_commits = false
 57 | # regex for preprocessing the commit messages
 58 | commit_preprocessors = [
 59 |     # { pattern = '\((\w+\s)?#([0-9]+)\)', replace = "([#${2}](https://github.com/cr7258/elasticsearch-mcp-server/issues/${2}))"}, # replace issue numbers
 60 | ]
 61 | # regex for parsing and grouping commits
 62 | commit_parsers = [
 63 |   { message = "^feat", group = "<!-- 0 -->⛰️  Features" },
 64 |   { message = "^fix", group = "<!-- 1 -->🐛 Bug Fixes" },
 65 |   { message = "^doc", group = "<!-- 3 -->📚 Documentation" },
 66 |   { message = "^perf", group = "<!-- 4 -->⚡ Performance" },
 67 |   { message = "^refactor\\(clippy\\)", skip = true },
 68 |   { message = "^refactor", group = "<!-- 2 -->🚜 Refactor" },
 69 |   { message = "^style", group = "<!-- 5 -->🎨 Styling" },
 70 |   { message = "^test", group = "<!-- 6 -->🧪 Testing" },
 71 |   { message = "^chore\\(release\\): prepare for", skip = true },
 72 |   { message = "^chore\\(deps.*\\)", skip = true },
 73 |   { message = "^chore\\(pr\\)", skip = true },
 74 |   { message = "^chore\\(pull\\)", skip = true },
 75 |   { message = "^chore\\(npm\\).*yarn\\.lock", skip = true },
 76 |   { message = "^chore|^ci", group = "<!-- 7 -->⚙️ Miscellaneous Tasks" },
 77 |   { body = ".*security", group = "<!-- 8 -->🛡️ Security" },
 78 |   { message = "^revert", group = "<!-- 9 -->◀️ Revert" },
 79 | ]
 80 | 
 81 | # filter out the commits that are not matched by commit parsers
 82 | filter_commits = false
 83 | # sort the tags topologically
 84 | topo_order = false
 85 | # sort the commits inside sections by oldest/newest order
 86 | sort_commits = "oldest"
 87 | # regex for matching git tags
 88 | tag_pattern = "^v[0-9]"
 89 | # regex for skipping tags
 90 | skip_tags = ""
 91 | # regex for ignoring tags
 92 | ignore_tags = ""
 93 | # use tag date instead of commit date
 94 | date_order = true
 95 | # path to git binary
 96 | git_path = "git"
 97 | # whether to use relaxed or strict semver parsing
 98 | relaxed_semver = true
 99 | # only show the changes for the current version
100 | tag_range = true
101 | 
```

--------------------------------------------------------------------------------
/mcp_client/python-sdk-anthropic/client.py:
--------------------------------------------------------------------------------

```python
  1 | """
  2 | Client example copied from https://modelcontextprotocol.io/quickstart/client
  3 | """
  4 | 
  5 | import asyncio
  6 | from typing import Optional
  7 | from contextlib import AsyncExitStack
  8 | 
  9 | from mcp import ClientSession, StdioServerParameters
 10 | from mcp.client.stdio import stdio_client
 11 | 
 12 | from anthropic import Anthropic
 13 | 
 14 | from config import get_logger, read_config
 15 | 
 16 | logger = get_logger(__name__)
 17 | 
 18 | 
 19 | class MCPClient:
 20 |     def __init__(self):
 21 |         self.session: Optional[ClientSession] = None
 22 |         self.exit_stack = AsyncExitStack()
 23 |         self.anthropic = Anthropic()
 24 |         self.config = read_config()
 25 | 
 26 |     async def connect_to_server(self, server_script_path: str):
 27 |         """Connect to an MCP server
 28 | 
 29 |         Args:
 30 |             server_script_path: Path to the server script (.py or .js)
 31 |         """
 32 |         is_python = server_script_path.endswith('.py')
 33 |         is_js = server_script_path.endswith('.js')
 34 |         if not (is_python or is_js):
 35 |             raise ValueError("Server script must be a .py or .js file")
 36 | 
 37 |         command = "python" if is_python else "node"
 38 |         server_params = StdioServerParameters(command=command, args=[server_script_path], env=None)
 39 | 
 40 |         stdio_transport = await self.exit_stack.enter_async_context(stdio_client(server_params))
 41 |         self.stdio, self.write = stdio_transport
 42 |         self.session = await self.exit_stack.enter_async_context(ClientSession(self.stdio, self.write))
 43 | 
 44 |         if self.session is not None:
 45 |             await self.session.initialize()
 46 |             response = await self.session.list_tools()
 47 |             tools = response.tools
 48 |             logger.info(f"\nConnected to server with tools: {', '.join(tool.name for tool in tools)}")
 49 | 
 50 |     async def process_query(self, query: str) -> str:
 51 |         """Process a query using Claude and available tools"""
 52 |         messages = [
 53 |             {
 54 |                 "role": "user",
 55 |                 "content": query
 56 |             }
 57 |         ]
 58 | 
 59 |         response = await self.session.list_tools()
 60 |         available_tools = [{
 61 |             "name": tool.name,
 62 |             "description": tool.description,
 63 |             "input_schema": tool.inputSchema
 64 |         } for tool in response.tools]
 65 | 
 66 |         # Initial Claude API call
 67 |         response = self.anthropic.messages.create(
 68 |             model=self.config.anthropic.model,
 69 |             max_tokens=self.config.anthropic.max_tokens_message,
 70 |             messages=messages,
 71 |             tools=available_tools
 72 |         )
 73 | 
 74 |         # Process response and handle tool calls
 75 |         final_text = []
 76 | 
 77 |         assistant_message_content = []
 78 |         for content in response.content:
 79 |             if content.type == 'text':
 80 |                 final_text.append(content.text)
 81 |                 assistant_message_content.append(content)
 82 |             elif content.type == 'tool_use':
 83 |                 tool_name = content.name
 84 |                 tool_args = content.input
 85 | 
 86 |                 # Execute tool call
 87 |                 result = await self.session.call_tool(tool_name, tool_args)
 88 |                 final_text.append(f"[Calling tool {tool_name} with args {tool_args}]")
 89 | 
 90 |                 assistant_message_content.append(content)
 91 |                 messages.append({
 92 |                     "role": "assistant",
 93 |                     "content": assistant_message_content
 94 |                 })
 95 |                 messages.append({
 96 |                     "role": "user",
 97 |                     "content": [
 98 |                         {
 99 |                             "type": "tool_result",
100 |                             "tool_use_id": content.id,
101 |                             "content": result.content
102 |                         }
103 |                     ]
104 |                 })
105 | 
106 |                 # Get next response from Claude
107 |                 response = self.anthropic.messages.create(
108 |                     model=self.config.anthropic.model,
109 |                     max_tokens=self.config.anthropic.max_tokens_message,
110 |                     messages=messages,
111 |                     tools=available_tools
112 |                 )
113 | 
114 |                 final_text.append(response.content[0].text)
115 | 
116 |         return "\n".join(final_text)
117 | 
118 |     async def chat_loop(self):
119 |         """Run an interactive chat loop"""
120 |         logger.info("\nMCP Client Started!")
121 |         logger.info("Type your queries or 'quit' to exit.")
122 | 
123 |         while True:
124 |             try:
125 |                 query = input("\nQuery: ").strip()
126 | 
127 |                 if query.lower() == 'quit':
128 |                     break
129 | 
130 |                 response = await self.process_query(query)
131 |                 logger.info("\n" + response)
132 | 
133 |             except Exception as e:
134 |                 logger.error(f"\nError: {str(e)}")
135 | 
136 |     async def cleanup(self):
137 |         """Clean up resources"""
138 |         await self.exit_stack.aclose()
139 | 
140 | 
141 | async def main():
142 |     # if len(sys.argv) < 2:
143 |     #     logger.exception("Usage: python client.py <path_to_server_script>")
144 |     #     return
145 | 
146 |     client = MCPClient()
147 |     try:
148 |         await client.connect_to_server(sys.argv[1])
149 |         logger.info(f"Connected to the server: {sys.argv[1]}.")
150 |         await client.chat_loop()
151 |     finally:
152 |         await client.cleanup()
153 |         logger.info(f"Disconnected from the server: {sys.argv[1]}.")
154 | 
155 | 
156 | if __name__ == "__main__":
157 |     import sys
158 |     asyncio.run(main())
159 | 
```

--------------------------------------------------------------------------------
/src/server.py:
--------------------------------------------------------------------------------

```python
  1 | import logging
  2 | import sys
  3 | import argparse
  4 | 
  5 | from fastmcp import FastMCP
  6 | 
  7 | from src.clients import create_search_client
  8 | from src.tools.alias import AliasTools
  9 | from src.tools.cluster import ClusterTools
 10 | from src.tools.data_stream import DataStreamTools
 11 | from src.tools.document import DocumentTools
 12 | from src.tools.general import GeneralTools
 13 | from src.tools.index import IndexTools
 14 | from src.tools.register import ToolsRegister
 15 | from src.version import __version__ as VERSION
 16 | 
 17 | class SearchMCPServer:
 18 |     def __init__(self, engine_type):
 19 |         # Set engine type
 20 |         self.engine_type = engine_type
 21 |         self.name = f"{self.engine_type}-mcp-server"
 22 |         self.mcp = FastMCP(self.name)
 23 |         
 24 |         # Configure logging
 25 |         logging.basicConfig(
 26 |             level=logging.INFO,
 27 |             format='%(asctime)s - %(name)s - %(levelname)s - %(message)s',
 28 |         )
 29 |         self.logger = logging.getLogger(__name__)
 30 |         self.logger.info(f"Initializing {self.name}, Version: {VERSION}")
 31 |         
 32 |         # Create the corresponding search client
 33 |         self.search_client = create_search_client(self.engine_type)
 34 |         
 35 |         # Initialize tools
 36 |         self._register_tools()
 37 | 
 38 |     def _register_tools(self):
 39 |         """Register all MCP tools."""
 40 |         # Create a tools register
 41 |         register = ToolsRegister(self.logger, self.search_client, self.mcp)
 42 |         
 43 |         # Define all tool classes to register
 44 |         tool_classes = [
 45 |             IndexTools,
 46 |             DocumentTools,
 47 |             ClusterTools,
 48 |             AliasTools,
 49 |             DataStreamTools,
 50 |             GeneralTools,
 51 |         ]        
 52 |         # Register all tools
 53 |         register.register_all_tools(tool_classes)
 54 | 
 55 | 
 56 | def run_search_server(engine_type, transport, host, port, path):
 57 |     """Run search server with specified engine type and transport options.
 58 |     
 59 |     Args:
 60 |         engine_type: Type of search engine to use ("elasticsearch" or "opensearch")
 61 |         transport: Transport protocol to use ("stdio", "streamable-http", or "sse")
 62 |         host: Host to bind to when using HTTP transports
 63 |         port: Port to bind to when using HTTP transports
 64 |         path: URL path prefix for HTTP transports
 65 |     """
 66 |     
 67 |     server = SearchMCPServer(engine_type=engine_type)
 68 |     
 69 |     if transport in ["streamable-http", "sse"]:
 70 |         server.logger.info(f"Starting {server.name} with {transport} transport on {host}:{port}{path}")
 71 |         server.mcp.run(transport=transport, host=host, port=port, path=path)
 72 |     else:
 73 |         server.logger.info(f"Starting {server.name} with {transport} transport")
 74 |         server.mcp.run(transport=transport)
 75 | 
 76 | def parse_server_args():
 77 |     """Parse command line arguments for the MCP server.
 78 |     
 79 |     Returns:
 80 |         Parsed arguments
 81 |     """
 82 |     parser = argparse.ArgumentParser()
 83 |     parser.add_argument(
 84 |         "--transport", "-t",
 85 |         default="stdio",
 86 |         choices=["stdio", "streamable-http", "sse"],
 87 |         help="Transport protocol to use (default: stdio)"
 88 |     )
 89 |     parser.add_argument(
 90 |         "--host", "-H",
 91 |         default="127.0.0.1",
 92 |         help="Host to bind to when using HTTP transports (default: 127.0.0.1)"
 93 |     )
 94 |     parser.add_argument(
 95 |         "--port", "-p",
 96 |         type=int,
 97 |         default=8000,
 98 |         help="Port to bind to when using HTTP transports (default: 8000)"
 99 |     )
100 |     parser.add_argument(
101 |         "--path", "-P",
102 |         help="URL path prefix for HTTP transports (default: /mcp for streamable-http, /sse for sse)"
103 |     )
104 |     
105 |     args = parser.parse_args()
106 |     
107 |     # Set default path based on transport type if not specified
108 |     if args.path is None:
109 |         if args.transport == "sse":
110 |             args.path = "/sse"
111 |         else:
112 |             args.path = "/mcp"
113 |             
114 |     return args
115 | 
116 | def elasticsearch_mcp_server():
117 |     """Entry point for Elasticsearch MCP server."""
118 |     args = parse_server_args()
119 |     
120 |     # Run the server with the specified options
121 |     run_search_server(
122 |         engine_type="elasticsearch",
123 |         transport=args.transport,
124 |         host=args.host,
125 |         port=args.port,
126 |         path=args.path
127 |     )
128 | 
129 | def opensearch_mcp_server():
130 |     """Entry point for OpenSearch MCP server."""
131 |     args = parse_server_args()
132 |     
133 |     # Run the server with the specified options
134 |     run_search_server(
135 |         engine_type="opensearch",
136 |         transport=args.transport,
137 |         host=args.host,
138 |         port=args.port,
139 |         path=args.path
140 |     )
141 | 
142 | if __name__ == "__main__":
143 |     # Require elasticsearch-mcp-server or opensearch-mcp-server as the first argument
144 |     if len(sys.argv) <= 1 or sys.argv[1] not in ["elasticsearch-mcp-server", "opensearch-mcp-server"]:
145 |         print("Error: First argument must be 'elasticsearch-mcp-server' or 'opensearch-mcp-server'")
146 |         sys.exit(1)
147 |         
148 |     # Determine engine type based on the first argument
149 |     engine_type = "elasticsearch"  # Default
150 |     if sys.argv[1] == "opensearch-mcp-server":
151 |         engine_type = "opensearch"
152 |         
153 |     # Remove the first argument so it doesn't interfere with argparse
154 |     sys.argv.pop(1)
155 |     
156 |     # Parse command line arguments
157 |     args = parse_server_args()
158 |     
159 |     # Run the server with the specified options
160 |     run_search_server(
161 |         engine_type=engine_type,
162 |         transport=args.transport,
163 |         host=args.host,
164 |         port=args.port,
165 |         path=args.path
166 |     )
167 | 
```

--------------------------------------------------------------------------------
/src/clients/base.py:
--------------------------------------------------------------------------------

```python
  1 | from abc import ABC
  2 | import logging
  3 | import warnings
  4 | from typing import Dict, Optional
  5 | 
  6 | from elasticsearch import Elasticsearch
  7 | import httpx
  8 | from opensearchpy import OpenSearch
  9 | 
 10 | class SearchClientBase(ABC):
 11 |     def __init__(self, config: Dict, engine_type: str):
 12 |         """
 13 |         Initialize the search client.
 14 |         
 15 |         Args:
 16 |             config: Configuration dictionary with connection parameters
 17 |             engine_type: Type of search engine to use ("elasticsearch" or "opensearch")
 18 |         """
 19 |         self.logger = logging.getLogger()
 20 |         self.config = config
 21 |         self.engine_type = engine_type
 22 |         
 23 |         # Extract common configuration
 24 |         hosts = config.get("hosts")
 25 |         username = config.get("username")
 26 |         password = config.get("password")
 27 |         api_key = config.get("api_key")
 28 |         verify_certs = config.get("verify_certs", False)
 29 |         
 30 |         # Disable insecure request warnings if verify_certs is False
 31 |         if not verify_certs:
 32 |             warnings.filterwarnings("ignore", message=".*verify_certs=False is insecure.*")
 33 |             warnings.filterwarnings("ignore", message=".*Unverified HTTPS request is being made to host.*")
 34 |             
 35 |             try:
 36 |                 import urllib3
 37 |                 urllib3.disable_warnings(urllib3.exceptions.InsecureRequestWarning)
 38 |             except ImportError:
 39 |                 pass
 40 |         
 41 |         # Initialize client based on engine type
 42 |         if engine_type == "elasticsearch":
 43 |             # Get auth parameters based on elasticsearch package version and authentication method
 44 |             auth_params = self._get_elasticsearch_auth_params(username, password, api_key)
 45 |             
 46 |             self.client = Elasticsearch(
 47 |                 hosts=hosts,
 48 |                 verify_certs=verify_certs,
 49 |                 **auth_params
 50 |             )
 51 |             self.logger.info(f"Elasticsearch client initialized with hosts: {hosts}")
 52 |         elif engine_type == "opensearch":
 53 |             self.client = OpenSearch(
 54 |                 hosts=hosts,
 55 |                 http_auth=(username, password) if username and password else None,
 56 |                 verify_certs=verify_certs
 57 |             )
 58 |             self.logger.info(f"OpenSearch client initialized with hosts: {hosts}")
 59 |         else:
 60 |             raise ValueError(f"Unsupported engine type: {engine_type}")
 61 | 
 62 |         # General REST client
 63 |         base_url = hosts[0] if isinstance(hosts, list) else hosts
 64 |         self.general_client = GeneralRestClient(
 65 |             base_url=base_url,
 66 |             username=username,
 67 |             password=password,
 68 |             api_key=api_key,
 69 |             verify_certs=verify_certs,
 70 |         )
 71 | 
 72 |     def _get_elasticsearch_auth_params(self, username: Optional[str], password: Optional[str], api_key: Optional[str]) -> Dict:
 73 |         """
 74 |         Get authentication parameters for Elasticsearch client based on package version.
 75 |         
 76 |         Args:
 77 |             username: Username for authentication
 78 |             password: Password for authentication
 79 |             api_key: API key for authentication
 80 |             
 81 |         Returns:
 82 |             Dictionary with appropriate auth parameters for the ES version
 83 |         """
 84 |         # API key takes precedence over username/password
 85 |         if api_key:
 86 |             return {"api_key": api_key}
 87 |             
 88 |         if not username or not password:
 89 |             return {}
 90 |             
 91 |         # Check Elasticsearch package version to determine auth parameter name
 92 |         try:
 93 |             from elasticsearch import __version__ as es_version
 94 |             # Convert version tuple to string format
 95 |             version_str = '.'.join(map(str, es_version))
 96 |             self.logger.info(f"Elasticsearch client version: {version_str}")
 97 |             major_version = es_version[0]    
 98 |             if major_version >= 8:
 99 |                 # ES 8+ uses basic_auth
100 |                 return {"basic_auth": (username, password)}
101 |             else:
102 |                 # ES 7 and below use http_auth
103 |                 return {"http_auth": (username, password)}
104 |         except Exception as e:
105 |             self.logger.error(f"Failed to detect Elasticsearch version: {e}")
106 |             # If we can't detect version, try basic_auth first (ES 8+ default)
107 |             return {"basic_auth": (username, password)}
108 | 
109 | class GeneralRestClient:
110 |     def __init__(self, base_url: Optional[str], username: Optional[str], password: Optional[str], api_key: Optional[str], verify_certs: bool):
111 |         self.base_url = base_url.rstrip("/") if base_url else ""
112 |         self.auth = (username, password) if username and password else None
113 |         self.api_key = api_key
114 |         self.verify_certs = verify_certs
115 | 
116 |     def request(self, method, path, params=None, body=None):
117 |         url = f"{self.base_url}/{path.lstrip('/')}"
118 |         headers = {}
119 |         
120 |         # Add API key to Authorization header if provided
121 |         if self.api_key:
122 |             headers["Authorization"] = f"ApiKey {self.api_key}"
123 |         
124 |         with httpx.Client(verify=self.verify_certs) as client:
125 |             resp = client.request(
126 |                 method=method.upper(),
127 |                 url=url,
128 |                 params=params,
129 |                 json=body,
130 |                 auth=self.auth if not self.api_key else None,  # Use basic auth only if no API key
131 |                 headers=headers
132 |             )
133 |             resp.raise_for_status()
134 |             ct = resp.headers.get("content-type", "")
135 |             if ct.startswith("application/json"):
136 |                 return resp.json()
137 |             return resp.text
138 | 
```

--------------------------------------------------------------------------------
/docker-compose-elasticsearch.yml:
--------------------------------------------------------------------------------

```yaml
  1 | services:
  2 |   setup:
  3 |     image: docker.elastic.co/elasticsearch/elasticsearch:8.17.2
  4 |     volumes:
  5 |       - certs:/usr/share/elasticsearch/config/certs
  6 |     user: "0"
  7 |     command: >
  8 |       bash -c '
  9 |         if [ x${ELASTICSEARCH_PASSWORD} == x ]; then
 10 |           echo "Set the ELASTICSEARCH_PASSWORD environment variable in the .env file";
 11 |           exit 1;
 12 |         fi;
 13 |         if [ ! -f config/certs/ca.zip ]; then
 14 |           echo "Creating CA";
 15 |           bin/elasticsearch-certutil ca --silent --pem -out config/certs/ca.zip;
 16 |           unzip config/certs/ca.zip -d config/certs;
 17 |         fi;
 18 |         if [ ! -f config/certs/certs.zip ]; then
 19 |           echo "Creating certs";
 20 |           echo -ne \
 21 |           "instances:\n"\
 22 |           "  - name: es01\n"\
 23 |           "    dns:\n"\
 24 |           "      - es01\n"\
 25 |           "      - localhost\n"\
 26 |           "    ip:\n"\
 27 |           "      - 127.0.0.1\n"\
 28 |           "  - name: es02\n"\
 29 |           "    dns:\n"\
 30 |           "      - es02\n"\
 31 |           "      - localhost\n"\
 32 |           "    ip:\n"\
 33 |           "      - 127.0.0.1\n"\
 34 |           "  - name: es03\n"\
 35 |           "    dns:\n"\
 36 |           "      - es03\n"\
 37 |           "      - localhost\n"\
 38 |           "    ip:\n"\
 39 |           "      - 127.0.0.1\n"\
 40 |           > config/certs/instances.yml;
 41 |           bin/elasticsearch-certutil cert --silent --pem -out config/certs/certs.zip --in config/certs/instances.yml --ca-cert config/certs/ca/ca.crt --ca-key config/certs/ca/ca.key;
 42 |           unzip config/certs/certs.zip -d config/certs;
 43 |         fi;
 44 |         echo "Setting file permissions"
 45 |         chown -R root:root config/certs;
 46 |         find . -type d -exec chmod 750 \{\} \;;
 47 |         find . -type f -exec chmod 640 \{\} \;;
 48 |         echo "Waiting for Elasticsearch availability";
 49 |         until curl -s --cacert config/certs/ca/ca.crt https://es01:9200 | grep -q "missing authentication credentials"; do sleep 30; done;
 50 |         echo "Setting kibana_system password";
 51 |         until curl -s -X POST --cacert config/certs/ca/ca.crt -u "elastic:${ELASTICSEARCH_PASSWORD}" -H "Content-Type: application/json" https://es01:9200/_security/user/kibana_system/_password -d "{\"password\":\"kibana123\"}" | grep -q "^{}"; do sleep 10; done;
 52 |         echo "All done!";
 53 |       '
 54 |     healthcheck:
 55 |       test: ["CMD-SHELL", "[ -f config/certs/es01/es01.crt ]"]
 56 |       interval: 1s
 57 |       timeout: 5s
 58 |       retries: 120
 59 | 
 60 |   es01:
 61 |     depends_on:
 62 |       setup:
 63 |         condition: service_healthy
 64 |     image: docker.elastic.co/elasticsearch/elasticsearch:8.17.2
 65 |     volumes:
 66 |       - certs:/usr/share/elasticsearch/config/certs
 67 |       - esdata01:/usr/share/elasticsearch/data
 68 |     ports:
 69 |       - 9200:9200
 70 |     environment:
 71 |       - node.name=es01
 72 |       - cluster.name=es-mcp-cluster
 73 |       - cluster.initial_master_nodes=es01,es02,es03
 74 |       - discovery.seed_hosts=es02,es03
 75 |       - ELASTIC_PASSWORD=${ELASTICSEARCH_PASSWORD}
 76 |       - bootstrap.memory_lock=true
 77 |       - xpack.security.enabled=true
 78 |       - xpack.security.http.ssl.enabled=true
 79 |       - xpack.security.http.ssl.key=certs/es01/es01.key
 80 |       - xpack.security.http.ssl.certificate=certs/es01/es01.crt
 81 |       - xpack.security.http.ssl.certificate_authorities=certs/ca/ca.crt
 82 |       - xpack.security.transport.ssl.enabled=true
 83 |       - xpack.security.transport.ssl.key=certs/es01/es01.key
 84 |       - xpack.security.transport.ssl.certificate=certs/es01/es01.crt
 85 |       - xpack.security.transport.ssl.certificate_authorities=certs/ca/ca.crt
 86 |       - xpack.security.transport.ssl.verification_mode=certificate
 87 |       - xpack.license.self_generated.type=basic
 88 |       - cluster.routing.allocation.disk.watermark.low=2gb
 89 |       - cluster.routing.allocation.disk.watermark.high=1gb
 90 |       - cluster.routing.allocation.disk.watermark.flood_stage=512mb
 91 |     mem_limit: 1073741824
 92 |     ulimits:
 93 |       memlock:
 94 |         soft: -1
 95 |         hard: -1
 96 |     healthcheck:
 97 |       test:
 98 |         [
 99 |           "CMD-SHELL",
100 |           "curl -s --cacert config/certs/ca/ca.crt https://localhost:9200 | grep -q 'missing authentication credentials'",
101 |         ]
102 |       interval: 10s
103 |       timeout: 10s
104 |       retries: 120
105 | 
106 |   es02:
107 |     depends_on:
108 |       - es01
109 |     image: docker.elastic.co/elasticsearch/elasticsearch:8.17.2
110 |     volumes:
111 |       - certs:/usr/share/elasticsearch/config/certs
112 |       - esdata02:/usr/share/elasticsearch/data
113 |     environment:
114 |       - node.name=es02
115 |       - cluster.name=es-mcp-cluster
116 |       - cluster.initial_master_nodes=es01,es02,es03
117 |       - discovery.seed_hosts=es01,es03
118 |       - bootstrap.memory_lock=true
119 |       - xpack.security.enabled=true
120 |       - xpack.security.http.ssl.enabled=true
121 |       - xpack.security.http.ssl.key=certs/es02/es02.key
122 |       - xpack.security.http.ssl.certificate=certs/es02/es02.crt
123 |       - xpack.security.http.ssl.certificate_authorities=certs/ca/ca.crt
124 |       - xpack.security.transport.ssl.enabled=true
125 |       - xpack.security.transport.ssl.key=certs/es02/es02.key
126 |       - xpack.security.transport.ssl.certificate=certs/es02/es02.crt
127 |       - xpack.security.transport.ssl.certificate_authorities=certs/ca/ca.crt
128 |       - xpack.security.transport.ssl.verification_mode=certificate
129 |       - xpack.license.self_generated.type=basic
130 |       - cluster.routing.allocation.disk.watermark.low=2gb
131 |       - cluster.routing.allocation.disk.watermark.high=1gb
132 |       - cluster.routing.allocation.disk.watermark.flood_stage=512mb
133 |     mem_limit: 1073741824
134 |     ulimits:
135 |       memlock:
136 |         soft: -1
137 |         hard: -1
138 |     healthcheck:
139 |       test:
140 |         [
141 |           "CMD-SHELL",
142 |           "curl -s --cacert config/certs/ca/ca.crt https://localhost:9200 | grep -q 'missing authentication credentials'",
143 |         ]
144 |       interval: 10s
145 |       timeout: 10s
146 |       retries: 120
147 | 
148 |   es03:
149 |     depends_on:
150 |       - es02
151 |     image: docker.elastic.co/elasticsearch/elasticsearch:8.17.2
152 |     volumes:
153 |       - certs:/usr/share/elasticsearch/config/certs
154 |       - esdata03:/usr/share/elasticsearch/data
155 |     environment:
156 |       - node.name=es03
157 |       - cluster.name=es-mcp-cluster
158 |       - cluster.initial_master_nodes=es01,es02,es03
159 |       - discovery.seed_hosts=es01,es02
160 |       - bootstrap.memory_lock=true
161 |       - xpack.security.enabled=true
162 |       - xpack.security.http.ssl.enabled=true
163 |       - xpack.security.http.ssl.key=certs/es03/es03.key
164 |       - xpack.security.http.ssl.certificate=certs/es03/es03.crt
165 |       - xpack.security.http.ssl.certificate_authorities=certs/ca/ca.crt
166 |       - xpack.security.transport.ssl.enabled=true
167 |       - xpack.security.transport.ssl.key=certs/es03/es03.key
168 |       - xpack.security.transport.ssl.certificate=certs/es03/es03.crt
169 |       - xpack.security.transport.ssl.certificate_authorities=certs/ca/ca.crt
170 |       - xpack.security.transport.ssl.verification_mode=certificate
171 |       - xpack.license.self_generated.type=basic
172 |       - cluster.routing.allocation.disk.watermark.low=2gb
173 |       - cluster.routing.allocation.disk.watermark.high=1gb
174 |       - cluster.routing.allocation.disk.watermark.flood_stage=512mb
175 |     mem_limit: 1073741824
176 |     ulimits:
177 |       memlock:
178 |         soft: -1
179 |         hard: -1
180 |     healthcheck:
181 |       test:
182 |         [
183 |           "CMD-SHELL",
184 |           "curl -s --cacert config/certs/ca/ca.crt https://localhost:9200 | grep -q 'missing authentication credentials'",
185 |         ]
186 |       interval: 10s
187 |       timeout: 10s
188 |       retries: 120
189 | 
190 |   kibana:
191 |     depends_on:
192 |       es01:
193 |         condition: service_healthy
194 |       es02:
195 |         condition: service_healthy
196 |       es03:
197 |         condition: service_healthy
198 |     image: docker.elastic.co/kibana/kibana:8.17.2
199 |     volumes:
200 |       - certs:/usr/share/kibana/config/certs
201 |       - kibanadata:/usr/share/kibana/data
202 |     ports:
203 |       - 5601:5601
204 |     environment:
205 |       - SERVERNAME=kibana
206 |       - ELASTICSEARCH_HOSTS=https://es01:9200
207 |       - ELASTICSEARCH_USERNAME=kibana_system
208 |       - ELASTICSEARCH_PASSWORD=kibana123
209 |       - ELASTICSEARCH_SSL_CERTIFICATEAUTHORITIES=config/certs/ca/ca.crt
210 |     mem_limit: 1073741824
211 |     healthcheck:
212 |       test:
213 |         [
214 |           "CMD-SHELL",
215 |           "curl -s -I http://localhost:5601 | grep -q 'HTTP/1.1 302 Found'",
216 |         ]
217 |       interval: 10s
218 |       timeout: 10s
219 |       retries: 120
220 | 
221 | volumes:
222 |   certs:
223 |     driver: local
224 |   esdata01:
225 |     driver: local
226 |   esdata02:
227 |     driver: local
228 |   esdata03:
229 |     driver: local
230 |   kibanadata:
231 |     driver: local
232 | 
```