#
tokens: 4276/50000 7/7 files
lines: off (toggle) GitHub
raw markdown copy
# Directory Structure

```
├── .github
│   └── workflows
│       └── publish.yml
├── .gitignore
├── CHANGELOG.md
├── LICENSE
├── package-lock.json
├── package.json
├── README.md
├── src
│   └── index.ts
└── tsconfig.json
```

# Files

--------------------------------------------------------------------------------
/.gitignore:
--------------------------------------------------------------------------------

```
node_modules/
build/
*.log
.env*
```

--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------

```markdown
# Azure TableStore MCP Server
[![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)

A TypeScript-based MCP server that enables interaction with Azure Table Storage directly through Cline. This tool allows you to query and manage data in Azure Storage Tables.

<a href="https://glama.ai/mcp/servers/8kah8zukke"><img width="380" height="200" src="https://glama.ai/mcp/servers/8kah8zukke/badge?refresh=1" alt="mcp-azure-tablestorage MCP server" /></a>

## Features

- Query Azure Storage Tables with OData filter support
- Get table schemas to understand data structure
- List all tables in the storage account
- Detailed error handling and response information
- Simple configuration through connection string

## Installation

### Local Development Setup

1. Clone the repository:
```powershell
git clone https://github.com/dkmaker/mcp-azure-tablestorage.git
cd mcp-azure-tablestorage
```

2. Install dependencies:
```powershell
npm install
```

3. Build the server:
```powershell
npm run build
```

### NPM Installation

You can install the package globally via npm:

```bash
npm install -g dkmaker-mcp-server-tablestore
```

Or run it directly with npx:

```bash
npx dkmaker-mcp-server-tablestore
```

Note: When using npx or global installation, you'll still need to configure the AZURE_STORAGE_CONNECTION_STRING environment variable.

### Installing in Cline

To use the Azure TableStore server with Cline, you need to add it to your MCP settings configuration. The configuration file is located at:

Windows: `%APPDATA%\Code\User\globalStorage\saoudrizwan.claude-dev\settings\cline_mcp_settings.json`

Add the following to your configuration:

```json
{
  "mcpServers": {
    "tablestore": {
      "command": "node",
      "args": ["C:/path/to/your/mcp-azure-tablestorage/build/index.js"],
      "env": {
        "AZURE_STORAGE_CONNECTION_STRING": "your_connection_string_here"  // Required: Your Azure Storage connection string
      }
    }
  }
}
```

Replace `C:/path/to/your/mcp-azure-tablestorage` with the actual path where you cloned the repository.

## Configuration

The server requires the following environment variable:

- `AZURE_STORAGE_CONNECTION_STRING`: Your Azure Storage account connection string

## Usage in Cline

⚠️ **IMPORTANT SAFETY NOTE**: The query_table tool returns a limited subset of results (default: 5 items) to protect the LLM's context window. DO NOT increase this limit unless explicitly confirmed by the user, as larger result sets can overwhelm the context window.

Once installed, you can use the Azure TableStore server through Cline. Here are some examples:

1. Querying a table:
```
Query the Users table where PartitionKey is 'ACTIVE'
```

Cline will use the query_table tool with:
```json
{
  "tableName": "Users",
  "filter": "PartitionKey eq 'ACTIVE'",
  "limit": 5  // Optional: Defaults to 5 items. WARNING: Do not increase without user confirmation
}
```

The response will include:
- Total number of items that match the query (without limit)
- Limited subset of items (default 5) for safe LLM processing
- Applied limit value

For example:
```json
{
  "totalItems": 25,
  "limit": 5,
  "items": [
    // First 5 matching items
  ]
}
```

This design allows the LLM to understand the full scope of the data while working with a manageable subset. The default limit of 5 items protects against overwhelming the LLM's context window - this limit should only be increased when explicitly confirmed by the user.

2. Getting table schema:
```
Show me the schema for the Orders table
```

Cline will use the get_table_schema tool with:
```json
{
  "tableName": "Orders"
}
```

3. Listing tables:
```
List all tables in the storage account
```

Cline will use the list_tables tool with:
```json
{}
```

## Project Structure

- `src/index.ts`: Main server implementation with Azure Table Storage interaction logic
- `build/`: Compiled JavaScript output
- `package.json`: Project dependencies and scripts

## Dependencies

- @azure/data-tables: Azure Table Storage client library
- @modelcontextprotocol/sdk: MCP server implementation toolkit

## License

This project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details. This means you can use, modify, distribute, and sublicense the code freely, provided you include the original copyright notice and license terms.

```

--------------------------------------------------------------------------------
/tsconfig.json:
--------------------------------------------------------------------------------

```json
{
  "compilerOptions": {
    "target": "ES2020",
    "module": "ES2020",
    "moduleResolution": "node",
    "esModuleInterop": true,
    "strict": true,
    "outDir": "build",
    "skipLibCheck": true,
    "forceConsistentCasingInFileNames": true
  },
  "include": ["src/**/*"],
  "exclude": ["node_modules", "build"]
}

```

--------------------------------------------------------------------------------
/CHANGELOG.md:
--------------------------------------------------------------------------------

```markdown
# Changelog

All notable changes to this project will be documented in this file.

## [0.2.0] - 2024-03-11

### Added
- Initial release
- Azure Table Storage integration
- Query table functionality with OData filter support
- Table schema analysis capabilities
- Table listing functionality
- MCP server implementation for Azure Storage Tables

```

--------------------------------------------------------------------------------
/package.json:
--------------------------------------------------------------------------------

```json
{
  "name": "dkmaker-mcp-server-tablestore",
  "version": "0.2.0",
  "description": "MCP server for Azure Table Storage interaction",
  "license": "MIT",
  "type": "module",
  "bin": {
    "dkmaker-mcp-server-tablestore": "build/index.js"
  },
  "files": [
    "build",
    "README.md",
    "LICENSE"
  ],
  "publishConfig": {
    "access": "public"
  },
  "scripts": {
    "build": "tsc",
    "start": "node build/index.js"
  },
  "dependencies": {
    "@azure/data-tables": "^13.2.2",
    "@modelcontextprotocol/sdk": "^1.0.4"
  },
  "devDependencies": {
    "@types/node": "^20.11.24",
    "typescript": "^5.3.3"
  },
  "keywords": [
    "mcp",
    "azure",
    "cline",
    "sonnet",
    "assistant",
    "development",
    "typescript"
  ],
  "author": "dkmaker",
  "repository": {
    "type": "git",
    "url": "git+https://github.com/dkmaker/mcp-azure-tablestorage.git"
  },
  "bugs": {
    "url": "https://github.com/dkmaker/mcp-azure-tablestorage/issues"
  },
  "homepage": "https://github.com/dkmaker/mcp-azure-tablestorage#readme",
  "engines": {
    "node": ">=18.0.0"
  }

}

```

--------------------------------------------------------------------------------
/.github/workflows/publish.yml:
--------------------------------------------------------------------------------

```yaml
name: Publish Package

on:
  push:
    branches: [ main ]
  workflow_dispatch:

permissions:
  contents: write
  pull-requests: write

jobs:
  publish:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4
        with:
          fetch-depth: 0
      
      - name: Setup Node.js
        uses: actions/setup-node@v4
        with:
          node-version: '18.x'
          registry-url: 'https://registry.npmjs.org'

      - name: Install dependencies
        run: npm ci

      - name: Conventional Changelog Action
        id: changelog
        uses: TriPSs/conventional-changelog-action@v6
        with:
          github-token: ${{ secrets.GITHUB_TOKEN }}
          git-message: 'chore(release): {version}'
          preset: 'conventionalcommits'
          tag-prefix: 'v'
          output-file: 'CHANGELOG.md'
          skip-version-file: false
          skip-commit: false

      - name: Update version in package.json
        if: steps.changelog.outputs.skipped == 'false'
        run: |
          npm version ${{ steps.changelog.outputs.version }} --no-git-tag-version

      - name: Build
        run: npm run build

      - name: Create Release
        if: steps.changelog.outputs.skipped == 'false'
        env:
          GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
        run: |
          gh release create v${{ steps.changelog.outputs.version }} \
            --title "Release v${{ steps.changelog.outputs.version }}" \
            --notes "${{ steps.changelog.outputs.clean_changelog }}"

      - name: Publish to NPM
        if: steps.changelog.outputs.skipped == 'false'
        run: npm publish
        env:
          NODE_AUTH_TOKEN: ${{ secrets.NPM_TOKEN }}

      - name: Push changes
        if: steps.changelog.outputs.skipped == 'false'
        env:
          GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
        run: |
          git config --global user.name 'github-actions[bot]'
          git config --global user.email 'github-actions[bot]@users.noreply.github.com'
          git add package.json CHANGELOG.md
          git commit -m "chore(release): v${{ steps.changelog.outputs.version }}"
          git push "https://[email protected]/$GITHUB_REPOSITORY.git" HEAD:main

```

--------------------------------------------------------------------------------
/src/index.ts:
--------------------------------------------------------------------------------

```typescript
#!/usr/bin/env node
import { Server } from '@modelcontextprotocol/sdk/server/index.js';
import { StdioServerTransport } from '@modelcontextprotocol/sdk/server/stdio.js';
import {
  CallToolRequestSchema,
  ListToolsRequestSchema,
  McpError,
  ErrorCode,
} from '@modelcontextprotocol/sdk/types.js';
import {
  TableClient,
  TableServiceClient,
} from '@azure/data-tables';

interface QueryTableArgs {
  tableName: string;
  filter?: string;
  select?: string[];
  limit?: number;
}

interface GetSchemaArgs {
  tableName: string;
}

interface ListTablesArgs {
  prefix?: string;
}

class TableStoreServer {
  private server: Server;
  private connectionString: string;

  constructor() {
    this.connectionString = process.env.AZURE_STORAGE_CONNECTION_STRING || 'UseDevelopmentStorage=true';

    this.server = new Server(
      {
        name: 'tablestore',
        version: '0.1.0',
      },
      {
        capabilities: {
          tools: {},
        },
      }
    );

    this.setupToolHandlers();
    
    this.server.onerror = (error) => console.error('[MCP Error]', error);
    process.on('SIGINT', async () => {
      await this.server.close();
      process.exit(0);
    });
  }

  private setupToolHandlers() {
    this.server.setRequestHandler(ListToolsRequestSchema, async () => ({
      tools: [
        {
          name: 'query_table',
          description: '⚠️ WARNING: This tool returns a limited subset of results (default: 5 items) to protect the LLM\'s context window. DO NOT increase this limit unless explicitly confirmed by the user.\n\n' +
            'Query data from an Azure Storage Table with optional filters.\n\n' +
            'Supported OData Filter Examples:\n' +
            '1. Simple equality:\n' +
            '   filter: "PartitionKey eq \'COURSE\'"\n' +
            '   filter: "email eq \'[email protected]\'"\n\n' +
            '2. Compound conditions:\n' +
            '   filter: "PartitionKey eq \'USER\' and email eq \'[email protected]\'"\n' +
            '   filter: "PartitionKey eq \'COURSE\' and title eq \'GDPR Training\'"\n\n' +
            '3. Numeric comparisons:\n' +
            '   filter: "age gt 25"\n' +
            '   filter: "costPrice le 100"\n\n' +
            '4. Date comparisons (ISO 8601 format):\n' +
            '   filter: "createdDate gt datetime\'2023-01-01T00:00:00Z\'"\n' +
            '   filter: "timestamp lt datetime\'2024-12-31T23:59:59Z\'"\n\n' +
            'Supported Operators:\n' +
            '- eq: Equal\n' +
            '- ne: Not equal\n' +
            '- gt: Greater than\n' +
            '- ge: Greater than or equal\n' +
            '- lt: Less than\n' +
            '- le: Less than or equal\n' +
            '- and: Logical and\n' +
            '- or: Logical or\n' +
            '- not: Logical not',
          inputSchema: {
            type: 'object',
            properties: {
              tableName: {
                type: 'string',
                description: 'Name of the table to query',
              },
              filter: {
                type: 'string',
                description: 'OData filter string. See description for examples.',
              },
              select: {
                type: 'array',
                items: {
                  type: 'string',
                },
                description: 'Array of property names to select. Example: ["email", "username", "createdDate"]',
              },
              limit: {
                type: 'number',
                description: 'Maximum number of items to return in response (default: 5). Note: Full query is still executed to get total count.',
                default: 5
              }
            },
            required: ['tableName'],
          },
        },
        {
          name: 'get_table_schema',
          description: 'Get property names and types from a table',
          inputSchema: {
            type: 'object',
            properties: {
              tableName: {
                type: 'string',
                description: 'Name of the table to analyze',
              },
            },
            required: ['tableName'],
          },
        },
        {
          name: 'list_tables',
          description: 'List all tables in the storage account',
          inputSchema: {
            type: 'object',
            properties: {
              prefix: {
                type: 'string',
                description: 'Optional prefix to filter table names',
              },
            },
          },
        },
      ],
    }));

    this.server.setRequestHandler(CallToolRequestSchema, async (request) => {
      try {
        switch (request.params.name) {
          case 'query_table':
            const queryArgs = request.params.arguments as Record<string, unknown>;
            if (typeof queryArgs?.tableName !== 'string') {
              throw new McpError(ErrorCode.InvalidParams, 'tableName is required and must be a string');
            }
            return await this.handleQueryTable({
              tableName: queryArgs.tableName,
              filter: typeof queryArgs.filter === 'string' ? queryArgs.filter : undefined,
              select: Array.isArray(queryArgs.select) ? queryArgs.select.map(String) : undefined,
              limit: typeof queryArgs.limit === 'number' ? queryArgs.limit : 5
            });
          case 'get_table_schema':
            const schemaArgs = request.params.arguments as Record<string, unknown>;
            if (typeof schemaArgs?.tableName !== 'string') {
              throw new McpError(ErrorCode.InvalidParams, 'tableName is required and must be a string');
            }
            return await this.handleGetTableSchema({
              tableName: schemaArgs.tableName
            });
          case 'list_tables':
            const listArgs = request.params.arguments as ListTablesArgs;
            return await this.handleListTables(listArgs);
          default:
            throw new McpError(
              ErrorCode.MethodNotFound,
              `Unknown tool: ${request.params.name}`
            );
        }
      } catch (error: unknown) {
        if (error instanceof McpError) throw error;
        if (error instanceof Error) {
          throw new McpError(ErrorCode.InternalError, error.message);
        }
        throw new McpError(
          ErrorCode.InternalError,
          'An unexpected error occurred'
        );
      }
    });
  }

  private async handleQueryTable(args: QueryTableArgs) {
    const tableClient = TableClient.fromConnectionString(
      this.connectionString,
      args.tableName
    );

    const queryOptions: { queryOptions?: { filter?: string; select?: string[] } } = {};
    
    if (args.filter) {
      // Pass the OData filter directly to allow for all valid OData operations
      queryOptions.queryOptions = {
        filter: args.filter
      };
    }
    
    if (args.select) {
      if (!queryOptions.queryOptions) {
        queryOptions.queryOptions = {};
      }
      queryOptions.queryOptions.select = args.select;
    }

    const entities = [];
    const iterator = tableClient.listEntities(queryOptions);
    for await (const entity of iterator) {
      entities.push(entity);
    }

    // Apply limit in memory to maintain total count
    const totalItems = entities.length;
    const limit = args.limit || 5;
    const limitedEntities = entities.slice(0, limit);

    return {
      content: [
        {
          type: 'text',
          text: JSON.stringify({
            totalItems,
            limit,
            items: limitedEntities
          }, null, 2),
        },
      ],
    };
  }

  private async handleGetTableSchema(args: GetSchemaArgs) {
    const tableClient = TableClient.fromConnectionString(
      this.connectionString,
      args.tableName
    );

    const propertyMap = new Map<string, Set<string>>();
    const iterator = tableClient.listEntities();
    
    for await (const entity of iterator) {
      Object.entries(entity).forEach(([key, value]) => {
        if (!propertyMap.has(key)) {
          propertyMap.set(key, new Set());
        }
        propertyMap.get(key)?.add(typeof value);
      });
    }

    const schema = Object.fromEntries(
      Array.from(propertyMap.entries()).map(([key, types]) => [
        key,
        Array.from(types),
      ])
    );

    return {
      content: [
        {
          type: 'text',
          text: JSON.stringify(schema, null, 2),
        },
      ],
    };
  }

  private async handleListTables(args: ListTablesArgs) {
    const serviceClient = TableServiceClient.fromConnectionString(this.connectionString);
    const tables = [];
    const iterator = serviceClient.listTables();
    
    for await (const table of iterator) {
      if (table.name && (!args.prefix || table.name.startsWith(args.prefix))) {
        tables.push(table.name);
      }
    }

    return {
      content: [
        {
          type: 'text',
          text: JSON.stringify(tables, null, 2),
        },
      ],
    };
  }

  async run() {
    const transport = new StdioServerTransport();
    await this.server.connect(transport);
    console.error('Table Storage MCP server running on stdio');
  }
}

const server = new TableStoreServer();
server.run().catch(console.error);

```