#
tokens: 47693/50000 40/40 files
lines: off (toggle) GitHub
raw markdown copy
# Directory Structure

```
├── .github
│   └── ISSUE_TEMPLATE
│       ├── bug_report.md
│       └── feature_request.md
├── .gitignore
├── bun.lockb
├── docs
│   ├── creating-tools.md
│   └── tool-examples.md
├── example.ts
├── LICENSE
├── package.json
├── README.md
├── src
│   ├── main.ts
│   ├── prompts
│   │   └── list-vaults
│   │       └── index.ts
│   ├── resources
│   │   ├── index.ts
│   │   ├── resources.ts
│   │   └── vault
│   │       └── index.ts
│   ├── server.ts
│   ├── tools
│   │   ├── add-tags
│   │   │   └── index.ts
│   │   ├── create-directory
│   │   │   └── index.ts
│   │   ├── create-note
│   │   │   └── index.ts
│   │   ├── delete-note
│   │   │   └── index.ts
│   │   ├── edit-note
│   │   │   └── index.ts
│   │   ├── list-available-vaults
│   │   │   └── index.ts
│   │   ├── manage-tags
│   │   │   └── index.ts
│   │   ├── move-note
│   │   │   └── index.ts
│   │   ├── read-note
│   │   │   └── index.ts
│   │   ├── remove-tags
│   │   │   └── index.ts
│   │   ├── rename-tag
│   │   │   └── index.ts
│   │   └── search-vault
│   │       └── index.ts
│   ├── types.ts
│   └── utils
│       ├── errors.ts
│       ├── files.ts
│       ├── links.ts
│       ├── path.test.ts
│       ├── path.ts
│       ├── prompt-factory.ts
│       ├── responses.ts
│       ├── schema.ts
│       ├── security.ts
│       ├── tags.ts
│       ├── tool-factory.ts
│       └── vault-resolver.ts
└── tsconfig.json
```

# Files

--------------------------------------------------------------------------------
/.gitignore:
--------------------------------------------------------------------------------

```
# Logs
logs
*.log
npm-debug.log*
yarn-debug.log*
yarn-error.log*
lerna-debug.log*
.pnpm-debug.log*

# Diagnostic reports (https://nodejs.org/api/report.html)
report.[0-9]*.[0-9]*.[0-9]*.[0-9]*.json

# Runtime data
pids
*.pid
*.seed
*.pid.lock

# Directory for instrumented libs generated by jscoverage/JSCover
lib-cov

# Coverage directory used by tools like istanbul
coverage
*.lcov

# nyc test coverage
.nyc_output

# Grunt intermediate storage (https://gruntjs.com/creating-plugins#storing-task-files)
.grunt

# Bower dependency directory (https://bower.io/)
bower_components

# node-waf configuration
.lock-wscript

# Compiled binary addons (https://nodejs.org/api/addons.html)
build/Release

# Dependency directories
node_modules/
jspm_packages/

# Snowpack dependency directory (https://snowpack.dev/)
web_modules/

# TypeScript cache
*.tsbuildinfo

# Optional npm cache directory
.npm

# Optional eslint cache
.eslintcache

# Optional stylelint cache
.stylelintcache

# Microbundle cache
.rpt2_cache/
.rts2_cache_cjs/
.rts2_cache_es/
.rts2_cache_umd/

# Optional REPL history
.node_repl_history

# Output of 'npm pack'
*.tgz

# Yarn Integrity file
.yarn-integrity

# dotenv environment variable files
.env
.env.development.local
.env.test.local
.env.production.local
.env.local

# parcel-bundler cache (https://parceljs.org/)
.cache
.parcel-cache

# Next.js build output
.next
out

# Nuxt.js build / generate output
.nuxt
dist
build

# Gatsby files
.cache/
# Comment in the public line in if your project uses Gatsby and not Next.js
# https://nextjs.org/blog/next-9-1#public-directory-support
# public

# vuepress build output
.vuepress/dist

# vuepress v2.x temp and cache directory
.temp
.cache

# Docusaurus cache and generated files
.docusaurus

# Serverless directories
.serverless/

# FuseBox cache
.fusebox/

# DynamoDB Local files
.dynamodb/

# TernJS port file
.tern-port

# Stores VSCode versions used for testing VSCode extensions
.vscode-test

# yarn v2
.yarn/cache
.yarn/unplugged
.yarn/build-state.yml
.yarn/install-state.gz
.pnp.*

```

--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------

```markdown
# Obsidian MCP Server

[![smithery badge](https://smithery.ai/badge/obsidian-mcp)](https://smithery.ai/server/obsidian-mcp)

An [MCP (Model Context Protocol)](https://modelcontextprotocol.io) server that enables AI assistants to interact with Obsidian vaults, providing tools for reading, creating, editing and managing notes and tags.

## Warning!!!

This MCP has read and write access (if you allow it). Please. PLEASE backup your Obsidian vault prior to using obsidian-mcp to manage your notes. I recommend using git, but any backup method will work. These tools have been tested, but not thoroughly, and this MCP is in active development.

## Features

- Read and search notes in your vault
- Create new notes and directories
- Edit existing notes
- Move and delete notes
- Manage tags (add, remove, rename)
- Search vault contents

## Requirements

- Node.js 20 or higher (might work on lower, but I haven't tested it)
- An Obsidian vault

## Install

### Installing Manually

Add to your Claude Desktop configuration:

- macOS: `~/Library/Application Support/Claude/claude_desktop_config.json`
- Windows: `%APPDATA%\Claude\claude_desktop_config.json`

```json
{
    "mcpServers": {
        "obsidian": {
            "command": "npx",
            "args": ["-y", "obsidian-mcp", "/path/to/your/vault", "/path/to/your/vault2"]
        }
    }
}
```

Replace `/path/to/your/vault` with the absolute path to your Obsidian vault. For example:

MacOS/Linux:

```json
"/Users/username/Documents/MyVault"
```

Windows:

```json
"C:\\Users\\username\\Documents\\MyVault"
```

Restart Claude for Desktop after saving the configuration. You should see the hammer icon appear, indicating the server is connected.

If you have connection issues, check the logs at:

- MacOS: `~/Library/Logs/Claude/mcp*.log`
- Windows: `%APPDATA%\Claude\logs\mcp*.log`


### Installing via Smithery
Warning: I am not affiliated with Smithery. I have not tested using it and encourage users to install manually if they can.

To install Obsidian for Claude Desktop automatically via [Smithery](https://smithery.ai/server/obsidian-mcp):

```bash
npx -y @smithery/cli install obsidian-mcp --client claude
```

## Development

```bash
# Clone the repository
git clone https://github.com/StevenStavrakis/obsidian-mcp
cd obsidian-mcp

# Install dependencies
npm install

# Build
npm run build
```

Then add to your Claude Desktop configuration:

```json
{
    "mcpServers": {
        "obsidian": {
            "command": "node",
            "args": ["<absolute-path-to-obsidian-mcp>/build/main.js", "/path/to/your/vault", "/path/to/your/vault2"]
        }
    }
}
```

## Available Tools

- `read-note` - Read the contents of a note
- `create-note` - Create a new note
- `edit-note` - Edit an existing note
- `delete-note` - Delete a note
- `move-note` - Move a note to a different location
- `create-directory` - Create a new directory
- `search-vault` - Search notes in the vault
- `add-tags` - Add tags to a note
- `remove-tags` - Remove tags from a note
- `rename-tag` - Rename a tag across all notes
- `manage-tags` - List and organize tags
- `list-available-vaults` - List all available vaults (helps with multi-vault setups)

## Documentation

Additional documentation can be found in the `docs` directory:

- `creating-tools.md` - Guide for creating new tools
- `tool-examples.md` - Examples of using the available tools

## Security

This server requires access to your Obsidian vault directory. When configuring the server, make sure to:

- Only provide access to your intended vault directory
- Review tool actions before approving them

## Troubleshooting

Common issues:

1. **Server not showing up in Claude Desktop**
   - Verify your configuration file syntax
   - Make sure the vault path is absolute and exists
   - Restart Claude Desktop

2. **Permission errors**
   - Ensure the vault path is readable/writable
   - Check file permissions in your vault

3. **Tool execution failures**
   - Check Claude Desktop logs at:
     - macOS: `~/Library/Logs/Claude/mcp*.log`
     - Windows: `%APPDATA%\Claude\logs\mcp*.log`

## License

MIT

```

--------------------------------------------------------------------------------
/src/resources/index.ts:
--------------------------------------------------------------------------------

```typescript
export * from "./vault";

```

--------------------------------------------------------------------------------
/tsconfig.json:
--------------------------------------------------------------------------------

```json
{
  "compilerOptions": {
    "target": "ES2020",
    "module": "ES2020",
    "moduleResolution": "node",
    "esModuleInterop": true,
    "strict": true,
    "skipLibCheck": true,
    "outDir": "build",
    "rootDir": "src",
    "sourceMap": true,
    "allowJs": true
  },
  "include": ["src/**/*.ts"],
  "exclude": ["node_modules", "build"]
}

```

--------------------------------------------------------------------------------
/.github/ISSUE_TEMPLATE/feature_request.md:
--------------------------------------------------------------------------------

```markdown
---
name: Feature request
about: Suggest an idea for this project
title: "[FEATURE] "
labels: enhancement
assignees: StevenStavrakis

---

**Is your feature request related to a problem? Please describe.**
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]

**Describe the solution you'd like**
A clear and concise description of what you want to happen.

**Describe alternatives you've considered**
A clear and concise description of any alternative solutions or features you've considered.

**Additional context**
Add any other context or screenshots about the feature request here.

```

--------------------------------------------------------------------------------
/src/tools/list-available-vaults/index.ts:
--------------------------------------------------------------------------------

```typescript
import { createToolResponse } from "../../utils/responses.js";
import { createToolNoArgs } from "../../utils/tool-factory.js";

export const createListAvailableVaultsTool = (vaults: Map<string, string>) => {
  return createToolNoArgs({
    name: "list-available-vaults",
    description: "Lists all available vaults that can be used with other tools",
    handler: async () => {
      const availableVaults = Array.from(vaults.keys());
      
      if (availableVaults.length === 0) {
        return createToolResponse("No vaults are currently available");
      }
      
      const message = [
        "Available vaults:",
        ...availableVaults.map(vault => `  - ${vault}`)
      ].join('\n');
      
      return createToolResponse(message);
    }
  }, vaults);
}

```

--------------------------------------------------------------------------------
/.github/ISSUE_TEMPLATE/bug_report.md:
--------------------------------------------------------------------------------

```markdown
---
name: Bug report
about: Create a report to help us improve
title: "[BUG] "
labels: bug
assignees: StevenStavrakis

---

**Describe the bug**
A clear and concise description of what the bug is.

**To Reproduce**
Steps to reproduce the behavior:
1. Go to '...'
2. Click on '....'
3. Scroll down to '....'
4. See error

**Expected behavior**
A clear and concise description of what you expected to happen.

**Error logs (if available)**
Instructions on how to access error logs can be found [here](https://modelcontextprotocol.io/docs/tools/debugging)
The MCP instructions are only available for MacOS at this time.

**Desktop (please complete the following information):**
 - OS: [e.g. iOS]
 - AI Client [e.g. Claude]

**Additional context**
Add any other context about the problem here.

```

--------------------------------------------------------------------------------
/src/utils/prompt-factory.ts:
--------------------------------------------------------------------------------

```typescript
import { McpError, ErrorCode } from "@modelcontextprotocol/sdk/types.js";
import { Prompt } from "../types.js";

const prompts = new Map<string, Prompt>();

/**
 * Register a prompt for use in the MCP server
 */
export function registerPrompt(prompt: Prompt): void {
  if (prompts.has(prompt.name)) {
    throw new McpError(
      ErrorCode.InvalidRequest,
      `Prompt "${prompt.name}" is already registered`
    );
  }
  prompts.set(prompt.name, prompt);
}

/**
 * List all registered prompts
 */
export function listPrompts() {
  return {
    prompts: Array.from(prompts.values()).map(prompt => ({
      name: prompt.name,
      description: prompt.description,
      arguments: prompt.arguments
    }))
  };
}

/**
 * Get a specific prompt by name
 */
export async function getPrompt(name: string, vaults: Map<string, string>, args?: any) {
  const prompt = prompts.get(name);
  if (!prompt) {
    throw new McpError(ErrorCode.MethodNotFound, `Prompt not found: ${name}`);
  }

  try {
    return await prompt.handler(args, vaults);
  } catch (error) {
    if (error instanceof McpError) {
      throw error;
    }
    throw new McpError(
      ErrorCode.InternalError,
      `Failed to execute prompt: ${error instanceof Error ? error.message : String(error)}`
    );
  }
}

```

--------------------------------------------------------------------------------
/package.json:
--------------------------------------------------------------------------------

```json
{
  "name": "obsidian-mcp",
  "version": "1.0.6",
  "description": "MCP server for AI assistants to interact with Obsidian vaults",
  "type": "module",
  "main": "build/main.js",
  "bin": {
    "obsidian-mcp": "./build/main.js"
  },
  "files": [
    "build",
    "README.md",
    "LICENSE"
  ],
  "exports": {
    ".": "./build/main.js",
    "./utils/*": "./build/utils/*.js",
    "./resources/*": "./build/resources/*.js"
  },
  "peerDependencies": {
    "@modelcontextprotocol/sdk": "^1.0.4"
  },
  "dependencies": {
    "yaml": "^2.6.1",
    "zod": "^3.22.4",
    "zod-to-json-schema": "^3.24.1"
  },
  "devDependencies": {
    "@modelcontextprotocol/sdk": "^1.0.4",
    "@types/node": "^20.0.0",
    "typescript": "^5.0.0",
    "@types/bun": "latest"
  },
  "scripts": {
    "build": "bun build ./src/main.ts --outdir build --target node && chmod +x build/main.js",
    "start": "bun build/main.js",
    "prepublishOnly": "npm run build",
    "inspect": "bunx @modelcontextprotocol/inspector bun ./build/main.js"
  },
  "keywords": [
    "obsidian",
    "mcp",
    "ai",
    "notes",
    "knowledge-management"
  ],
  "author": "Steven Stavrakis",
  "license": "MIT",
  "repository": {
    "type": "git",
    "url": "https://github.com/StevenStavrakis/obsidian-mcp"
  },
  "engines": {
    "node": ">=16"
  }
}

```

--------------------------------------------------------------------------------
/src/utils/vault-resolver.ts:
--------------------------------------------------------------------------------

```typescript
import { McpError, ErrorCode } from "@modelcontextprotocol/sdk/types.js";

export interface VaultResolutionResult {
  vaultPath: string;
  vaultName: string;
}

export interface DualVaultResolutionResult {
  source: VaultResolutionResult;
  destination: VaultResolutionResult;
  isCrossVault: boolean;
}

export class VaultResolver {
  private vaults: Map<string, string>;
  constructor(vaults: Map<string, string>) {
    if (!vaults || vaults.size === 0) {
      throw new Error("At least one vault is required");
    }
    this.vaults = vaults;
  }

  /**
   * Resolves a single vault name to its path and validates it exists
   */
  resolveVault(vaultName: string): VaultResolutionResult {
    const vaultPath = this.vaults.get(vaultName);

    if (!vaultPath) {
      throw new McpError(
        ErrorCode.InvalidParams,
        `Unknown vault: ${vaultName}. Available vaults: ${Array.from(this.vaults.keys()).join(', ')}`
      );
    }

    return { vaultPath, vaultName };
  }

  /**
   * Resolves source and destination vaults for operations that work across vaults
   */
  // NOT IN USE

  /*
  resolveDualVaults(sourceVault: string, destinationVault: string): DualVaultResolutionResult {
    const source = this.resolveVault(sourceVault);
    const destination = this.resolveVault(destinationVault);
    const isCrossVault = sourceVault !== destinationVault;

    return {
      source,
      destination,
      isCrossVault
    };
  }
    */

  /**
   * Returns a list of available vault names
   */
  getAvailableVaults(): string[] {
    return Array.from(this.vaults.keys());
  }
}

```

--------------------------------------------------------------------------------
/src/prompts/list-vaults/index.ts:
--------------------------------------------------------------------------------

```typescript
import { Prompt, PromptResult } from "../../types.js";

/**
 * Generates the system prompt for tool usage
 */
function generateSystemPrompt(): string {
  return `When using tools that require a vault name, use one of the vault names from the "list-vaults" prompt.
For example, when creating a note, you must specify which vault to create it in.

Available tools will help you:
- Create, edit, move, and delete notes
- Search for specific content within vaults
- Manage tags
- Create directories

The search-vault tool is for finding specific content within vaults, not for listing available vaults.
Use the "list-vaults" prompt to see available vaults.
Do not try to directly access vault paths - use the provided tools instead.`;
}

export const listVaultsPrompt: Prompt = {
  name: "list-vaults",
  description: "Show available Obsidian vaults. Use this prompt to discover which vaults you can work with.",
  arguments: [],
  handler: async (_, vaults: Map<string, string>): Promise<PromptResult> => {
    const vaultList = Array.from(vaults.entries())
      .map(([name, path]) => `- ${name}`)
      .join('\n');

    return {
      messages: [
        {
          role: "user",
          content: {
            type: "text",
            text: `The following Obsidian vaults are available:\n${vaultList}\n\nYou can use these vault names when working with tools. For example, to create a note in the first vault, use that vault's name in the create-note tool's arguments.`
          }
        },
        {
          role: "assistant",
          content: {
            type: "text",
            text: `I see the available vaults. I'll use these vault names when working with tools that require a vault parameter. For searching within vault contents, I'll use the search-vault tool with the appropriate vault name.`
          }
        }
      ]
    };
  }
};

```

--------------------------------------------------------------------------------
/src/utils/errors.ts:
--------------------------------------------------------------------------------

```typescript
import { McpError, ErrorCode } from "@modelcontextprotocol/sdk/types.js";
import { z } from "zod";

/**
 * Wraps common file system errors into McpErrors
 */
export function handleFsError(error: unknown, operation: string): never {
  if (error instanceof McpError) {
    throw error;
  }

  if (error instanceof Error) {
    const nodeError = error as NodeJS.ErrnoException;
    
    switch (nodeError.code) {
      case 'ENOENT':
        throw new McpError(
          ErrorCode.InvalidRequest,
          `File or directory not found: ${nodeError.message}`
        );
      case 'EACCES':
        throw new McpError(
          ErrorCode.InvalidRequest,
          `Permission denied: ${nodeError.message}`
        );
      case 'EEXIST':
        throw new McpError(
          ErrorCode.InvalidRequest,
          `File or directory already exists: ${nodeError.message}`
        );
      case 'ENOSPC':
        throw new McpError(
          ErrorCode.InternalError,
          'Not enough space to write file'
        );
      default:
        throw new McpError(
          ErrorCode.InternalError,
          `Failed to ${operation}: ${nodeError.message}`
        );
    }
  }

  throw new McpError(
    ErrorCode.InternalError,
    `Unexpected error during ${operation}`
  );
}

/**
 * Handles Zod validation errors by converting them to McpErrors
 */
export function handleZodError(error: z.ZodError): never {
  throw new McpError(
    ErrorCode.InvalidRequest,
    `Invalid arguments: ${error.errors.map(e => e.message).join(", ")}`
  );
}

/**
 * Creates a standardized error for when a note already exists
 */
export function createNoteExistsError(path: string): McpError {
  return new McpError(
    ErrorCode.InvalidRequest,
    `A note already exists at: ${path}\n\n` +
    'To prevent accidental modifications, this operation has been cancelled.\n' +
    'If you want to modify an existing note, please explicitly request to edit or replace it.'
  );
}

/**
 * Creates a standardized error for when a note is not found
 */
export function createNoteNotFoundError(path: string): McpError {
  return new McpError(
    ErrorCode.InvalidRequest,
    `Note "${path}" not found in vault`
  );
}

```

--------------------------------------------------------------------------------
/src/types.ts:
--------------------------------------------------------------------------------

```typescript
import { z } from "zod";

// Tool types
export interface Tool<T = any> {
  name: string;
  description: string;
  inputSchema: {
    parse: (args: any) => T;
    jsonSchema: any;
  };
  handler: (args: T) => Promise<{
    content: {
      type: "text";
      text: string;
    }[];
  }>;
}

// Search types
export interface SearchMatch {
  line: number;
  text: string;
}

export interface SearchResult {
  file: string;
  content?: string;
  lineNumber?: number;
  matches?: SearchMatch[];
}

export interface SearchOperationResult {
  results: SearchResult[];
  totalResults?: number;
  totalMatches?: number;
  matchedFiles?: number;
  success?: boolean;
  message?: string;
}

export interface SearchOptions {
  caseSensitive?: boolean;
  wholeWord?: boolean;
  useRegex?: boolean;
  maxResults?: number;
  path?: string;
  searchType?: 'content' | 'filename' | 'both';
}

// Tag types
export interface TagChange {
  tag: string;
  location: string;
}

// Prompt types
export interface Prompt<T = any> {
  name: string;
  description: string;
  arguments: {
    name: string;
    description: string;
    required?: boolean;
  }[];
  handler: (args: T, vaults: Map<string, string>) => Promise<PromptResult>;
}

export interface PromptMessage {
  role: "user" | "assistant";
  content: {
    type: "text";
    text: string;
  };
}

export interface ToolResponse {
  content: {
    type: "text";
    text: string;
  }[];
}

export interface OperationResult {
  success: boolean;
  message: string;
  details?: Record<string, any>;
}

export interface BatchOperationResult {
  success: boolean;
  message: string;
  totalCount: number;
  successCount: number;
  failedItems: Array<{
    item: string;
    error: string;
  }>;
}

export interface FileOperationResult {
  success: boolean;
  message: string;
  operation: 'create' | 'edit' | 'delete' | 'move';
  path: string;
}

export interface TagOperationResult {
  success: boolean;
  message: string;
  totalCount: number;
  successCount: number;
  details: Record<string, {
    changes: TagChange[];
  }>;
  failedItems: Array<{
    item: string;
    error: string;
  }>;
}

export interface PromptResult {
  systemPrompt?: string;
  messages: PromptMessage[];
  _meta?: {
    [key: string]: any;
  };
}

```

--------------------------------------------------------------------------------
/src/tools/create-directory/index.ts:
--------------------------------------------------------------------------------

```typescript
import { z } from "zod";
import { promises as fs } from "fs";
import path from "path";
import { McpError, ErrorCode } from "@modelcontextprotocol/sdk/types.js";
import { createTool } from "../../utils/tool-factory.js";

// Input validation schema with descriptions
const schema = z.object({
  vault: z.string()
    .min(1, "Vault name cannot be empty")
    .describe("Name of the vault where the directory should be created"),
  path: z.string()
    .min(1, "Directory path cannot be empty")
    .refine(dirPath => !path.isAbsolute(dirPath), 
      "Directory path must be relative to vault root")
    .describe("Path of the directory to create (relative to vault root)"),
  recursive: z.boolean()
    .optional()
    .default(true)
    .describe("Create parent directories if they don't exist")
}).strict();

type CreateDirectoryInput = z.infer<typeof schema>;

// Helper function to create directory
async function createDirectory(
  vaultPath: string,
  dirPath: string,
  recursive: boolean
): Promise<string> {
  const fullPath = path.join(vaultPath, dirPath);
  
  // Validate path is within vault
  const normalizedPath = path.normalize(fullPath);
  if (!normalizedPath.startsWith(path.normalize(vaultPath))) {
    throw new McpError(
      ErrorCode.InvalidRequest,
      "Directory path must be within the vault directory"
    );
  }

  try {
    // Check if directory already exists
    try {
      await fs.access(normalizedPath);
      throw new McpError(
        ErrorCode.InvalidRequest,
        `A directory already exists at: ${normalizedPath}`
      );
    } catch (error: any) {
      if (error.code !== 'ENOENT') {
        throw error;
      }
      // Directory doesn't exist, proceed with creation
      await fs.mkdir(normalizedPath, { recursive });
      return normalizedPath;
    }
  } catch (error: any) {
    if (error instanceof McpError) {
      throw error;
    }
    throw new McpError(
      ErrorCode.InternalError,
      `Failed to create directory: ${error.message}`
    );
  }
}

export function createCreateDirectoryTool(vaults: Map<string, string>) {
  return createTool<CreateDirectoryInput>({
    name: "create-directory",
    description: "Create a new directory in the specified vault",
    schema,
    handler: async (args, vaultPath, _vaultName) => {
      const createdPath = await createDirectory(vaultPath, args.path, args.recursive ?? true);
      return {
        content: [
          {
            type: "text",
            text: `Successfully created directory at: ${createdPath}`
          }
        ]
      };
    }
  }, vaults);
}

```

--------------------------------------------------------------------------------
/src/utils/schema.ts:
--------------------------------------------------------------------------------

```typescript
import { z } from "zod";
import { zodToJsonSchema } from "zod-to-json-schema";
import { McpError, ErrorCode } from "@modelcontextprotocol/sdk/types.js";

/**
 * Converts a JSON Schema object to a Zod schema
 */
function jsonSchemaToZod(schema: {
  type: string;
  properties: Record<string, any>;
  required?: string[];
}): z.ZodSchema {
  const requiredFields = new Set(schema.required || []);
  const properties: Record<string, z.ZodTypeAny> = {};
  
  for (const [key, value] of Object.entries(schema.properties)) {
    let fieldSchema: z.ZodTypeAny;
    
    switch (value.type) {
      case 'string':
        fieldSchema = value.enum ? z.enum(value.enum) : z.string();
        break;
      case 'number':
        fieldSchema = z.number();
        break;
      case 'boolean':
        fieldSchema = z.boolean();
        break;
      case 'array':
        if (value.items.type === 'string') {
          fieldSchema = z.array(z.string());
        } else {
          fieldSchema = z.array(z.unknown());
        }
        break;
      case 'object':
        if (value.properties) {
          fieldSchema = jsonSchemaToZod(value);
        } else {
          fieldSchema = z.record(z.unknown());
        }
        break;
      default:
        fieldSchema = z.unknown();
    }

    // Add description if present
    if (value.description) {
      fieldSchema = fieldSchema.describe(value.description);
    }

    // Make field optional if it's not required
    properties[key] = requiredFields.has(key) ? fieldSchema : fieldSchema.optional();
  }
  
  return z.object(properties);
}

/**
 * Creates a tool schema handler from an existing JSON Schema
 */
export function createSchemaHandlerFromJson<T = any>(jsonSchema: {
  type: string;
  properties: Record<string, any>;
  required?: string[];
}) {
  const zodSchema = jsonSchemaToZod(jsonSchema);
  return createSchemaHandler(zodSchema);
}

/**
 * Creates a tool schema handler that manages both JSON Schema for MCP and Zod validation
 */
export function createSchemaHandler<T>(schema: z.ZodSchema<T>) {
  return {
    // Convert to JSON Schema for MCP interface
    jsonSchema: (() => {
      const fullSchema = zodToJsonSchema(schema) as {
        type: string;
        properties: Record<string, any>;
        required?: string[];
      };
      return {
        type: fullSchema.type || "object",
        properties: fullSchema.properties || {},
        required: fullSchema.required || []
      };
    })(),
    
    // Validate and parse input
    parse: (input: unknown): T => {
      try {
        return schema.parse(input);
      } catch (error) {
        if (error instanceof z.ZodError) {
          throw new McpError(
            ErrorCode.InvalidParams,
            `Invalid arguments: ${error.errors.map(e => e.message).join(", ")}`
          );
        }
        throw error;
      }
    }
  };
}

```

--------------------------------------------------------------------------------
/src/utils/security.ts:
--------------------------------------------------------------------------------

```typescript
import { McpError, ErrorCode } from "@modelcontextprotocol/sdk/types.js";

// Basic rate limiting for API protection
export class RateLimiter {
  private requests: Map<string, number[]> = new Map();
  private maxRequests: number;
  private timeWindow: number;

  constructor(maxRequests: number = 1000, timeWindow: number = 60000) {
    // 1000 requests per minute for local usage
    this.maxRequests = maxRequests;
    this.timeWindow = timeWindow;
  }

  checkLimit(clientId: string): boolean {
    const now = Date.now();
    const timestamps = this.requests.get(clientId) || [];

    // Remove old timestamps
    const validTimestamps = timestamps.filter(
      (time) => now - time < this.timeWindow
    );

    if (validTimestamps.length >= this.maxRequests) {
      return false;
    }

    validTimestamps.push(now);
    this.requests.set(clientId, validTimestamps);
    return true;
  }
}

// Message size validation to prevent memory issues
const MAX_MESSAGE_SIZE = 5 * 1024 * 1024; // 5MB for local usage

export function validateMessageSize(message: any): void {
  const size = new TextEncoder().encode(JSON.stringify(message)).length;
  if (size > MAX_MESSAGE_SIZE) {
    throw new McpError(
      ErrorCode.InvalidRequest,
      `Message size exceeds limit of ${MAX_MESSAGE_SIZE} bytes`
    );
  }
}

// Connection health monitoring
export class ConnectionMonitor {
  private lastActivity: number = Date.now();
  private healthCheckInterval: NodeJS.Timeout | null = null;
  private heartbeatInterval: NodeJS.Timeout | null = null;
  private readonly timeout: number;
  private readonly gracePeriod: number;
  private readonly heartbeat: number;
  private initialized: boolean = false;

  constructor(
    timeout: number = 300000,
    gracePeriod: number = 60000,
    heartbeat: number = 30000
  ) {
    // 5min timeout, 1min grace period, 30s heartbeat
    this.timeout = timeout;
    this.gracePeriod = gracePeriod;
    this.heartbeat = heartbeat;
  }

  updateActivity() {
    this.lastActivity = Date.now();
  }

  start(onTimeout: () => void) {
    // Start monitoring after grace period
    setTimeout(() => {
      this.initialized = true;

      // Set up heartbeat to keep connection alive
      this.heartbeatInterval = setInterval(() => {
        // The heartbeat itself counts as activity
        this.updateActivity();
      }, this.heartbeat);

      // Set up health check
      this.healthCheckInterval = setInterval(() => {
        const now = Date.now();
        const inactiveTime = now - this.lastActivity;

        if (inactiveTime > this.timeout) {
          onTimeout();
        }
      }, 10000); // Check every 10 seconds
    }, this.gracePeriod);
  }

  stop() {
    if (this.healthCheckInterval) {
      clearInterval(this.healthCheckInterval);
      this.healthCheckInterval = null;
    }

    if (this.heartbeatInterval) {
      clearInterval(this.heartbeatInterval);
      this.heartbeatInterval = null;
    }
  }
}

```

--------------------------------------------------------------------------------
/src/resources/vault/index.ts:
--------------------------------------------------------------------------------

```typescript
import { McpError, ErrorCode } from "@modelcontextprotocol/sdk/types.js";
import { promises as fs } from "fs";

export interface VaultResource {
  uri: string;
  name: string;
  mimeType: string;
  description?: string;
  metadata?: {
    path: string;
    isAccessible: boolean;
  };
}

export interface VaultListResource {
  uri: string;
  name: string;
  mimeType: string;
  description: string;
  metadata?: {
    totalVaults: number;
    vaults: Array<{
      name: string;
      path: string;
      isAccessible: boolean;
    }>;
  };
}

export async function getVaultMetadata(vaultPath: string): Promise<{
  isAccessible: boolean;
}> {
  try {
    await fs.access(vaultPath);
    return {
      isAccessible: true
    };
  } catch {
    return {
      isAccessible: false
    };
  }
}

export async function listVaultResources(vaults: Map<string, string>): Promise<(VaultResource | VaultListResource)[]> {
  const resources: (VaultResource | VaultListResource)[] = [];

  // Add root resource that lists all vaults
  const vaultList: VaultListResource = {
    uri: "obsidian-vault://",
    name: "Available Vaults",
    mimeType: "application/json",
    description: "List of all available Obsidian vaults and their access status",
    metadata: {
      totalVaults: vaults.size,
      vaults: []
    }
  };

  // Process each vault
  for (const [vaultName, vaultPath] of vaults.entries()) {
    try {
      const metadata = await getVaultMetadata(vaultPath);

      // Add to vault list
      vaultList.metadata?.vaults.push({
        name: vaultName,
        path: vaultPath,
        isAccessible: metadata.isAccessible
      });

      // Add individual vault resource
      resources.push({
        uri: `obsidian-vault://${vaultName}`,
        name: vaultName,
        mimeType: "application/json",
        description: `Access information for the ${vaultName} vault`,
        metadata: {
          path: vaultPath,
          isAccessible: metadata.isAccessible
        }
      });
    } catch (error) {
      console.error(`Error processing vault ${vaultName}:`, error);
      // Still add to vault list but mark as inaccessible
      vaultList.metadata?.vaults.push({
        name: vaultName,
        path: vaultPath,
        isAccessible: false
      });
    }
  }

  // Add vault list as first resource
  resources.unshift(vaultList);

  return resources;
}

export async function readVaultResource(
  vaults: Map<string, string>,
  uri: string
): Promise<{ uri: string; mimeType: string; text: string }> {
  const vaultName = uri.replace("obsidian-vault://", "");
  const vaultPath = vaults.get(vaultName);

  if (!vaultPath) {
    throw new McpError(
      ErrorCode.InvalidRequest,
      `Unknown vault: ${vaultName}`
    );
  }

  const metadata = await getVaultMetadata(vaultPath);

  return {
    uri,
    mimeType: "application/json",
    text: JSON.stringify({
      name: vaultName,
      path: vaultPath,
      isAccessible: metadata.isAccessible
    }, null, 2)
  };
}

```

--------------------------------------------------------------------------------
/src/tools/read-note/index.ts:
--------------------------------------------------------------------------------

```typescript
import { z } from "zod";
import { FileOperationResult } from "../../types.js";
import { promises as fs } from "fs";
import path from "path";
import { McpError } from "@modelcontextprotocol/sdk/types.js";
import { ensureMarkdownExtension, validateVaultPath } from "../../utils/path.js";
import { fileExists } from "../../utils/files.js";
import { createNoteNotFoundError, handleFsError } from "../../utils/errors.js";
import { createToolResponse, formatFileResult } from "../../utils/responses.js";
import { createTool } from "../../utils/tool-factory.js";

// Input validation schema with descriptions
const schema = z.object({
  vault: z.string()
    .min(1, "Vault name cannot be empty")
    .describe("Name of the vault containing the note"),
  filename: z.string()
    .min(1, "Filename cannot be empty")
    .refine(name => !name.includes('/') && !name.includes('\\'), 
      "Filename cannot contain path separators - use the 'folder' parameter for paths instead")
    .describe("Just the note name without any path separators (e.g. 'my-note.md', NOT 'folder/my-note.md')"),
  folder: z.string()
    .optional()
    .refine(folder => !folder || !path.isAbsolute(folder), 
      "Folder must be a relative path")
    .describe("Optional subfolder path relative to vault root")
}).strict();

type ReadNoteInput = z.infer<typeof schema>;

async function readNote(
  vaultPath: string,
  filename: string,
  folder?: string
): Promise<FileOperationResult & { content: string }> {
  const sanitizedFilename = ensureMarkdownExtension(filename);
  const fullPath = folder
    ? path.join(vaultPath, folder, sanitizedFilename)
    : path.join(vaultPath, sanitizedFilename);
  
  // Validate path is within vault
  validateVaultPath(vaultPath, fullPath);

  try {
    // Check if file exists
    if (!await fileExists(fullPath)) {
      throw createNoteNotFoundError(filename);
    }

    // Read the file content
    const content = await fs.readFile(fullPath, "utf-8");

    return {
      success: true,
      message: "Note read successfully",
      path: fullPath,
      operation: 'edit', // Using 'edit' since we don't have a 'read' operation type
      content: content
    };
  } catch (error: unknown) {
    if (error instanceof McpError) {
      throw error;
    }
    throw handleFsError(error, 'read note');
  }
}

export function createReadNoteTool(vaults: Map<string, string>) {
  return createTool<ReadNoteInput>({
    name: "read-note",
    description: `Read the content of an existing note in the vault.

Examples:
- Root note: { "vault": "vault1", "filename": "note.md" }
- Subfolder note: { "vault": "vault1", "filename": "note.md", "folder": "journal/2024" }
- INCORRECT: { "filename": "journal/2024/note.md" } (don't put path in filename)`,
    schema,
    handler: async (args, vaultPath, _vaultName) => {
      const result = await readNote(vaultPath, args.filename, args.folder);
      
      const formattedResult = formatFileResult({
        success: result.success,
        message: result.message,
        path: result.path,
        operation: result.operation
      });
      
      return createToolResponse(
        `${result.content}\n\n${formattedResult}`
      );
    }
  }, vaults);
}

```

--------------------------------------------------------------------------------
/src/tools/move-note/index.ts:
--------------------------------------------------------------------------------

```typescript
import { z } from "zod";
import { promises as fs } from "fs";
import path from "path";
import { McpError } from "@modelcontextprotocol/sdk/types.js";
import { ensureMarkdownExtension, validateVaultPath } from "../../utils/path.js";
import { fileExists, ensureDirectory } from "../../utils/files.js";
import { updateVaultLinks } from "../../utils/links.js";
import { createNoteExistsError, createNoteNotFoundError, handleFsError } from "../../utils/errors.js";
import { createTool } from "../../utils/tool-factory.js";

// Input validation schema with descriptions
const schema = z.object({
  vault: z.string()
    .min(1, "Vault name cannot be empty")
    .describe("Name of the vault containing the note"),
  source: z.string()
    .min(1, "Source path cannot be empty")
    .refine(name => !path.isAbsolute(name), 
      "Source must be a relative path within the vault")
    .describe("Source path of the note relative to vault root (e.g., 'folder/note.md')"),
  destination: z.string()
    .min(1, "Destination path cannot be empty")
    .refine(name => !path.isAbsolute(name), 
      "Destination must be a relative path within the vault")
    .describe("Destination path relative to vault root (e.g., 'new-folder/new-name.md')")
}).strict();

type MoveNoteArgs = z.infer<typeof schema>;

async function moveNote(
  args: MoveNoteArgs,
  vaultPath: string
): Promise<string> {
  // Ensure paths are relative to vault
  const fullSourcePath = path.join(vaultPath, args.source);
  const fullDestPath = path.join(vaultPath, args.destination);

  // Validate paths are within vault
  validateVaultPath(vaultPath, fullSourcePath);
  validateVaultPath(vaultPath, fullDestPath);

  try {
    // Check if source exists
    if (!await fileExists(fullSourcePath)) {
      throw createNoteNotFoundError(args.source);
    }

    // Check if destination already exists
    if (await fileExists(fullDestPath)) {
      throw createNoteExistsError(args.destination);
    }

    // Ensure destination directory exists
    const destDir = path.dirname(fullDestPath);
    await ensureDirectory(destDir);

    // Move the file
    await fs.rename(fullSourcePath, fullDestPath);
    
    // Update links in the vault
    const updatedFiles = await updateVaultLinks(vaultPath, args.source, args.destination);
    
    return `Successfully moved note from "${args.source}" to "${args.destination}"\n` +
           `Updated links in ${updatedFiles} file${updatedFiles === 1 ? '' : 's'}`;
  } catch (error) {
    if (error instanceof McpError) {
      throw error;
    }
    throw handleFsError(error, 'move note');
  }
}

export function createMoveNoteTool(vaults: Map<string, string>) {
  return createTool<MoveNoteArgs>({
    name: "move-note",
    description: "Move/rename a note while preserving links",
    schema,
    handler: async (args, vaultPath, vaultName) => {
      const argsWithExt: MoveNoteArgs = {
        vault: args.vault,
        source: ensureMarkdownExtension(args.source),
        destination: ensureMarkdownExtension(args.destination)
      };
      
      const resultMessage = await moveNote(argsWithExt, vaultPath);
      
      return {
        content: [
          {
            type: "text",
            text: resultMessage
          }
        ]
      };
    }
  }, vaults);
}

```

--------------------------------------------------------------------------------
/src/utils/tool-factory.ts:
--------------------------------------------------------------------------------

```typescript
import { z } from "zod";
import { Tool } from "../types.js";
import { McpError, ErrorCode } from "@modelcontextprotocol/sdk/types.js";
import { createSchemaHandler } from "./schema.js";
import { VaultResolver } from "./vault-resolver.js";

export interface BaseToolConfig<T> {
  name: string;
  description: string;
  schema?: z.ZodType<any>;
  handler: (
    args: T,
    sourcePath: string,
    sourceVaultName: string,
    destinationPath?: string,
    destinationVaultName?: string,
    isCrossVault?: boolean
  ) => Promise<any>;
}

/**
 * Creates a standardized tool with common error handling and vault validation
 */
export function createTool<T extends { vault: string }>(
  config: BaseToolConfig<T>,
  vaults: Map<string, string>
): Tool {
  const vaultResolver = new VaultResolver(vaults);
  const schemaHandler = config.schema ? createSchemaHandler(config.schema) : undefined;

  return {
    name: config.name,
    description: config.description,
    inputSchema: schemaHandler || createSchemaHandler(z.object({})),
    handler: async (args) => {
      try {
        const validated = schemaHandler ? schemaHandler.parse(args) as T : {} as T;
        const { vaultPath, vaultName } = vaultResolver.resolveVault(validated.vault);
        return await config.handler(validated, vaultPath, vaultName);
      } catch (error) {
        if (error instanceof z.ZodError) {
          throw new McpError(
            ErrorCode.InvalidRequest,
            `Invalid arguments: ${error.errors.map(e => e.message).join(", ")}`
          );
        }
        throw error;
      }
    }
  };
}

/**
 * Creates a tool that requires no arguments
 */
export function createToolNoArgs(
  config: Omit<BaseToolConfig<{}>, "schema">,
  vaults: Map<string, string>
): Tool {
  const vaultResolver = new VaultResolver(vaults);

  return {
    name: config.name,
    description: config.description,
    inputSchema: createSchemaHandler(z.object({})),
    handler: async () => {
      try {
        return await config.handler({}, "", "");
      } catch (error) {
        throw error;
      }
    }
  };
}

/**
 * Creates a standardized tool that operates between two vaults
 */

// NOT IN USE

/*
export function createDualVaultTool<T extends { sourceVault: string; destinationVault: string }>(
  config: BaseToolConfig<T>,
  vaults: Map<string, string>
): Tool {
  const vaultResolver = new VaultResolver(vaults);
  const schemaHandler = createSchemaHandler(config.schema);

  return {
    name: config.name,
    description: config.description,
    inputSchema: schemaHandler,
    handler: async (args) => {
      try {
        const validated = schemaHandler.parse(args) as T;
        const { source, destination, isCrossVault } = vaultResolver.resolveDualVaults(
          validated.sourceVault,
          validated.destinationVault
        );
        return await config.handler(
          validated,
          source.vaultPath,
          source.vaultName,
          destination.vaultPath,
          destination.vaultName,
          isCrossVault
        );
      } catch (error) {
        if (error instanceof z.ZodError) {
          throw new McpError(
            ErrorCode.InvalidRequest,
            `Invalid arguments: ${error.errors.map(e => e.message).join(", ")}`
          );
        }
        throw error;
      }
    }
  };
}
*/

```

--------------------------------------------------------------------------------
/src/utils/files.ts:
--------------------------------------------------------------------------------

```typescript
import { promises as fs, Dirent } from "fs";
import { McpError, ErrorCode } from "@modelcontextprotocol/sdk/types.js";
import { normalizePath, safeJoinPath } from "./path.js";

/**
 * Recursively gets all markdown files in a directory
 */
export async function getAllMarkdownFiles(vaultPath: string, dir = vaultPath): Promise<string[]> {
  // Normalize paths upfront
  const normalizedVaultPath = normalizePath(vaultPath);
  const normalizedDir = normalizePath(dir);

  // Verify directory is within vault
  if (!normalizedDir.startsWith(normalizedVaultPath)) {
    throw new McpError(
      ErrorCode.InvalidRequest,
      `Search directory must be within vault: ${dir}`
    );
  }

  try {
    const files: string[] = [];
    let entries: Dirent[];
    
    try {
      entries = await fs.readdir(normalizedDir, { withFileTypes: true });
    } catch (error) {
      if ((error as any).code === 'ENOENT') {
        throw new McpError(
          ErrorCode.InvalidRequest,
          `Directory not found: ${dir}`
        );
      }
      throw error;
    }

    for (const entry of entries) {
      try {
        // Use safeJoinPath to ensure path safety
        const fullPath = safeJoinPath(normalizedDir, entry.name);
        
        if (entry.isDirectory()) {
          if (!entry.name.startsWith(".")) {
            const subDirFiles = await getAllMarkdownFiles(normalizedVaultPath, fullPath);
            files.push(...subDirFiles);
          }
        } else if (entry.isFile() && entry.name.endsWith(".md")) {
          files.push(fullPath);
        }
      } catch (error) {
        // Log but don't throw - we want to continue processing other files
        if (error instanceof McpError) {
          console.error(`Skipping ${entry.name}:`, error.message);
        } else {
          console.error(`Error processing ${entry.name}:`, error);
        }
      }
    }

    return files;
  } catch (error) {
    if (error instanceof McpError) throw error;
    
    throw new McpError(
      ErrorCode.InternalError,
      `Failed to read directory ${dir}: ${error instanceof Error ? error.message : String(error)}`
    );
  }
}

/**
 * Ensures a directory exists, creating it if necessary
 */
export async function ensureDirectory(dirPath: string): Promise<void> {
  const normalizedPath = normalizePath(dirPath);
  
  try {
    await fs.mkdir(normalizedPath, { recursive: true });
  } catch (error: any) {
    if (error.code !== 'EEXIST') {
      throw new McpError(
        ErrorCode.InternalError,
        `Failed to create directory ${dirPath}: ${error.message}`
      );
    }
  }
}

/**
 * Checks if a file exists
 */
export async function fileExists(filePath: string): Promise<boolean> {
  const normalizedPath = normalizePath(filePath);
  
  try {
    await fs.access(normalizedPath);
    return true;
  } catch {
    return false;
  }
}

/**
 * Safely reads a file's contents
 * Returns undefined if file doesn't exist
 */
export async function safeReadFile(filePath: string): Promise<string | undefined> {
  const normalizedPath = normalizePath(filePath);
  
  try {
    return await fs.readFile(normalizedPath, 'utf-8');
  } catch (error: any) {
    if (error.code === 'ENOENT') {
      return undefined;
    }
    throw new McpError(
      ErrorCode.InternalError,
      `Failed to read file ${filePath}: ${error.message}`
    );
  }
}

```

--------------------------------------------------------------------------------
/src/tools/create-note/index.ts:
--------------------------------------------------------------------------------

```typescript
import { z } from "zod";
import { FileOperationResult } from "../../types.js";
import { promises as fs } from "fs";
import path from "path";
import { McpError } from "@modelcontextprotocol/sdk/types.js";
import { ensureMarkdownExtension, validateVaultPath } from "../../utils/path.js";
import { ensureDirectory, fileExists } from "../../utils/files.js";
import { createNoteExistsError, handleFsError } from "../../utils/errors.js";
import { createToolResponse, formatFileResult } from "../../utils/responses.js";
import { createTool } from "../../utils/tool-factory.js";

// Input validation schema with descriptions
const schema = z.object({
  vault: z.string()
    .min(1, "Vault name cannot be empty")
    .describe("Name of the vault to create the note in"),
  filename: z.string()
    .min(1, "Filename cannot be empty")
    .refine(name => !name.includes('/') && !name.includes('\\'), 
      "Filename cannot contain path separators - use the 'folder' parameter for paths instead. Example: use filename:'note.md', folder:'my/path' instead of filename:'my/path/note.md'")
    .describe("Just the note name without any path separators (e.g. 'my-note.md', NOT 'folder/my-note.md'). Will add .md extension if missing"),
  content: z.string()
    .min(1, "Content cannot be empty")
    .describe("Content of the note in markdown format"),
  folder: z.string()
    .optional()
    .refine(folder => !folder || !path.isAbsolute(folder), 
      "Folder must be a relative path")
    .describe("Optional subfolder path relative to vault root (e.g. 'journal/subfolder'). Use this for the path instead of including it in filename")
}).strict();

async function createNote(
  args: z.infer<typeof schema>,
  vaultPath: string,
  _vaultName: string
): Promise<FileOperationResult> {
  const sanitizedFilename = ensureMarkdownExtension(args.filename);

  const notePath = args.folder
    ? path.join(vaultPath, args.folder, sanitizedFilename)
    : path.join(vaultPath, sanitizedFilename);

  // Validate path is within vault
  validateVaultPath(vaultPath, notePath);

  try {
    // Create directory structure if needed
    const noteDir = path.dirname(notePath);
    await ensureDirectory(noteDir);

    // Check if file exists first
    if (await fileExists(notePath)) {
      throw createNoteExistsError(notePath);
    }

    // File doesn't exist, proceed with creation
    await fs.writeFile(notePath, args.content, 'utf8');
    
    return {
      success: true,
      message: "Note created successfully",
      path: notePath,
      operation: 'create'
    };
  } catch (error) {
    if (error instanceof McpError) {
      throw error;
    }
    throw handleFsError(error, 'create note');
  }
}

type CreateNoteArgs = z.infer<typeof schema>;

export function createCreateNoteTool(vaults: Map<string, string>) {
  return createTool<CreateNoteArgs>({
    name: "create-note",
    description: `Create a new note in the specified vault with markdown content.

Examples:
- Root note: { "vault": "vault1", "filename": "note.md" }
- Subfolder note: { "vault": "vault2", "filename": "note.md", "folder": "journal/2024" }
- INCORRECT: { "filename": "journal/2024/note.md" } (don't put path in filename)`,
    schema,
    handler: async (args, vaultPath, vaultName) => {
      const result = await createNote(args, vaultPath, vaultName);
      return createToolResponse(formatFileResult(result));
    }
  }, vaults);
}

```

--------------------------------------------------------------------------------
/src/utils/path.test.ts:
--------------------------------------------------------------------------------

```typescript
import { describe, it } from 'bun:test';
import assert from 'node:assert';
import path from 'path';
import { normalizePath } from './path';
import { McpError, ErrorCode } from "@modelcontextprotocol/sdk/types.js";

describe('normalizePath', () => {
  describe('Common tests', () => {
    it('should handle relative paths', () => {
      assert.strictEqual(normalizePath('./path/to/file'), path.resolve('./path/to/file'));
      assert.strictEqual(normalizePath('../path/to/file'), path.resolve('../path/to/file'));
    });

    it('should throw error for invalid paths', () => {
      assert.throws(() => normalizePath(''), McpError);
      assert.throws(() => normalizePath(null as any), McpError);
      assert.throws(() => normalizePath(undefined as any), McpError);
      assert.throws(() => normalizePath(123 as any), McpError);
    });
  });

  describe('Windows-specific tests', () => {
    it('should handle Windows drive letters', () => {
      assert.strictEqual(normalizePath('C:\\path\\to\\file'), 'C:/path/to/file');
      assert.strictEqual(normalizePath('D:/path/to/file'), 'D:/path/to/file');
      assert.strictEqual(normalizePath('Z:\\test\\folder'), 'Z:/test/folder');
    });

    it('should allow colons in Windows drive letters', () => {
      assert.strictEqual(normalizePath('C:\\path\\to\\file'), 'C:/path/to/file');
      assert.strictEqual(normalizePath('D:/path/to/file'), 'D:/path/to/file');
      assert.strictEqual(normalizePath('X:\\test\\folder'), 'X:/test/folder');
    });

    it('should reject Windows paths with invalid characters', () => {
      assert.throws(() => normalizePath('C:\\path\\to\\file<'), McpError);
      assert.throws(() => normalizePath('D:/path/to/file>'), McpError);
      assert.throws(() => normalizePath('E:\\test\\folder|'), McpError);
      assert.throws(() => normalizePath('F:/test/folder?'), McpError);
      assert.throws(() => normalizePath('G:\\test\\folder*'), McpError);
    });

    it('should handle UNC paths correctly', () => {
      assert.strictEqual(normalizePath('\\\\server\\share\\path'), '//server/share/path');
      assert.strictEqual(normalizePath('//server/share/path'), '//server/share/path');
      assert.strictEqual(normalizePath('\\\\server\\share\\folder\\file'), '//server/share/folder/file');
    });

    it('should handle network drive paths', () => {
      assert.strictEqual(normalizePath('Z:\\network\\drive'), 'Z:/network/drive');
      assert.strictEqual(normalizePath('Y:/network/drive'), 'Y:/network/drive');
    });

    it('should preserve path separators in UNC paths', () => {
      const result = normalizePath('\\\\server\\share\\path');
      assert.strictEqual(result, '//server/share/path');
      assert.notStrictEqual(result, path.resolve('//server/share/path'));
    });

    it('should preserve drive letters in Windows paths', () => {
      const result = normalizePath('C:\\path\\to\\file');
      assert.strictEqual(result, 'C:/path/to/file');
      assert.notStrictEqual(result, path.resolve('C:/path/to/file'));
    });
  });

  describe('macOS/Unix-specific tests', () => {
    it('should handle absolute paths', () => {
      assert.strictEqual(normalizePath('/path/to/file'), path.resolve('/path/to/file'));
    });

    it('should handle mixed forward/backward slashes', () => {
      assert.strictEqual(normalizePath('path\\to\\file'), 'path/to/file');
    });

    it('should handle paths with colons in filenames', () => {
      assert.strictEqual(normalizePath('/path/to/file:name'), path.resolve('/path/to/file:name'));
    });
  });
});

```

--------------------------------------------------------------------------------
/src/utils/links.ts:
--------------------------------------------------------------------------------

```typescript
import { promises as fs } from "fs";
import path from "path";
import { getAllMarkdownFiles } from "./files.js";

interface LinkUpdateOptions {
  filePath: string;
  oldPath: string;
  newPath?: string;
  isMovedToOtherVault?: boolean;
  isMovedFromOtherVault?: boolean;
  sourceVaultName?: string;
  destVaultName?: string;
}

/**
 * Updates markdown links in a file
 * @returns true if any links were updated
 */
export async function updateLinksInFile({
  filePath,
  oldPath,
  newPath,
  isMovedToOtherVault,
  isMovedFromOtherVault,
  sourceVaultName,
  destVaultName
}: LinkUpdateOptions): Promise<boolean> {
  const content = await fs.readFile(filePath, "utf-8");
  
  const oldName = path.basename(oldPath, ".md");
  const newName = newPath ? path.basename(newPath, ".md") : null;
  
  let newContent: string;
  
  if (isMovedToOtherVault) {
    // Handle move to another vault - add vault reference
    newContent = content
      .replace(
        new RegExp(`\\[\\[${oldName}(\\|[^\\]]*)?\\]\\]`, "g"),
        `[[${destVaultName}/${oldName}$1]]`
      )
      .replace(
        new RegExp(`\\[([^\\]]*)\\]\\(${oldName}\\.md\\)`, "g"),
        `[$1](${destVaultName}/${oldName}.md)`
      );
  } else if (isMovedFromOtherVault) {
    // Handle move from another vault - add note about original location
    newContent = content
      .replace(
        new RegExp(`\\[\\[${oldName}(\\|[^\\]]*)?\\]\\]`, "g"),
        `[[${newName}$1]] *(moved from ${sourceVaultName})*`
      )
      .replace(
        new RegExp(`\\[([^\\]]*)\\]\\(${oldName}\\.md\\)`, "g"),
        `[$1](${newName}.md) *(moved from ${sourceVaultName})*`
      );
  } else if (!newPath) {
    // Handle deletion - strike through the links
    newContent = content
      .replace(
        new RegExp(`\\[\\[${oldName}(\\|[^\\]]*)?\\]\\]`, "g"),
        `~~[[${oldName}$1]]~~`
      )
      .replace(
        new RegExp(`\\[([^\\]]*)\\]\\(${oldName}\\.md\\)`, "g"),
        `~~[$1](${oldName}.md)~~`
      );
  } else {
    // Handle move/rename within same vault
    newContent = content
      .replace(
        new RegExp(`\\[\\[${oldName}(\\|[^\\]]*)?\\]\\]`, "g"),
        `[[${newName}$1]]`
      )
      .replace(
        new RegExp(`\\[([^\\]]*)\\]\\(${oldName}\\.md\\)`, "g"),
        `[$1](${newName}.md)`
      );
  }

  if (content !== newContent) {
    await fs.writeFile(filePath, newContent, "utf-8");
    return true;
  }
  
  return false;
}

/**
 * Updates all markdown links in the vault after a note is moved or deleted
 * @returns number of files updated
 */
export async function updateVaultLinks(
  vaultPath: string,
  oldPath: string | null | undefined,
  newPath: string | null | undefined,
  sourceVaultName?: string,
  destVaultName?: string
): Promise<number> {
  const files = await getAllMarkdownFiles(vaultPath);
  let updatedFiles = 0;

  // Determine the type of operation
  const isMovedToOtherVault: boolean = Boolean(oldPath !== null && newPath === null && sourceVaultName && destVaultName);
  const isMovedFromOtherVault: boolean = Boolean(oldPath === null && newPath !== null && sourceVaultName && destVaultName);

  for (const file of files) {
    // Skip the target file itself if it's a move operation
    if (newPath && file === path.join(vaultPath, newPath)) continue;
    
    if (await updateLinksInFile({
      filePath: file,
      oldPath: oldPath || "",
      newPath: newPath || undefined,
      isMovedToOtherVault,
      isMovedFromOtherVault,
      sourceVaultName,
      destVaultName
    })) {
      updatedFiles++;
    }
  }

  return updatedFiles;
}

```

--------------------------------------------------------------------------------
/src/resources/resources.ts:
--------------------------------------------------------------------------------

```typescript
import { promises as fs } from "fs";
import { McpError, ErrorCode } from "@modelcontextprotocol/sdk/types.js";

export interface VaultResource {
  uri: string;
  name: string;
  mimeType: string;
  description?: string;
  metadata?: {
    path: string;
    isAccessible: boolean;
  };
}

export interface VaultListResource {
  uri: string;
  name: string;
  mimeType: string;
  description: string;
  metadata?: {
    totalVaults: number;
    vaults: Array<{
      name: string;
      path: string;
      isAccessible: boolean;
    }>;
  };
}

/**
 * Gets metadata for a vault
 */
export async function getVaultMetadata(vaultPath: string): Promise<{
  isAccessible: boolean;
}> {
  try {
    await fs.access(vaultPath);
    return {
      isAccessible: true
    };
  } catch {
    return {
      isAccessible: false
    };
  }
}

/**
 * Lists vault resources including a root resource that lists all vaults
 */
export async function listVaultResources(vaults: Map<string, string>): Promise<(VaultResource | VaultListResource)[]> {
  const resources: (VaultResource | VaultListResource)[] = [];

  // Add root resource that lists all vaults
  const vaultList: VaultListResource = {
    uri: "obsidian-vault://",
    name: "Available Vaults",
    mimeType: "application/json",
    description: "List of all available Obsidian vaults and their access status",
    metadata: {
      totalVaults: vaults.size,
      vaults: []
    }
  };

  // Process each vault
  for (const [vaultName, vaultPath] of vaults.entries()) {
    try {
      const metadata = await getVaultMetadata(vaultPath);

      // Add to vault list
      vaultList.metadata?.vaults.push({
        name: vaultName,
        path: vaultPath,
        isAccessible: metadata.isAccessible
      });

      // Add individual vault resource
      resources.push({
        uri: `obsidian-vault://${vaultName}`,
        name: vaultName,
        mimeType: "application/json",
        description: `Access information for the ${vaultName} vault`,
        metadata: {
          path: vaultPath,
          isAccessible: metadata.isAccessible
        }
      });
    } catch (error) {
      console.error(`Error processing vault ${vaultName}:`, error);
      // Still add to vault list but mark as inaccessible
      vaultList.metadata?.vaults.push({
        name: vaultName,
        path: vaultPath,
        isAccessible: false
      });
    }
  }

  // Add vault list as first resource
  resources.unshift(vaultList);

  return resources;
}

/**
 * Reads a vault resource by URI
 */
export async function readVaultResource(
  vaults: Map<string, string>,
  uri: string
): Promise<{ uri: string; mimeType: string; text: string }> {
  // Handle root vault list
  if (uri === 'obsidian-vault://') {
    const vaultList = [];
    for (const [name, path] of vaults.entries()) {
      const metadata = await getVaultMetadata(path);
      vaultList.push({
        name,
        path,
        isAccessible: metadata.isAccessible
      });
    }
    return {
      uri,
      mimeType: "application/json",
      text: JSON.stringify({
        totalVaults: vaults.size,
        vaults: vaultList
      }, null, 2)
    };
  }

  // Handle individual vault resources
  const vaultName = uri.replace("obsidian-vault://", "");
  const vaultPath = vaults.get(vaultName);

  if (!vaultPath) {
    throw new McpError(
      ErrorCode.InvalidRequest,
      `Unknown vault: ${vaultName}`
    );
  }

  const metadata = await getVaultMetadata(vaultPath);

  return {
    uri,
    mimeType: "application/json",
    text: JSON.stringify({
      name: vaultName,
      path: vaultPath,
      isAccessible: metadata.isAccessible
    }, null, 2)
  };
}

```

--------------------------------------------------------------------------------
/src/utils/responses.ts:
--------------------------------------------------------------------------------

```typescript
import {
  ToolResponse,
  OperationResult,
  BatchOperationResult,
  FileOperationResult,
  TagOperationResult,
  SearchOperationResult,
  TagChange,
  SearchResult
} from '../types.js';

/**
 * Creates a standardized tool response
 */
export function createToolResponse(message: string): ToolResponse {
  return {
    content: [{
      type: "text",
      text: message
    }]
  };
}

/**
 * Formats a basic operation result
 */
export function formatOperationResult(result: OperationResult): string {
  const parts: string[] = [];
  
  // Add main message
  parts.push(result.message);
  
  // Add details if present
  if (result.details) {
    parts.push('\nDetails:');
    Object.entries(result.details).forEach(([key, value]) => {
      parts.push(`  ${key}: ${JSON.stringify(value)}`);
    });
  }
  
  return parts.join('\n');
}

/**
 * Formats a batch operation result
 */
export function formatBatchResult(result: BatchOperationResult): string {
  const parts: string[] = [];
  
  // Add summary
  parts.push(result.message);
  parts.push(`\nProcessed ${result.totalCount} items: ${result.successCount} succeeded`);
  
  // Add failures if any
  if (result.failedItems.length > 0) {
    parts.push('\nErrors:');
    result.failedItems.forEach(({ item, error }) => {
      parts.push(`  ${item}: ${error}`);
    });
  }
  
  return parts.join('\n');
}

/**
 * Formats a file operation result
 */
export function formatFileResult(result: FileOperationResult): string {
  const operationText = {
    create: 'Created',
    edit: 'Modified',
    delete: 'Deleted',
    move: 'Moved'
  }[result.operation];
  
  return `${operationText} file: ${result.path}\n${result.message}`;
}

/**
 * Formats tag changes for reporting
 */
function formatTagChanges(changes: TagChange[]): string {
  const byLocation = changes.reduce((acc, change) => {
    if (!acc[change.location]) acc[change.location] = new Set();
    acc[change.location].add(change.tag);
    return acc;
  }, {} as Record<string, Set<string>>);
  
  const parts: string[] = [];
  for (const [location, tags] of Object.entries(byLocation)) {
    parts.push(`  ${location}: ${Array.from(tags).join(', ')}`);
  }
  
  return parts.join('\n');
}

/**
 * Formats a tag operation result
 */
export function formatTagResult(result: TagOperationResult): string {
  const parts: string[] = [];
  
  // Add summary
  parts.push(result.message);
  parts.push(`\nProcessed ${result.totalCount} files: ${result.successCount} modified`);
  
  // Add detailed changes
  for (const [filename, fileDetails] of Object.entries(result.details)) {
    if (fileDetails.changes.length > 0) {
      parts.push(`\nChanges in ${filename}:`);
      parts.push(formatTagChanges(fileDetails.changes));
    }
  }
  
  // Add failures if any
  if (result.failedItems.length > 0) {
    parts.push('\nErrors:');
    result.failedItems.forEach(({ item, error }) => {
      parts.push(`  ${item}: ${error}`);
    });
  }
  
  return parts.join('\n');
}

/**
 * Formats search results
 */
export function formatSearchResult(result: SearchOperationResult): string {
  const parts: string[] = [];
  
  // Add summary
  parts.push(
    `Found ${result.totalMatches} match${result.totalMatches === 1 ? '' : 'es'} ` +
    `in ${result.matchedFiles} file${result.matchedFiles === 1 ? '' : 's'}`
  );
  
  if (result.results.length === 0) {
    return 'No matches found.';
  }
  
  // Separate filename and content matches
  const filenameMatches = result.results.filter(r => r.matches?.some(m => m.line === 0));
  const contentMatches = result.results.filter(r => r.matches?.some(m => m.line !== 0));
  
  // Add filename matches if any
  if (filenameMatches.length > 0) {
    parts.push('\nFilename matches:');
    filenameMatches.forEach(result => {
      parts.push(`  ${result.file}`);
    });
  }
  
  // Add content matches if any
  if (contentMatches.length > 0) {
    parts.push('\nContent matches:');
    contentMatches.forEach(result => {
      parts.push(`\nFile: ${result.file}`);
      result.matches
        ?.filter(m => m?.line !== 0) // Skip filename matches
        ?.forEach(m => m && parts.push(`  Line ${m.line}: ${m.text}`));
    });
  }
  
  return parts.join('\n');
}

/**
 * Creates a standardized error response
 */
export function createErrorResponse(error: Error): ToolResponse {
  return createToolResponse(`Error: ${error.message}`);
}

```

--------------------------------------------------------------------------------
/src/tools/delete-note/index.ts:
--------------------------------------------------------------------------------

```typescript
import { z } from "zod";
import { promises as fs } from "fs";
import path from "path";
import { McpError } from "@modelcontextprotocol/sdk/types.js";
import { ensureMarkdownExtension, validateVaultPath } from "../../utils/path.js";
import { fileExists, ensureDirectory } from "../../utils/files.js";
import { updateVaultLinks } from "../../utils/links.js";
import { createNoteNotFoundError, handleFsError } from "../../utils/errors.js";
import { createTool } from "../../utils/tool-factory.js";

// Input validation schema with descriptions
const schema = z.object({
  vault: z.string()
    .min(1, "Vault name cannot be empty")
    .describe("Name of the vault containing the note"),
  path: z.string()
    .min(1, "Path cannot be empty")
    .refine(name => !path.isAbsolute(name), 
      "Path must be relative to vault root")
    .describe("Path of the note relative to vault root (e.g., 'folder/note.md')"),
  reason: z.string()
    .optional()
    .describe("Optional reason for deletion (stored in trash metadata)"),
  permanent: z.boolean()
    .optional()
    .default(false)
    .describe("Whether to permanently delete instead of moving to trash (default: false)")
}).strict();


interface TrashMetadata {
  originalPath: string;
  deletedAt: string;
  reason?: string;
}

async function ensureTrashDirectory(vaultPath: string): Promise<string> {
  const trashPath = path.join(vaultPath, ".trash");
  await ensureDirectory(trashPath);
  return trashPath;
}

async function moveToTrash(
  vaultPath: string,
  notePath: string,
  reason?: string
): Promise<string> {
  const trashPath = await ensureTrashDirectory(vaultPath);
  const timestamp = new Date().toISOString().replace(/[:.]/g, "-");
  const trashName = `${path.basename(notePath, ".md")}_${timestamp}.md`;
  const trashFilePath = path.join(trashPath, trashName);

  // Create metadata
  const metadata: TrashMetadata = {
    originalPath: notePath,
    deletedAt: new Date().toISOString(),
    reason
  };

  try {
    // Read original content
    const content = await fs.readFile(path.join(vaultPath, notePath), "utf-8");
    
    // Prepend metadata as YAML frontmatter
    const contentWithMetadata = `---
trash_metadata:
  original_path: ${metadata.originalPath}
  deleted_at: ${metadata.deletedAt}${reason ? `\n  reason: ${reason}` : ""}
---

${content}`;

    // Write to trash with metadata
    await fs.writeFile(trashFilePath, contentWithMetadata);
    
    // Delete original file
    await fs.unlink(path.join(vaultPath, notePath));

    return trashName;
  } catch (error) {
    throw handleFsError(error, 'move note to trash');
  }
}

async function deleteNote(
  vaultPath: string,
  notePath: string,
  options: {
    permanent?: boolean;
    reason?: string;
  } = {}
): Promise<string> {
  const fullPath = path.join(vaultPath, notePath);

  // Validate path is within vault
  validateVaultPath(vaultPath, fullPath);

  try {
    // Check if note exists
    if (!await fileExists(fullPath)) {
      throw createNoteNotFoundError(notePath);
    }

    // Update links in other files first
    const updatedFiles = await updateVaultLinks(vaultPath, notePath, null);
    
    if (options.permanent) {
      // Permanently delete the file
      await fs.unlink(fullPath);
      return `Permanently deleted note "${notePath}"\n` +
             `Updated ${updatedFiles} file${updatedFiles === 1 ? '' : 's'} with broken links`;
    } else {
      // Move to trash with metadata
      const trashName = await moveToTrash(vaultPath, notePath, options.reason);
      return `Moved note "${notePath}" to trash as "${trashName}"\n` +
             `Updated ${updatedFiles} file${updatedFiles === 1 ? '' : 's'} with broken links`;
    }
  } catch (error) {
    if (error instanceof McpError) {
      throw error;
    }
    throw handleFsError(error, 'delete note');
  }
}

type DeleteNoteArgs = z.infer<typeof schema>;

export function createDeleteNoteTool(vaults: Map<string, string>) {
  return createTool<DeleteNoteArgs>({
    name: "delete-note",
    description: "Delete a note, moving it to .trash by default or permanently deleting if specified",
    schema,
    handler: async (args, vaultPath, _vaultName) => {
      // Ensure .md extension
      const fullNotePath = ensureMarkdownExtension(args.path);
      
      const resultMessage = await deleteNote(vaultPath, fullNotePath, { 
        reason: args.reason, 
        permanent: args.permanent 
      });
      
      return {
        content: [
          {
            type: "text",
            text: resultMessage
          }
        ]
      };
    }
  }, vaults);
}

```

--------------------------------------------------------------------------------
/src/tools/add-tags/index.ts:
--------------------------------------------------------------------------------

```typescript
import { z } from "zod";
import { TagOperationResult } from "../../types.js";
import { promises as fs } from "fs";
import path from "path";
import { McpError } from "@modelcontextprotocol/sdk/types.js";
import { validateVaultPath } from "../../utils/path.js";
import { fileExists, safeReadFile } from "../../utils/files.js";
import {
  validateTag,
  parseNote,
  stringifyNote,
  addTagsToFrontmatter,
  normalizeTag
} from "../../utils/tags.js";
import { createToolResponse, formatTagResult } from "../../utils/responses.js";
import { createTool } from "../../utils/tool-factory.js";

// Input validation schema with descriptions
const schema = z.object({
  vault: z.string()
    .min(1, "Vault name cannot be empty")
    .describe("Name of the vault containing the notes"),
  files: z.array(z.string())
    .min(1, "At least one file must be specified")
    .refine(
      files => files.every(f => f.endsWith('.md')),
      "All files must have .md extension"
    )
    .describe("Array of note filenames to process (must have .md extension)"),
  tags: z.array(z.string())
    .min(1, "At least one tag must be specified")
    .refine(
      tags => tags.every(validateTag),
      "Invalid tag format. Tags must contain only letters, numbers, and forward slashes for hierarchy."
    )
    .describe("Array of tags to add (e.g., 'status/active', 'project/docs')"),
  location: z.enum(['frontmatter', 'content', 'both'])
    .optional()
    .describe("Where to add tags (default: both)"),
  normalize: z.boolean()
    .optional()
    .describe("Whether to normalize tag format (e.g., ProjectActive -> project-active) (default: true)"),
  position: z.enum(['start', 'end'])
    .optional()
    .describe("Where to add inline tags in content (default: end)")
}).strict();

type AddTagsArgs = z.infer<typeof schema>;

async function addTags(
  vaultPath: string,
  files: string[],
  tags: string[],
  location: 'frontmatter' | 'content' | 'both' = 'both',
  normalize: boolean = true,
  position: 'start' | 'end' = 'end'
): Promise<TagOperationResult> {
  const result: TagOperationResult = {
    success: true,
    message: "Tag addition completed",
    successCount: 0,
    totalCount: files.length,
    failedItems: [],
    details: {}
  };

  for (const filename of files) {
    const fullPath = path.join(vaultPath, filename);
    result.details[filename] = { changes: [] };
    
    try {
      // Validate path is within vault
      validateVaultPath(vaultPath, fullPath);
      
      // Check if file exists
      if (!await fileExists(fullPath)) {
        result.failedItems.push({
          item: filename,
          error: "File not found"
        });
        continue;
      }

      // Read file content
      const content = await safeReadFile(fullPath);
      if (!content) {
        result.failedItems.push({
          item: filename,
          error: "Failed to read file"
        });
        continue;
      }

      // Parse the note
      const parsed = parseNote(content);
      let modified = false;

      // Handle frontmatter tags
      if (location !== 'content') {
        const updatedFrontmatter = addTagsToFrontmatter(
          parsed.frontmatter,
          tags,
          normalize
        );
        
        if (JSON.stringify(parsed.frontmatter) !== JSON.stringify(updatedFrontmatter)) {
          parsed.frontmatter = updatedFrontmatter;
          parsed.hasFrontmatter = true;
          modified = true;
          
          // Record changes
          tags.forEach((tag: string) => {
            result.details[filename].changes.push({
              tag: normalize ? normalizeTag(tag) : tag,
              location: 'frontmatter'
            });
          });
        }
      }

      // Handle inline tags
      if (location !== 'frontmatter') {
        const tagString = tags
          .filter(tag => validateTag(tag))
          .map((tag: string) => `#${normalize ? normalizeTag(tag) : tag}`)
          .join(' ');

        if (tagString) {
          if (position === 'start') {
            parsed.content = tagString + '\n\n' + parsed.content.trim();
          } else {
            parsed.content = parsed.content.trim() + '\n\n' + tagString;
          }
          modified = true;
          
          // Record changes
          tags.forEach((tag: string) => {
            result.details[filename].changes.push({
              tag: normalize ? normalizeTag(tag) : tag,
              location: 'content'
            });
          });
        }
      }

      // Save changes if modified
      if (modified) {
        const updatedContent = stringifyNote(parsed);
        await fs.writeFile(fullPath, updatedContent);
        result.successCount++;
      }
    } catch (error) {
      result.failedItems.push({
        item: filename,
        error: error instanceof Error ? error.message : 'Unknown error'
      });
    }
  }

  // Update success status based on results
  result.success = result.failedItems.length === 0;
  result.message = result.success 
    ? `Successfully added tags to ${result.successCount} files`
    : `Completed with ${result.failedItems.length} errors`;

  return result;
}

export function createAddTagsTool(vaults: Map<string, string>) {
  return createTool<AddTagsArgs>({
    name: "add-tags",
    description: `Add tags to notes in frontmatter and/or content.

Examples:
- Add to both locations: { "files": ["note.md"], "tags": ["status/active"] }
- Add to frontmatter only: { "files": ["note.md"], "tags": ["project/docs"], "location": "frontmatter" }
- Add to start of content: { "files": ["note.md"], "tags": ["type/meeting"], "location": "content", "position": "start" }`,
    schema,
    handler: async (args, vaultPath, _vaultName) => {
      const result = await addTags(
        vaultPath, 
        args.files, 
        args.tags, 
        args.location ?? 'both', 
        args.normalize ?? true, 
        args.position ?? 'end'
      );
      
      return createToolResponse(formatTagResult(result));
    }
  }, vaults);
}

```

--------------------------------------------------------------------------------
/example.ts:
--------------------------------------------------------------------------------

```typescript
// src/types.ts
export interface Tool {
    name: string;
    description: string;
    inputSchema: {
      type: string;
      properties: Record<string, any>;
      required?: string[];
    };
    handler: (args: any) => Promise<{
      content: Array<{
        type: string;
        text: string;
      }>;
    }>;
  }
  
  export interface ToolProvider {
    getTools(): Tool[];
  }
  
  // src/tools/note-tools.ts
  import { z } from "zod";
  import { Tool, ToolProvider } from "../types.js";
  import { promises as fs } from "fs";
  import path from "path";
  
  const CreateNoteSchema = z.object({
    filename: z.string(),
    content: z.string(),
    folder: z.string().optional()
  });
  
  export class NoteTools implements ToolProvider {
    constructor(private vaultPath: string) {}
  
    getTools(): Tool[] {
      return [
        {
          name: "create-note",
          description: "Create a new note in the vault",
          inputSchema: {
            type: "object",
            properties: {
              filename: {
                type: "string",
                description: "Name of the note (with .md extension)"
              },
              content: {
                type: "string",
                description: "Content of the note in markdown format"
              },
              folder: {
                type: "string",
                description: "Optional subfolder path"
              }
            },
            required: ["filename", "content"]
          },
          handler: async (args) => {
            const { filename, content, folder } = CreateNoteSchema.parse(args);
            const notePath = await this.createNote(filename, content, folder);
            return {
              content: [
                {
                  type: "text",
                  text: `Successfully created note: ${notePath}`
                }
              ]
            };
          }
        }
      ];
    }
  
    private async createNote(filename: string, content: string, folder?: string): Promise<string> {
      if (!filename.endsWith(".md")) {
        filename = `${filename}.md`;
      }
  
      const notePath = folder 
        ? path.join(this.vaultPath, folder, filename)
        : path.join(this.vaultPath, filename);
  
      const noteDir = path.dirname(notePath);
      await fs.mkdir(noteDir, { recursive: true });
  
      try {
        await fs.access(notePath);
        throw new Error("Note already exists");
      } catch (error) {
        if (error.code === "ENOENT") {
          await fs.writeFile(notePath, content);
          return notePath;
        }
        throw error;
      }
    }
  }
  
  // src/tools/search-tools.ts
  import { z } from "zod";
  import { Tool, ToolProvider } from "../types.js";
  import { promises as fs } from "fs";
  import path from "path";
  
  const SearchSchema = z.object({
    query: z.string(),
    path: z.string().optional(),
    caseSensitive: z.boolean().optional()
  });
  
  export class SearchTools implements ToolProvider {
    constructor(private vaultPath: string) {}
  
    getTools(): Tool[] {
      return [
        {
          name: "search-vault",
          description: "Search for text across notes",
          inputSchema: {
            type: "object",
            properties: {
              query: {
                type: "string",
                description: "Search query"
              },
              path: {
                type: "string",
                description: "Optional path to limit search scope"
              },
              caseSensitive: {
                type: "boolean",
                description: "Whether to perform case-sensitive search"
              }
            },
            required: ["query"]
          },
          handler: async (args) => {
            const { query, path: searchPath, caseSensitive } = SearchSchema.parse(args);
            const results = await this.searchVault(query, searchPath, caseSensitive);
            return {
              content: [
                {
                  type: "text",
                  text: this.formatSearchResults(results)
                }
              ]
            };
          }
        }
      ];
    }
  
    private async searchVault(query: string, searchPath?: string, caseSensitive = false) {
      // Implementation of searchVault method...
    }
  
    private formatSearchResults(results: any[]) {
      // Implementation of formatSearchResults method...
    }
  }
  
  // src/server.ts
  import { Server } from "@modelcontextprotocol/sdk/server/index.js";
  import { StdioServerTransport } from "@modelcontextprotocol/sdk/server/stdio.js";
  import {
    CallToolRequestSchema,
    ListToolsRequestSchema,
  } from "@modelcontextprotocol/sdk/types.js";
  import { Tool, ToolProvider } from "./types.js";
  
  export class ObsidianServer {
    private server: Server;
    private tools: Map<string, Tool> = new Map();
  
    constructor() {
      this.server = new Server(
        {
          name: "obsidian-vault",
          version: "1.0.0"
        },
        {
          capabilities: {
            tools: {}
          }
        }
      );
  
      this.setupHandlers();
    }
  
    registerToolProvider(provider: ToolProvider) {
      for (const tool of provider.getTools()) {
        this.tools.set(tool.name, tool);
      }
    }
  
    private setupHandlers() {
      this.server.setRequestHandler(ListToolsRequestSchema, async () => ({
        tools: Array.from(this.tools.values()).map(tool => ({
          name: tool.name,
          description: tool.description,
          inputSchema: tool.inputSchema
        }))
      }));
  
      this.server.setRequestHandler(CallToolRequestSchema, async (request) => {
        const { name, arguments: args } = request.params;
        const tool = this.tools.get(name);
        
        if (!tool) {
          throw new Error(`Unknown tool: ${name}`);
        }
  
        return tool.handler(args);
      });
    }
  
    async start() {
      const transport = new StdioServerTransport();
      await this.server.connect(transport);
      console.error("Obsidian MCP Server running on stdio");
    }
  }
  
  // src/main.ts
  import { ObsidianServer } from "./server.js";
  import { NoteTools } from "./tools/note-tools.js";
  import { SearchTools } from "./tools/search-tools.js";
  
  async function main() {
    const vaultPath = process.argv[2];
    if (!vaultPath) {
      console.error("Please provide the path to your Obsidian vault");
      process.exit(1);
    }
  
    try {
      const server = new ObsidianServer();
      
      // Register tool providers
      server.registerToolProvider(new NoteTools(vaultPath));
      server.registerToolProvider(new SearchTools(vaultPath));
  
      await server.start();
    } catch (error) {
      console.error("Fatal error:", error);
      process.exit(1);
    }
  }
  
  main().catch((error) => {
    console.error("Unhandled error:", error);
    process.exit(1);
  });
```

--------------------------------------------------------------------------------
/docs/tool-examples.md:
--------------------------------------------------------------------------------

```markdown
# Tool Implementation Examples

This document provides practical examples of common tool implementation patterns and anti-patterns.

## Example 1: File Operation Tool

### ✅ Good Implementation

```typescript
import { z } from "zod";
import { Tool, FileOperationResult } from "../../types.js";
import { validateVaultPath } from "../../utils/path.js";
import { handleFsError } from "../../utils/errors.js";
import { createToolResponse, formatFileResult } from "../../utils/responses.js";
import { createSchemaHandler } from "../../utils/schema.js";

const schema = z.object({
  path: z.string()
    .min(1, "Path cannot be empty")
    .refine(path => !path.includes('..'), "Path cannot contain '..'")
    .describe("Path to the file relative to vault root"),
  content: z.string()
    .min(1, "Content cannot be empty")
    .describe("File content to write")
}).strict();

const schemaHandler = createSchemaHandler(schema);

async function writeFile(
  vaultPath: string,
  filePath: string,
  content: string
): Promise<FileOperationResult> {
  const fullPath = path.join(vaultPath, filePath);
  validateVaultPath(vaultPath, fullPath);

  try {
    await ensureDirectory(path.dirname(fullPath));
    await fs.writeFile(fullPath, content, 'utf8');
    
    return {
      success: true,
      message: "File written successfully",
      path: fullPath,
      operation: 'create'
    };
  } catch (error) {
    throw handleFsError(error, 'write file');
  }
}

export function createWriteFileTool(vaultPath: string): Tool {
  if (!vaultPath) {
    throw new Error("Vault path is required");
  }

  return {
    name: "write-file",
    description: "Write content to a file in the vault",
    inputSchema: schemaHandler,
    handler: async (args) => {
      const validated = schemaHandler.parse(args);
      const result = await writeFile(vaultPath, validated.path, validated.content);
      return createToolResponse(formatFileResult(result));
    }
  };
}
```

### ❌ Bad Implementation

```typescript
// Anti-pattern example
export function createBadWriteFileTool(vaultPath: string): Tool {
  return {
    name: "write-file",
    description: "Writes a file",  // Too vague
    inputSchema: {
      // Missing proper schema handler
      jsonSchema: {
        type: "object",
        properties: {
          path: { type: "string" },
          content: { type: "string" }
        }
      },
      parse: (input: any) => input  // No validation!
    },
    handler: async (args) => {
      try {
        // Missing path validation
        const filePath = path.join(vaultPath, args.path);
        
        // Direct fs operations without proper error handling
        await fs.writeFile(filePath, args.content);
        
        // Poor response formatting
        return createToolResponse("File written");
      } catch (error) {
        // Bad error handling
        return createToolResponse(`Error: ${error}`);
      }
    }
  };
}
```

## Example 2: Search Tool

### ✅ Good Implementation

```typescript
const schema = z.object({
  query: z.string()
    .min(1, "Search query cannot be empty")
    .describe("Text to search for"),
  caseSensitive: z.boolean()
    .optional()
    .describe("Whether to perform case-sensitive search"),
  path: z.string()
    .optional()
    .describe("Optional subfolder to limit search scope")
}).strict();

const schemaHandler = createSchemaHandler(schema);

async function searchFiles(
  vaultPath: string,
  query: string,
  options: SearchOptions
): Promise<SearchOperationResult> {
  try {
    const searchPath = options.path 
      ? path.join(vaultPath, options.path)
      : vaultPath;
    
    validateVaultPath(vaultPath, searchPath);
    
    // Implementation details...
    
    return {
      success: true,
      message: "Search completed",
      results: matches,
      totalMatches: totalCount,
      matchedFiles: fileCount
    };
  } catch (error) {
    throw handleFsError(error, 'search files');
  }
}

export function createSearchTool(vaultPath: string): Tool {
  if (!vaultPath) {
    throw new Error("Vault path is required");
  }

  return {
    name: "search-files",
    description: "Search for text in vault files",
    inputSchema: schemaHandler,
    handler: async (args) => {
      const validated = schemaHandler.parse(args);
      const result = await searchFiles(vaultPath, validated.query, {
        caseSensitive: validated.caseSensitive,
        path: validated.path
      });
      return createToolResponse(formatSearchResult(result));
    }
  };
}
```

### ❌ Bad Implementation

```typescript
// Anti-pattern example
export function createBadSearchTool(vaultPath: string): Tool {
  return {
    name: "search",
    description: "Searches files",
    inputSchema: {
      jsonSchema: {
        type: "object",
        properties: {
          query: { type: "string" }
        }
      },
      parse: (input: any) => input
    },
    handler: async (args) => {
      // Bad: Recursive search without limits
      async function searchDir(dir: string): Promise<string[]> {
        const results: string[] = [];
        const files = await fs.readdir(dir);
        
        for (const file of files) {
          const fullPath = path.join(dir, file);
          const stat = await fs.stat(fullPath);
          
          if (stat.isDirectory()) {
            results.push(...await searchDir(fullPath));
          } else {
            const content = await fs.readFile(fullPath, 'utf8');
            if (content.includes(args.query)) {
              results.push(fullPath);
            }
          }
        }
        
        return results;
      }
      
      try {
        const matches = await searchDir(vaultPath);
        // Poor response formatting
        return createToolResponse(
          `Found matches in:\n${matches.join('\n')}`
        );
      } catch (error) {
        return createToolResponse(`Search failed: ${error}`);
      }
    }
  };
}
```

## Common Anti-Patterns to Avoid

1. **Poor Error Handling**
```typescript
// ❌ Bad
catch (error) {
  return createToolResponse(`Error: ${error}`);
}

// ✅ Good
catch (error) {
  if (error instanceof McpError) {
    throw error;
  }
  throw handleFsError(error, 'operation name');
}
```

2. **Missing Input Validation**
```typescript
// ❌ Bad
const input = args as { path: string };

// ✅ Good
const validated = schemaHandler.parse(args);
```

3. **Unsafe Path Operations**
```typescript
// ❌ Bad
const fullPath = path.join(vaultPath, args.path);

// ✅ Good
const fullPath = path.join(vaultPath, validated.path);
validateVaultPath(vaultPath, fullPath);
```

4. **Poor Response Formatting**
```typescript
// ❌ Bad
return createToolResponse(JSON.stringify(result));

// ✅ Good
return createToolResponse(formatOperationResult(result));
```

5. **Direct File System Operations**
```typescript
// ❌ Bad
await fs.writeFile(path, content);

// ✅ Good
await ensureDirectory(path.dirname(fullPath));
await fs.writeFile(fullPath, content, 'utf8');
```

Remember:
- Always use utility functions for common operations
- Validate all inputs thoroughly
- Handle errors appropriately
- Format responses consistently
- Follow the established patterns in the codebase

```

--------------------------------------------------------------------------------
/src/tools/edit-note/index.ts:
--------------------------------------------------------------------------------

```typescript
import { z } from "zod";
import { FileOperationResult } from "../../types.js";
import { promises as fs } from "fs";
import path from "path";
import { McpError, ErrorCode } from "@modelcontextprotocol/sdk/types.js";
import { ensureMarkdownExtension, validateVaultPath } from "../../utils/path.js";
import { fileExists } from "../../utils/files.js";
import { createNoteNotFoundError, handleFsError } from "../../utils/errors.js";
import { createToolResponse, formatFileResult } from "../../utils/responses.js";
import { createTool } from "../../utils/tool-factory.js";

// Input validation schema with descriptions
// Schema for delete operation
const deleteSchema = z.object({
  vault: z.string()
    .min(1, "Vault name cannot be empty")
    .describe("Name of the vault containing the note"),
  filename: z.string()
    .min(1, "Filename cannot be empty")
    .refine(name => !name.includes('/') && !name.includes('\\'), 
      "Filename cannot contain path separators - use the 'folder' parameter for paths instead")
    .describe("Just the note name without any path separators (e.g. 'my-note.md', NOT 'folder/my-note.md')"),
  folder: z.string()
    .optional()
    .refine(folder => !folder || !path.isAbsolute(folder), 
      "Folder must be a relative path")
    .describe("Optional subfolder path relative to vault root"),
  operation: z.literal('delete')
    .describe("Delete operation"),
  content: z.undefined()
    .describe("Must not provide content for delete operation")
}).strict();

// Schema for non-delete operations
const editSchema = z.object({
  vault: z.string()
    .min(1, "Vault name cannot be empty")
    .describe("Name of the vault containing the note"),
  filename: z.string()
    .min(1, "Filename cannot be empty")
    .refine(name => !name.includes('/') && !name.includes('\\'), 
      "Filename cannot contain path separators - use the 'folder' parameter for paths instead")
    .describe("Just the note name without any path separators (e.g. 'my-note.md', NOT 'folder/my-note.md')"),
  folder: z.string()
    .optional()
    .refine(folder => !folder || !path.isAbsolute(folder), 
      "Folder must be a relative path")
    .describe("Optional subfolder path relative to vault root"),
  operation: z.enum(['append', 'prepend', 'replace'])
    .describe("Type of edit operation - must be one of: 'append', 'prepend', 'replace'")
    .refine(
      (op) => ['append', 'prepend', 'replace'].includes(op),
      {
        message: "Invalid operation. Must be one of: 'append', 'prepend', 'replace'",
        path: ['operation']
      }
    ),
  content: z.string()
    .min(1, "Content cannot be empty for non-delete operations")
    .describe("New content to add/prepend/replace")
}).strict();

// Combined schema using discriminated union
const schema = z.discriminatedUnion('operation', [deleteSchema, editSchema]);

// Types
type EditOperation = 'append' | 'prepend' | 'replace' | 'delete';

async function editNote(
  vaultPath: string, 
  filename: string,
  operation: EditOperation,
  content?: string,
  folder?: string
): Promise<FileOperationResult> {
  const sanitizedFilename = ensureMarkdownExtension(filename);
  const fullPath = folder
    ? path.join(vaultPath, folder, sanitizedFilename)
    : path.join(vaultPath, sanitizedFilename);
  
  // Validate path is within vault
  validateVaultPath(vaultPath, fullPath);

  // Create unique backup filename
  const timestamp = Date.now();
  const backupPath = `${fullPath}.${timestamp}.backup`;

  try {
    // For non-delete operations, create backup first
    if (operation !== 'delete' && await fileExists(fullPath)) {
      await fs.copyFile(fullPath, backupPath);
    }

    switch (operation) {
      case 'delete': {
        if (!await fileExists(fullPath)) {
          throw createNoteNotFoundError(filename);
        }
        // For delete, create backup before deleting
        await fs.copyFile(fullPath, backupPath);
        await fs.unlink(fullPath);
        
        // On successful delete, remove backup after a short delay
        // This gives a small window for potential recovery if needed
        setTimeout(async () => {
          try {
            await fs.unlink(backupPath);
          } catch (error: unknown) {
            const errorMessage = error instanceof Error ? error.message : String(error);
            console.error('Failed to cleanup backup file:', errorMessage);
          }
        }, 5000);

        return {
          success: true,
          message: "Note deleted successfully",
          path: fullPath,
          operation: 'delete'
        };
      }
      
      case 'append':
      case 'prepend':
      case 'replace': {
        // Check if file exists for non-delete operations
        if (!await fileExists(fullPath)) {
          throw createNoteNotFoundError(filename);
        }

        try {
          // Read existing content
          const existingContent = await fs.readFile(fullPath, "utf-8");
          
          // Prepare new content based on operation
          let newContent: string;
          if (operation === 'append') {
            newContent = existingContent.trim() + (existingContent.trim() ? '\n\n' : '') + content;
          } else if (operation === 'prepend') {
            newContent = content + (existingContent.trim() ? '\n\n' : '') + existingContent.trim();
          } else {
            // replace
            newContent = content as string;
          }

          // Write the new content
          await fs.writeFile(fullPath, newContent);
          
          // Clean up backup on success
          await fs.unlink(backupPath);

          return {
            success: true,
            message: `Note ${operation}ed successfully`,
            path: fullPath,
            operation: 'edit'
          };
        } catch (error: unknown) {
          // On error, attempt to restore from backup
          if (await fileExists(backupPath)) {
            try {
              await fs.copyFile(backupPath, fullPath);
              await fs.unlink(backupPath);
            } catch (rollbackError: unknown) {
              const errorMessage = error instanceof Error ? error.message : String(error);
              const rollbackErrorMessage = rollbackError instanceof Error ? rollbackError.message : String(rollbackError);
              
              throw new McpError(
                ErrorCode.InternalError,
                `Failed to rollback changes. Original error: ${errorMessage}. Rollback error: ${rollbackErrorMessage}. Backup file preserved at ${backupPath}`
              );
            }
          }
          throw error;
        }
      }
      
      default: {
        const _exhaustiveCheck: never = operation;
        throw new McpError(
          ErrorCode.InvalidParams,
          `Invalid operation: ${operation}`
        );
      }
    }
  } catch (error: unknown) {
    // If we have a backup and haven't handled the error yet, try to restore
    if (await fileExists(backupPath)) {
      try {
        await fs.copyFile(backupPath, fullPath);
        await fs.unlink(backupPath);
      } catch (rollbackError: unknown) {
        const rollbackErrorMessage = rollbackError instanceof Error ? rollbackError.message : String(rollbackError);
        console.error('Failed to cleanup/restore backup during error handling:', rollbackErrorMessage);
      }
    }

    if (error instanceof McpError) {
      throw error;
    }
    throw handleFsError(error, `${operation} note`);
  }
}

type EditNoteArgs = z.infer<typeof schema>;

export function createEditNoteTool(vaults: Map<string, string>) {
  return createTool<EditNoteArgs>({
    name: "edit-note",
    description: `Edit an existing note in the specified vault.

    There is a limited and discrete list of supported operations:
    - append: Appends content to the end of the note
    - prepend: Prepends content to the beginning of the note
    - replace: Replaces the entire content of the note

Examples:
- Root note: { "vault": "vault1", "filename": "note.md", "operation": "append", "content": "new content" }
- Subfolder note: { "vault": "vault2", "filename": "note.md", "folder": "journal/2024", "operation": "append", "content": "new content" }
- INCORRECT: { "filename": "journal/2024/note.md" } (don't put path in filename)`,
    schema,
    handler: async (args, vaultPath, _vaultName) => {
      const result = await editNote(
        vaultPath, 
        args.filename, 
        args.operation, 
        'content' in args ? args.content : undefined, 
        args.folder
      );
      return createToolResponse(formatFileResult(result));
    }
  }, vaults);
}

```

--------------------------------------------------------------------------------
/src/tools/search-vault/index.ts:
--------------------------------------------------------------------------------

```typescript
import { z } from "zod";
import { SearchResult, SearchOperationResult, SearchOptions } from "../../types.js";
import { promises as fs } from "fs";
import path from "path";
import { McpError, ErrorCode } from "@modelcontextprotocol/sdk/types.js";
import { validateVaultPath, safeJoinPath, normalizePath } from "../../utils/path.js";
import { getAllMarkdownFiles } from "../../utils/files.js";
import { handleFsError } from "../../utils/errors.js";
import { extractTags, normalizeTag, matchesTagPattern } from "../../utils/tags.js";
import { createToolResponse, formatSearchResult } from "../../utils/responses.js";
import { createTool } from "../../utils/tool-factory.js";

// Input validation schema with descriptions
const schema = z.object({
  vault: z.string()
    .min(1, "Vault name cannot be empty")
    .describe("Name of the vault to search in"),
  query: z.string()
    .min(1, "Search query cannot be empty")
    .describe("Search query (required). For text search use the term directly, for tag search use tag: prefix"),
  path: z.string()
    .optional()
    .describe("Optional subfolder path within the vault to limit search scope"),
  caseSensitive: z.boolean()
    .optional()
    .default(false)
    .describe("Whether to perform case-sensitive search (default: false)"),
  searchType: z.enum(['content', 'filename', 'both'])
    .optional()
    .default('content')
    .describe("Type of search to perform (default: content)")
}).strict();

type SearchVaultInput = z.infer<typeof schema>;

// Helper functions
function isTagSearch(query: string): boolean {
  return query.startsWith('tag:');
}

function normalizeTagQuery(query: string): string {
  // Remove 'tag:' prefix
  return normalizeTag(query.slice(4));
}

async function searchFilenames(
  vaultPath: string,
  query: string,
  options: SearchOptions
): Promise<SearchResult[]> {
  try {
    // Use safeJoinPath for path safety
    const searchDir = options.path ? safeJoinPath(vaultPath, options.path) : vaultPath;
    const files = await getAllMarkdownFiles(vaultPath, searchDir);
    const results: SearchResult[] = [];
    const searchQuery = options.caseSensitive ? query : query.toLowerCase();

    for (const file of files) {
      const relativePath = path.relative(vaultPath, file);
      const searchTarget = options.caseSensitive ? relativePath : relativePath.toLowerCase();

      if (searchTarget.includes(searchQuery)) {
        results.push({
          file: relativePath,
          matches: [{
            line: 0, // We use 0 to indicate this is a filename match
            text: `Filename match: ${relativePath}`
          }]
        });
      }
    }

    return results;
  } catch (error) {
    if (error instanceof McpError) throw error;
    throw handleFsError(error, 'search filenames');
  }
}

async function searchContent(
  vaultPath: string,
  query: string,
  options: SearchOptions
): Promise<SearchResult[]> {
  try {
    // Use safeJoinPath for path safety
    const searchDir = options.path ? safeJoinPath(vaultPath, options.path) : vaultPath;
    const files = await getAllMarkdownFiles(vaultPath, searchDir);
    const results: SearchResult[] = [];
    const isTagSearchQuery = isTagSearch(query);
    const normalizedTagQuery = isTagSearchQuery ? normalizeTagQuery(query) : '';

    for (const file of files) {
      try {
        const content = await fs.readFile(file, "utf-8");
        const lines = content.split("\n");
        const matches: SearchResult["matches"] = [];

        if (isTagSearchQuery) {
          // For tag searches, extract all tags from the content
          const fileTags = extractTags(content);

          lines.forEach((line, index) => {
            // Look for tag matches in each line
            const lineTags = extractTags(line);
            const hasMatchingTag = lineTags.some(tag => {
              const normalizedTag = normalizeTag(tag);
              return normalizedTag === normalizedTagQuery || matchesTagPattern(normalizedTagQuery, normalizedTag);
            });

            if (hasMatchingTag) {
              matches.push({
                line: index + 1,
                text: line.trim()
              });
            }
          });
        } else {
          // Regular text search
          const searchQuery = options.caseSensitive ? query : query.toLowerCase();

          lines.forEach((line, index) => {
            const searchLine = options.caseSensitive ? line : line.toLowerCase();
            if (searchLine.includes(searchQuery)) {
              matches.push({
                line: index + 1,
                text: line.trim()
              });
            }
          });
        }

        if (matches.length > 0) {
          results.push({
            file: path.relative(vaultPath, file),
            matches
          });
        }
      } catch (err) {
        console.error(`Error reading file ${file}:`, err);
        // Continue with other files
      }
    }

    return results;
  } catch (error) {
    if (error instanceof McpError) throw error;
    throw handleFsError(error, 'search content');
  }
}

async function searchVault(
  vaultPath: string,
  query: string,
  options: SearchOptions
): Promise<SearchOperationResult> {
  try {
    // Normalize vault path upfront
    const normalizedVaultPath = normalizePath(vaultPath);
    let results: SearchResult[] = [];
    let errors: string[] = [];

    if (options.searchType === 'filename' || options.searchType === 'both') {
      try {
        const filenameResults = await searchFilenames(normalizedVaultPath, query, options);
        results = results.concat(filenameResults);
      } catch (error) {
        if (error instanceof McpError) {
          errors.push(`Filename search error: ${error.message}`);
        } else {
          errors.push(`Filename search failed: ${error instanceof Error ? error.message : String(error)}`);
        }
      }
    }

    if (options.searchType === 'content' || options.searchType === 'both') {
      try {
        const contentResults = await searchContent(normalizedVaultPath, query, options);
        results = results.concat(contentResults);
      } catch (error) {
        if (error instanceof McpError) {
          errors.push(`Content search error: ${error.message}`);
        } else {
          errors.push(`Content search failed: ${error instanceof Error ? error.message : String(error)}`);
        }
      }
    }

    const totalMatches = results.reduce((sum, result) => sum + (result.matches?.length ?? 0), 0);

    // If we have some results but also errors, we'll return partial results with a warning
    if (results.length > 0 && errors.length > 0) {
      return {
        success: true,
        message: `Search completed with warnings:\n${errors.join('\n')}`,
        results,
        totalMatches,
        matchedFiles: results.length
      };
    }

    // If we have no results and errors, throw an error
    if (results.length === 0 && errors.length > 0) {
      throw new McpError(
        ErrorCode.InternalError,
        `Search failed:\n${errors.join('\n')}`
      );
    }

    return {
      success: true,
      message: "Search completed successfully",
      results,
      totalMatches,
      matchedFiles: results.length
    };
  } catch (error) {
    if (error instanceof McpError) {
      throw error;
    }
    throw handleFsError(error, 'search vault');
  }
}

export const createSearchVaultTool = (vaults: Map<string, string>) => {
  return createTool<SearchVaultInput>({
    name: "search-vault",
    description: `Search for specific content within vault notes (NOT for listing available vaults - use the list-vaults prompt for that).

This tool searches through note contents and filenames for specific text or tags:
- Content search: { "vault": "vault1", "query": "hello world", "searchType": "content" }
- Filename search: { "vault": "vault2", "query": "meeting-notes", "searchType": "filename" }
- Search both: { "vault": "vault1", "query": "project", "searchType": "both" }
- Tag search: { "vault": "vault2", "query": "tag:status/active" }
- Search in subfolder: { "vault": "vault1", "query": "hello", "path": "journal/2024" }

Note: To get a list of available vaults, use the list-vaults prompt instead of this search tool.`,
    schema,
    handler: async (args, vaultPath, _vaultName) => {
      const options: SearchOptions = {
        path: args.path,
        caseSensitive: args.caseSensitive,
        searchType: args.searchType
      };
      const result = await searchVault(vaultPath, args.query, options);
      return createToolResponse(formatSearchResult(result));
    }
  }, vaults);
}

```

--------------------------------------------------------------------------------
/src/server.ts:
--------------------------------------------------------------------------------

```typescript
import { Server } from "@modelcontextprotocol/sdk/server/index.js";
import { StdioServerTransport } from "@modelcontextprotocol/sdk/server/stdio.js";
import {
  CallToolRequestSchema,
  ListToolsRequestSchema,
  ListResourcesRequestSchema,
  ReadResourceRequestSchema,
  ListPromptsRequestSchema,
  GetPromptRequestSchema,
  McpError,
  ErrorCode
} from "@modelcontextprotocol/sdk/types.js";
import { RateLimiter, ConnectionMonitor, validateMessageSize } from "./utils/security.js";
import { Tool } from "./types.js";
import { z } from "zod";
import path from "path";
import os from 'os';
import fs from 'fs';
import {
  listVaultResources,
  readVaultResource
} from "./resources/resources.js";
import { listPrompts, getPrompt, registerPrompt } from "./utils/prompt-factory.js";
import { listVaultsPrompt } from "./prompts/list-vaults/index.js";

// Utility function to expand home directory
function expandHome(filepath: string): string {
  if (filepath.startsWith('~/') || filepath === '~') {
    return path.join(os.homedir(), filepath.slice(1));
  }
  return filepath;
}

export class ObsidianServer {
  private server: Server;
  private tools: Map<string, Tool<any>> = new Map();
  private vaults: Map<string, string> = new Map();
  private rateLimiter: RateLimiter;
  private connectionMonitor: ConnectionMonitor;

  constructor(vaultConfigs: { name: string; path: string }[]) {
    if (!vaultConfigs || vaultConfigs.length === 0) {
      throw new McpError(
        ErrorCode.InvalidRequest,
        'No vault configurations provided. At least one valid Obsidian vault is required.'
      );
    }

    // Initialize vaults
    vaultConfigs.forEach(config => {
      const expandedPath = expandHome(config.path);
      const resolvedPath = path.resolve(expandedPath);
      
      // Check if .obsidian directory exists
      const obsidianConfigPath = path.join(resolvedPath, '.obsidian');
      try {
        const stats = fs.statSync(obsidianConfigPath);
        if (!stats.isDirectory()) {
          throw new McpError(
            ErrorCode.InvalidRequest,
            `Invalid Obsidian vault at ${config.path}: .obsidian exists but is not a directory`
          );
        }
      } catch (error) {
        if ((error as NodeJS.ErrnoException).code === 'ENOENT') {
          throw new McpError(
            ErrorCode.InvalidRequest,
            `Invalid Obsidian vault at ${config.path}: Missing .obsidian directory. Please open this folder in Obsidian first to initialize it.`
          );
        }
        throw new McpError(
          ErrorCode.InvalidRequest,
          `Error accessing vault at ${config.path}: ${(error as Error).message}`
        );
      }

      this.vaults.set(config.name, resolvedPath);
    });
    this.server = new Server(
      {
        name: "obsidian-mcp",
        version: "1.0.6"
      },
      {
        capabilities: {
          resources: {},
          tools: {},
          prompts: {}
        }
      }
    );

    // Initialize security features
    this.rateLimiter = new RateLimiter();
    this.connectionMonitor = new ConnectionMonitor();

    // Register prompts
    registerPrompt(listVaultsPrompt);

    this.setupHandlers();

    // Setup connection monitoring with grace period for initialization
    this.connectionMonitor.start(() => {
      this.server.close();
    });
    
    // Update activity during initialization
    this.connectionMonitor.updateActivity();

    // Setup error handler
    this.server.onerror = (error) => {
      console.error("Server error:", error);
    };
  }

  registerTool<T>(tool: Tool<T>) {
    console.error(`Registering tool: ${tool.name}`);
    this.tools.set(tool.name, tool);
    console.error(`Current tools: ${Array.from(this.tools.keys()).join(', ')}`);
  }

  private validateRequest(request: any) {
    try {
      // Validate message size
      validateMessageSize(request);

      // Update connection activity
      this.connectionMonitor.updateActivity();

      // Check rate limit (using method name as client id for basic implementation)
      if (!this.rateLimiter.checkLimit(request.method)) {
        throw new McpError(ErrorCode.InvalidRequest, "Rate limit exceeded");
      }
    } catch (error) {
      console.error("Request validation failed:", error);
      throw error;
    }
  }

  private setupHandlers() {
    // List available prompts
    this.server.setRequestHandler(ListPromptsRequestSchema, async (request) => {
      this.validateRequest(request);
      return listPrompts();
    });

    // Get specific prompt
    this.server.setRequestHandler(GetPromptRequestSchema, async (request) => {
      this.validateRequest(request);
      const { name, arguments: args } = request.params;
      
      if (!name || typeof name !== 'string') {
        throw new McpError(ErrorCode.InvalidParams, "Missing or invalid prompt name");
      }

      const result = await getPrompt(name, this.vaults, args);
      return {
        ...result,
        _meta: {
          promptName: name,
          timestamp: new Date().toISOString()
        }
      };
    });

    // List available tools
    this.server.setRequestHandler(ListToolsRequestSchema, async (request) => {
      this.validateRequest(request);
      return {
        tools: Array.from(this.tools.values()).map(tool => ({
          name: tool.name,
          description: tool.description,
          inputSchema: tool.inputSchema.jsonSchema
        }))
      };
    });

    // List available resources
    this.server.setRequestHandler(ListResourcesRequestSchema, async (request) => {
      this.validateRequest(request);
      const resources = await listVaultResources(this.vaults);
      return {
        resources,
        resourceTemplates: []
      };
    });

    // Read resource content
    this.server.setRequestHandler(ReadResourceRequestSchema, async (request) => {
      this.validateRequest(request);
      const uri = request.params?.uri;
      if (!uri || typeof uri !== 'string') {
        throw new McpError(ErrorCode.InvalidParams, "Missing or invalid URI parameter");
      }

      if (!uri.startsWith('obsidian-vault://')) {
        throw new McpError(ErrorCode.InvalidParams, "Invalid URI format. Only vault resources are supported.");
      }

      return {
        contents: [await readVaultResource(this.vaults, uri)]
      };
    });

    this.server.setRequestHandler(CallToolRequestSchema, async (request, extra) => {
      this.validateRequest(request);
      const params = request.params;
      if (!params || typeof params !== 'object') {
        throw new McpError(ErrorCode.InvalidParams, "Invalid request parameters");
      }
      
      const name = params.name;
      const args = params.arguments;
      
      if (!name || typeof name !== 'string') {
        throw new McpError(ErrorCode.InvalidParams, "Missing or invalid tool name");
      }

      const tool = this.tools.get(name);
      if (!tool) {
        throw new McpError(ErrorCode.MethodNotFound, `Unknown tool: ${name}`);
      }

      try {
        // Validate and transform arguments using tool's schema handler
        const validatedArgs = tool.inputSchema.parse(args);
        
        // Execute tool with validated arguments
        const result = await tool.handler(validatedArgs);
        
        return {
          _meta: {
            toolName: name,
            timestamp: new Date().toISOString(),
            success: true
          },
          content: result.content
        };
      } catch (error: unknown) {
        if (error instanceof z.ZodError) {
          const formattedErrors = error.errors.map(e => {
            const path = e.path.join(".");
            const message = e.message;
            return `${path ? path + ': ' : ''}${message}`;
          }).join("\n");
          
          throw new McpError(
            ErrorCode.InvalidParams,
            `Invalid arguments:\n${formattedErrors}`
          );
        }
        
        // Enhance error reporting
        if (error instanceof McpError) {
          throw error;
        }
        
        // Convert unknown errors to McpError with helpful message
        throw new McpError(
          ErrorCode.InternalError,
          `Tool execution failed: ${error instanceof Error ? error.message : String(error)}`
        );
      }
    });
  }

  async start() {
    const transport = new StdioServerTransport();
    await this.server.connect(transport);
    console.error("Obsidian MCP Server running on stdio");
  }

  async stop() {
    this.connectionMonitor.stop();
    await this.server.close();
    console.error("Obsidian MCP Server stopped");
  }
}

```

--------------------------------------------------------------------------------
/docs/creating-tools.md:
--------------------------------------------------------------------------------

```markdown
# Creating New Tools Guide

This guide explains how to create new tools that integrate seamlessly with the existing codebase while following established patterns and best practices.

## Tool Structure Overview

Every tool follows a consistent structure:

1. Input validation using Zod schemas
2. Core functionality implementation
3. Tool factory function that creates the tool interface
4. Standardized error handling and responses

## Step-by-Step Implementation Guide

### 1. Create the Tool Directory

Create a new directory under `src/tools/` with your tool name:

```bash
src/tools/your-tool-name/
└── index.ts
```

### 2. Define the Input Schema

Start by defining a Zod schema for input validation. Always include descriptions for better documentation:

```typescript
const schema = z.object({
  param1: z.string()
    .min(1, "Parameter cannot be empty")
    .describe("Description of what this parameter does"),
  param2: z.number()
    .min(0)
    .describe("Description of numeric constraints"),
  optionalParam: z.string()
    .optional()
    .describe("Optional parameters should have clear descriptions too")
}).strict();

const schemaHandler = createSchemaHandler(schema);
```

### 3. Implement Core Functionality

Create a private async function that implements the tool's core logic:

```typescript
async function performOperation(
  vaultPath: string,
  param1: string,
  param2: number,
  optionalParam?: string
): Promise<OperationResult> {
  try {
    // Implement core functionality
    // Use utility functions for common operations
    // Handle errors appropriately
    return {
      success: true,
      message: "Operation completed successfully",
      // Include relevant details
    };
  } catch (error) {
    if (error instanceof McpError) {
      throw error;
    }
    throw handleFsError(error, 'operation name');
  }
}
```

### 4. Create the Tool Factory

Export a factory function that creates the tool interface:

```typescript
export function createYourTool(vaultPath: string): Tool {
  if (!vaultPath) {
    throw new Error("Vault path is required");
  }

  return {
    name: "your-tool-name",
    description: `Clear description of what the tool does.

Examples:
- Basic usage: { "param1": "value", "param2": 42 }
- With options: { "param1": "value", "param2": 42, "optionalParam": "extra" }`,
    inputSchema: schemaHandler,
    handler: async (args) => {
      try {
        const validated = schemaHandler.parse(args);
        const result = await performOperation(
          vaultPath,
          validated.param1,
          validated.param2,
          validated.optionalParam
        );
        
        return createToolResponse(formatOperationResult(result));
      } catch (error) {
        if (error instanceof z.ZodError) {
          throw new McpError(
            ErrorCode.InvalidRequest,
            `Invalid arguments: ${error.errors.map(e => e.message).join(", ")}`
          );
        }
        throw error;
      }
    }
  };
}
```

## Best Practices

### Input Validation
✅ DO:
- Use strict schemas with `.strict()`
- Provide clear error messages for validation
- Include descriptions for all parameters
- Validate paths are within vault when relevant
- Use discriminated unions for operations with different requirements
- Keep validation logic JSON Schema-friendly

#### Handling Conditional Validation

When dealing with operations that have different validation requirements, prefer using discriminated unions over complex refinements:

```typescript
// ✅ DO: Use discriminated unions for different operation types
const deleteSchema = z.object({
  operation: z.literal('delete'),
  target: z.string(),
  content: z.undefined()
}).strict();

const editSchema = z.object({
  operation: z.enum(['update', 'append']),
  target: z.string(),
  content: z.string().min(1)
}).strict();

const schema = z.discriminatedUnion('operation', [
  deleteSchema,
  editSchema
]);

// ❌ DON'T: Use complex refinements that don't translate well to JSON Schema
const schema = z.object({
  operation: z.enum(['delete', 'update', 'append']),
  target: z.string(),
  content: z.string().optional()
}).superRefine((data, ctx) => {
  if (data.operation === 'delete') {
    if (data.content !== undefined) {
      ctx.addIssue({
        code: z.ZodIssueCode.custom,
        message: "Content not allowed for delete"
      });
    }
  } else if (!data.content) {
    ctx.addIssue({
      code: z.ZodIssueCode.custom,
      message: "Content required for non-delete"
    });
  }
});
```

#### Schema Design Patterns

When designing schemas:

✅ DO:
- Break down complex schemas into smaller, focused schemas
- Use discriminated unions for operations with different requirements
- Keep validation logic simple and explicit
- Consider how schemas will translate to JSON Schema
- Use literal types for precise operation matching

❌ DON'T:
```typescript
// Don't use complex refinements that access parent data
schema.superRefine((val, ctx) => {
  const parent = ctx.parent; // Unreliable
});

// Don't mix validation concerns
const schema = z.object({
  operation: z.enum(['delete', 'update']),
  content: z.string().superRefine((val, ctx) => {
    // Don't put operation-specific logic here
  })
});

// Don't skip schema validation
const schema = z.object({
  path: z.string() // Missing validation and description
});

// Don't allow unsafe paths
const schema = z.object({
  path: z.string().describe("File path")  // Missing path validation
});
```

### Error Handling
✅ DO:
- Use utility functions for common errors
- Convert filesystem errors to McpErrors
- Provide specific error messages

❌ DON'T:
```typescript
// Don't throw raw errors
catch (error) {
  throw error;
}

// Don't ignore validation errors
handler: async (args) => {
  const result = await performOperation(args.param); // Missing validation
}
```

### Response Formatting
✅ DO:
- Use response utility functions
- Return standardized result objects
- Include relevant operation details

❌ DON'T:
```typescript
// Don't return raw strings
return createToolResponse("Done"); // Too vague

// Don't skip using proper response types
return {
  message: "Success" // Missing proper response structure
};
```

### Code Organization
✅ DO:
- Split complex logic into smaller functions
- Use utility functions for common operations
- Keep the tool factory function clean

❌ DON'T:
```typescript
// Don't mix concerns in the handler
handler: async (args) => {
  // Don't put core logic here
  const files = await fs.readdir(path);
  // ... more direct implementation
}

// Don't duplicate utility functions
function isValidPath(path: string) {
  // Don't reimplement existing utilities
}
```

## Schema Conversion Considerations

When creating schemas, remember they need to be converted to JSON Schema for the MCP interface:

### JSON Schema Compatibility

✅ DO:
- Test your schemas with the `createSchemaHandler` utility
- Use standard Zod types that have clear JSON Schema equivalents
- Structure complex validation using composition of simple schemas
- Verify generated JSON Schema matches expected validation rules

❌ DON'T:
- Rely heavily on refinements that don't translate to JSON Schema
- Use complex validation logic that can't be represented in JSON Schema
- Access parent context in nested validations
- Assume all Zod features will work in JSON Schema

### Schema Handler Usage

```typescript
// ✅ DO: Test schema conversion
const schema = z.discriminatedUnion('operation', [
  z.object({
    operation: z.literal('read'),
    path: z.string()
  }),
  z.object({
    operation: z.literal('write'),
    path: z.string(),
    content: z.string()
  })
]);

// Verify schema handler creation succeeds
const schemaHandler = createSchemaHandler(schema);

// ❌ DON'T: Use features that don't convert well
const schema = z.object({
  data: z.any().superRefine((val, ctx) => {
    // Complex custom validation that won't translate
  })
});
```

## Common Utilities

Make use of existing utilities:

- `createSchemaHandler`: For input validation
- `handleFsError`: For filesystem error handling
- `createToolResponse`: For formatting responses
- `validateVaultPath`: For path validation
- `ensureDirectory`: For directory operations
- `formatOperationResult`: For standardized results

## Testing Your Tool

1. Ensure your tool handles edge cases:
   - Invalid inputs
   - File/directory permissions
   - Non-existent paths
   - Concurrent operations

2. Verify error messages are helpful:
   - Validation errors should guide the user
   - Operation errors should be specific
   - Path-related errors should be clear

3. Check response formatting:
   - Success messages should be informative
   - Error messages should be actionable
   - Operation details should be complete

## Integration

After implementing your tool:

1. Export it from `src/tools/index.ts`
2. Register it in `src/server.ts`
3. Update any relevant documentation
4. Add appropriate error handling utilities if needed

Remember: Tools should be focused, well-documented, and follow the established patterns in the codebase. When in doubt, look at existing tools like `create-note` or `edit-note` as references.

```

--------------------------------------------------------------------------------
/src/tools/remove-tags/index.ts:
--------------------------------------------------------------------------------

```typescript
import { z } from "zod";
import { promises as fs } from "fs";
import path from "path";
import { McpError } from "@modelcontextprotocol/sdk/types.js";
import { validateVaultPath } from "../../utils/path.js";
import { fileExists, safeReadFile } from "../../utils/files.js";
import {
  validateTag,
  parseNote,
  stringifyNote,
  removeTagsFromFrontmatter,
  removeInlineTags
} from "../../utils/tags.js";
import { createTool } from "../../utils/tool-factory.js";

// Input validation schema with descriptions
const schema = z.object({
  vault: z.string()
    .min(1, "Vault name cannot be empty")
    .describe("Name of the vault containing the notes"),
  files: z.array(z.string())
    .min(1, "At least one file must be specified")
    .refine(
      files => files.every(f => f.endsWith('.md')),
      "All files must have .md extension"
    )
    .describe("Array of note filenames to process (must have .md extension)"),
  tags: z.array(z.string())
    .min(1, "At least one tag must be specified")
    .refine(
      tags => tags.every(tag => /^[a-zA-Z0-9\/]+$/.test(tag)),
      "Tags must contain only letters, numbers, and forward slashes. Do not include the # symbol. Examples: 'project', 'work/active', 'tasks/2024/q1'"
    )
    .describe("Array of tags to remove (without # symbol). Example: ['project', 'work/active']"),
  options: z.object({
    location: z.enum(['frontmatter', 'content', 'both'])
      .default('both')
      .describe("Where to remove tags from (default: both)"),
    normalize: z.boolean()
      .default(true)
      .describe("Whether to normalize tag format (e.g., ProjectActive -> project-active) (default: true)"),
    preserveChildren: z.boolean()
      .default(false)
      .describe("Whether to preserve child tags when removing parent tags (default: false)"),
    patterns: z.array(z.string())
      .default([])
      .describe("Tag patterns to match for removal (supports * wildcard) (default: [])")
  }).default({
    location: 'both',
    normalize: true,
    preserveChildren: false,
    patterns: []
  })
});

interface RemoveTagsReport {
  success: string[];
  errors: { file: string; error: string }[];
  details: {
    [filename: string]: {
      removedTags: Array<{
        tag: string;
        location: 'frontmatter' | 'content';
        line?: number;
        context?: string;
      }>;
      preservedTags: Array<{
        tag: string;
        location: 'frontmatter' | 'content';
        line?: number;
        context?: string;
      }>;
    };
  };
}

type RemoveTagsInput = z.infer<typeof schema>;

async function removeTags(
  vaultPath: string,
  params: Omit<RemoveTagsInput, 'vault'>
): Promise<RemoveTagsReport> {
  const results: RemoveTagsReport = {
    success: [],
    errors: [],
    details: {}
  };

  for (const filename of params.files) {
    const fullPath = path.join(vaultPath, filename);
    
    try {
      // Validate path is within vault
      validateVaultPath(vaultPath, fullPath);
      
      // Check if file exists
      if (!await fileExists(fullPath)) {
        results.errors.push({
          file: filename,
          error: "File not found"
        });
        continue;
      }

      // Read file content
      const content = await safeReadFile(fullPath);
      if (!content) {
        results.errors.push({
          file: filename,
          error: "Failed to read file"
        });
        continue;
      }

      // Parse the note
      const parsed = parseNote(content);
      let modified = false;
      results.details[filename] = {
        removedTags: [],
        preservedTags: []
      };

      // Handle frontmatter tags
      if (params.options.location !== 'content') {
        const { frontmatter: updatedFrontmatter, report } = removeTagsFromFrontmatter(
          parsed.frontmatter,
          params.tags,
          {
            normalize: params.options.normalize,
            preserveChildren: params.options.preserveChildren,
            patterns: params.options.patterns
          }
        );
        
        results.details[filename].removedTags.push(...report.removed);
        results.details[filename].preservedTags.push(...report.preserved);
        
        if (JSON.stringify(parsed.frontmatter) !== JSON.stringify(updatedFrontmatter)) {
          parsed.frontmatter = updatedFrontmatter;
          modified = true;
        }
      }

      // Handle inline tags
      if (params.options.location !== 'frontmatter') {
        const { content: newContent, report } = removeInlineTags(
          parsed.content,
          params.tags,
          {
            normalize: params.options.normalize,
            preserveChildren: params.options.preserveChildren,
            patterns: params.options.patterns
          }
        );
        
        results.details[filename].removedTags.push(...report.removed);
        results.details[filename].preservedTags.push(...report.preserved);
        
        if (parsed.content !== newContent) {
          parsed.content = newContent;
          modified = true;
        }
      }

      // Save changes if modified
      if (modified) {
        const updatedContent = stringifyNote(parsed);
        await fs.writeFile(fullPath, updatedContent);
        results.success.push(filename);
      }
    } catch (error) {
      results.errors.push({
        file: filename,
        error: error instanceof Error ? error.message : 'Unknown error'
      });
    }
  }

  return results;
}

export function createRemoveTagsTool(vaults: Map<string, string>) {
  return createTool<RemoveTagsInput>({
    name: "remove-tags",
    description: `Remove tags from notes in frontmatter and/or content.

Examples:
- Simple: { "files": ["note.md"], "tags": ["project", "status"] }
- With hierarchy: { "files": ["note.md"], "tags": ["work/active", "priority/high"] }
- With options: { "files": ["note.md"], "tags": ["status"], "options": { "location": "frontmatter" } }
- Pattern matching: { "files": ["note.md"], "options": { "patterns": ["status/*"] } }
- INCORRECT: { "tags": ["#project"] } (don't include # symbol)`,
    schema,
    handler: async (args, vaultPath, _vaultName) => {
      const results = await removeTags(vaultPath, {
        files: args.files,
        tags: args.tags,
        options: args.options
      });
        
      // Format detailed response message
      let message = '';
      
      // Add success summary
      if (results.success.length > 0) {
        message += `Successfully processed tags in: ${results.success.join(', ')}\n\n`;
      }
      
      // Add detailed changes for each file
      for (const [filename, details] of Object.entries(results.details)) {
        if (details.removedTags.length > 0 || details.preservedTags.length > 0) {
          message += `Changes in ${filename}:\n`;
          
          if (details.removedTags.length > 0) {
            message += '  Removed tags:\n';
            const byLocation = details.removedTags.reduce((acc, change) => {
              if (!acc[change.location]) acc[change.location] = new Map();
              const key = change.line ? `${change.location} (line ${change.line})` : change.location;
              const locationMap = acc[change.location];
              if (locationMap) {
                if (!locationMap.has(key)) {
                  locationMap.set(key, new Set());
                }
                const tagSet = locationMap.get(key);
                if (tagSet) {
                  tagSet.add(change.tag);
                }
              }
              return acc;
            }, {} as Record<string, Map<string, Set<string>>>);
            
            for (const [location, locationMap] of Object.entries(byLocation)) {
              for (const [key, tags] of locationMap.entries()) {
                message += `    ${key}: ${Array.from(tags).join(', ')}\n`;
              }
            }
          }
          
          if (details.preservedTags.length > 0) {
            message += '  Preserved tags:\n';
            const byLocation = details.preservedTags.reduce((acc, change) => {
              if (!acc[change.location]) acc[change.location] = new Map();
              const key = change.line ? `${change.location} (line ${change.line})` : change.location;
              const locationMap = acc[change.location];
              if (locationMap) {
                if (!locationMap.has(key)) {
                  locationMap.set(key, new Set());
                }
                const tagSet = locationMap.get(key);
                if (tagSet) {
                  tagSet.add(change.tag);
                }
              }
              return acc;
            }, {} as Record<string, Map<string, Set<string>>>);
            
            for (const [location, locationMap] of Object.entries(byLocation)) {
              for (const [key, tags] of locationMap.entries()) {
                message += `    ${key}: ${Array.from(tags).join(', ')}\n`;
              }
            }
          }
          
          message += '\n';
        }
      }
      
      // Add errors if any
      if (results.errors.length > 0) {
        message += 'Errors:\n';
        results.errors.forEach(error => {
          message += `  ${error.file}: ${error.error}\n`;
        });
      }

      return {
        content: [{
          type: "text",
          text: message.trim()
        }]
      };
    }
  }, vaults);
}

```

--------------------------------------------------------------------------------
/src/tools/manage-tags/index.ts:
--------------------------------------------------------------------------------

```typescript
import { z } from "zod";
import { promises as fs } from "fs";
import path from "path";
import { McpError, ErrorCode } from "@modelcontextprotocol/sdk/types.js";
import { validateVaultPath } from "../../utils/path.js";
import { fileExists, safeReadFile } from "../../utils/files.js";
import {
  validateTag,
  parseNote,
  stringifyNote,
  addTagsToFrontmatter,
  removeTagsFromFrontmatter,
  removeInlineTags,
  normalizeTag
} from "../../utils/tags.js";
import { createTool } from "../../utils/tool-factory.js";

// Input validation schema
const schema = z.object({
  vault: z.string()
    .min(1, "Vault name cannot be empty")
    .describe("Name of the vault containing the notes"),
  files: z.array(z.string())
    .min(1, "At least one file must be specified")
    .refine(
      files => files.every(f => f.endsWith('.md')),
      "All files must have .md extension"
    ),
  operation: z.enum(['add', 'remove'])
    .describe("Whether to add or remove the specified tags"),
  tags: z.array(z.string())
    .min(1, "At least one tag must be specified")
    .refine(
      tags => tags.every(validateTag),
      "Invalid tag format. Tags must contain only letters, numbers, and forward slashes for hierarchy."
    ),
  options: z.object({
    location: z.enum(['frontmatter', 'content', 'both'])
      .default('frontmatter')
      .describe("Where to add/remove tags"),
    normalize: z.boolean()
      .default(true)
      .describe("Whether to normalize tag format"),
    position: z.enum(['start', 'end'])
      .default('end')
      .describe("Where to add inline tags in content"),
    preserveChildren: z.boolean()
      .default(false)
      .describe("Whether to preserve child tags when removing parent tags"),
    patterns: z.array(z.string())
      .default([])
      .describe("Tag patterns to match for removal (supports * wildcard)")
  }).default({
    location: 'both',
    normalize: true,
    position: 'end',
    preserveChildren: false,
    patterns: []
  })
}).strict();

type ManageTagsInput = z.infer<typeof schema>;

interface OperationParams {
  files: string[];
  operation: 'add' | 'remove';
  tags: string[];
  options: {
    location: 'frontmatter' | 'content' | 'both';
    normalize: boolean;
    position: 'start' | 'end';
    preserveChildren: boolean;
    patterns: string[];
  };
}

interface OperationReport {
  success: string[];
  errors: { file: string; error: string }[];
  details: {
    [filename: string]: {
      removedTags: Array<{
        tag: string;
        location: 'frontmatter' | 'content';
        line?: number;
        context?: string;
      }>;
      preservedTags: Array<{
        tag: string;
        location: 'frontmatter' | 'content';
        line?: number;
        context?: string;
      }>;
    };
  };
}

async function manageTags(
  vaultPath: string,
  operation: ManageTagsInput
): Promise<OperationReport> {
  const results: OperationReport = {
    success: [],
    errors: [],
    details: {}
  };

  for (const filename of operation.files) {
    const fullPath = path.join(vaultPath, filename);
    
    try {
      // Validate path is within vault
      validateVaultPath(vaultPath, fullPath);
      
      // Check if file exists
      if (!await fileExists(fullPath)) {
        results.errors.push({
          file: filename,
          error: "File not found"
        });
        continue;
      }

      // Read file content
      const content = await safeReadFile(fullPath);
      if (!content) {
        results.errors.push({
          file: filename,
          error: "Failed to read file"
        });
        continue;
      }

      // Parse the note
      const parsed = parseNote(content);
      let modified = false;
      results.details[filename] = {
        removedTags: [],
        preservedTags: []
      };

      if (operation.operation === 'add') {
        // Handle frontmatter tags for add operation
        if (operation.options.location !== 'content') {
          const updatedFrontmatter = addTagsToFrontmatter(
            parsed.frontmatter,
            operation.tags,
            operation.options.normalize
          );
          
          if (JSON.stringify(parsed.frontmatter) !== JSON.stringify(updatedFrontmatter)) {
            parsed.frontmatter = updatedFrontmatter;
            parsed.hasFrontmatter = true;
            modified = true;
          }
        }

        // Handle inline tags for add operation
        if (operation.options.location !== 'frontmatter') {
          const tagString = operation.tags
            .filter(tag => validateTag(tag))
            .map(tag => `#${operation.options.normalize ? normalizeTag(tag) : tag}`)
            .join(' ');

          if (tagString) {
            if (operation.options.position === 'start') {
              parsed.content = tagString + '\n\n' + parsed.content.trim();
            } else {
              parsed.content = parsed.content.trim() + '\n\n' + tagString;
            }
            modified = true;
          }
        }
      } else {
        // Handle frontmatter tags for remove operation
        if (operation.options.location !== 'content') {
          const { frontmatter: updatedFrontmatter, report } = removeTagsFromFrontmatter(
            parsed.frontmatter,
            operation.tags,
            {
              normalize: operation.options.normalize,
              preserveChildren: operation.options.preserveChildren,
              patterns: operation.options.patterns
            }
          );
          
          results.details[filename].removedTags.push(...report.removed);
          results.details[filename].preservedTags.push(...report.preserved);
          
          if (JSON.stringify(parsed.frontmatter) !== JSON.stringify(updatedFrontmatter)) {
            parsed.frontmatter = updatedFrontmatter;
            modified = true;
          }
        }

        // Handle inline tags for remove operation
        if (operation.options.location !== 'frontmatter') {
          const { content: newContent, report } = removeInlineTags(
            parsed.content,
            operation.tags,
            {
              normalize: operation.options.normalize,
              preserveChildren: operation.options.preserveChildren,
              patterns: operation.options.patterns
            }
          );
          
          results.details[filename].removedTags.push(...report.removed);
          results.details[filename].preservedTags.push(...report.preserved);
          
          if (parsed.content !== newContent) {
            parsed.content = newContent;
            modified = true;
          }
        }
      }

      // Save changes if modified
      if (modified) {
        const updatedContent = stringifyNote(parsed);
        await fs.writeFile(fullPath, updatedContent);
        results.success.push(filename);
      }
    } catch (error) {
      results.errors.push({
        file: filename,
        error: error instanceof Error ? error.message : 'Unknown error'
      });
    }
  }

  return results;
}

export function createManageTagsTool(vaults: Map<string, string>) {
  return createTool<ManageTagsInput>({
    name: "manage-tags",
    description: `Add or remove tags from notes, supporting both frontmatter and inline tags.

Examples:
- Add tags: { "vault": "vault1", "files": ["note.md"], "operation": "add", "tags": ["project", "status/active"] }
- Remove tags: { "vault": "vault1", "files": ["note.md"], "operation": "remove", "tags": ["project"] }
- With options: { "vault": "vault1", "files": ["note.md"], "operation": "add", "tags": ["status"], "options": { "location": "frontmatter" } }
- Pattern matching: { "vault": "vault1", "files": ["note.md"], "operation": "remove", "options": { "patterns": ["status/*"] } }
- INCORRECT: { "tags": ["#project"] } (don't include # symbol)`,
    schema,
    handler: async (args, vaultPath, _vaultName) => {
      const results = await manageTags(vaultPath, args);
        
        // Format detailed response message
        let message = '';
        
        // Add success summary
        if (results.success.length > 0) {
          message += `Successfully processed tags in: ${results.success.join(', ')}\n\n`;
        }
        
        // Add detailed changes for each file
        for (const [filename, details] of Object.entries(results.details)) {
          if (details.removedTags.length > 0 || details.preservedTags.length > 0) {
            message += `Changes in ${filename}:\n`;
            
            if (details.removedTags.length > 0) {
              message += '  Removed tags:\n';
              details.removedTags.forEach(change => {
                message += `    - ${change.tag} (${change.location}`;
                if (change.line) {
                  message += `, line ${change.line}`;
                }
                message += ')\n';
              });
            }
            
            if (details.preservedTags.length > 0) {
              message += '  Preserved tags:\n';
              details.preservedTags.forEach(change => {
                message += `    - ${change.tag} (${change.location}`;
                if (change.line) {
                  message += `, line ${change.line}`;
                }
                message += ')\n';
              });
            }
            
            message += '\n';
          }
        }
        
        // Add errors if any
        if (results.errors.length > 0) {
          message += 'Errors:\n';
          results.errors.forEach(error => {
            message += `  ${error.file}: ${error.error}\n`;
          });
        }

        return {
          content: [{
            type: "text",
            text: message.trim()
          }]
        };
    }
  }, vaults);
}

```

--------------------------------------------------------------------------------
/src/utils/tags.ts:
--------------------------------------------------------------------------------

```typescript
import { parse as parseYaml, stringify as stringifyYaml } from 'yaml';
import { McpError, ErrorCode } from "@modelcontextprotocol/sdk/types.js";

interface ParsedNote {
  frontmatter: Record<string, any>;
  content: string;
  hasFrontmatter: boolean;
}

interface TagChange {
  tag: string;
  location: 'frontmatter' | 'content';
  line?: number;
  context?: string;
}

interface TagRemovalReport {
  removedTags: TagChange[];
  preservedTags: TagChange[];
  errors: string[];
}

/**
 * Checks if tagA is a parent of tagB in a hierarchical structure
 */
export function isParentTag(parentTag: string, childTag: string): boolean {
  return childTag.startsWith(parentTag + '/');
}

/**
 * Matches a tag against a pattern
 * Supports * wildcard and hierarchical matching
 */
export function matchesTagPattern(pattern: string, tag: string): boolean {
  // Convert glob pattern to regex
  const regexPattern = pattern
    .replace(/\*/g, '.*')
    .replace(/\//g, '\\/');
  return new RegExp(`^${regexPattern}$`).test(tag);
}

/**
 * Gets all related tags (parent/child) for a given tag
 */
export function getRelatedTags(tag: string, allTags: string[]): {
  parents: string[];
  children: string[];
} {
  const parents: string[] = [];
  const children: string[] = [];
  
  const parts = tag.split('/');
  let current = '';
  
  // Find parents
  for (let i = 0; i < parts.length - 1; i++) {
    current = current ? `${current}/${parts[i]}` : parts[i];
    parents.push(current);
  }
  
  // Find children
  allTags.forEach(otherTag => {
    if (isParentTag(tag, otherTag)) {
      children.push(otherTag);
    }
  });
  
  return { parents, children };
}

/**
 * Validates a tag format
 * Allows: #tag, tag, tag/subtag, project/active
 * Disallows: empty strings, spaces, special characters except '/'
 */
export function validateTag(tag: string): boolean {
  // Remove leading # if present
  tag = tag.replace(/^#/, '');
  
  // Check if tag is empty
  if (!tag) return false;
  
  // Basic tag format validation
  const TAG_REGEX = /^[a-zA-Z0-9]+(\/[a-zA-Z0-9]+)*$/;
  return TAG_REGEX.test(tag);
}

/**
 * Normalizes a tag to a consistent format
 * Example: ProjectActive -> project-active
 */
export function normalizeTag(tag: string, normalize = true): string {
  // Remove leading # if present
  tag = tag.replace(/^#/, '');
  
  if (!normalize) return tag;
  
  // Convert camelCase/PascalCase to kebab-case
  return tag
    .split('/')
    .map(part => 
      part
        .replace(/([a-z0-9])([A-Z])/g, '$1-$2')
        .toLowerCase()
    )
    .join('/');
}

/**
 * Parses a note's content into frontmatter and body
 */
export function parseNote(content: string): ParsedNote {
  const frontmatterRegex = /^---\n([\s\S]*?)\n---\n([\s\S]*)$/;
  const match = content.match(frontmatterRegex);

  if (!match) {
    return {
      frontmatter: {},
      content: content,
      hasFrontmatter: false
    };
  }

  try {
    const frontmatter = parseYaml(match[1]);
    return {
      frontmatter: frontmatter || {},
      content: match[2],
      hasFrontmatter: true
    };
  } catch (error) {
    throw new McpError(
      ErrorCode.InvalidParams,
      'Invalid frontmatter YAML format'
    );
  }
}

/**
 * Combines frontmatter and content back into a note
 */
export function stringifyNote(parsed: ParsedNote): string {
  if (!parsed.hasFrontmatter || Object.keys(parsed.frontmatter).length === 0) {
    return parsed.content;
  }

  const frontmatterStr = stringifyYaml(parsed.frontmatter).trim();
  return `---\n${frontmatterStr}\n---\n\n${parsed.content.trim()}`;
}

/**
 * Extracts all tags from a note's content
 */
export function extractTags(content: string): string[] {
  const tags = new Set<string>();
  
  // Match hashtags that aren't inside code blocks or HTML comments
  const TAG_PATTERN = /(?<!`)#[a-zA-Z0-9][a-zA-Z0-9/]*(?!`)/g;
  
  // Split content into lines
  const lines = content.split('\n');
  let inCodeBlock = false;
  let inHtmlComment = false;
  
  for (const line of lines) {
    // Check for code block boundaries
    if (line.trim().startsWith('```')) {
      inCodeBlock = !inCodeBlock;
      continue;
    }
    
    // Check for HTML comment boundaries
    if (line.includes('<!--')) inHtmlComment = true;
    if (line.includes('-->')) inHtmlComment = false;
    
    // Skip if we're in a code block or HTML comment
    if (inCodeBlock || inHtmlComment) continue;
    
    // Extract tags from the line
    const matches = line.match(TAG_PATTERN);
    if (matches) {
      matches.forEach(tag => tags.add(tag.slice(1))); // Remove # prefix
    }
  }
  
  return Array.from(tags);
}

/**
 * Safely adds tags to frontmatter
 */
export function addTagsToFrontmatter(
  frontmatter: Record<string, any>,
  newTags: string[],
  normalize = true
): Record<string, any> {
  const updatedFrontmatter = { ...frontmatter };
  const existingTags = new Set(
    Array.isArray(frontmatter.tags) ? frontmatter.tags : []
  );
  
  for (const tag of newTags) {
    if (!validateTag(tag)) {
      throw new McpError(
        ErrorCode.InvalidParams,
        `Invalid tag format: ${tag}`
      );
    }
    existingTags.add(normalizeTag(tag, normalize));
  }
  
  updatedFrontmatter.tags = Array.from(existingTags).sort();
  return updatedFrontmatter;
}

/**
 * Safely removes tags from frontmatter with detailed reporting
 */
export function removeTagsFromFrontmatter(
  frontmatter: Record<string, any>,
  tagsToRemove: string[],
  options: {
    normalize?: boolean;
    preserveChildren?: boolean;
    patterns?: string[];
  } = {}
): {
  frontmatter: Record<string, any>;
  report: {
    removed: TagChange[];
    preserved: TagChange[];
  };
} {
  const {
    normalize = true,
    preserveChildren = false,
    patterns = []
  } = options;

  const updatedFrontmatter = { ...frontmatter };
  const existingTags = Array.isArray(frontmatter.tags) ? frontmatter.tags : [];
  const removed: TagChange[] = [];
  const preserved: TagChange[] = [];

  // Get all related tags if preserving children
  const relatedTagsMap = new Map(
    tagsToRemove.map(tag => [
      tag,
      preserveChildren ? getRelatedTags(tag, existingTags) : null
    ])
  );

  const newTags = existingTags.filter(tag => {
    const normalizedTag = normalizeTag(tag, normalize);
    
    // Check if tag should be removed
    const shouldRemove = tagsToRemove.some(removeTag => {
      // Direct match
      if (normalizeTag(removeTag, normalize) === normalizedTag) return true;
      
      // Pattern match
      if (patterns.some(pattern => matchesTagPattern(pattern, normalizedTag))) {
        return true;
      }
      
      // Hierarchical match (if not preserving children)
      if (!preserveChildren) {
        const related = relatedTagsMap.get(removeTag);
        if (related?.parents.includes(normalizedTag)) return true;
      }
      
      return false;
    });

    if (shouldRemove) {
      removed.push({
        tag: normalizedTag,
        location: 'frontmatter'
      });
      return false;
    } else {
      preserved.push({
        tag: normalizedTag,
        location: 'frontmatter'
      });
      return true;
    }
  });

  updatedFrontmatter.tags = newTags.sort();
  return {
    frontmatter: updatedFrontmatter,
    report: { removed, preserved }
  };
}

/**
 * Removes inline tags from content with detailed reporting
 */
export function removeInlineTags(
  content: string,
  tagsToRemove: string[],
  options: {
    normalize?: boolean;
    preserveChildren?: boolean;
    patterns?: string[];
  } = {}
): {
  content: string;
  report: {
    removed: TagChange[];
    preserved: TagChange[];
  };
} {
  const {
    normalize = true,
    preserveChildren = false,
    patterns = []
  } = options;

  const removed: TagChange[] = [];
  const preserved: TagChange[] = [];
  
  // Process content line by line to track context
  const lines = content.split('\n');
  let inCodeBlock = false;
  let inHtmlComment = false;
  let modifiedLines = lines.map((line, lineNum) => {
    // Track code blocks and comments
    if (line.trim().startsWith('```')) {
      inCodeBlock = !inCodeBlock;
      return line;
    }
    if (line.includes('<!--')) inHtmlComment = true;
    if (line.includes('-->')) inHtmlComment = false;
    if (inCodeBlock || inHtmlComment) {
      // Preserve tags in code blocks and comments
      const tags = line.match(/(?<!`)#[a-zA-Z0-9][a-zA-Z0-9/]*(?!`)/g) || [];
      tags.forEach(tag => {
        preserved.push({
          tag: tag.slice(1),
          location: 'content',
          line: lineNum + 1,
          context: line.trim()
        });
      });
      return line;
    }

    // Process tags in regular content
    return line.replace(
      /(?<!`)#[a-zA-Z0-9][a-zA-Z0-9/]*(?!`)/g,
      (match) => {
        const tag = match.slice(1); // Remove # prefix
        const normalizedTag = normalizeTag(tag, normalize);
        
        const shouldRemove = tagsToRemove.some(removeTag => {
          // Direct match
          if (normalizeTag(removeTag, normalize) === normalizedTag) return true;
          
          // Pattern match
          if (patterns.some(pattern => matchesTagPattern(pattern, normalizedTag))) {
            return true;
          }
          
          // Hierarchical match (if not preserving children)
          if (!preserveChildren && isParentTag(removeTag, normalizedTag)) {
            return true;
          }
          
          return false;
        });

        if (shouldRemove) {
          removed.push({
            tag: normalizedTag,
            location: 'content',
            line: lineNum + 1,
            context: line.trim()
          });
          return '';
        } else {
          preserved.push({
            tag: normalizedTag,
            location: 'content',
            line: lineNum + 1,
            context: line.trim()
          });
          return match;
        }
      }
    );
  });

  // Clean up empty lines created by tag removal
  modifiedLines = modifiedLines.reduce((acc: string[], line: string) => {
    if (line.trim() === '') {
      if (acc[acc.length - 1]?.trim() === '') {
        return acc;
      }
    }
    acc.push(line);
    return acc;
  }, []);

  return {
    content: modifiedLines.join('\n'),
    report: { removed, preserved }
  };
}

```

--------------------------------------------------------------------------------
/src/tools/rename-tag/index.ts:
--------------------------------------------------------------------------------

```typescript
import { z } from "zod";
import { McpError, ErrorCode } from "@modelcontextprotocol/sdk/types.js";
import { promises as fs } from "fs";
import path from "path";
import {
  validateTag,
  normalizeTag,
  parseNote,
  stringifyNote
} from "../../utils/tags.js";
import {
  getAllMarkdownFiles,
  safeReadFile,
  fileExists
} from "../../utils/files.js";
import { createTool } from "../../utils/tool-factory.js";

// Input validation schema with descriptions
const schema = z.object({
  vault: z.string()
    .min(1, "Vault name cannot be empty")
    .describe("Name of the vault containing the tags"),
  oldTag: z.string()
    .min(1, "Old tag must not be empty")
    .refine(
      tag => /^[a-zA-Z0-9\/]+$/.test(tag),
      "Tags must contain only letters, numbers, and forward slashes. Do not include the # symbol. Examples: 'project', 'work/active', 'tasks/2024/q1'"
    )
    .describe("The tag to rename (without #). Example: 'project' or 'work/active'"),
  newTag: z.string()
    .min(1, "New tag must not be empty")
    .refine(
      tag => /^[a-zA-Z0-9\/]+$/.test(tag),
      "Tags must contain only letters, numbers, and forward slashes. Do not include the # symbol. Examples: 'project', 'work/active', 'tasks/2024/q1'"
    )
    .describe("The new tag name (without #). Example: 'projects' or 'work/current'"),
  createBackup: z.boolean()
    .default(true)
    .describe("Whether to create a backup before making changes (default: true)"),
  normalize: z.boolean()
    .default(true)
    .describe("Whether to normalize tag names (e.g., ProjectActive -> project-active) (default: true)"),
  batchSize: z.number()
    .min(1)
    .max(100)
    .default(50)
    .describe("Number of files to process in each batch (1-100) (default: 50)")
}).strict();

// Types
type RenameTagInput = z.infer<typeof schema>;

interface TagReplacement {
  oldTag: string;
  newTag: string;
}

interface TagChangeReport {
  filePath: string;
  oldTags: string[];
  newTags: string[];
  location: 'frontmatter' | 'content';
  line?: number;
}

interface RenameTagReport {
  successful: TagChangeReport[];
  failed: {
    filePath: string;
    error: string;
  }[];
  timestamp: string;
  backupCreated?: string;
}

/**
 * Creates a backup of the vault
 */
async function createVaultBackup(vaultPath: string): Promise<string> {
  const timestamp = new Date().toISOString().replace(/[:.]/g, '-');
  const backupDir = path.join(vaultPath, '.backup');
  const backupPath = path.join(backupDir, `vault-backup-${timestamp}`);

  await fs.mkdir(backupDir, { recursive: true });
  
  // Copy all markdown files to backup
  const files = await getAllMarkdownFiles(vaultPath);
  for (const file of files) {
    const relativePath = path.relative(vaultPath, file);
    const backupFile = path.join(backupPath, relativePath);
    await fs.mkdir(path.dirname(backupFile), { recursive: true });
    await fs.copyFile(file, backupFile);
  }

  return backupPath;
}

/**
 * Updates tags in frontmatter
 */
function updateFrontmatterTags(
  frontmatter: Record<string, any>,
  replacements: TagReplacement[],
  normalize: boolean
): {
  frontmatter: Record<string, any>;
  changes: { oldTag: string; newTag: string }[];
} {
  const changes: { oldTag: string; newTag: string }[] = [];
  const updatedFrontmatter = { ...frontmatter };
  
  if (!Array.isArray(frontmatter.tags)) {
    return { frontmatter: updatedFrontmatter, changes };
  }

  const updatedTags = frontmatter.tags.map(tag => {
    const normalizedTag = normalizeTag(tag, normalize);
    
    for (const { oldTag, newTag } of replacements) {
      const normalizedOldTag = normalizeTag(oldTag, normalize);
      
      if (normalizedTag === normalizedOldTag || 
          normalizedTag.startsWith(normalizedOldTag + '/')) {
        const updatedTag = normalizedTag.replace(
          new RegExp(`^${normalizedOldTag}`),
          normalizeTag(newTag, normalize)
        );
        changes.push({ oldTag: normalizedTag, newTag: updatedTag });
        return updatedTag;
      }
    }
    
    return normalizedTag;
  });

  updatedFrontmatter.tags = Array.from(new Set(updatedTags)).sort();
  return { frontmatter: updatedFrontmatter, changes };
}

/**
 * Updates inline tags in content
 */
function updateInlineTags(
  content: string,
  replacements: TagReplacement[],
  normalize: boolean
): {
  content: string;
  changes: { oldTag: string; newTag: string; line: number }[];
} {
  const changes: { oldTag: string; newTag: string; line: number }[] = [];
  const lines = content.split('\n');
  let inCodeBlock = false;
  let inHtmlComment = false;

  const updatedLines = lines.map((line, lineNum) => {
    // Handle code blocks and comments
    if (line.trim().startsWith('```')) {
      inCodeBlock = !inCodeBlock;
      return line;
    }
    if (line.includes('<!--')) inHtmlComment = true;
    if (line.includes('-->')) inHtmlComment = false;
    if (inCodeBlock || inHtmlComment) return line;

    // Update tags in regular content
    return line.replace(
      /(?<!`)#[a-zA-Z0-9][a-zA-Z0-9/]*(?!`)/g,
      (match) => {
        const tag = match.slice(1);
        const normalizedTag = normalizeTag(tag, normalize);

        for (const { oldTag, newTag } of replacements) {
          const normalizedOldTag = normalizeTag(oldTag, normalize);
          
          if (normalizedTag === normalizedOldTag ||
              normalizedTag.startsWith(normalizedOldTag + '/')) {
            const updatedTag = normalizedTag.replace(
              new RegExp(`^${normalizedOldTag}`),
              normalizeTag(newTag, normalize)
            );
            changes.push({
              oldTag: normalizedTag,
              newTag: updatedTag,
              line: lineNum + 1
            });
            return `#${updatedTag}`;
          }
        }

        return match;
      }
    );
  });

  return {
    content: updatedLines.join('\n'),
    changes
  };
}

/**
 * Updates saved searches and filters
 */
async function updateSavedSearches(
  vaultPath: string,
  replacements: TagReplacement[],
  normalize: boolean
): Promise<void> {
  const searchConfigPath = path.join(vaultPath, '.obsidian', 'search.json');
  
  if (!await fileExists(searchConfigPath)) return;

  try {
    const searchConfig = JSON.parse(
      await fs.readFile(searchConfigPath, 'utf-8')
    );

    let modified = false;
    
    // Update saved searches
    if (Array.isArray(searchConfig.savedSearches)) {
      searchConfig.savedSearches = searchConfig.savedSearches.map(
        (search: any) => {
          if (typeof search.query !== 'string') return search;

          let updatedQuery = search.query;
          for (const { oldTag, newTag } of replacements) {
            const normalizedOldTag = normalizeTag(oldTag, normalize);
            const normalizedNewTag = normalizeTag(newTag, normalize);
            
            // Update tag queries
            updatedQuery = updatedQuery.replace(
              new RegExp(`tag:${normalizedOldTag}(/\\S*)?`, 'g'),
              `tag:${normalizedNewTag}$1`
            );
            
            // Update raw tag references
            updatedQuery = updatedQuery.replace(
              new RegExp(`#${normalizedOldTag}(/\\S*)?`, 'g'),
              `#${normalizedNewTag}$1`
            );
          }

          if (updatedQuery !== search.query) {
            modified = true;
            return { ...search, query: updatedQuery };
          }
          return search;
        }
      );
    }

    if (modified) {
      await fs.writeFile(
        searchConfigPath,
        JSON.stringify(searchConfig, null, 2)
      );
    }
  } catch (error) {
    console.error('Error updating saved searches:', error);
    // Continue with other operations
  }
}

/**
 * Processes files in batches to handle large vaults
 */
async function processBatch(
  files: string[],
  start: number,
  batchSize: number,
  replacements: TagReplacement[],
  normalize: boolean
): Promise<{
  successful: TagChangeReport[];
  failed: { filePath: string; error: string }[];
}> {
  const batch = files.slice(start, start + batchSize);
  const successful: TagChangeReport[] = [];
  const failed: { filePath: string; error: string }[] = [];

  await Promise.all(
    batch.map(async (filePath) => {
      try {
        const content = await safeReadFile(filePath);
        if (!content) {
          failed.push({
            filePath,
            error: 'File not found or cannot be read'
          });
          return;
        }

        const parsed = parseNote(content);
        
        // Update frontmatter tags
        const { frontmatter: updatedFrontmatter, changes: frontmatterChanges } =
          updateFrontmatterTags(parsed.frontmatter, replacements, normalize);
        
        // Update inline tags
        const { content: updatedContent, changes: contentChanges } =
          updateInlineTags(parsed.content, replacements, normalize);

        // Only write file if changes were made
        if (frontmatterChanges.length > 0 || contentChanges.length > 0) {
          const updatedNote = stringifyNote({
            ...parsed,
            frontmatter: updatedFrontmatter,
            content: updatedContent
          });
          
          await fs.writeFile(filePath, updatedNote, 'utf-8');

          // Record changes
          if (frontmatterChanges.length > 0) {
            successful.push({
              filePath,
              oldTags: frontmatterChanges.map(c => c.oldTag),
              newTags: frontmatterChanges.map(c => c.newTag),
              location: 'frontmatter'
            });
          }

          if (contentChanges.length > 0) {
            successful.push({
              filePath,
              oldTags: contentChanges.map(c => c.oldTag),
              newTags: contentChanges.map(c => c.newTag),
              location: 'content',
              line: contentChanges[0].line
            });
          }
        }
      } catch (error) {
        failed.push({
          filePath,
          error: error instanceof Error ? error.message : String(error)
        });
      }
    })
  );

  return { successful, failed };
}

/**
 * Renames tags throughout the vault while preserving hierarchies
 */
async function renameTag(
  vaultPath: string,
  params: Omit<RenameTagInput, 'vault'>
): Promise<RenameTagReport> {
  try {
    // Validate tags (though Zod schema already handles this)
    if (!validateTag(params.oldTag) || !validateTag(params.newTag)) {
      throw new McpError(
        ErrorCode.InvalidParams,
        'Invalid tag format'
      );
    }

    // Create backup if requested
    let backupPath: string | undefined;
    if (params.createBackup) {
      backupPath = await createVaultBackup(vaultPath);
    }

    // Get all markdown files
    const files = await getAllMarkdownFiles(vaultPath);
    
    // Process files in batches
    const successful: TagChangeReport[] = [];
    const failed: { filePath: string; error: string }[] = [];
    
    for (let i = 0; i < files.length; i += params.batchSize) {
      const { successful: batchSuccessful, failed: batchFailed } =
        await processBatch(
          files,
          i,
          params.batchSize,
          [{ oldTag: params.oldTag, newTag: params.newTag }],
          params.normalize
        );
      
      successful.push(...batchSuccessful);
      failed.push(...batchFailed);
    }

    // Update saved searches
    await updateSavedSearches(
      vaultPath,
      [{ oldTag: params.oldTag, newTag: params.newTag }],
      params.normalize
    );

    return {
      successful,
      failed,
      timestamp: new Date().toISOString(),
      backupCreated: backupPath
    };
  } catch (error) {
    // Ensure errors are properly propagated
    if (error instanceof McpError) {
      throw error;
    }
    throw new McpError(
      ErrorCode.InternalError,
      error instanceof Error ? error.message : 'Unknown error during tag renaming'
    );
  }
}

export function createRenameTagTool(vaults: Map<string, string>) {
  return createTool<RenameTagInput>({
    name: 'rename-tag',
    description: `Safely renames tags throughout the vault while preserving hierarchies.

Examples:
- Simple rename: { "oldTag": "project", "newTag": "projects" }
- Rename with hierarchy: { "oldTag": "work/active", "newTag": "projects/current" }
- With options: { "oldTag": "status", "newTag": "state", "normalize": true, "createBackup": true }
- INCORRECT: { "oldTag": "#project" } (don't include # symbol)`,
    schema,
    handler: async (args, vaultPath, _vaultName) => {
      const results = await renameTag(vaultPath, {
        oldTag: args.oldTag,
        newTag: args.newTag,
        createBackup: args.createBackup ?? true,
        normalize: args.normalize ?? true,
        batchSize: args.batchSize ?? 50
      });
      
      // Format response message
      let message = '';
      
      // Add backup info if created
      if (results.backupCreated) {
        message += `Created backup at: ${results.backupCreated}\n\n`;
      }
      
      // Add success summary
      if (results.successful.length > 0) {
        message += `Successfully renamed tags in ${results.successful.length} locations:\n\n`;
        
        // Group changes by file
        const changesByFile = results.successful.reduce((acc, change) => {
          if (!acc[change.filePath]) {
            acc[change.filePath] = [];
          }
          acc[change.filePath].push(change);
          return acc;
        }, {} as Record<string, typeof results.successful>);
        
        // Report changes for each file
        for (const [file, changes] of Object.entries(changesByFile)) {
          message += `${file}:\n`;
          changes.forEach(change => {
            const location = change.line 
              ? `${change.location} (line ${change.line})`
              : change.location;
            message += `  ${location}: ${change.oldTags.join(', ')} -> ${change.newTags.join(', ')}\n`;
          });
          message += '\n';
        }
      }
      
      // Add errors if any
      if (results.failed.length > 0) {
        message += 'Errors:\n';
        results.failed.forEach(error => {
          message += `  ${error.filePath}: ${error.error}\n`;
        });
      }

      return {
        content: [{
          type: 'text',
          text: message.trim()
        }]
      };
    }
  }, vaults);
}

```

--------------------------------------------------------------------------------
/src/utils/path.ts:
--------------------------------------------------------------------------------

```typescript
import path from "path";
import fs from "fs/promises";
import { McpError, ErrorCode } from "@modelcontextprotocol/sdk/types.js";
import os from "os";
import { exec as execCallback } from "child_process";
import { promisify } from "util";

// Promisify exec for cleaner async/await usage
const exec = promisify(execCallback);

/**
 * Checks if a path contains any problematic characters or patterns
 * @param vaultPath - The path to validate
 * @returns Error message if invalid, null if valid
 */
export function checkPathCharacters(vaultPath: string): string | null {
  // Platform-specific path length limits
  const maxPathLength = process.platform === 'win32' ? 260 : 4096;
  if (vaultPath.length > maxPathLength) {
    return `Path exceeds maximum length (${maxPathLength} characters)`;
  }

  // Check component length (individual parts between separators)
  const components = vaultPath.split(/[\/\\]/);
  const maxComponentLength = process.platform === 'win32' ? 255 : 255;
  const longComponent = components.find(c => c.length > maxComponentLength);
  if (longComponent) {
    return `Directory/file name too long: "${longComponent.slice(0, 50)}..."`;
  }

  // Check for root-only paths
  if (process.platform === 'win32') {
    if (/^[A-Za-z]:\\?$/.test(vaultPath)) {
      return 'Cannot use drive root directory';
    }
  } else {
    if (vaultPath === '/') {
      return 'Cannot use filesystem root directory';
    }
  }

  // Check for relative path components
  if (components.includes('..') || components.includes('.')) {
    return 'Path cannot contain relative components (. or ..)';
  }

  // Check for non-printable characters
  if (/[\x00-\x1F\x7F]/.test(vaultPath)) {
    return 'Contains non-printable characters';
  }

  // Platform-specific checks
  if (process.platform === 'win32') {
    // Windows-specific checks
    const winReservedNames = /^(con|prn|aux|nul|com[1-9]|lpt[1-9])(\..*)?$/i;
    const pathParts = vaultPath.split(/[\/\\]/);
    if (pathParts.some(part => winReservedNames.test(part))) {
      return 'Contains Windows reserved names (CON, PRN, etc.)';
    }

    // Windows invalid characters (allowing : for drive letters)
    // First check if this is a Windows path with a drive letter
    if (/^[A-Za-z]:[\/\\]/.test(vaultPath)) {
      // Skip the drive letter part and check the rest of the path
      const pathWithoutDrive = vaultPath.slice(2);
      const components = pathWithoutDrive.split(/[\/\\]/);
      for (const part of components) {
        if (/[<>:"|?*]/.test(part)) {
          return 'Contains characters not allowed on Windows (<>:"|?*)';
        }
      }
    } else {
      // No drive letter, check all components normally
      const components = vaultPath.split(/[\/\\]/);
      for (const part of components) {
        if (/[<>:"|?*]/.test(part)) {
          return 'Contains characters not allowed on Windows (<>:"|?*)';
        }
      }
    }

    // Windows device paths
    if (/^\\\\.\\/.test(vaultPath)) {
      return 'Device paths are not allowed';
    }
  } else {
    // Unix-specific checks
    const unixInvalidChars = /[\x00]/;  // Only check for null character
    const pathComponents = vaultPath.split('/');
    for (const component of pathComponents) {
      if (unixInvalidChars.test(component)) {
        return 'Contains invalid characters for Unix paths';
      }
    }
  }

  // Check for Unicode replacement character
  if (vaultPath.includes('\uFFFD')) {
    return 'Contains invalid Unicode characters';
  }

  // Check for leading/trailing whitespace
  if (vaultPath !== vaultPath.trim()) {
    return 'Contains leading or trailing whitespace';
  }

  // Check for consecutive separators
  if (/[\/\\]{2,}/.test(vaultPath)) {
    return 'Contains consecutive path separators';
  }

  return null;
}

/**
 * Checks if a path is on a local filesystem
 * @param vaultPath - The path to check
 * @returns Error message if invalid, null if valid
 */
export async function checkLocalPath(vaultPath: string): Promise<string | null> {
  try {
    // Get real path (resolves symlinks)
    const realPath = await fs.realpath(vaultPath);
    
    // Check if path changed significantly after resolving symlinks
    if (path.dirname(realPath) !== path.dirname(vaultPath)) {
      return 'Path contains symlinks that point outside the parent directory';
    }

    // Check for network paths
    if (process.platform === 'win32') {
      // Windows UNC paths and mapped drives
      if (realPath.startsWith('\\\\') || /^[a-zA-Z]:\\$/.test(realPath.slice(0, 3))) {
        // Check Windows drive type
        const drive = realPath[0].toUpperCase();
        
        // Helper functions for drive type checking
        async function checkWithWmic() {
          const cmd = `wmic logicaldisk where "DeviceID='${drive}:'" get DriveType /value`;
          return await exec(cmd, { timeout: 5000 });
        }

        async function checkWithPowershell() {
          const cmd = `powershell -Command "(Get-WmiObject -Class Win32_LogicalDisk | Where-Object { $_.DeviceID -eq '${drive}:' }).DriveType"`;
          const { stdout, stderr } = await exec(cmd, { timeout: 5000 });
          return { stdout: `DriveType=${stdout.trim()}`, stderr };
        }
        
        try {
          let result: { stdout: string; stderr: string };
          try {
            result = await checkWithWmic();
          } catch (wmicError) {
            // Fallback to PowerShell if WMIC fails
            result = await checkWithPowershell();
          }

          const { stdout, stderr } = result;

          if (stderr) {
            console.error(`Warning: Drive type check produced errors:`, stderr);
          }

          // DriveType: 2 = Removable, 3 = Local, 4 = Network, 5 = CD-ROM, 6 = RAM disk
          const match = stdout.match(/DriveType=(\d+)/);
          const driveType = match ? match[1] : '0';
          
          // Consider removable drives and unknown types as potentially network-based
          if (driveType === '0' || driveType === '2' || driveType === '4') {
            return 'Network, removable, or unknown drive type is not supported';
          }
        } catch (error: unknown) {
          if ((error as Error & { code?: string }).code === 'ETIMEDOUT') {
            return 'Network, removable, or unknown drive type is not supported';
          }
          console.error(`Error checking drive type:`, error);
          // Fail safe: treat any errors as potential network drives
          return 'Unable to verify if drive is local';
        }
      }
    } else {
      // Unix network mounts (common mount points)
      const networkPaths = ['/net/', '/mnt/', '/media/', '/Volumes/'];
      if (networkPaths.some(prefix => realPath.startsWith(prefix))) {
        // Check if it's a network mount using df
        // Check Unix mount type
        const cmd = `df -P "${realPath}" | tail -n 1`;
        try {
          const { stdout, stderr } = await exec(cmd, { timeout: 5000 })
            .catch((error: Error & { code?: string }) => {
              if (error.code === 'ETIMEDOUT') {
                // Timeout often indicates a network mount
                return { stdout: 'network', stderr: '' };
              }
              throw error;
            });

          if (stderr) {
            console.error(`Warning: Mount type check produced errors:`, stderr);
          }

          // Check for common network filesystem indicators
          const isNetwork = stdout.match(/^(nfs|cifs|smb|afp|ftp|ssh|davfs)/i) ||
                          stdout.includes(':') ||
                          stdout.includes('//') ||
                          stdout.includes('type fuse.') ||
                          stdout.includes('network');

          if (isNetwork) {
            return 'Network or remote filesystem is not supported';
          }
        } catch (error: unknown) {
          console.error(`Error checking mount type:`, error);
          // Fail safe: treat any errors as potential network mounts
          return 'Unable to verify if filesystem is local';
        }
      }
    }

    return null;
  } catch (error) {
    if ((error as NodeJS.ErrnoException).code === 'ELOOP') {
      return 'Contains circular symlinks';
    }
    return null; // Other errors will be caught by the main validation
  }
}

/**
 * Checks if a path contains any suspicious patterns
 * @param vaultPath - The path to check
 * @returns Error message if suspicious, null if valid
 */
export async function checkSuspiciousPath(vaultPath: string): Promise<string | null> {
  // Check for hidden directories (except .obsidian)
  if (vaultPath.split(path.sep).some(part => 
    part.startsWith('.') && part !== '.obsidian')) {
    return 'Contains hidden directories';
  }

  // Check for system directories
  const systemDirs = [
    '/bin', '/sbin', '/usr/bin', '/usr/sbin',
    '/etc', '/var', '/tmp', '/dev', '/sys',
    'C:\\Windows', 'C:\\Program Files', 'C:\\System32',
    'C:\\Users\\All Users', 'C:\\ProgramData'
  ];
  if (systemDirs.some(dir => vaultPath.toLowerCase().startsWith(dir.toLowerCase()))) {
    return 'Points to a system directory';
  }

  // Check for home directory root (too broad access)
  if (vaultPath === os.homedir()) {
    return 'Points to home directory root';
  }

  // Check for path length
  if (vaultPath.length > 255) {
    return 'Path is too long (maximum 255 characters)';
  }

  // Check for problematic characters
  const charIssue = checkPathCharacters(vaultPath);
  if (charIssue) {
    return charIssue;
  }

  return null;
}

/**
 * Normalizes and resolves a path consistently
 * @param inputPath - The path to normalize
 * @returns The normalized and resolved absolute path
 * @throws {McpError} If the input path is empty or invalid
 */
export function normalizePath(inputPath: string): string {
  if (!inputPath || typeof inputPath !== "string") {
    throw new McpError(
      ErrorCode.InvalidRequest,
      `Invalid path: ${inputPath}`
    );
  }

  try {
    // Handle Windows paths
    let normalized = inputPath;

    // Only validate filename portion for invalid Windows characters, allowing : for drive letters
    const filename = normalized.split(/[\\/]/).pop() || '';
    if (/[<>"|?*]/.test(filename) || (/:/.test(filename) && !/^[A-Za-z]:$/.test(filename))) {
      throw new McpError(
        ErrorCode.InvalidRequest,
        `Filename contains invalid characters: ${filename}`
      );
    }
    
    // Preserve UNC paths
    if (normalized.startsWith('\\\\')) {
      // Convert to forward slashes but preserve exactly two leading slashes
      normalized = '//' + normalized.slice(2).replace(/\\/g, '/');
      return normalized;
    }

    // Handle Windows drive letters
    if (/^[a-zA-Z]:[\\/]/.test(normalized)) {
      // Normalize path while preserving drive letter
      normalized = path.normalize(normalized);
      // Convert to forward slashes for consistency
      normalized = normalized.replace(/\\/g, '/');
      return normalized;
    }

    // Only restrict critical system directories
    const restrictedDirs = [
      'C:\\Windows',
      'C:\\Program Files',
      'C:\\Program Files (x86)',
      'C:\\ProgramData'
    ];
    if (restrictedDirs.some(dir => normalized.toLowerCase().startsWith(dir.toLowerCase()))) {
      throw new McpError(
        ErrorCode.InvalidRequest,
        `Path points to restricted system directory: ${normalized}`
      );
    }

    // Handle relative paths
    if (normalized.startsWith('./') || normalized.startsWith('../')) {
      normalized = path.normalize(normalized);
      return path.resolve(normalized);
    }

    // Default normalization for other paths
    normalized = normalized.replace(/\\/g, '/');
    if (normalized.startsWith('./') || normalized.startsWith('../')) {
      return path.resolve(normalized);
    }
    return normalized;
  } catch (error) {
    throw new McpError(
      ErrorCode.InvalidRequest,
      `Failed to normalize path: ${error instanceof Error ? error.message : String(error)}`
    );
  }
}

/**
 * Checks if a target path is safely contained within a base path
 * @param basePath - The base directory path
 * @param targetPath - The target path to check
 * @returns True if target is within base path, false otherwise
 */
export async function checkPathSafety(basePath: string, targetPath: string): Promise<boolean> {
  const resolvedPath = normalizePath(targetPath);
  const resolvedBasePath = normalizePath(basePath);

  try {
    // Check real path for symlinks
    const realPath = await fs.realpath(resolvedPath);
    const normalizedReal = normalizePath(realPath);
    
    // Check if real path is within base path
    if (!normalizedReal.startsWith(resolvedBasePath)) {
      return false;
    }

    // Check if original path is within base path
    return resolvedPath.startsWith(resolvedBasePath);
  } catch (error) {
    // For new files that don't exist yet, verify parent directory
    const parentDir = path.dirname(resolvedPath);
    try {
      const realParentPath = await fs.realpath(parentDir);
      const normalizedParent = normalizePath(realParentPath);
      return normalizedParent.startsWith(resolvedBasePath);
    } catch {
      return false;
    }
  }
}

/**
 * Ensures a path has .md extension and is valid
 * @param filePath - The file path to check
 * @returns The path with .md extension
 * @throws {McpError} If the path is invalid
 */
export function ensureMarkdownExtension(filePath: string): string {
  const normalized = normalizePath(filePath);
  return normalized.endsWith('.md') ? normalized : `${normalized}.md`;
}

/**
 * Validates that a path is within the vault directory
 * @param vaultPath - The vault directory path
 * @param targetPath - The target path to validate
 * @throws {McpError} If path is outside vault or invalid
 */
export function validateVaultPath(vaultPath: string, targetPath: string): void {
  if (!checkPathSafety(vaultPath, targetPath)) {
    throw new McpError(
      ErrorCode.InvalidRequest,
      `Path must be within the vault directory. Path: ${targetPath}, Vault: ${vaultPath}`
    );
  }
}

/**
 * Safely joins paths and ensures result is within vault
 * @param vaultPath - The vault directory path
 * @param segments - Path segments to join
 * @returns The joined and validated path
 * @throws {McpError} If resulting path would be outside vault
 */
export function safeJoinPath(vaultPath: string, ...segments: string[]): string {
  const joined = path.join(vaultPath, ...segments);
  const resolved = normalizePath(joined);
  
  validateVaultPath(vaultPath, resolved);
  
  return resolved;
}

/**
 * Sanitizes a vault name to be filesystem-safe
 * @param name - The raw vault name
 * @returns The sanitized vault name
 */
export function sanitizeVaultName(name: string): string {
  return name
    .toLowerCase()
    // Replace spaces and special characters with hyphens
    .replace(/[^a-z0-9]+/g, '-')
    // Remove leading/trailing hyphens
    .replace(/^-+|-+$/g, '')
    // Ensure name isn't empty
    || 'unnamed-vault';
}

/**
 * Checks if one path is a parent of another
 * @param parent - The potential parent path
 * @param child - The potential child path
 * @returns True if parent contains child, false otherwise
 */
export function isParentPath(parent: string, child: string): boolean {
  const relativePath = path.relative(parent, child);
  return !relativePath.startsWith('..') && !path.isAbsolute(relativePath);
}

/**
 * Checks if paths overlap or are duplicates
 * @param paths - Array of paths to check
 * @throws {McpError} If paths overlap or are duplicates
 */
export function checkPathOverlap(paths: string[]): void {
  // First normalize all paths to handle . and .. and symlinks
  const normalizedPaths = paths.map(p => {
    // Remove trailing slashes and normalize separators
    return path.normalize(p).replace(/[\/\\]+$/, '');
  });

  // Check for exact duplicates using normalized paths
  const uniquePaths = new Set<string>();
  normalizedPaths.forEach((normalizedPath, index) => {
    if (uniquePaths.has(normalizedPath)) {
      throw new McpError(
        ErrorCode.InvalidRequest,
        `Duplicate vault path provided:\n` +
        `  Original paths:\n` +
        `    1: ${paths[index]}\n` +
        `    2: ${paths[normalizedPaths.indexOf(normalizedPath)]}\n` +
        `  Both resolve to: ${normalizedPath}`
      );
    }
    uniquePaths.add(normalizedPath);
  });

  // Then check for overlapping paths using normalized paths
  for (let i = 0; i < normalizedPaths.length; i++) {
    for (let j = i + 1; j < normalizedPaths.length; j++) {
      if (isParentPath(normalizedPaths[i], normalizedPaths[j]) || 
          isParentPath(normalizedPaths[j], normalizedPaths[i])) {
        throw new McpError(
          ErrorCode.InvalidRequest,
          `Vault paths cannot overlap:\n` +
          `  Path 1: ${paths[i]}\n` +
          `  Path 2: ${paths[j]}\n` +
          `  (One vault directory cannot be inside another)\n` +
          `  Normalized paths:\n` +
          `    1: ${normalizedPaths[i]}\n` +
          `    2: ${normalizedPaths[j]}`
        );
      }
    }
  }
}

```

--------------------------------------------------------------------------------
/src/main.ts:
--------------------------------------------------------------------------------

```typescript
#!/usr/bin/env node
import { ObsidianServer } from "./server.js";
import { createCreateNoteTool } from "./tools/create-note/index.js";
import { createListAvailableVaultsTool } from "./tools/list-available-vaults/index.js";
import { createEditNoteTool } from "./tools/edit-note/index.js";
import { createSearchVaultTool } from "./tools/search-vault/index.js";
import { createMoveNoteTool } from "./tools/move-note/index.js";
import { createCreateDirectoryTool } from "./tools/create-directory/index.js";
import { createDeleteNoteTool } from "./tools/delete-note/index.js";
import { createAddTagsTool } from "./tools/add-tags/index.js";
import { createRemoveTagsTool } from "./tools/remove-tags/index.js";
import { createRenameTagTool } from "./tools/rename-tag/index.js";
import { createReadNoteTool } from "./tools/read-note/index.js";
import { listVaultsPrompt } from "./prompts/list-vaults/index.js";
import { registerPrompt } from "./utils/prompt-factory.js";
import path from "path";
import os from "os";
import { promises as fs, constants as fsConstants } from "fs";
import { McpError, ErrorCode } from "@modelcontextprotocol/sdk/types.js";
import { 
  checkPathCharacters, 
  checkLocalPath, 
  checkSuspiciousPath,
  sanitizeVaultName,
  checkPathOverlap 
} from "./utils/path.js";

interface VaultConfig {
  name: string;
  path: string;
}

async function main() {
  // Constants
  const MAX_VAULTS = 10; // Reasonable limit to prevent resource issues

  const vaultArgs = process.argv.slice(2);
  if (vaultArgs.length === 0) {
    const helpMessage = `
Obsidian MCP Server - Multi-vault Support

Usage: obsidian-mcp <vault1_path> [vault2_path ...]

Requirements:
- Paths must point to valid Obsidian vaults (containing .obsidian directory)
- Vaults must be initialized in Obsidian at least once
- Paths must have read and write permissions
- Paths cannot overlap (one vault cannot be inside another)
- Each vault must be a separate directory
- Maximum ${MAX_VAULTS} vaults can be connected at once

Security restrictions:
- Must be on a local filesystem (no network drives or mounts)
- Cannot point to system directories
- Hidden directories not allowed (except .obsidian)
- Cannot use the home directory root
- Cannot use symlinks that point outside their directory
- All paths must be dedicated vault directories

Note: If a path is not recognized as a vault, open it in Obsidian first to 
initialize it properly. This creates the required .obsidian configuration directory.

Recommended locations:
- ~/Documents/Obsidian/[vault-name]     # Recommended for most users
- ~/Notes/[vault-name]                  # Alternative location
- ~/Obsidian/[vault-name]              # Alternative location

Not supported:
- Network drives (//server/share)
- Network mounts (/net, /mnt, /media)
- System directories (/tmp, C:\\Windows)
- Hidden directories (except .obsidian)

Vault names are automatically generated from the last part of each path:
- Spaces and special characters are converted to hyphens
- Names are made lowercase for consistency
- Numbers are appended to resolve duplicates (e.g., 'work-vault-1')

Examples:
  # Valid paths:
  obsidian-mcp ~/Documents/Obsidian/Work ~/Documents/Obsidian/Personal
  → Creates vaults named 'work' and 'personal'

  obsidian-mcp ~/Notes/Work ~/Notes/Archive
  → Creates vaults named 'work' and 'archive'

  # Invalid paths:
  obsidian-mcp ~/Vaults ~/Vaults/Work     # ❌ Paths overlap
  obsidian-mcp ~/Work ~/Work              # ❌ Duplicate paths
  obsidian-mcp ~/                         # ❌ Home directory root
  obsidian-mcp /tmp/vault                 # ❌ System directory
  obsidian-mcp ~/.config/vault            # ❌ Hidden directory
  obsidian-mcp //server/share/vault       # ❌ Network path
  obsidian-mcp /mnt/network/vault         # ❌ Network mount
  obsidian-mcp ~/symlink-to-vault         # ❌ External symlink
`;

    // Log help message to stderr for user reference
    console.error(helpMessage);

    // Write MCP error to stdout
    process.stdout.write(JSON.stringify({
      jsonrpc: "2.0",
      error: {
        code: ErrorCode.InvalidRequest,
        message: "No vault paths provided. Please provide at least one valid Obsidian vault path."
      },
      id: null
    }));

    process.exit(1);
  }

  // Validate and normalize vault paths
  const normalizedPaths = await Promise.all(vaultArgs.map(async (vaultPath, index) => {
    try {
      // Expand home directory if needed
      const expandedPath = vaultPath.startsWith('~') ? 
        path.join(os.homedir(), vaultPath.slice(1)) : 
        vaultPath;
      
      // Normalize and convert to absolute path
      const normalizedPath = path.normalize(expandedPath)
        .replace(/[\/\\]+$/, ''); // Remove trailing slashes
      const absolutePath = path.resolve(normalizedPath);

      // Validate path is absolute and safe
      if (!path.isAbsolute(absolutePath)) {
        const errorMessage = `Vault path must be absolute: ${vaultPath}`;
        console.error(`Error: ${errorMessage}`);
        
        process.stdout.write(JSON.stringify({
          jsonrpc: "2.0",
          error: {
            code: ErrorCode.InvalidRequest,
            message: errorMessage
          },
          id: null
        }));
        
        process.exit(1);
      }

      // Check for suspicious paths and local filesystem
      const [suspiciousReason, localPathIssue] = await Promise.all([
        checkSuspiciousPath(absolutePath),
        checkLocalPath(absolutePath)
      ]);

      if (localPathIssue) {
        const errorMessage = `Invalid vault path (${localPathIssue}): ${vaultPath}\n` +
          `For reliability and security reasons, vault paths must:\n` +
          `- Be on a local filesystem\n` +
          `- Not use network drives or mounts\n` +
          `- Not contain symlinks that point outside their directory`;
        
        console.error(`Error: ${errorMessage}`);
        
        process.stdout.write(JSON.stringify({
          jsonrpc: "2.0",
          error: {
            code: ErrorCode.InvalidRequest,
            message: errorMessage
          },
          id: null
        }));
        
        process.exit(1);
      }

      if (suspiciousReason) {
        const errorMessage = `Invalid vault path (${suspiciousReason}): ${vaultPath}\n` +
          `For security reasons, vault paths cannot:\n` +
          `- Point to system directories\n` +
          `- Use hidden directories (except .obsidian)\n` +
          `- Point to the home directory root\n` +
          `Please choose a dedicated directory for your vault`;
        
        console.error(`Error: ${errorMessage}`);
        
        process.stdout.write(JSON.stringify({
          jsonrpc: "2.0",
          error: {
            code: ErrorCode.InvalidRequest,
            message: errorMessage
          },
          id: null
        }));
        
        process.exit(1);
      }

      try {
        // Check if path exists and is a directory
        const stats = await fs.stat(absolutePath);
        if (!stats.isDirectory()) {
          const errorMessage = `Vault path must be a directory: ${vaultPath}`;
          console.error(`Error: ${errorMessage}`);
          
          process.stdout.write(JSON.stringify({
            jsonrpc: "2.0",
            error: {
              code: ErrorCode.InvalidRequest,
              message: errorMessage
            },
            id: null
          }));
          
          process.exit(1);
        }

        // Check if path is readable and writable
        await fs.access(absolutePath, fsConstants.R_OK | fsConstants.W_OK);

        // Check if this is a valid Obsidian vault
        const obsidianConfigPath = path.join(absolutePath, '.obsidian');
        const obsidianAppConfigPath = path.join(obsidianConfigPath, 'app.json');
        
        try {
          // Check .obsidian directory
          const configStats = await fs.stat(obsidianConfigPath);
          if (!configStats.isDirectory()) {
            const errorMessage = `Invalid Obsidian vault configuration in ${vaultPath}\n` +
              `The .obsidian folder exists but is not a directory\n` +
              `Try removing it and reopening the vault in Obsidian`;
            
            console.error(`Error: ${errorMessage}`);
            
            process.stdout.write(JSON.stringify({
              jsonrpc: "2.0",
              error: {
                code: ErrorCode.InvalidRequest,
                message: errorMessage
              },
              id: null
            }));
            
            process.exit(1);
          }

          // Check app.json to verify it's properly initialized
          await fs.access(obsidianAppConfigPath, fsConstants.R_OK);
          
        } catch (error) {
          if ((error as NodeJS.ErrnoException).code === 'ENOENT') {
            const errorMessage = `Not a valid Obsidian vault (${vaultPath})\n` +
              `Missing or incomplete .obsidian configuration\n\n` +
              `To fix this:\n` +
              `1. Open Obsidian\n` +
              `2. Click "Open folder as vault"\n` +
              `3. Select the directory: ${absolutePath}\n` +
              `4. Wait for Obsidian to initialize the vault\n` +
              `5. Try running this command again`;
            
            console.error(`Error: ${errorMessage}`);
            
            process.stdout.write(JSON.stringify({
              jsonrpc: "2.0",
              error: {
                code: ErrorCode.InvalidRequest,
                message: errorMessage
              },
              id: null
            }));
          } else {
            const errorMessage = `Error checking Obsidian configuration in ${vaultPath}: ${error instanceof Error ? error.message : String(error)}`;
            console.error(`Error: ${errorMessage}`);
            
            process.stdout.write(JSON.stringify({
              jsonrpc: "2.0",
              error: {
                code: ErrorCode.InternalError,
                message: errorMessage
              },
              id: null
            }));
          }
          process.exit(1);
        }

        return absolutePath;
      } catch (error) {
        let errorMessage: string;
        if ((error as NodeJS.ErrnoException).code === 'ENOENT') {
          errorMessage = `Vault directory does not exist: ${vaultPath}`;
        } else if ((error as NodeJS.ErrnoException).code === 'EACCES') {
          errorMessage = `No permission to access vault directory: ${vaultPath}`;
        } else {
          errorMessage = `Error accessing vault path ${vaultPath}: ${error instanceof Error ? error.message : String(error)}`;
        }
        
        console.error(`Error: ${errorMessage}`);
        
        process.stdout.write(JSON.stringify({
          jsonrpc: "2.0",
          error: {
            code: ErrorCode.InvalidRequest,
            message: errorMessage
          },
          id: null
        }));
        
        process.exit(1);
      }
    } catch (error) {
      const errorMessage = `Error processing vault path ${vaultPath}: ${error instanceof Error ? error.message : String(error)}`;
      console.error(`Error: ${errorMessage}`);
      
      process.stdout.write(JSON.stringify({
        jsonrpc: "2.0",
        error: {
          code: ErrorCode.InternalError,
          message: errorMessage
        },
        id: null
      }));
      
      process.exit(1);
    }
  }));

  // Validate number of vaults
  if (vaultArgs.length > MAX_VAULTS) {
    const errorMessage = `Too many vaults specified (${vaultArgs.length})\n` +
      `Maximum number of vaults allowed: ${MAX_VAULTS}\n` +
      `This limit helps prevent performance issues and resource exhaustion`;
    
    console.error(`Error: ${errorMessage}`);
    
    process.stdout.write(JSON.stringify({
      jsonrpc: "2.0",
      error: {
        code: ErrorCode.InvalidRequest,
        message: errorMessage
      },
      id: null
    }));
    
    process.exit(1);
  }

  console.error(`Validating ${vaultArgs.length} vault path${vaultArgs.length > 1 ? 's' : ''}...`);

  // Check if we have any valid paths
  if (normalizedPaths.length === 0) {
    const errorMessage = `No valid vault paths provided\n` +
      `Make sure at least one path points to a valid Obsidian vault`;
    
    console.error(`\nError: ${errorMessage}`);
    
    process.stdout.write(JSON.stringify({
      jsonrpc: "2.0",
      error: {
        code: ErrorCode.InvalidRequest,
        message: errorMessage
      },
      id: null
    }));
    
    process.exit(1);
  } else if (normalizedPaths.length < vaultArgs.length) {
    console.error(`\nWarning: Only ${normalizedPaths.length} out of ${vaultArgs.length} paths were valid`);
    console.error("Some vaults will not be available");
  }

  try {
    // Check for overlapping vault paths
    checkPathOverlap(normalizedPaths);
  } catch (error) {
    const errorMessage = error instanceof McpError ? error.message : String(error);
    console.error(`Error: ${errorMessage}`);
    
    process.stdout.write(JSON.stringify({
      jsonrpc: "2.0",
      error: {
        code: ErrorCode.InvalidRequest,
        message: errorMessage
      },
      id: null
    }));
    
    process.exit(1);
  }

  // Create vault configurations with human-friendly names
  console.error("\nInitializing vaults...");
  const vaults: VaultConfig[] = normalizedPaths.map(vaultPath => {
    // Get the last directory name from the path as the vault name
    const rawName = path.basename(vaultPath);
    const vaultName = sanitizeVaultName(rawName);
    
    // Log the vault name mapping for user reference
    console.error(`Vault "${rawName}" registered as "${vaultName}"`);
    
    return {
      name: vaultName,
      path: vaultPath
    };
  });

  // Ensure vault names are unique by appending numbers if needed
  const uniqueVaults: VaultConfig[] = [];
  const usedNames = new Set<string>();

  vaults.forEach(vault => {
    let uniqueName = vault.name;
    let counter = 1;
    
    // If name is already used, find a unique variant
    if (usedNames.has(uniqueName)) {
      console.error(`Note: Found duplicate vault name "${uniqueName}"`);
      while (usedNames.has(uniqueName)) {
        uniqueName = `${vault.name}-${counter}`;
        counter++;
      }
      console.error(`  → Using "${uniqueName}" instead`);
    }
    
    usedNames.add(uniqueName);
    uniqueVaults.push({
      name: uniqueName,
      path: vault.path
    });
  });

  // Log final vault configuration to stderr
  console.error("\nSuccessfully configured vaults:");
  uniqueVaults.forEach(vault => {
    console.error(`- ${vault.name}`);
    console.error(`  Path: ${vault.path}`);
  });
  console.error(`\nTotal vaults: ${uniqueVaults.length}`);
  console.error(""); // Empty line for readability

  try {
    if (uniqueVaults.length === 0) {
      throw new McpError(
        ErrorCode.InvalidRequest,
        'No valid Obsidian vaults provided. Please provide at least one valid vault path.\n\n' +
        'Example usage:\n' +
        '  obsidian-mcp ~/Documents/Obsidian/MyVault\n\n' +
        'The vault directory must:\n' +
        '- Exist and be accessible\n' +
        '- Contain a .obsidian directory (initialize by opening in Obsidian first)\n' +
        '- Have read/write permissions'
      );
    }

    console.error(`Starting Obsidian MCP Server with ${uniqueVaults.length} vault${uniqueVaults.length > 1 ? 's' : ''}...`);
    
    const server = new ObsidianServer(uniqueVaults);
    console.error("Server initialized successfully");

    // Handle graceful shutdown
    let isShuttingDown = false;
    async function shutdown(signal: string) {
      if (isShuttingDown) return;
      isShuttingDown = true;

      console.error(`\nReceived ${signal}, shutting down...`);
      try {
        await server.stop();
        console.error("Server stopped cleanly");
        process.exit(0);
      } catch (error) {
        console.error("Error during shutdown:", error);
        process.exit(1);
      }
    }

    // Register signal handlers
    process.on('SIGINT', () => shutdown('SIGINT')); // Ctrl+C
    process.on('SIGTERM', () => shutdown('SIGTERM')); // Kill command

    // Create vaults Map from unique vaults
    const vaultsMap = new Map(uniqueVaults.map(v => [v.name, v.path]));

    // Register tools with unique vault names
    const tools = [
      createCreateNoteTool(vaultsMap),
      createListAvailableVaultsTool(vaultsMap),
      createEditNoteTool(vaultsMap),
      createSearchVaultTool(vaultsMap),
      createMoveNoteTool(vaultsMap),
      createCreateDirectoryTool(vaultsMap),
      createDeleteNoteTool(vaultsMap),
      createAddTagsTool(vaultsMap),
      createRemoveTagsTool(vaultsMap),
      createRenameTagTool(vaultsMap),
      createReadNoteTool(vaultsMap)
    ];

    for (const tool of tools) {
      try {
        server.registerTool(tool);
      } catch (error) {
        console.error(`Error registering tool ${tool.name}:`, error);
        throw error;
      }
    }

    // All prompts are registered in the server constructor
    console.error("All tools registered successfully");
    console.error("Server starting...\n");

    // Start the server without logging to stdout
    await server.start();
  } catch (error) {
    console.log(error instanceof Error ? error.message : String(error));
    // Format error for MCP protocol
    const mcpError = error instanceof McpError ? error : new McpError(
      ErrorCode.InternalError,
      error instanceof Error ? error.message : String(error)
    );

    // Write error in MCP protocol format to stdout
    process.stdout.write(JSON.stringify({
      jsonrpc: "2.0",
      error: {
        code: mcpError.code,
        message: mcpError.message
      },
      id: null
    }));

    // Log details to stderr for debugging
    console.error("\nFatal error starting server:");
    console.error(mcpError.message);
    if (error instanceof Error && error.stack) {
      console.error("\nStack trace:");
      console.error(error.stack.split('\n').slice(1).join('\n'));
    }
    
    process.exit(1);
  }
}

main().catch((error) => {
  console.error("Unhandled error:", error);
  process.exit(1);
});

```