#
tokens: 20485/50000 13/13 files
lines: off (toggle) GitHub
raw markdown copy
# Directory Structure

```
├── .gitignore
├── CODE_OF_CONDUCT.md
├── Dockerfile
├── LICENSE
├── package.json
├── README.md
├── smithery.yaml
├── src
│   ├── index.ts
│   ├── toolDescriptions.ts
│   ├── types.ts
│   └── utils.ts
├── test scripts
│   └── test_client.js
├── test_list_all_backups.js
└── tsconfig.json
```

# Files

--------------------------------------------------------------------------------
/.gitignore:
--------------------------------------------------------------------------------

```
# Dependencies
node_modules/
npm-debug.log
yarn-debug.log
yarn-error.log
package-lock.json
yarn.lock

# Build output
dist/
build/
*.tsbuildinfo

# Environment variables
.env
.env.local
.env.development.local
.env.test.local
.env.production.local

# IDE and editor files
.idea/
.vscode/
*.swp
*.swo
.DS_Store
Thumbs.db

# Backup files
.code_backups/
.code_emergency_backups/

# Logs
logs/
*.log
npm-debug.log*
yarn-debug.log*
yarn-error.log*

# Test coverage
coverage/
.nyc_output/

# Temporary files
tmp/
temp/

```

--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------

```markdown
[![MseeP.ai Security Assessment Badge](https://mseep.net/pr/hexitex-mcp-backup-server-badge.png)](https://mseep.ai/app/hexitex-mcp-backup-server)

# MCP Backup Server
[![smithery badge](https://smithery.ai/badge/@hexitex/MCP-Backup-Server)](https://smithery.ai/server/@hexitex/MCP-Backup-Server)

A specialized MCP server that provides backup and restoration capabilities for AI agents and code editing tools. Tested in both Cursor and Windsurf editors.

Repository: [https://github.com/hexitex/MCP-Backup-Server](https://github.com/hexitex/MCP-Backup-Server)

## Why Use This (Not Git)

This system serves a different purpose than Git:

**Pros:**
- Creates instant, targeted backups with agent context
- Simpler than Git for single-operation safety
- Preserves thought process and intent in backups
- No commit messages or branching required
- Better for AI agents making critical changes
- Works without repository initialization
- Faster for emergency "save points" during edits

**Cons:**
- Not for long-term version tracking 
- Limited collaboration features
- No merging or conflict resolution
- No distributed backup capabilities
- Not a replacement for proper version control
- Stores complete file copies rather than diffs

**When to use:** Before risky edits, folder restructuring, or when you need quick safety backups with context.

**When to use Git instead:** For proper version history, collaboration, and project management.

## Features
- Preserves agent context and reasoning
- Creates targeted, minimal backups
- Supports file and folder operations
- Maintains version history
- Provides restore safety
- Uses pattern filtering
- Tracks operations
- Allows cancellation

## Setup

### Installing via Smithery

To install Backup Server for Claude Desktop automatically via [Smithery](https://smithery.ai/server/@hexitex/MCP-Backup-Server):

```bash
npx -y @smithery/cli install @hexitex/MCP-Backup-Server --client claude
```

### Installing Manually
```bash
# Install dependencies
npm install

# Build TypeScript files
npm run build

# Start the backup server
npm start
```

## Config

Env:
- `BACKUP_DIR`: Backup directory (./.code_backups)
- `EMERGENCY_BACKUP_DIR`: Emergency backups (./.code_emergency_backups)
- `MAX_VERSIONS`: Version limit (10)

Configure in editor:

Windsurf MCP config:
```json
{
  "mcpServers": {
    "backup": {
      "command": "node",
      "args": ["./dist/index.js"],
      "env": {
        "BACKUP_DIR": "./.code_backups",
        "EMERGENCY_BACKUP_DIR": "./.code_emergency_backups",
        "MAX_VERSIONS": "20"
      }
    }
  }
}
```

Cursor: Create `.cursor/mcp.json` with similar config.

## Tools

### File Operations
- `backup_create`: Create backup with context
- `backup_list`: List available backups
- `backup_restore`: Restore with safety backup

### Folder Operations  
- `backup_folder_create`: Backup with pattern filtering
- `backup_folder_list`: List folder backups
- `backup_folder_restore`: Restore folder structure

### Management
- `backup_list_all`: List all backups
- `mcp_cancel`: Cancel operations

## When to Use Backups

Only create backups when truly needed:

1. **Before Refactoring**: When changing important code
2. **Before Removing Folders**: When reorganizing project structure
3. **Multiple Related Changes**: When updating several connected files
4. **Resuming Major Work**: When continuing significant changes
5. **Before Restores**: Create safety backup before restoring

Keep backups minimal and purposeful. Document why each backup is needed.

## Rules for Copy-Paste

```
Always try to use the backup MCP server for operations that require a backup, listing backups and restoring backups.
Only backup before critical code changes, folder removal, changes to multiple related files, resuming major work, or restoring files.
Keep backups minimal and focused only on files being changed.
Always provide clear context for why a backup is being created.
Use pattern filters to exclude irrelevant files from folder backups.
Use relative file paths when creating backups.
Create emergency backups before restore operations.
Clean up old backups to maintain system efficiency.
Backup tools: backup_create, backup_list, backup_restore, backup_folder_create, backup_folder_list, backup_folder_restore, backup_list_all, mcp_cancel.
```

## For Human Users

Simple commands like these at the start you may have to mention MCP tool

```
# Back up an important file
"Back up my core file before refactoring"

# Back up a folder before changes
"Create backup of the API folder before restructuring"

# Find previous backups
"Show me my recent backups"

# Restore a previous version
"Restore my core file from this morning"
```

## Agent Examples

### Quick Backups
```json
// Before project changes
{
  "name": "mcp0_backup_folder_create",
  "parameters": {
    "folder_path": "./src",
    "include_pattern": "*.{js,ts}",
    "exclude_pattern": "{node_modules,dist,test}/**",
    "agent_context": "Start auth changes"
  }
}

// Before core fix
{
  "name": "mcp0_backup_create",
  "parameters": {
    "file_path": "./src/core.js",
    "agent_context": "Fix validation"
  }
}
```

### Resume Session
```json
// View recent work
{
  "name": "mcp0_backup_list_all",
  "parameters": {
    "include_pattern": "src/**/*.js"
  }
}

// Get last version
{
  "name": "mcp0_backup_restore",
  "parameters": {
    "file_path": "./src/core.js",
    "timestamp": "20250310-055950-000",
    "create_emergency_backup": true
  }
}
```

### Core Changes
```json
// Critical update
{
  "name": "mcp0_backup_create",
  "parameters": {
    "file_path": "./src/core.js",
    "agent_context": "Add validation"
  }
}

// Module update
{
  "name": "mcp0_backup_folder_create",
  "parameters": {
    "folder_path": "./src/api",
    "include_pattern": "*.js",
    "exclude_pattern": "test/**",
    "agent_context": "Refactor modules"
  }
}
```

### Restore Points
```json
// Check versions
{
  "name": "mcp0_backup_list",
  "parameters": {
    "file_path": "./src/core.js"
  }
}

{
  "name": "mcp0_backup_folder_list",
  "parameters": {
    "folder_path": "./src/api"
  }
}

// File restore
{
  "name": "mcp0_backup_restore",
  "parameters": {
    "file_path": "./src/core.js",
    "timestamp": "20250310-055950-000",
    "create_emergency_backup": true
  }
}

// Folder restore
{
  "name": "mcp0_backup_folder_restore",
  "parameters": {
    "folder_path": "./src/api",
    "timestamp": "20250310-055950-000",
    "create_emergency_backup": true
  }
}
```

### Manage
```json
// List recent
{
  "name": "mcp0_backup_list_all",
  "parameters": {
    "include_pattern": "src/**/*.js"
  }
}

// Stop backup
{
  "name": "mcp0_mcp_cancel",
  "parameters": {
    "operationId": "backup_1234"
  }
}
```

## License
MIT

```

--------------------------------------------------------------------------------
/CODE_OF_CONDUCT.md:
--------------------------------------------------------------------------------

```markdown
# Contributor Covenant Code of Conduct

## Our Pledge

We as members, contributors, and leaders pledge to make participation in our
community a harassment-free experience for everyone, regardless of age, body
size, visible or invisible disability, ethnicity, sex characteristics, gender
identity and expression, level of experience, education, socio-economic status,
nationality, personal appearance, race, religion, or sexual identity
and orientation.

We pledge to act and interact in ways that contribute to an open, welcoming,
diverse, inclusive, and healthy community.

## Our Standards

Examples of behavior that contributes to a positive environment for our
community include:

* Demonstrating empathy and kindness toward other people
* Being respectful of differing opinions, viewpoints, and experiences
* Giving and gracefully accepting constructive feedback
* Accepting responsibility and apologizing to those affected by our mistakes,
  and learning from the experience
* Focusing on what is best not just for us as individuals, but for the
  overall community

Examples of unacceptable behavior include:

* The use of sexualized language or imagery, and sexual attention or
  advances of any kind
* Trolling, insulting or derogatory comments, and personal or political attacks
* Public or private harassment
* Publishing others' private information, such as a physical or email
  address, without their explicit permission
* Other conduct which could reasonably be considered inappropriate in a
  professional setting

## Enforcement Responsibilities

Community leaders are responsible for clarifying and enforcing our standards of
acceptable behavior and will take appropriate and fair corrective action in
response to any behavior that they deem inappropriate, threatening, offensive,
or harmful.

Community leaders have the right and responsibility to remove, edit, or reject
comments, commits, code, wiki edits, issues, and other contributions that are
not aligned to this Code of Conduct, and will communicate reasons for moderation
decisions when appropriate.

## Scope

This Code of Conduct applies within all community spaces, and also applies when
an individual is officially representing the community in public spaces.
Examples of representing our community include using an official e-mail address,
posting via an official social media account, or acting as an appointed
representative at an online or offline event.

## Enforcement

Instances of abusive, harassing, or otherwise unacceptable behavior may be
reported to the community leaders responsible for enforcement at
.
All complaints will be reviewed and investigated promptly and fairly.

All community leaders are obligated to respect the privacy and security of the
reporter of any incident.

## Enforcement Guidelines

Community leaders will follow these Community Impact Guidelines in determining
the consequences for any action they deem in violation of this Code of Conduct:

### 1. Correction

**Community Impact**: Use of inappropriate language or other behavior deemed
unprofessional or unwelcome in the community.

**Consequence**: A private, written warning from community leaders, providing
clarity around the nature of the violation and an explanation of why the
behavior was inappropriate. A public apology may be requested.

### 2. Warning

**Community Impact**: A violation through a single incident or series
of actions.

**Consequence**: A warning with consequences for continued behavior. No
interaction with the people involved, including unsolicited interaction with
those enforcing the Code of Conduct, for a specified period of time. This
includes avoiding interactions in community spaces as well as external channels
like social media. Violating these terms may lead to a temporary or
permanent ban.

### 3. Temporary Ban

**Community Impact**: A serious violation of community standards, including
sustained inappropriate behavior.

**Consequence**: A temporary ban from any sort of interaction or public
communication with the community for a specified period of time. No public or
private interaction with the people involved, including unsolicited interaction
with those enforcing the Code of Conduct, is allowed during this period.
Violating these terms may lead to a permanent ban.

### 4. Permanent Ban

**Community Impact**: Demonstrating a pattern of violation of community
standards, including sustained inappropriate behavior,  harassment of an
individual, or aggression toward or disparagement of classes of individuals.

**Consequence**: A permanent ban from any sort of public interaction within
the community.

## Attribution

This Code of Conduct is adapted from the [Contributor Covenant][homepage],
version 2.0, available at
https://www.contributor-covenant.org/version/2/0/code_of_conduct.html.

Community Impact Guidelines were inspired by [Mozilla's code of conduct
enforcement ladder](https://github.com/mozilla/diversity).

[homepage]: https://www.contributor-covenant.org

For answers to common questions about this code of conduct, see the FAQ at
https://www.contributor-covenant.org/faq. Translations are available at
https://www.contributor-covenant.org/translations.

```

--------------------------------------------------------------------------------
/tsconfig.json:
--------------------------------------------------------------------------------

```json
{
  "compilerOptions": {
    "target": "ES2020",
    "module": "NodeNext",
    "moduleResolution": "NodeNext",
    "declaration": true,
    "outDir": "./dist",
    "strict": true,
    "esModuleInterop": true,
    "skipLibCheck": true,
    "forceConsistentCasingInFileNames": true
  },
  "include": ["src/**/*"],
  "exclude": ["node_modules", "**/*.test.ts"]
}

```

--------------------------------------------------------------------------------
/Dockerfile:
--------------------------------------------------------------------------------

```dockerfile
# Generated by https://smithery.ai. See: https://smithery.ai/docs/config#dockerfile
# syntax=docker/dockerfile:1

# Builder stage: install dependencies and build TypeScript
FROM node:lts-alpine AS builder
WORKDIR /app

# Install dependencies and build
COPY package.json tsconfig.json ./
COPY src ./src
RUN npm install --ignore-scripts && npm run build

# Final stage: runtime image
FROM node:lts-alpine AS runner
WORKDIR /app

# Copy built artifacts and dependencies
COPY --from=builder /app/dist ./dist
COPY --from=builder /app/node_modules ./node_modules

# Default command to start the MCP server
CMD ["node", "dist/index.js"]

```

--------------------------------------------------------------------------------
/src/types.ts:
--------------------------------------------------------------------------------

```typescript
// Define interfaces for backup operations
export interface BackupMetadata {
  original_path: string;
  original_filename: string;
  timestamp: string;
  created_at: string;
  backup_path: string;
  relative_path: string;
  agent_context?: string; // Optional field for agent conversational context
}

export interface BackupFolderMetadata {
  original_path: string;
  original_foldername: string;
  timestamp: string;
  backup_path: string;
  include_pattern: string | null;
  exclude_pattern: string | null;
  agent_context?: string; // Optional field for agent conversational context
}

export interface BackupResult {
  success?: boolean;
  timestamp?: string;
  original_path?: string;
  original_filename?: string;
  original_foldername?: string;
  backup_path?: string;
  operation_id?: string;
  error?: string;
}

export interface Operation {
  id: string;
  type: string;
  progress: number;
  cancelled: boolean;
  status: string;
}

```

--------------------------------------------------------------------------------
/package.json:
--------------------------------------------------------------------------------

```json
{
  "name": "@modelcontextprotocol/server-backup",
  "version": "1.0.0",
  "description": "MCP server for file backup and restoration",
  "license": "MIT",
  "type": "module",
  "bin": {
    "@modelcontextprotocol/server-backup": "dist/index.js",
    "mcp-server-backup": "dist/index.js"
  },
  "main": "dist/index.js",
  "files": [
    "dist"
  ],
  "engines": {
    "node": ">=16"
  },
  "scripts": {
    "build": "tsc && shx chmod +x dist/index.js",
    "prepare": "npm run build",
    "start": "node dist/index.js",
    "watch": "tsc --watch",
    "test": "node \"test scripts/test_client.js\""
  },
  "keywords": [
    "mcp",
    "backup",
    "modelcontextprotocol"
  ],
  "author": "Rob MCGlade",
  "dependencies": {
    "@modelcontextprotocol/sdk": "0.5.0",
    "@types/minimatch": "^5.1.2",
    "minimatch": "^10.0.1",
    "zod-to-json-schema": "^3.24.3"
  },
  "devDependencies": {
    "@types/node": "^22",
    "shx": "^0.3.4",
    "typescript": "^5.3.3"
  }
}

```

--------------------------------------------------------------------------------
/smithery.yaml:
--------------------------------------------------------------------------------

```yaml
# Smithery configuration file: https://smithery.ai/docs/config#smitheryyaml

startCommand:
  type: stdio
  configSchema:
    # JSON Schema defining the configuration options for the MCP.
    type: object
    properties:
      backupDir:
        type: string
        default: ./.code_backups
        description: Directory to store regular backups
      emergencyBackupDir:
        type: string
        default: ./.code_emergency_backups
        description: Directory to store emergency backups
      maxVersions:
        type: number
        default: 10
        description: Maximum number of backup versions to keep per file/folder
  commandFunction:
    # A JS function that produces the CLI command based on the given config to start the MCP on stdio.
    |-
    (config) => ({ command: 'node', args: ['dist/index.js'], env: { BACKUP_DIR: config.backupDir, EMERGENCY_BACKUP_DIR: config.emergencyBackupDir, MAX_VERSIONS: String(config.maxVersions) } })
  exampleConfig:
    backupDir: ./.code_backups
    emergencyBackupDir: ./.code_emergency_backups
    maxVersions: 20

```

--------------------------------------------------------------------------------
/test_list_all_backups.js:
--------------------------------------------------------------------------------

```javascript
import fs from 'fs';
import { spawn } from 'child_process';

// Create a request to the MCP server
const request = {
  jsonrpc: '2.0',
  method: 'tools/call',
  params: {
    name: 'backup_list_all',
    arguments: {
      include_emergency: true
    }
  },
  id: 1
};

// Spawn the MCP server process
const mcp = spawn('node', ['dist/index.js'], {
  stdio: ['pipe', 'pipe', 'pipe']
});

// Send the request to the MCP server
mcp.stdin.write(JSON.stringify(request) + '\n');

// Collect the response from the MCP server
let responseData = '';
mcp.stdout.on('data', (data) => {
  responseData += data.toString();
});

// Handle errors
mcp.stderr.on('data', (data) => {
  console.error(`stderr: ${data}`);
});

// Process the response when the MCP server exits
mcp.on('close', (code) => {
  console.log(`MCP server exited with code ${code}`);
  
  if (responseData) {
    try {
      const response = JSON.parse(responseData);
      console.log('Response from MCP server:');
      console.log(JSON.stringify(response, null, 2));
    } catch (error) {
      console.error('Error parsing response:', error);
      console.log('Raw response:', responseData);
    }
  }
});

```

--------------------------------------------------------------------------------
/src/utils.ts:
--------------------------------------------------------------------------------

```typescript
import fs from 'fs';
import path from 'path';
import { promises as fsPromises } from 'fs';
import { Operation } from './types.js';

// Check if operation was cancelled and return appropriate response if it was
export function checkOperationCancelled(
  operationId: string | null, 
  operations: Map<string, Operation>,
  cleanupFn?: () => void
): { isCancelled: boolean; response?: any } {
  if (operationId && operations.get(operationId)?.cancelled) {
    console.error(`Operation was cancelled`);
    
    // Run cleanup function if provided
    if (cleanupFn) {
      cleanupFn();
    }
    
    return {
      isCancelled: true,
      response: {
        content: [{ type: "text", text: "Operation cancelled" }],
        isError: true
      }
    };
  }
  
  return { isCancelled: false };
}

// Format response with JSON content
export function formatJsonResponse(data: any): any {
  return {
    content: [{ 
      type: "text", 
      text: JSON.stringify(data, null, 2)
    }]
  };
}

// Format error response
export function formatErrorResponse(error: any, operationId: string | null = null): any {
  return {
    content: [{ 
      type: "text", 
      text: JSON.stringify({ 
        error: String(error),
        operationId
      }) 
    }]
  };
}

// Validate required parameters
export function validateRequiredParams(params: Record<string, any>, requiredParams: string[]): void {
  for (const param of requiredParams) {
    if (!params[param]) {
      throw new Error(`Invalid params: ${param} is required`);
    }
  }
}

// Check if file exists and is a file
export async function validateFileExists(filePath: string): Promise<void> {
  try {
    const stats = await fsPromises.stat(filePath);
    if (!stats.isFile()) {
      throw new Error(`Not a file: ${filePath}`);
    }
  } catch (error) {
    throw new Error(`File not found: ${filePath}`);
  }
}

// Check if folder exists and is a directory
export async function validateFolderExists(folderPath: string): Promise<void> {
  try {
    const stats = await fsPromises.stat(folderPath);
    if (!stats.isDirectory()) {
      throw new Error(`Not a directory: ${folderPath}`);
    }
  } catch (error) {
    throw new Error(`Folder not found: ${folderPath}`);
  }
}

// Ensure directory exists
export async function ensureDirectoryExists(dirPath: string): Promise<void> {
  await fsPromises.mkdir(dirPath, { recursive: true });
}

// Check if path exists
export async function exists(path: string): Promise<boolean> {
  try {
    await fsPromises.stat(path);
    return true;
  } catch {
    return false;
  }
}

```

--------------------------------------------------------------------------------
/src/toolDescriptions.ts:
--------------------------------------------------------------------------------

```typescript
import { z } from "zod";
import { zodToJsonSchema } from "zod-to-json-schema";

// Schema definitions
export const BackupCreateSchema = z.object({
  file_path: z.string().describe('Absolute path to the file to backup. This file must exist and be accessible.'),
  agent_context: z.string().optional().describe('Optional agent conversational context to store with the backup metadata. Agents should include the last relevant user instruction or context that explains why this backup is being created.')
});

export const BackupFolderCreateSchema = z.object({
  folder_path: z.string().describe('Absolute path to the folder to backup. This folder must exist and be accessible.'),
  include_pattern: z.string().optional().describe('Optional glob pattern to include specific files (e.g., "*.js")'),
  exclude_pattern: z.string().optional().describe('Optional glob pattern to exclude specific files (e.g., "node_modules/**")'),
  agent_context: z.string().optional().describe('Optional agent conversational context to store with the backup metadata. Agents should include the last relevant user instruction or context that explains why this backup is being created.')
});

export const BackupListSchema = z.object({
  file_path: z.string().describe('Absolute path to the file whose backups you want to list.')
});

export const BackupFolderListSchema = z.object({
  folder_path: z.string().describe('Absolute path to the folder whose backups you want to list.')
});

export const BackupRestoreSchema = z.object({
  file_path: z.string().describe('Absolute path to the file to restore.'),
  timestamp: z.string().describe('Timestamp of the backup version to restore (format: YYYYMMDD-HHMMSS-mmm).'),
  create_emergency_backup: z.boolean().optional().default(true).describe('Whether to create an emergency backup of the current file before restoring.')
});

export const BackupFolderRestoreSchema = z.object({
  folder_path: z.string().describe('Absolute path to the folder to restore.'),
  timestamp: z.string().describe('Timestamp of the backup version to restore (format: YYYYMMDD-HHMMSS-mmm).'),
  create_emergency_backup: z.boolean().optional().default(true).describe('Whether to create an emergency backup of the current folder before restoring.')
});

export const CancelSchema = z.object({
  operationId: z.string().describe('ID of the operation to cancel.')
});

// New schema for listing all backups
export const ListAllBackupsSchema = z.object({
  include_pattern: z.string().optional().describe('Optional glob pattern to filter backup files (e.g., "*.js")'),
  exclude_pattern: z.string().optional().describe('Optional glob pattern to exclude backup files (e.g., "node_modules/**")'),
  include_emergency: z.boolean().optional().default(true).describe('Whether to include emergency backups in the results.')
});

// Interface for tool description
export interface ToolDescription {
  name: string;
  description: string;
  usage: string;
  inputSchema: any;
}

// Define tool descriptions with detailed usage instructions
export const toolDescriptions: Record<string, ToolDescription> = {
  backup_create: {
    name: "backup_create",
    description: "Create a backup of a file before making big changes. The backup includes timestamp information and maintains the original directory structure.",
    usage: `Creates a timestamped backup of the specified file.

Parameters:
- file_path: Absolute path to the file to backup
- agent_context: (Optional) Include the last relevant user instruction or context

Best Practices for File Backups:
- Always prefer backing up individual files rather than entire folders when possible
- When modifying a single file, use backup_create instead of backup_folder_create
- For code edits, back up only the specific file being modified
- Agents should always include the user's last instruction as context
- Context should be concise and explain why the backup is being created
- Avoid including sensitive information in the context
- Keep context to 1-2 sentences that capture the purpose of the modification

Example contexts:
"Updating database connection string in config file"
"Fixing bug in login function that was causing authentication failures"

Returns:
- timestamp: Timestamp of the created backup
- backup_path: Path where the backup was stored
- agent_context: The context that was stored with the backup`,
    inputSchema: zodToJsonSchema(BackupCreateSchema)
  },
  backup_list: {
    name: "backup_list",
    description: "List all available backups for a file, including their timestamps and locations. Useful for finding specific versions to restore.",
    usage: `# Backup List Tool

Lists all available backup versions for a specified file.

## When to Use
- Before restoring a file to check available versions
- To audit the backup history of a file
- To verify backup creation was successful
- To check timestamps of previous backups
- To list backups instead of using CLI commands or file system operations

## Best Practices
1. Use absolute paths for reliable file lookup
2. Check the timestamps to identify the right version
3. Review backup locations to ensure proper storage
4. Always use this tool rather than CLI commands to list backups
5. Agents should use this tool instead of file system operations to check backup history
6. Always provide the original file path (not the backup path) to this tool
7. The tool will automatically search in the configured backup location (BACKUP_DIR)

## Important Note for Agents
- Always try the backup_list and backup_folder_list tools first to check for backups
- Only use command-line tools (like Get-ChildItem, ls, dir) as a fallback if:
  1. The MCP tools return an empty array or no response
  2. You need to verify if backups exist in a specific location
  3. The MCP server is not responding or unavailable
- The MCP backup tools provide the correct context and metadata for backups
- When using CLI as fallback, check the configured backup directory (typically ~/.code_backups or as specified in environment)
- Never attempt to directly access or manipulate files in the backup directory
- Always use the original file path when listing backups, not paths within the backup directory

## Example Usage
\`\`\`json
{
  "name": "backup_list",
  "arguments": {
    "file_path": "C:/path/to/important/file.js"
  }
}
\`\`\`

## Response Format
Returns an array of backup metadata:
- timestamp: When the backup was created
- backup_path: Where to find the backup
- created_at: ISO timestamp of creation
- agent_context: The context that was stored with the backup (if provided)`,
    inputSchema: zodToJsonSchema(BackupListSchema)
  },
  backup_restore: {
    name: "backup_restore",
    description: "Restore a file from a previous backup using its timestamp. Use this to revert changes or recover previous versions.",
    usage: `# Backup Restore Tool

Restores a file to a previous version using a specific backup timestamp.

## When to Use
- To revert unwanted changes
- To recover from failed modifications
- When comparing different versions of a file
- After unsuccessful code changes

## Best Practices
1. List available backups first to get the correct timestamp
2. Create a new backup before restoring (backup of current state)
3. Verify file permissions before restoration
4. Use absolute paths for reliable restoration

## Example Usage
\`\`\`json
{
  "name": "backup_restore",
  "arguments": {
    "file_path": "C:/path/to/important/file.js",
    "timestamp": "20250309-120000-123"
  }
}
\`\`\`

## Response Format
Confirms restoration with:
- restored_path: Path to the restored file
- timestamp: Backup version used`,
    inputSchema: zodToJsonSchema(BackupRestoreSchema)
  },
  backup_folder_create: {
    name: "backup_folder_create",
    description: "Create a backup of a folder before making structural changes. The backup includes timestamp information and maintains the original directory structure.",
    usage: `Creates a timestamped backup of the specified folder.

Parameters:
- folder_path: Absolute path to the folder to backup
- include_pattern: (Optional) Glob pattern to include specific files
- exclude_pattern: (Optional) Glob pattern to exclude specific files
- agent_context: (Optional) Include the last relevant user instruction or context

When to Use Folder vs. File Backup:
- Use file backup (backup_create) for single file changes
- Use folder backup (backup_folder_create) ONLY when:
  1. Multiple files in a folder need to be modified together
  2. You're making structural changes to a directory (adding/removing multiple files)
  3. You need to preserve relationships between multiple files

Best Practices for Folder Backups:
- Only backup the specific folder you're modifying, not parent directories
- When removing a subfolder, backup just that subfolder, not the entire parent structure
- For structural changes, backup the smallest unit of the structure being changed
- For project-wide backups at the start of a session, ask the user first
- Agents should always include the user's last instruction as context
- Context should be concise and explain why the backup is being created
- Avoid including sensitive information in the context
- Keep context to 1-2 sentences that capture the purpose of the modification

Example contexts:
"Refactoring authentication module to use JWT tokens"
"Backing up subfolder before removal as requested by user"

Returns:
- timestamp: Timestamp of the created backup
- backup_path: Path where the backup was stored
- agent_context: The context that was stored with the backup
- versions_kept: Number of backup versions maintained`,
    inputSchema: zodToJsonSchema(BackupFolderCreateSchema)
  },
  backup_folder_list: {
    name: "backup_folder_list",
    description: "List all available backups for a folder, including their timestamps and locations. Useful for finding specific versions to restore.",
    usage: `# Backup Folder List Tool

Lists all available backup versions for a specified folder.

## When to Use
- Before restoring a folder to check available versions
- To audit the backup history of a folder
- To verify folder backup creation was successful
- To check timestamps of previous folder backups
- To list folder backups instead of using CLI commands or file system operations

## Best Practices
1. Use absolute paths for reliable folder lookup
2. Check the timestamps to identify the right version
3. Review backup locations to ensure proper storage
4. Always use this tool rather than CLI commands to list backups
5. Agents should use this tool instead of file system operations to check backup history
6. Always provide the original folder path (not the backup path) to this tool
7. The tool will automatically search in the configured backup location (BACKUP_DIR)
8. Only backup folders that you are working on or removing, not the whole directory structure

## Important Note for Agents
- Always try the backup_list and backup_folder_list tools first to check for backups
- Only use command-line tools (like Get-ChildItem, ls, dir) as a fallback if:
  1. The MCP tools return an empty array or no response
  2. You need to verify if backups exist in a specific location
  3. The MCP server is not responding or unavailable
- The MCP backup tools provide the correct context and metadata for backups
- When using CLI as fallback, check the configured backup directory (typically ~/.code_backups or as specified in environment)
- Never attempt to directly access or manipulate files in the backup directory
- Always use the original folder path when listing backups, not paths within the backup directory
- Create a project folder backup at the start of a resumed session
- Create a folder backup before making structural changes to a folder, especially when removing child folders

## Example Usage
\`\`\`json
{
  "name": "backup_folder_list",
  "arguments": {
    "folder_path": "C:/path/to/important/folder"
  }
}
\`\`\`

## Response Format
Returns an array of backup metadata:
- timestamp: When the backup was created
- backup_path: Where to find the backup
- created_at: ISO timestamp of creation
- agent_context: The context that was stored with the backup (if provided)`,
    inputSchema: zodToJsonSchema(BackupFolderListSchema)
  },
  backup_folder_restore: {
    name: "backup_folder_restore",
    description: "Restore a folder from a previous backup using its timestamp. Use this to revert changes or recover previous versions.",
    usage: `# Backup Folder Restore Tool

Restores a folder to a previous version using a specific backup timestamp.

## When to Use
- To revert unwanted changes
- To recover from failed modifications
- When comparing different versions of a folder
- After unsuccessful code changes

## Best Practices
1. List available backups first to get the correct timestamp
2. Create a new backup before restoring (backup of current state)
3. Verify folder permissions before restoration
4. Use absolute paths for reliable restoration

## Example Usage
\`\`\`json
{
  "name": "backup_folder_restore",
  "arguments": {
    "folder_path": "C:/path/to/important/folder",
    "timestamp": "20250309-120000-123"
  }
}
\`\`\`

## Response Format
Confirms restoration with:
- restored_path: Path to the restored folder
- timestamp: Backup version used`,
    inputSchema: zodToJsonSchema(BackupFolderRestoreSchema)
  },
  backup_list_all: {
    name: "backup_list_all",
    description: "List all backup files in both the main backup directory and emergency backup directory.",
    usage: `# List All Backups Tool

Lists all backup files in both the main backup directory and emergency backup directory.

## When to Use
- To get a comprehensive view of all backups across both directories
- To audit all backup files in the system
- To find specific backups using include/exclude patterns
- To check for emergency backups created during restore operations

## Best Practices
1. Use include/exclude patterns to filter results when looking for specific files
2. Set include_emergency to false if you only want to see regular backups
3. Review both directories to ensure proper backup management

## Example Usage
\`\`\`json
{
  "name": "backup_list_all",
  "arguments": {
    "include_pattern": "*.js",
    "exclude_pattern": "node_modules/**",
    "include_emergency": true
  }
}
\`\`\`

## Response Format
Returns an object with two arrays:
- main_backups: Array of backups in the main backup directory
- emergency_backups: Array of backups in the emergency backup directory (if include_emergency is true)

Each backup entry contains:
- path: Full path to the backup file
- type: "file" or "folder" backup
- size: Size of the backup in bytes
- created_at: Creation timestamp
- original_path: Original path of the backed up file/folder (if available from metadata)`,
    inputSchema: zodToJsonSchema(ListAllBackupsSchema)
  },
  mcp_cancel: {
    name: "mcp_cancel",
    description: "Cancel an ongoing backup or restore operation. Use this to stop long-running operations safely.",
    usage: `# Operation Cancel Tool

Cancels an in-progress backup or restore operation.

## When to Use
- To stop a long-running backup
- When the wrong file was selected
- If an operation appears stuck
- To free up system resources

## Best Practices
1. Keep track of operation IDs from responses
2. Check operation status before canceling
3. Verify the operation was actually cancelled

## Example Usage
\`\`\`json
{
  "name": "mcp_cancel",
  "arguments": {
    "operationId": "abc123-xyz789"
  }
}
\`\`\`

## Response Format
Confirms cancellation with:
- operationId: ID of cancelled operation
- status: Final operation status`,
    inputSchema: zodToJsonSchema(CancelSchema)
  }
};

```

--------------------------------------------------------------------------------
/test scripts/test_client.js:
--------------------------------------------------------------------------------

```javascript
import fs from 'fs';
import path from 'path';
import { spawn } from 'child_process';
import { fileURLToPath } from 'url';

// Get current directory
const __filename = fileURLToPath(import.meta.url);
const __dirname = path.dirname(__filename);

// Create a test file
const testDir = path.join(__dirname, 'test_files');
const testFile = path.join(testDir, 'test_file.txt');

// Create a test folder structure
const testFolderStructure = path.join(testDir, 'test_folder_structure');
const testSubFolder1 = path.join(testFolderStructure, 'subfolder1');
const testSubFolder2 = path.join(testFolderStructure, 'subfolder2');
const testFileInFolder1 = path.join(testSubFolder1, 'file1.txt');
const testFileInFolder2 = path.join(testSubFolder2, 'file2.txt');

// Ensure test directory exists
if (!fs.existsSync(testDir)) {
  fs.mkdirSync(testDir, { recursive: true });
}

// Create or update test file with content
fs.writeFileSync(testFile, `This is a test file created at ${new Date().toISOString()}`);
console.log(`Created test file at: ${testFile}`);

// Create test folder structure
if (!fs.existsSync(testFolderStructure)) {
  fs.mkdirSync(testFolderStructure, { recursive: true });
}
if (!fs.existsSync(testSubFolder1)) {
  fs.mkdirSync(testSubFolder1, { recursive: true });
}
if (!fs.existsSync(testSubFolder2)) {
  fs.mkdirSync(testSubFolder2, { recursive: true });
}

// Create test files in subfolders
fs.writeFileSync(testFileInFolder1, `This is a test file in subfolder1 created at ${new Date().toISOString()}`);
fs.writeFileSync(testFileInFolder2, `This is a test file in subfolder2 created at ${new Date().toISOString()}`);
console.log(`Created test folder structure at: ${testFolderStructure}`);

// Start the server in a separate process
const server = spawn('node', ['dist/index.js'], {
  stdio: ['pipe', 'pipe', 'inherit'],
  env: {
    ...process.env,
    BACKUP_DIR: path.join(__dirname, 'test_backups'),
    MAX_VERSIONS: '3'
  }
});

// Function to send a JSON-RPC request and get the response
function sendRequest(request) {
  return new Promise((resolve, reject) => {
    console.log(`Sending request: ${JSON.stringify(request)}`);
    
    // Set up response handler
    const responseHandler = (data) => {
      const lines = data.toString().split('\n');
      
      for (const line of lines) {
        if (!line.trim()) continue;
        
        try {
          const response = JSON.parse(line);
          
          // If this is a response to our request
          if (response.id === request.id) {
            server.stdout.removeListener('data', responseHandler);
            resolve(response);
            return;
          }
        } catch (error) {
          console.error(`Error parsing response: ${line}`);
        }
      }
    };
    
    server.stdout.on('data', responseHandler);
    
    // Send the request
    server.stdin.write(JSON.stringify(request) + '\n');
    
    // Set a timeout
    setTimeout(() => {
      server.stdout.removeListener('data', responseHandler);
      reject(new Error('Request timed out'));
    }, 10000);
  });
}

// Run tests
async function runTests() {
  try {
    // Wait for server to start
    await new Promise(resolve => setTimeout(resolve, 1000));
    
    // Test 1: List available tools
    console.log('\n=== Test 1: List Tools ===');
    const toolsResponse = await sendRequest({
      jsonrpc: '2.0',
      method: 'tools/list',
      params: {},
      id: Date.now().toString()
    });
    console.log('Available tools:', JSON.stringify(toolsResponse.result, null, 2));
    
    // Test 2: Create backup with agent context
    console.log('\n=== Test 2: Create Backup with Agent Context ===');
    const createResult = await sendRequest({
      jsonrpc: '2.0',
      method: 'tools/call',
      params: { 
        name: 'backup_create', 
        arguments: { 
          file_path: testFile,
          agent_context: "This is a sample agent context for file backup. It could contain the last part of a conversation or other metadata."
        }
      },
      id: Date.now().toString()
    });
    console.log('Backup created:', JSON.stringify(createResult.result, null, 2));
    
    // Test 3: List backups
    console.log('\n=== Test 3: List Backups ===');
    const listResult = await sendRequest({
      jsonrpc: '2.0',
      method: 'tools/call',
      params: { 
        name: 'backup_list', 
        arguments: { file_path: testFile }
      },
      id: Date.now().toString()
    });
    console.log('Backups list:', JSON.stringify(listResult.result, null, 2));
    
    // Test 4: Create another backup with different agent context
    console.log('\n=== Test 4: Create Another Backup with Different Agent Context ===');
    const createResult2 = await sendRequest({
      jsonrpc: '2.0',
      method: 'tools/call',
      params: { 
        name: 'backup_create', 
        arguments: { 
          file_path: testFile,
          agent_context: "This is a different agent context for the second backup. We can see how multiple backups store different context information."
        }
      },
      id: Date.now().toString()
    });
    console.log('Second backup created:', JSON.stringify(createResult2.result, null, 2));
    
    // Test 5: List backups again
    console.log('\n=== Test 5: List Backups Again ===');
    const listResult2 = await sendRequest({
      jsonrpc: '2.0',
      method: 'tools/call',
      params: { 
        name: 'backup_list', 
        arguments: { file_path: testFile }
      },
      id: Date.now().toString()
    });
    console.log('Updated backups list:', JSON.stringify(listResult2.result, null, 2));

    // Parse the content field from the response
    let backups = [];
    if (listResult2.result && listResult2.result.content && listResult2.result.content.length > 0) {
      try {
        backups = JSON.parse(listResult2.result.content[0].text);
      } catch (err) {
        console.error('Error parsing backups list:', err);
      }
    }
    
    // Test 6: Restore the first backup
    if (backups && backups.length > 0) {
      console.log('\n=== Test 6: Restore Backup ===');
      const timestamp = backups[0].timestamp;
      const restoreResult = await sendRequest({
        jsonrpc: '2.0',
        method: 'tools/call',
        params: {
          name: 'backup_restore', 
          arguments: {
            file_path: testFile,
            timestamp: timestamp
          }
        },
        id: Date.now().toString()
      });
      console.log('Restore result:', JSON.stringify(restoreResult.result, null, 2));
    } else {
      console.log('No backups found to restore');
    }
    
    // Test 7: Get Tool Documentation
    console.log('\n=== Test 7: Get Tool Documentation ===');
    const describeRequest = {
      jsonrpc: '2.0',
      method: 'tools/describe',
      params: {
        name: 'backup_create'
      },
      id: Date.now().toString()
    };
    console.log(`Sending request: ${JSON.stringify(describeRequest)}`);
    await sendRequest(describeRequest).then(response => {
      console.log(`Tool documentation: ${JSON.stringify(response, null, 2)}`);
    });
    
    // Test 8: Create folder backup with agent context
    console.log('\n=== Test 8: Create Folder Backup with Agent Context ===');
    const folderCreateResult = await sendRequest({
      jsonrpc: '2.0',
      method: 'tools/call',
      params: { 
        name: 'backup_folder_create', 
        arguments: { 
          folder_path: testFolderStructure,
          include_pattern: "*.txt",
          agent_context: "This is a sample agent context for folder backup. It demonstrates storing context with folder backups."
        }
      },
      id: Date.now().toString()
    });
    console.log('Folder backup created:', JSON.stringify(folderCreateResult.result, null, 2));
    
    // Test 9: List folder backups
    console.log('\n=== Test 9: List Folder Backups ===');
    const folderListResult = await sendRequest({
      jsonrpc: '2.0',
      method: 'tools/call',
      params: { 
        name: 'backup_folder_list', 
        arguments: { folder_path: testFolderStructure }
      },
      id: Date.now().toString()
    });
    console.log('Folder backups list:', JSON.stringify(folderListResult.result, null, 2));
    
    // Test 10: Create another folder backup
    console.log('\n=== Test 10: Create Another Folder Backup ===');
    const folderCreateResult2 = await sendRequest({
      jsonrpc: '2.0',
      method: 'tools/call',
      params: { 
        name: 'backup_folder_create', 
        arguments: { folder_path: testFolderStructure }
      },
      id: Date.now().toString()
    });
    console.log('Second folder backup created:', JSON.stringify(folderCreateResult2.result, null, 2));
    
    // Test 11: List folder backups again
    console.log('\n=== Test 11: List Folder Backups Again ===');
    const folderListResult2 = await sendRequest({
      jsonrpc: '2.0',
      method: 'tools/call',
      params: { 
        name: 'backup_folder_list', 
        arguments: { folder_path: testFolderStructure }
      },
      id: Date.now().toString()
    });
    console.log('Updated folder backups list:', JSON.stringify(folderListResult2.result, null, 2));

    // Parse the content field from the response
    let folderBackups = [];
    if (folderListResult2.result && folderListResult2.result.content && folderListResult2.result.content.length > 0) {
      try {
        folderBackups = JSON.parse(folderListResult2.result.content[0].text);
      } catch (err) {
        console.error('Error parsing folder backups list:', err);
      }
    }
    
    // Test 12: Restore the first folder backup
    if (folderBackups && folderBackups.length > 0) {
      console.log('\n=== Test 12: Restore Folder Backup ===');
      const timestamp = folderBackups[0].timestamp;
      
      // Modify a file in the folder to verify restoration
      fs.writeFileSync(testFileInFolder1, `This file was modified before restore at ${new Date().toISOString()}`);
      console.log(`Modified test file before restore: ${testFileInFolder1}`);
      
      const folderRestoreResult = await sendRequest({
        jsonrpc: '2.0',
        method: 'tools/call',
        params: {
          name: 'backup_folder_restore', 
          arguments: {
            folder_path: testFolderStructure,
            timestamp: timestamp
          }
        },
        id: Date.now().toString()
      });
      console.log('Folder restore result:', JSON.stringify(folderRestoreResult.result, null, 2));
      
      // Verify the file was restored
      const restoredContent = fs.readFileSync(testFileInFolder1, 'utf8');
      console.log(`Restored file content: ${restoredContent}`);
    } else {
      console.log('No folder backups found to restore');
    }
    
    // Test 13: Restore with emergency backup creation
    if (folderBackups && folderBackups.length > 0) {
      console.log('\n=== Test 13: Restore with Emergency Backup ===');
      const timestamp = folderBackups[0].timestamp;
      
      // Modify a file in the folder to verify restoration and emergency backup
      fs.writeFileSync(testFileInFolder1, `This file was modified before emergency backup restore at ${new Date().toISOString()}`);
      console.log(`Modified test file before emergency backup restore: ${testFileInFolder1}`);
      
      const emergencyRestoreResult = await sendRequest({
        jsonrpc: '2.0',
        method: 'tools/call',
        params: {
          name: 'backup_folder_restore', 
          arguments: {
            folder_path: testFolderStructure,
            timestamp: timestamp,
            create_emergency_backup: true
          }
        },
        id: Date.now().toString()
      });
      console.log('Folder restore with emergency backup result:', JSON.stringify(emergencyRestoreResult.result, null, 2));
    }
    
    // Test 14: List all backups including emergency backups
    console.log('\n=== Test 14: List All Backups Including Emergency Backups ===');
    const listAllResult = await sendRequest({
      jsonrpc: '2.0',
      method: 'tools/call',
      params: { 
        name: 'backup_list_all', 
        arguments: { include_emergency: true }
      },
      id: Date.now().toString()
    });
    console.log('All backups list:', JSON.stringify(listAllResult.result, null, 2));
    
    // Test 15: Verify emergency backups have metadata
    console.log('\n=== Test 15: Verify Emergency Backups Have Metadata ===');
    let emergencyBackups = [];
    if (listAllResult.result && listAllResult.result.content && listAllResult.result.content.length > 0) {
      try {
        const allBackups = JSON.parse(listAllResult.result.content[0].text);
        emergencyBackups = allBackups.emergency_backups || [];
        console.log(`Found ${emergencyBackups.length} emergency backups with metadata`);
        
        // Check if we have emergency backups with metadata
        if (emergencyBackups.length > 0) {
          console.log('Emergency backups with metadata found:', JSON.stringify(emergencyBackups, null, 2));
        } else {
          console.log('No emergency backups with metadata found. This may indicate an issue with emergency backup metadata creation.');
        }
      } catch (err) {
        console.error('Error parsing all backups list:', err);
      }
    }
    
    // Test 16: File restore with emergency backup
    console.log('\n=== Test 16: File Restore with Emergency Backup ===');
    // Modify test file
    fs.writeFileSync(testFile, `This file was modified before emergency backup restore at ${new Date().toISOString()}`);
    console.log(`Modified test file before emergency backup restore: ${testFile}`);
    
    // Get the latest backup timestamp
    const latestFileBackups = await sendRequest({
      jsonrpc: '2.0',
      method: 'tools/call',
      params: { 
        name: 'backup_list', 
        arguments: { file_path: testFile }
      },
      id: Date.now().toString()
    });
    
    let fileBackups = [];
    if (latestFileBackups.result && latestFileBackups.result.content && latestFileBackups.result.content.length > 0) {
      try {
        fileBackups = JSON.parse(latestFileBackups.result.content[0].text);
      } catch (err) {
        console.error('Error parsing file backups list:', err);
      }
    }
    
    if (fileBackups && fileBackups.length > 0) {
      const fileTimestamp = fileBackups[0].timestamp;
      
      const fileEmergencyRestoreResult = await sendRequest({
        jsonrpc: '2.0',
        method: 'tools/call',
        params: {
          name: 'backup_restore', 
          arguments: {
            file_path: testFile,
            timestamp: fileTimestamp,
            create_emergency_backup: true
          }
        },
        id: Date.now().toString()
      });
      console.log('File restore with emergency backup result:', JSON.stringify(fileEmergencyRestoreResult.result, null, 2));
      
      // List all backups again to verify the new emergency backup
      const finalListAllResult = await sendRequest({
        jsonrpc: '2.0',
        method: 'tools/call',
        params: { 
          name: 'backup_list_all', 
          arguments: { include_emergency: true }
        },
        id: Date.now().toString()
      });
      
      // Check for new emergency backups
      let finalEmergencyBackups = [];
      if (finalListAllResult.result && finalListAllResult.result.content && finalListAllResult.result.content.length > 0) {
        try {
          const finalAllBackups = JSON.parse(finalListAllResult.result.content[0].text);
          finalEmergencyBackups = finalAllBackups.emergency_backups || [];
          console.log(`Found ${finalEmergencyBackups.length} emergency backups with metadata after file restore`);
          
          // Check if we have more emergency backups than before
          if (finalEmergencyBackups.length > emergencyBackups.length) {
            console.log('New emergency backup with metadata created successfully!');
          } else {
            console.log('No new emergency backup metadata found. This may indicate an issue with file emergency backup metadata creation.');
          }
        } catch (err) {
          console.error('Error parsing final all backups list:', err);
        }
      }
    } else {
      console.log('No file backups found to restore');
    }
    
    console.log('\nAll tests completed successfully!');
  } catch (error) {
    console.error('Test failed:', error);
  } finally {
    // Clean up
    server.stdin.end();
    process.exit(0);
  }
}

// Run the tests
runTests();

```

--------------------------------------------------------------------------------
/src/index.ts:
--------------------------------------------------------------------------------

```typescript
#!/usr/bin/env node

import fs from 'fs';
import path from 'path';
import crypto from 'crypto';
import { promises as fsPromises } from 'fs';
import { Server } from "@modelcontextprotocol/sdk/server/index.js";
import { StdioServerTransport } from "@modelcontextprotocol/sdk/server/stdio.js";
import { 
  CallToolRequestSchema, 
  ListToolsRequestSchema,
  ToolSchema 
} from "@modelcontextprotocol/sdk/types.js";
import { z } from "zod";
import os from 'os';
import { minimatch } from 'minimatch';
import { 
  BackupCreateSchema, 
  BackupListSchema, 
  BackupRestoreSchema,
  BackupFolderCreateSchema,
  BackupFolderListSchema,
  BackupFolderRestoreSchema,
  ListAllBackupsSchema,
  CancelSchema,
  toolDescriptions 
} from './toolDescriptions.js';
import { 
  BackupMetadata, 
  BackupFolderMetadata, 
  BackupResult, 
  Operation 
} from './types.js';
import { 
  checkOperationCancelled, 
  formatJsonResponse, 
  formatErrorResponse, 
  validateRequiredParams,
  validateFileExists,
  validateFolderExists,
  exists
} from './utils.js';

// Type for tool input
const ToolInputSchema = ToolSchema.shape.inputSchema;
type ToolInput = z.infer<typeof ToolInputSchema>;

// Create a local ensureDirectoryExists function to avoid conflict with the imported one
async function ensureBackupDirectoryExists(dirPath: string): Promise<void> {
  try {
    await fsPromises.mkdir(dirPath, { recursive: true });
  } catch (error) {
    console.error(`Error creating directory ${dirPath}:`, error);
    throw error;
  }
}

// Constants
const SERVER_VERSION = '1.0.0';
const SERVER_NAME = 'backup-mcp-server';
const BACKUP_DIR = process.env.BACKUP_DIR || path.join(os.homedir(), '.code_backups');
const MAX_VERSIONS = parseInt(process.env.MAX_VERSIONS || '10', 10);
const EMERGENCY_BACKUP_DIR = process.env.EMERGENCY_BACKUP_DIR || path.join(os.homedir(), '.code_emergency_backups');

// Normalize backup directory paths for Windows
const BACKUP_DIR_NORMALIZED = path.normalize(BACKUP_DIR);
const EMERGENCY_BACKUP_DIR_NORMALIZED = path.normalize(EMERGENCY_BACKUP_DIR);

// Track current operation
let currentOperationId: string | null = null;

// Map to track operations
const operations = new Map<string, Operation>();

// Report progress for an operation
function reportProgress(operationId: string, progress: number): void {
  // Only report progress if operationId is valid
  if (operationId) {
    console.error(`Operation ${operationId} progress: ${progress}%`);
  }
}

// Update operation progress safely
function updateOperationProgress(operationId: string, progress: number): void {
  const operation = operations.get(operationId);
  if (operation) {
    operation.progress = progress;
  }
}

// Helper function to report progress
function logProgress(progress: number): void {
  if (currentOperationId) {
    updateOperationProgress(currentOperationId, progress);
    reportProgress(currentOperationId, progress);
  }
}

// Generate a backup folder name
function getBackupFolderName(folderPath: string, timestamp: string): string {
  const folderName = path.basename(folderPath);
  return `${folderName}.${timestamp}`;
}

// Create a new operation
function createOperation(type: string, params: any): Operation {
  const id = crypto.randomUUID();
  const operation: Operation = {
    id,
    type,
    progress: 0,
    cancelled: false,
    status: 'running'
  };
  operations.set(id, operation);
  return operation;
}

// Cancel operation
function cancelOperation(operationId: string): boolean {
  const operation = operations.get(operationId);
  if (operation) {
    operation.cancelled = true;
    return true;
  }
  return false;
}

// Create MCP server
const server = new Server(
  {
    name: SERVER_NAME,
    version: SERVER_VERSION,
  },
  {
    capabilities: {
      tools: {},
    },
  },
);

// Initialize server methods if not already initialized
if (!(server as any).methods) {
  (server as any).methods = {};
}

// Define tools
server.setRequestHandler(ListToolsRequestSchema, async () => {
  return {
    tools: Object.values(toolDescriptions).map(tool => ({
      name: tool.name,
      description: tool.description,
      inputSchema: tool.inputSchema as ToolInput,
    }))
  };
});

// Custom schema for tool documentation requests
const DescribeToolRequestSchema = z.object({
  jsonrpc: z.literal('2.0'),
  method: z.literal('tools/describe'),
  params: z.object({
    name: z.string().describe('Name of the tool to describe')
  }),
  id: z.union([z.string(), z.number()])
});

// Implement tool documentation
server.setRequestHandler(DescribeToolRequestSchema, async (request) => {
  const { name } = request.params;
  const toolInfo = toolDescriptions[name];
  
  if (!toolInfo) {
    throw new Error(`Tool '${name}' not found`);
  }

  return {
    content: [{
      type: "text",
      text: toolInfo.usage
    }]
  };
});

// Implement tool handlers
server.setRequestHandler(CallToolRequestSchema, async (request) => {
  let currentOperationId: string | null = null;
  
  try {
    const { name, arguments: toolInput } = request.params;
    console.error(`Received request for ${name} with params:`, toolInput);
    
    // Create a unique operation ID for tracking progress
    currentOperationId = createOperation(name, toolInput).id;
    
    switch (name) {
      case "backup_create": {
        const params = toolInput as z.infer<typeof BackupCreateSchema>;
        console.error('Received request for backup_create with params:', params);
        
        // Validate required parameters
        validateRequiredParams(params, ['file_path']);
        
        const filePath = path.normalize(params.file_path);
        
        // Check if file exists
        await validateFileExists(filePath);
        
        // Generate timestamp for the backup
        const timestamp = generateTimestamp();
        
        // Create backup directory
        const backupDir = getBackupDir(filePath);
        await ensureBackupDirectoryExists(backupDir);
        
        // Create backup filename
        const backupFilename = getBackupFilename(filePath, timestamp);
        const backupPath = path.join(backupDir, backupFilename);
        
        // Report progress
        logProgress(10);
        
        // Check if operation was cancelled
        const cancelCheck = checkOperationCancelled(
          currentOperationId, 
          operations,
          () => {}
        );
        if (cancelCheck.isCancelled) return cancelCheck.response;
        
        // Copy the file
        await fsPromises.copyFile(filePath, backupPath);
        
        // Report progress
        logProgress(70);
        
        // Check if operation was cancelled
        const cancelCheck2 = checkOperationCancelled(
          currentOperationId, 
          operations,
          () => {
            // Clean up the partial backup
            if (fs.existsSync(backupPath)) {
              fs.unlinkSync(backupPath);
            }
          }
        );
        if (cancelCheck2.isCancelled) return cancelCheck2.response;
        
        // Create and save metadata
        const metadata = createBackupMetadata(filePath, timestamp, backupPath, params.agent_context);
        const metadataPath = getBackupMetadataFilename(backupPath);
        saveBackupMetadata(metadataPath, metadata);
        
        // Report progress
        logProgress(90);
        
        // Clean up old backups
        const versionsKept = cleanupOldBackups(filePath);
        
        // Report completion
        logProgress(100);
        
        // Return result with versionsKept
        return formatJsonResponse({
          ...metadata,
          versions_kept: versionsKept
        });
      }
      
      case "backup_list": {
        const params = toolInput as z.infer<typeof BackupListSchema>;
        console.error('Received request for backup_list with params:', params);
        
        // Validate required parameters
        validateRequiredParams(params, ['file_path']);
        
        const filePath = path.normalize(params.file_path);
        
        // Report initial progress
        logProgress(0);
        
        // Check if file exists
        await validateFileExists(filePath);
        
        // Report progress
        logProgress(30);
        
        const backups = findBackupsByFilePath(filePath);
        
        // Report progress
        logProgress(70);
        
        // Sort backups by timestamp (newest first)
        backups.sort((a, b) => {
          return b.timestamp.localeCompare(a.timestamp);
        });
        
        // Report completion
        logProgress(100);
        
        return formatJsonResponse(backups);
      }
      
      case "backup_restore": {
        const params = toolInput as z.infer<typeof BackupRestoreSchema>;
        console.error('Received request for backup_restore with params:', params);
        
        // Validate required parameters
        validateRequiredParams(params, ['file_path', 'timestamp']);
        
        const filePath = path.normalize(params.file_path);
        const timestamp = params.timestamp;
        
        // Find the backup
        const backup = await findBackupByTimestamp(filePath, timestamp);
        
        if (!backup) {
          throw new Error(`Backup with timestamp ${timestamp} not found for ${filePath}`);
        }
        
        // Report progress
        logProgress(20);
        
        // Check if operation was cancelled
        const cancelCheck = checkOperationCancelled(
          currentOperationId, 
          operations,
          () => {}
        );
        if (cancelCheck.isCancelled) return cancelCheck.response;
        
        // Ensure the target directory exists
        const targetDir = path.dirname(filePath);
        await ensureBackupDirectoryExists(targetDir);
        
        // Report progress
        logProgress(50);
        
        // Check if operation was cancelled
        const cancelCheck2 = checkOperationCancelled(
          currentOperationId, 
          operations,
          () => {}
        );
        if (cancelCheck2.isCancelled) return cancelCheck2.response;
        
        // Create emergency backup if requested
        if (params.create_emergency_backup) {
          const emergencyBackupPath = await createEmergencyBackup(filePath);
          if (emergencyBackupPath) {
            console.error(`Created emergency backup at ${emergencyBackupPath}`);
          }
        }
        
        // Copy the backup file to the original location
        await restoreBackup(filePath, timestamp, params.create_emergency_backup);
        
        // Report completion
        logProgress(100);
        
        // Return result
        return formatJsonResponse({
          restored_path: filePath,
          timestamp: timestamp
        });
      }
      
      case "backup_folder_create": {
        const params = toolInput as z.infer<typeof BackupFolderCreateSchema>;
        console.error('Received request for backup_folder_create with params:', params);
        
        // Validate required parameters
        validateRequiredParams(params, ['folder_path']);
        
        const folderPath = path.normalize(params.folder_path);
        
        // Check if folder exists
        await validateFolderExists(folderPath);
        
        // Generate timestamp for the backup
        const timestamp = generateTimestamp();
        
        // Create backup directory
        const backupDir = getBackupDir(folderPath);
        await ensureBackupDirectoryExists(backupDir);
        
        // Create backup folder name
        const backupFolderName = getBackupFolderName(folderPath, timestamp);
        const backupFolderPath = path.join(backupDir, backupFolderName);
        
        // Report progress
        logProgress(10);
        
        // Check if operation was cancelled
        const cancelCheck = checkOperationCancelled(
          currentOperationId, 
          operations,
          () => {}
        );
        if (cancelCheck.isCancelled) return cancelCheck.response;
        
        // Copy the folder
        await copyFolderContents(folderPath, backupFolderPath, params.include_pattern, params.exclude_pattern);
        
        // Report progress
        logProgress(70);
        
        // Check if operation was cancelled
        const cancelCheck2 = checkOperationCancelled(
          currentOperationId, 
          operations,
          () => {
            // Clean up the partial backup
            if (fs.existsSync(backupFolderPath)) {
              fs.rmdirSync(backupFolderPath, { recursive: true });
            }
          }
        );
        if (cancelCheck2.isCancelled) return cancelCheck2.response;
        
        // Create and save metadata
        const metadata = createBackupMetadata(folderPath, timestamp, backupFolderPath, params.agent_context);
        const metadataPath = `${backupFolderPath}.meta.json`;
        saveBackupMetadata(metadataPath, metadata);
        
        // Report progress
        logProgress(90);
        
        // Clean up old backups
        const versionsKept = cleanupOldBackups(folderPath);
        
        // Report completion
        logProgress(100);
        
        // Return result with versionsKept
        return formatJsonResponse({
          ...metadata,
          versions_kept: versionsKept
        });
      }
      
      case "backup_folder_list": {
        const params = toolInput as z.infer<typeof BackupFolderListSchema>;
        console.error('Received request for backup_folder_list with params:', params);
        
        // Validate required parameters
        validateRequiredParams(params, ['folder_path']);
        
        const folderPath = path.normalize(params.folder_path);
        
        // Report initial progress
        logProgress(0);
        
        // Check if folder exists
        await validateFolderExists(folderPath);
        
        // Report progress
        logProgress(30);
        
        const backups = findBackupsByFolderPath(folderPath);
        
        // Report progress
        logProgress(70);
        
        // Sort backups by timestamp (newest first)
        backups.sort((a, b) => {
          return b.timestamp.localeCompare(a.timestamp);
        });
        
        // Report completion
        logProgress(100);
        
        return formatJsonResponse(backups);
      }
      
      case "backup_folder_restore": {
        const params = toolInput as z.infer<typeof BackupFolderRestoreSchema>;
        console.error('Received request for backup_folder_restore with params:', params);
        
        // Validate required parameters
        validateRequiredParams(params, ['folder_path', 'timestamp']);
        
        const { folder_path, timestamp, create_emergency_backup = true } = params;
        const folderPath = path.normalize(folder_path);
        
        // Check if folder exists
        await validateFolderExists(folderPath);
        
        // Report initial progress
        logProgress(0);
        
        try {
          // Find the backup
          const backups = findBackupsByFolderPath(folderPath);
          const backup = backups.find(b => b.timestamp === timestamp);
          
          if (!backup) {
            throw new Error(`Backup with timestamp ${timestamp} not found for ${folderPath}`);
          }
          
          // Report progress
          logProgress(10);
          
          // Create emergency backup if requested
          let emergencyBackupPath: string | null = null;
          if (create_emergency_backup) {
            emergencyBackupPath = await createEmergencyFolderBackup(folderPath);
          }
          
          // Check if backup path exists
          if (!backup.backup_path || !fs.existsSync(backup.backup_path)) {
            throw new Error(`Backup folder not found: ${backup.backup_path}`);
          }
          
          // Check if operation was cancelled
          const cancelCheck = checkOperationCancelled(
            currentOperationId, 
            operations,
            () => {}
          );
          if (cancelCheck.isCancelled) return cancelCheck.response;
          
          // Copy the backup folder to the original location
          await copyFolderContents(backup.backup_path, folderPath);
          
          // Report completion
          logProgress(100);
          
          return formatJsonResponse({
            restored_path: folderPath,
            timestamp: timestamp,
            emergency_backup_path: emergencyBackupPath
          });
        } catch (error) {
          // Update operation status on error
          const operation = operations.get(currentOperationId);
          if (operation) {
            operation.status = 'error';
          }
          
          throw error;
        }
      }
      
      case "backup_list_all": {
        const params = toolInput as z.infer<typeof ListAllBackupsSchema>;
        console.error('Received request for backup_list_all with params:', params);
        
        // Extract parameters
        const includePattern = params.include_pattern;
        const excludePattern = params.exclude_pattern;
        const includeEmergency = params.include_emergency !== false; // Default to true if not specified
        
        // Create operation for tracking
        const operation = operations.get(currentOperationId);
        if (operation) {
          operation.status = 'running';
        }
        
        // Report initial progress
        logProgress(0);
        
        try {
          // Initialize results object
          const results: {
            main_backups: Array<{
              path: string;
              type: string;
              size: number;
              created_at: string;
              original_path: string | null;
            }>;
            emergency_backups: Array<{
              path: string;
              type: string;
              size: number;
              created_at: string;
              original_path: string | null;
            }>;
          } = {
            main_backups: [],
            emergency_backups: []
          };
          
          // Function to scan a directory and get all backup files
          async function scanBackupDirectory(directory: string, isEmergency: boolean = false) {
            if (!fs.existsSync(directory)) {
              return [];
            }
            
            // Get all files and folders in the directory recursively
            const getAllFiles = async (dir: string, fileList: any[] = []) => {
              const files = await fsPromises.readdir(dir, { withFileTypes: true });
              
              for (const file of files) {
                const filePath = path.join(dir, file.name);
                
                // Check if operation was cancelled
                if (currentOperationId && operations.get(currentOperationId)?.cancelled) {
                  throw new Error('Operation cancelled');
                }
                
                // Apply include/exclude patterns if specified
                if (includePattern && !minimatch(filePath, includePattern)) {
                  continue;
                }
                
                if (excludePattern && minimatch(filePath, excludePattern)) {
                  continue;
                }
                
                if (file.isDirectory()) {
                  fileList = await getAllFiles(filePath, fileList);
                } else {
                  // Check if this is a backup file (has timestamp format in name)
                  const isBackupFile = /\.\d{8}-\d{6}-\d{3}$/.test(file.name);
                  const isMetadataFile = file.name.endsWith('.meta.json');
                  
                  if (isBackupFile || isMetadataFile) {
                    try {
                      const stats = await fsPromises.stat(filePath);
                      
                      // Try to get original path from metadata if this is a backup file
                      let originalPath = null;
                      let backupType = 'unknown';
                      
                      if (isBackupFile) {
                        // Look for corresponding metadata file
                        const metadataPath = `${filePath}.meta.json`;
                        if (await exists(metadataPath)) {
                          try {
                            const metadataContent = await fsPromises.readFile(metadataPath, 'utf8');
                            const metadata = JSON.parse(metadataContent);
                            originalPath = metadata.original_path;
                          } catch (err) {
                            console.error(`Error reading metadata for ${filePath}:`, err);
                          }
                        }
                      } else if (isMetadataFile) {
                        try {
                          const metadataContent = await fsPromises.readFile(filePath, 'utf8');
                          const metadata = JSON.parse(metadataContent);
                          originalPath = metadata.original_path;
                        } catch (err) {
                          console.error(`Error reading metadata file ${filePath}:`, err);
                        }
                      }
                      
                      // Add to appropriate list
                      const result = {
                        path: filePath,
                        type: file.isDirectory() ? 'directory' : 'file',
                        size: stats.size,
                        created_at: stats.birthtime.toISOString(),
                        original_path: originalPath
                      };
                      
                      if (isEmergency) {
                        results.emergency_backups.push(result);
                      } else {
                        results.main_backups.push(result);
                      }
                      
                      // Update progress periodically
                      if (results.main_backups.length % 10 === 0 || results.emergency_backups.length % 10 === 0) {
                        // Calculate progress based on number of files found
                        const totalFiles = results.main_backups.length + results.emergency_backups.length;
                        // Cap progress at 90% until we're completely done
                        const progress = Math.min(90, Math.floor(totalFiles / 10) * 5);
                        logProgress(progress);
                      }
                    } catch (err) {
                      console.error(`Error processing file ${filePath}:`, err);
                    }
                  }
                }
              }
              
              return fileList;
            };
            
            await getAllFiles(directory);
          }
          
          // Scan main backup directory
          await scanBackupDirectory(BACKUP_DIR_NORMALIZED);
          
          // Report progress after scanning main directory
          logProgress(50);
          
          // Scan emergency backup directory if requested
          if (includeEmergency) {
            console.error('Scanning emergency backup directory:', EMERGENCY_BACKUP_DIR_NORMALIZED);
            if (!fs.existsSync(EMERGENCY_BACKUP_DIR_NORMALIZED)) {
              console.error('Emergency backup directory does not exist, creating it');
              await fsPromises.mkdir(EMERGENCY_BACKUP_DIR_NORMALIZED, { recursive: true });
            }
            await scanBackupDirectory(EMERGENCY_BACKUP_DIR_NORMALIZED, true);
          }
          
          // Report completion
          logProgress(100);
          
          return formatJsonResponse(results);
        } catch (error) {
          // Update operation status on error
          const operation = operations.get(currentOperationId);
          if (operation) {
            operation.status = 'error';
          }
          
          throw error;
        }
      }
        
      case "mcp_cancel": {
        const params = toolInput as z.infer<typeof CancelSchema>;
        console.error('Received request for mcp_cancel with params:', params);
        
        // Validate required parameters
        validateRequiredParams(params, ['operationId']);
        
        const { operationId } = params;
        const cancelled = cancelOperation(operationId);
        
        if (!cancelled) {
          return formatJsonResponse({
            success: false,
            error: `Operation ${operationId} not found or already completed`
          });
        }
        
        return formatJsonResponse({
          success: true,
          operationId,
          status: 'cancelled'
        });
      }
      
      default:
        throw new Error(`Unknown tool: ${name}`);
    }
  } catch (error) {
    console.error('Error handling request:', error);
    return formatErrorResponse(error, currentOperationId);
  }
});

// Utility functions
function generateOperationId(): string {
  return crypto.randomUUID();
}

function generateTimestamp(): string {
  const now = new Date();
  const year = now.getFullYear();
  const month = String(now.getMonth() + 1).padStart(2, '0');
  const day = String(now.getDate()).padStart(2, '0');
  const hours = String(now.getHours()).padStart(2, '0');
  const minutes = String(now.getMinutes()).padStart(2, '0');
  const seconds = String(now.getSeconds()).padStart(2, '0');
  const milliseconds = String(now.getMilliseconds()).padStart(3, '0');
  
  return `${year}${month}${day}-${hours}${minutes}${seconds}-${milliseconds}`;
}

function getBackupDir(filePath: string): string {
  // Create a directory structure that mirrors the original file's path
  const normalizedPath = path.normalize(filePath);
  const parsedPath = path.parse(normalizedPath);
  
  // Remove drive letter (on Windows) and create backup path
  let relativePath = parsedPath.dir.replace(/^[a-zA-Z]:/, '');
  
  // Ensure the path is safe by removing leading slashes
  relativePath = relativePath.replace(/^[/\\]+/, '');
  
  // Create the backup directory path
  return path.join(BACKUP_DIR_NORMALIZED, relativePath);
}

function getBackupFilename(filePath: string, timestamp: string): string {
  const parsedPath = path.parse(filePath);
  return `${parsedPath.name}${parsedPath.ext}.${timestamp}`;
}

function getBackupMetadataFilename(backupFilePath: string): string {
  return `${backupFilePath}.meta.json`;
}

function createBackupMetadata(filePath: string, timestamp: string, backupPath: string, agentContext?: string): BackupMetadata {
  return {
    original_path: filePath,
    original_filename: path.basename(filePath),
    timestamp: timestamp,
    created_at: new Date().toISOString(),
    backup_path: backupPath,
    relative_path: path.relative(process.cwd(), backupPath),
    agent_context: agentContext
  };
}

function saveBackupMetadata(metadataPath: string, metadata: BackupMetadata): void {
  fs.writeFileSync(metadataPath, JSON.stringify(metadata, null, 2));
}

function readBackupMetadata(metadataPath: string): BackupMetadata | BackupFolderMetadata | null {
  try {
    const data = fs.readFileSync(metadataPath, 'utf8');
    return JSON.parse(data);
  } catch (err) {
    console.error(`Error reading metadata: ${err}`);
    return null;
  }
}

function isFolderMetadata(metadata: any): metadata is BackupFolderMetadata {
  // Check if this is a folder metadata by examining the backup_path
  // Folder backups have a directory structure, while file backups have a file
  return metadata && 
    metadata.original_path && 
    metadata.backup_path && 
    !metadata.backup_path.endsWith('.meta.json') &&
    fs.existsSync(metadata.backup_path) && 
    fs.statSync(metadata.backup_path).isDirectory();
}

// Helper function to check if a path is a parent of another path
function isParentPath(parentPath: string, childPath: string): boolean {
  const normalizedParent = path.normalize(parentPath).toLowerCase() + path.sep;
  const normalizedChild = path.normalize(childPath).toLowerCase() + path.sep;
  return normalizedChild.startsWith(normalizedParent);
}

// Helper function to recursively search for backup metadata files
function findAllBackupMetadataFiles(directory: string): string[] {
  if (!fs.existsSync(directory)) {
    return [];
  }

  let results: string[] = [];
  const items = fs.readdirSync(directory);

  for (const item of items) {
    const itemPath = path.join(directory, item);
    const stats = fs.statSync(itemPath);

    if (stats.isDirectory()) {
      // Recursively search subdirectories
      results = results.concat(findAllBackupMetadataFiles(itemPath));
    } else if (item.endsWith('.meta.json')) {
      // Add metadata files to results
      results.push(itemPath);
    }
  }

  return results;
}

function findBackupsByFilePath(filePath: string): BackupMetadata[] {
  const backupDir = getBackupDir(filePath);
  const backups: BackupMetadata[] = [];
  
  // Start at the root of the backup directory to find all possible backups
  const rootBackupDir = BACKUP_DIR_NORMALIZED;
  
  // Find all metadata files recursively
  const metadataFiles = findAllBackupMetadataFiles(rootBackupDir);
  
  // Process each metadata file
  for (const metadataPath of metadataFiles) {
    const metadata = readBackupMetadata(metadataPath);
    
    // Check if this backup is for the requested file (exact match)
    if (metadata && metadata.original_path === filePath && !isFolderMetadata(metadata)) {
      backups.push(metadata);
    }
  }
  
  // Sort backups by timestamp (newest first)
  backups.sort((a, b) => {
    return new Date(b.timestamp).getTime() - new Date(a.timestamp).getTime();
  });
  
  return backups;
}

function findBackupsByFolderPath(folderPath: string): BackupFolderMetadata[] {
  const backups: BackupFolderMetadata[] = [];
  
  // Start at the root of the backup directory to find all possible backups
  const rootBackupDir = BACKUP_DIR_NORMALIZED;
  
  // Find all metadata files recursively
  const metadataFiles = findAllBackupMetadataFiles(rootBackupDir);
  
  // Process each metadata file
  for (const metadataPath of metadataFiles) {
    try {
      const metadata = readBackupMetadata(metadataPath);
      
      // Check if this backup is for the requested folder (exact match) or any subfolder
      if (metadata && isFolderMetadata(metadata)) {
        // Include if it's an exact match or if the original path is a parent of the requested path
        // or if the requested path is a parent of the original path
        if (metadata.original_path === folderPath || 
            isParentPath(metadata.original_path, folderPath) || 
            isParentPath(folderPath, metadata.original_path)) {
          backups.push(metadata);
        }
      }
    } catch (error) {
      console.error(`Error processing metadata file ${metadataPath}:`, error);
      // Continue processing other metadata files
    }
  }
  
  // Sort backups by timestamp (newest first)
  backups.sort((a, b) => {
    return b.timestamp.localeCompare(a.timestamp);
  });
  
  return backups;
}

async function findBackupByTimestamp(filePath: string, timestamp: string): Promise<BackupMetadata | null> {
  const backupDir = getBackupDir(filePath);
  const backupFilename = getBackupFilename(filePath, timestamp);
  const backupPath = path.join(backupDir, backupFilename);
  const metadataPath = `${backupPath}.meta.json`;
  
  if (fs.existsSync(metadataPath)) {
    const metadata = readBackupMetadata(metadataPath);
    if (metadata && !isFolderMetadata(metadata)) {
      return metadata;
    }
  }
  
  return null;
}

async function findFolderBackupByTimestamp(folderPath: string, timestamp: string): Promise<BackupFolderMetadata | null> {
  const backupDir = getBackupDir(folderPath);
  const backupFolderName = getBackupFolderName(folderPath, timestamp);
  const backupPath = path.join(backupDir, backupFolderName);
  const metadataPath = `${backupPath}.meta.json`;
  
  if (fs.existsSync(metadataPath)) {
    const metadata = readBackupMetadata(metadataPath);
    if (metadata && isFolderMetadata(metadata)) {
      return metadata;
    }
  }
  
  return null;
}

async function listFolderBackups(folderPath: string): Promise<BackupFolderMetadata[]> {
  return findBackupsByFolderPath(folderPath);
}

function cleanupOldBackups(filePath: string): number {
  // Get all backups for this file
  const backups = findBackupsByFilePath(filePath);
  
  // If we have more than MAX_VERSIONS, remove the oldest ones
  if (backups.length > MAX_VERSIONS) {
    // Sort backups by timestamp (oldest first)
    backups.sort((a, b) => {
      return new Date(a.timestamp).getTime() - new Date(b.timestamp).getTime();
    });
    
    // Remove oldest backups
    const backupsToRemove = backups.slice(0, backups.length - MAX_VERSIONS);
    for (const backup of backupsToRemove) {
      try {
        fs.unlinkSync(backup.backup_path);
        console.log(`Removed old backup: ${backup.backup_path}`);
      } catch (error) {
        console.error(`Error removing old backup: ${backup.backup_path}`, error);
      }
    }
    
    return MAX_VERSIONS;
  }
  
  return backups.length;
}

// Copy folder recursively
async function copyFolderRecursive(sourcePath: string, targetPath: string, includePattern?: string, excludePattern?: string): Promise<void> {
  // Create target folder if it doesn't exist
  if (!fs.existsSync(targetPath)) {
    await fsPromises.mkdir(targetPath, { recursive: true });
  }
  
  // Read source directory
  const entries = fs.readdirSync(sourcePath, { withFileTypes: true });
  
  // Process each entry
  for (const entry of entries) {
    const srcPath = path.join(sourcePath, entry.name);
    const destPath = path.join(targetPath, entry.name);
    
    // Skip excluded files/folders
    if (excludePattern && minimatch(entry.name, excludePattern)) {
      continue;
    }
    
    // Only include files/folders matching the include pattern if specified
    if (includePattern && !minimatch(entry.name, includePattern)) {
      continue;
    }
    
    if (entry.isDirectory()) {
      // Recursively copy subdirectories
      await copyFolderRecursive(srcPath, destPath, includePattern || undefined, excludePattern || undefined);
    } else {
      // Copy files
      await fsPromises.copyFile(srcPath, destPath);
    }
  }
}

// Copy folder contents helper function
async function copyFolderContents(sourcePath: string, targetPath: string, includePattern?: string, excludePattern?: string): Promise<void> {
  if (!sourcePath || !targetPath) {
    throw new Error('Source and target paths are required');
  }
  
  // Ensure target directory exists
  await fsPromises.mkdir(targetPath, { recursive: true });
  
  // Copy folder contents
  await copyFolderRecursive(sourcePath, targetPath, includePattern, excludePattern);
}

// Ensure emergency backup directory exists
async function ensureEmergencyBackupDir(): Promise<void> {
  if (!fs.existsSync(EMERGENCY_BACKUP_DIR_NORMALIZED)) {
    await fsPromises.mkdir(EMERGENCY_BACKUP_DIR_NORMALIZED, { recursive: true });
  }
}

// Create emergency backup of a file before restoration
async function createEmergencyBackup(filePath: string): Promise<string | null> {
  try {
    if (!fs.existsSync(filePath)) {
      console.error(`File not found for emergency backup: ${filePath}`);
      return null;
    }
    
    await ensureEmergencyBackupDir();
    const timestamp = generateTimestamp();
    const fileName = path.basename(filePath);
    
    // Create a directory structure that mirrors the original file's path
    const normalizedPath = path.normalize(filePath);
    const parsedPath = path.parse(normalizedPath);
    
    // Remove drive letter (on Windows) and create backup path
    let relativePath = parsedPath.dir.replace(/^[a-zA-Z]:/, '');
    
    // Ensure the path is safe by removing leading slashes
    relativePath = relativePath.replace(/^[/\\]+/, '');
    
    // Create the emergency backup directory path
    const emergencyBackupDir = path.join(EMERGENCY_BACKUP_DIR_NORMALIZED, relativePath);
    
    // Ensure the directory structure exists
    await fsPromises.mkdir(emergencyBackupDir, { recursive: true });
    
    // Create the emergency backup file path
    const backupPath = path.join(emergencyBackupDir, `${parsedPath.name}${parsedPath.ext}.emergency.${timestamp}`);
    
    // Copy file to emergency backup location
    await fsPromises.copyFile(filePath, backupPath);
    
    // Create metadata file for the emergency backup
    const metadata = createBackupMetadata(filePath, timestamp, backupPath, "Emergency backup created before restoration");
    const metadataPath = path.join(EMERGENCY_BACKUP_DIR_NORMALIZED, `${parsedPath.name}.emergency.${timestamp}.meta.json`);
    await fsPromises.writeFile(metadataPath, JSON.stringify(metadata, null, 2));
    
    return backupPath;
  } catch (error) {
    console.error('Error creating emergency backup:', error);
    return null;
  }
}

// Create emergency backup of a folder before restoration
async function createEmergencyFolderBackup(folderPath: string): Promise<string | null> {
  try {
    if (!fs.existsSync(folderPath)) {
      console.error(`Folder not found for emergency backup: ${folderPath}`);
      return null;
    }
    
    await ensureEmergencyBackupDir();
    const timestamp = generateTimestamp();
    
    // Create a directory structure that mirrors the original folder's path
    const normalizedPath = path.normalize(folderPath);
    const parsedPath = path.parse(normalizedPath);
    
    // Remove drive letter (on Windows) and create backup path
    let relativePath = parsedPath.dir.replace(/^[a-zA-Z]:/, '');
    
    // Ensure the path is safe by removing leading slashes
    relativePath = relativePath.replace(/^[/\\]+/, '');
    
    // Create the emergency backup directory path
    const emergencyBackupDir = path.join(EMERGENCY_BACKUP_DIR_NORMALIZED, relativePath);
    
    // Ensure the directory structure exists
    await fsPromises.mkdir(emergencyBackupDir, { recursive: true });
    
    // Create the emergency backup folder path
    const backupPath = path.join(emergencyBackupDir, `${parsedPath.name}.emergency.${timestamp}`);
    
    // Copy folder to emergency backup location
    await copyFolderContents(folderPath, backupPath);
    
    // Create metadata file for the emergency backup
    const metadata = {
      original_path: folderPath,
      original_filename: path.basename(folderPath),
      timestamp: timestamp,
      created_at: new Date().toISOString(),
      backup_path: backupPath,
      relative_path: path.relative(process.cwd(), backupPath),
      agent_context: "Emergency backup created before restoration"
    };
    const metadataPath = path.join(EMERGENCY_BACKUP_DIR_NORMALIZED, `${parsedPath.name}.emergency.${timestamp}.meta.json`);
    await fsPromises.writeFile(metadataPath, JSON.stringify(metadata, null, 2));
    
    return backupPath;
  } catch (error) {
    console.error('Error creating emergency folder backup:', error);
    return null;
  }
}

// Fix string | null assignment errors
async function mcp_backup_status(params: { operationId: string }): Promise<{ progress: number, status: string }> {
  const { operationId } = params;
  
  if (!operationId) {
    return { progress: 0, status: 'error' };
  }
  
  // Check if operation exists
  if (operations.has(operationId)) {
    const operation = operations.get(operationId);
    if (operation) {
      return {
        progress: operation.progress,
        status: operation.cancelled ? 'cancelled' : operation.progress >= 100 ? 'completed' : 'in_progress'
      };
    }
  }
  
  return { progress: 0, status: 'not_found' };
}

// Restore backup function
async function restoreBackup(filePath: string, timestamp: string, createEmergencyBackupFlag: boolean = false): Promise<void> {
  // Find the backup
  const backups = findBackupsByFilePath(filePath);
  const backup = backups.find(b => b.timestamp === timestamp);
  
  if (!backup) {
    throw new Error(`Backup with timestamp ${timestamp} not found for ${filePath}`);
  }
  
  // Create emergency backup if requested
  if (createEmergencyBackupFlag) {
    const emergencyBackupPath = await createEmergencyBackup(filePath);
    console.log(`Created emergency backup at: ${emergencyBackupPath}`);
  }
  
  // Get backup path
  const backupPath = backup.backup_path;
  
  // Check if backup exists
  if (!backupPath || !fs.existsSync(backupPath)) {
    throw new Error(`Backup file not found: ${backupPath}`);
  }
  
  // Check if original file exists
  if (!fs.existsSync(filePath)) {
    throw new Error(`Original file not found: ${filePath}`);
  }
  
  // Restore backup by copying it to original location
  await fsPromises.copyFile(backupPath, filePath);
}

// Start the server with stdio transport
const transport = new StdioServerTransport();
server.connect(transport).catch((error: Error) => {
  console.error("Fatal error running server:", error);
  process.exit(1);
});

```