# Directory Structure
```
├── .clinerules
├── .github
│ └── workflows
│ └── npm-publish.yml
├── .gitignore
├── bun.lock
├── LICENSE
├── memory-bank
│ ├── activeContext.md
│ ├── productContext.md
│ ├── progress.md
│ ├── projectbrief.md
│ ├── systemPatterns.md
│ └── techContext.md
├── package-lock.json
├── package.json
├── readme.md
└── src
├── handlers
│ ├── advanced-operations.js
│ ├── branch-operations.js
│ ├── commit-operations.js
│ ├── common.js
│ ├── config-operations.js
│ ├── directory-operations.js
│ ├── index.js
│ ├── other-operations.js
│ ├── remote-operations.js
│ ├── stash-operations.js
│ └── tag-operations.js
├── index.js
├── server.js
└── utils
└── git.js
```
# Files
--------------------------------------------------------------------------------
/.gitignore:
--------------------------------------------------------------------------------
```
# Dependency directories
node_modules/
jspm_packages/
# Logs
logs
*.log
npm-debug.log*
yarn-debug.log*
yarn-error.log*
lerna-debug.log*
# Runtime data
pids
*.pid
*.seed
*.pid.lock
# Directory for instrumented libs generated by jscoverage/JSCover
lib-cov
# Coverage directory used by tools like istanbul
coverage
*.lcov
# nyc test coverage
.nyc_output
# Grunt intermediate storage (https://gruntjs.com/creating-plugins#storing-task-files)
.grunt
# Bower dependency directory (https://bower.io/)
bower_components
# node-waf configuration
.lock-wscript
# Compiled binary addons (https://nodejs.org/api/addons.html)
build/Release
# TypeScript cache
*.tsbuildinfo
# Optional npm cache directory
.npm
# Optional eslint cache
.eslintcache
# Optional REPL history
.node_repl_history
# Output of 'npm pack'
*.tgz
# Yarn Integrity file
.yarn-integrity
# dotenv environment variables file
.env
.env.test
# parcel-bundler cache (https://parceljs.org/)
.cache
# Temporary folders
tmp/
temp/
# IDE folders
.idea/
.vscode/
*.swp
*.swo
```
--------------------------------------------------------------------------------
/.clinerules:
--------------------------------------------------------------------------------
```
# Git Commands MCP Project Rules
## Package Version Management
1. **Increment Version Before Pushing**:
- Always increment the version number in `package.json` before pushing to the repository
- This is critical as pushes to the repository trigger an npm package release through the CI/CD pipeline
- Current version format is semantic versioning (major.minor.patch)
2. **Version Update Workflow**:
- Check current version in package.json
- Increment appropriate segment based on changes:
- Patch (0.1.x): Bug fixes and minor changes
- Minor (0.x.0): New features, backward compatible
- Major (x.0.0): Breaking changes
- Stage and commit version change separately
- Sample commit message: "Bump version to X.Y.Z for npm release"
## Repository Configuration
1. **Git Remote Setup**:
- Repository uses SSH for authentication: `[email protected]:bsreeram08/git-commands-mcp.git`
- SSH keys must be properly configured for push access
2. **Branch Structure**:
- Main development happens on `master` branch
- Use feature branches for new development
## CI/CD Pipeline
1. **GitHub Actions**:
- The repository has an npm-publish workflow in `.github/workflows/npm-publish.yml`
- This workflow triggers on pushes to the repository
- It builds and publishes the package to npm registry automatically
2. **Release Checklist**:
- Update version in package.json
- Ensure all changes are committed
- Push to repository
- Verify GitHub Actions workflow completes successfully
- Check npm registry for the updated package
## Development Patterns
1. **Tool Handler Registration**:
- When adding new Git handlers, ensure they are added to both:
- `this.handlersMap` for functional registration
- `this.toolsList` for exposure through the MCP interface
- Update appropriate handler category in `this.handlerCategories`
2. **Code Organization**:
- Handlers are organized in `src/handlers/index.js`
- Server setup is in `src/server.js`
- Main entry point is `src/index.js`
3. **Naming Conventions**:
- Git tool handlers follow pattern: `git_[action]_[resource]`
- Handler implementations follow pattern: `handleGit[Action][Resource]`
```
--------------------------------------------------------------------------------
/readme.md:
--------------------------------------------------------------------------------
```markdown
# MCP Git Repo Browser (Node.js)
A Node.js implementation of a Git repository browser using the Model Context Protocol (MCP).
[](https://github.com/bsreeram08/git-commands-mcp)
[](https://www.npmjs.com/package/git-commands-mcp)
## Installation
### NPM (Recommended)
```bash
npm install -g git-commands-mcp
```
### Manual Installation
```bash
git clone https://github.com/bsreeram08/git-commands-mcp.git
cd git-commands-mcp
npm install
```
## Configuration
Add this to your MCP settings configuration file:
```json
{
"mcpServers": {
"git-commands-mcp": {
"command": "git-commands-mcp"
}
}
}
```
For manual installation, use:
```json
{
"mcpServers": {
"git-commands-mcp": {
"command": "node",
"args": ["/path/to/git-commands-mcp/src/index.js"]
}
}
}
```
## Features
The server provides the following tools:
### Basic Repository Operations
1. `git_directory_structure`: Returns a tree-like representation of a repository's directory structure
- Input: Repository URL
- Output: ASCII tree representation of the repository structure
2. `git_read_files`: Reads and returns the contents of specified files in a repository
- Input: Repository URL and list of file paths
- Output: Dictionary mapping file paths to their contents
3. `git_search_code`: Searches for patterns in repository code
- Input: Repository URL, search pattern, optional file patterns, case sensitivity, and context lines
- Output: JSON with search results including matching lines and context
### Branch Operations
4. `git_branch_diff`: Compare two branches and show files changed between them
- Input: Repository URL, source branch, target branch, and optional show_patch flag
- Output: JSON with commit count and diff summary
### Commit Operations
5. `git_commit_history`: Get commit history for a branch with optional filtering
- Input: Repository URL, branch name, max count, author filter, since date, until date, and message grep
- Output: JSON with commit details
6. `git_commits_details`: Get detailed information about commits including full messages and diffs
- Input: Repository URL, branch name, max count, include_diff flag, author filter, since date, until date, and message grep
- Output: JSON with detailed commit information
7. `git_local_changes`: Get uncommitted changes in the working directory
- Input: Local repository path
- Output: JSON with status information and diffs
## Project Structure
```
git-commands-mcp/
├── src/
│ ├── index.js # Entry point
│ ├── server.js # Main server implementation
│ ├── handlers/ # Tool handlers
│ │ └── index.js # Tool implementation functions
│ └── utils/ # Utility functions
│ └── git.js # Git-related helper functions
├── package.json
└── readme.md
```
## Implementation Details
- Uses Node.js native modules (crypto, path, os) for core functionality
- Leverages fs-extra for enhanced file operations
- Uses simple-git for Git repository operations
- Implements clean error handling and resource cleanup
- Creates deterministic temporary directories based on repository URL hashes
- Reuses cloned repositories when possible for efficiency
- Modular code structure for better maintainability
## Requirements
- Node.js 14.x or higher
- Git installed on the system
## Usage
If installed globally via npm:
```bash
git-commands-mcp
```
If installed manually:
```bash
node src/index.js
```
The server runs on stdio, making it compatible with MCP clients.
## CI/CD
This project uses GitHub Actions for continuous integration and deployment:
### Automatic NPM Publishing
The repository is configured with a GitHub Actions workflow that automatically publishes the package to npm when changes are pushed to the master branch.
#### Setting up NPM_AUTOMATION_TOKEN
To enable automatic publishing, you need to add an npm Automation token as a GitHub secret (this works even with accounts that have 2FA enabled):
1. Generate an npm Automation token:
- Log in to your npm account on [npmjs.com](https://www.npmjs.com/)
- Go to your profile settings
- Select "Access Tokens"
- Click "Generate New Token"
- Select "Automation" token type
- Set the appropriate permissions (needs "Read and write" for packages)
- Copy the generated token
2. Add the token to your GitHub repository:
- Go to your GitHub repository
- Navigate to "Settings" > "Secrets and variables" > "Actions"
- Click "New repository secret"
- Name: `NPM_AUTOMATION_TOKEN`
- Value: Paste your npm Automation token
- Click "Add secret"
Once configured, any push to the master branch will trigger the workflow to publish the package to npm.
## License
MIT License - see the [LICENSE](LICENSE) file for details.
## Links
- [GitHub Repository](https://github.com/bsreeram08/git-commands-mcp)
- [NPM Package](https://www.npmjs.com/package/git-commands-mcp)
- [Report Issues](https://github.com/bsreeram08/git-commands-mcp/issues)
```
--------------------------------------------------------------------------------
/src/index.js:
--------------------------------------------------------------------------------
```javascript
#!/usr/bin/env node
import { GitRepoBrowserServer } from "./server.js";
const server = new GitRepoBrowserServer();
server.run().catch(console.error);
```
--------------------------------------------------------------------------------
/src/handlers/common.js:
--------------------------------------------------------------------------------
```javascript
import path from "path";
import fs from "fs-extra";
import { simpleGit } from "simple-git";
import { exec } from "child_process";
import { promisify } from "util";
import { cloneRepo, getDirectoryTree } from "../utils/git.js";
const execPromise = promisify(exec);
export { path, fs, simpleGit, execPromise, cloneRepo, getDirectoryTree };
```
--------------------------------------------------------------------------------
/.github/workflows/npm-publish.yml:
--------------------------------------------------------------------------------
```yaml
name: NPM Publish
on:
push:
branches:
- master
jobs:
publish:
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v3
- name: Setup Node.js
uses: actions/setup-node@v3
with:
node-version: "16.x"
registry-url: "https://registry.npmjs.org"
- name: Install dependencies
run: npm ci
- name: Publish to npm
# Use --access=public if it's a scoped package (@username/package-name)
run: npm publish --provenance
env:
# Use an Automation token to avoid 2FA issues
NODE_AUTH_TOKEN: ${{ secrets.NPM_AUTOMATION_TOKEN }}
```
--------------------------------------------------------------------------------
/memory-bank/projectbrief.md:
--------------------------------------------------------------------------------
```markdown
# Project Brief: Git Commands MCP Server
## Core Goal
To provide a Model Context Protocol (MCP) server that exposes common and advanced Git commands as tools. This allows users (or AI agents) to interact with Git repositories programmatically through the MCP interface.
## Key Features
- Expose a range of Git operations (cloning, committing, branching, merging, diffing, etc.) as distinct MCP tools.
- Operate on both remote repositories (via URL) and local repositories (via path).
- Return structured information from Git commands.
## Current Scope
The server currently implements a variety of Git commands using the `simple-git` library, which wraps the local Git executable.
## Project Status
Actively developed. A recent Pull Request proposes adding Docker and Smithery configuration for deployment, but there are concerns about compatibility due to the server's reliance on local Git execution via `simple-git`.
```
--------------------------------------------------------------------------------
/package.json:
--------------------------------------------------------------------------------
```json
{
"name": "git-commands-mcp",
"version": "0.1.4",
"description": "A Node.js implementation of a Git repository browser using the Model Context Protocol (MCP)",
"main": "src/index.js",
"type": "module",
"bin": {
"git-commands-mcp": "src/index.js"
},
"scripts": {
"start": "node src/index.js",
"prepublishOnly": "npm ci && echo \"No tests specified - publishing anyway\""
},
"author": "sreeram balamurugan",
"license": "MIT",
"repository": {
"type": "git",
"url": "git+https://github.com/bsreeram08/git-commands-mcp.git"
},
"bugs": {
"url": "https://github.com/bsreeram08/git-commands-mcp/issues"
},
"homepage": "https://github.com/bsreeram08/git-commands-mcp#readme",
"keywords": [
"git",
"mcp",
"model-context-protocol",
"repository",
"browser"
],
"dependencies": {
"@modelcontextprotocol/sdk": "1.5.0",
"@surfai/docs-mcp": "^1.0.1",
"fs-extra": "^11.3.0",
"simple-git": "^3.27.0"
},
"engines": {
"node": ">=14.0.0"
}
}
```
--------------------------------------------------------------------------------
/src/handlers/config-operations.js:
--------------------------------------------------------------------------------
```javascript
import { simpleGit } from "./common.js";
/**
* Configures git settings for the repository
* @param {string} repoPath - Path to the local repository
* @param {string} scope - Configuration scope (local, global, system)
* @param {string} key - Configuration key
* @param {string} value - Configuration value
* @returns {Object} - Configuration result
*/
export async function handleGitConfig({
repo_path,
scope = "local",
key,
value,
}) {
try {
const git = simpleGit(repo_path);
// Set the configuration
await git.addConfig(key, value, false, scope);
return {
content: [
{
type: "text",
text: JSON.stringify(
{
success: true,
message: `Set ${scope} config ${key}=${value}`,
key: key,
value: value,
scope: scope,
},
null,
2
),
},
],
};
} catch (error) {
return {
content: [
{
type: "text",
text: JSON.stringify(
{ error: `Failed to set git config: ${error.message}` },
null,
2
),
},
],
isError: true,
};
}
}
```
--------------------------------------------------------------------------------
/src/handlers/tag-operations.js:
--------------------------------------------------------------------------------
```javascript
import { simpleGit } from "./common.js";
/**
* Creates a tag
* @param {string} repoPath - Path to the local repository
* @param {string} tagName - Name of the tag
* @param {string} message - Tag message (for annotated tags)
* @param {boolean} annotated - Whether to create an annotated tag
* @returns {Object} - Tag creation result
*/
export async function handleGitCreateTag({
repo_path,
tag_name,
message = "",
annotated = true,
}) {
try {
const git = simpleGit(repo_path);
if (annotated) {
await git.addAnnotatedTag(tag_name, message);
} else {
await git.addTag(tag_name);
}
return {
content: [
{
type: "text",
text: JSON.stringify(
{
success: true,
message: `Created ${
annotated ? "annotated " : ""
}tag: ${tag_name}`,
tag: tag_name,
},
null,
2
),
},
],
};
} catch (error) {
return {
content: [
{
type: "text",
text: JSON.stringify(
{ error: `Failed to create tag: ${error.message}` },
null,
2
),
},
],
isError: true,
};
}
}
```
--------------------------------------------------------------------------------
/memory-bank/techContext.md:
--------------------------------------------------------------------------------
```markdown
# Tech Context: Git Commands MCP Server
## Core Technologies
- **Language:** Node.js (JavaScript)
- **Package Manager:** Likely npm (presence of `package.json`, `package-lock.json`) or potentially Bun (presence of `bun.lock`). Requires clarification if both are used or if one is primary.
- **Core Library:** `simple-git` - A Node.js wrapper for the Git command-line interface. This is the primary mechanism for interacting with Git.
- **MCP Framework:** Relies on an underlying MCP server implementation (details likely in `src/server.js` or dependencies) to handle MCP communication.
## Development Setup
- Requires Node.js runtime installed.
- Requires Git executable installed and accessible in the system PATH.
- Dependencies are managed via `package.json` (and potentially `bun.lock`). Installation likely via `npm install` or `bun install`.
- Server is typically run using a command like `node src/index.js` or a script defined in `package.json`.
## Technical Constraints
- **Dependency on Local Git:** The use of `simple-git` inherently ties the server's execution environment to having a functional Git installation.
- **Filesystem Access Requirement:** Tools operating on `repo_path` require direct access to the host filesystem, which is problematic in isolated environments like Docker containers.
- **Authentication:** Handling authentication for remote private repositories relies on the Git configuration (e.g., SSH keys, credential helpers) available in the execution environment. This is difficult to replicate securely and consistently within a generic container.
## Tool Usage Patterns
- MCP tools are defined with JSON schemas for input validation.
- Handlers parse inputs and construct `simple-git` commands.
- Error handling wraps potential exceptions from `simple-git`.
```
--------------------------------------------------------------------------------
/src/handlers/index.js:
--------------------------------------------------------------------------------
```javascript
// Import handlers from individual files
import {
handleGitDirectoryStructure,
handleGitReadFiles,
handleGitSearchCode,
handleGitLocalChanges,
} from "./directory-operations.js";
import {
handleGitCommitHistory,
handleGitCommitsDetails,
handleGitCommit,
handleGitTrack,
} from "./commit-operations.js";
import {
handleGitBranchDiff,
handleGitCheckoutBranch,
handleGitDeleteBranch,
handleGitMergeBranch,
} from "./branch-operations.js";
import {
handleGitPush,
handleGitPull,
handleGitRemote,
} from "./remote-operations.js";
import { handleGitStash } from "./stash-operations.js";
import { handleGitCreateTag } from "./tag-operations.js";
import { handleGitRebase, handleGitReset } from "./advanced-operations.js";
import { handleGitConfig } from "./config-operations.js";
import {
handleGitArchive,
handleGitAttributes,
handleGitBlame,
handleGitClean,
handleGitHooks,
handleGitLFS,
handleGitLFSFetch,
handleGitRevert,
} from "./other-operations.js";
// Re-export all handlers
export {
// Directory operations
handleGitDirectoryStructure,
handleGitReadFiles,
handleGitSearchCode,
handleGitLocalChanges,
// Commit operations
handleGitCommitHistory,
handleGitCommitsDetails,
handleGitCommit,
handleGitTrack,
// Branch operations
handleGitBranchDiff,
handleGitCheckoutBranch,
handleGitDeleteBranch,
handleGitMergeBranch,
// Remote operations
handleGitPush,
handleGitPull,
handleGitRemote,
// Stash operations
handleGitStash,
// Tag operations
handleGitCreateTag,
// Advanced operations
handleGitRebase,
handleGitReset,
// Config operations
handleGitConfig,
// Other operations
handleGitArchive,
handleGitAttributes,
handleGitBlame,
handleGitClean,
handleGitHooks,
handleGitLFS,
handleGitLFSFetch,
handleGitRevert,
};
```
--------------------------------------------------------------------------------
/src/utils/git.js:
--------------------------------------------------------------------------------
```javascript
import { simpleGit } from "simple-git";
import fs from "fs-extra";
import path from "path";
import os from "os";
import crypto from "crypto";
/**
* Clones a Git repository or reuses an existing clone
* @param {string} repoUrl - The URL of the Git repository to clone
* @returns {Promise<string>} - Path to the cloned repository
*/
export async function cloneRepo(repoUrl) {
// Create deterministic directory name based on repo URL
const repoHash = crypto
.createHash("sha256")
.update(repoUrl)
.digest("hex")
.slice(0, 12);
const tempDir = path.join(os.tmpdir(), `github_tools_${repoHash}`);
// Check if directory exists and is a valid git repo
if (await fs.pathExists(tempDir)) {
try {
const git = simpleGit(tempDir);
const remotes = await git.getRemotes(true);
if (remotes.length > 0 && remotes[0].refs.fetch === repoUrl) {
// Pull latest changes
await git.pull();
return tempDir;
}
} catch (error) {
// If there's any error with existing repo, clean it up
await fs.remove(tempDir);
}
}
// Create directory and clone repository
await fs.ensureDir(tempDir);
try {
await simpleGit().clone(repoUrl, tempDir);
return tempDir;
} catch (error) {
// Clean up on error
await fs.remove(tempDir);
throw new Error(`Failed to clone repository: ${error.message}`);
}
}
/**
* Generates a tree representation of a directory structure
* @param {string} dirPath - Path to the directory
* @param {string} prefix - Prefix for the current line (used for recursion)
* @returns {Promise<string>} - ASCII tree representation of the directory
*/
export async function getDirectoryTree(dirPath, prefix = "") {
let output = "";
const entries = await fs.readdir(dirPath);
entries.sort();
for (let i = 0; i < entries.length; i++) {
const entry = entries[i];
if (entry.startsWith(".git")) continue;
const isLast = i === entries.length - 1;
const currentPrefix = isLast ? "└── " : "├── ";
const nextPrefix = isLast ? " " : "│ ";
const entryPath = path.join(dirPath, entry);
output += prefix + currentPrefix + entry + "\n";
const stats = await fs.stat(entryPath);
if (stats.isDirectory()) {
output += await getDirectoryTree(entryPath, prefix + nextPrefix);
}
}
return output;
}
```
--------------------------------------------------------------------------------
/memory-bank/activeContext.md:
--------------------------------------------------------------------------------
```markdown
# Active Context: Git Commands MCP Server (2025-05-02)
## Current Focus
Completed a full code review of the `git-commands-mcp` server (`src/` directory) to understand its implementation details before evaluating the Smithery deployment PR. The review confirms the initial concern regarding containerization compatibility.
## Recent Changes
- Memory Bank initialized with core documentation files.
- Reviewed all source files in `src/`: `index.js`, `server.js`, `utils/git.js`, and all handler files in `src/handlers/`.
## Next Steps
1. **Update Memory Bank:** Refine `productContext.md`, `systemPatterns.md`, and `progress.md` based on the detailed code review findings.
2. **Present Findings:** Communicate the detailed analysis of `repo_url` vs. `repo_path` tool implementation and the resulting container incompatibility to the user.
3. **Discuss Smithery PR:** Re-engage on the Smithery PR, specifically asking for the `Dockerfile` and config file contents, now with the full context of the server's limitations.
4. **Evaluate Options:** Discuss potential paths forward regarding the Smithery deployment (e.g., deploying a limited subset of tools, modifying the server, or concluding it's unsuitable for this deployment model).
## Active Decisions & Considerations
- **Confirmation of Incompatibility:** The code review confirms that tools using `repo_path` directly interact with the local filesystem path provided, making them incompatible with standard container isolation.
- **`repo_url` Tool Feasibility:** Tools using `repo_url` operate on temporary clones within the server's environment. These _could_ work in a container if prerequisites (Git, network, permissions, auth) are met, but auth remains a major hurdle.
- **Deployment Scope:** Any potential Smithery deployment would likely be limited to the `repo_url`-based tools, significantly reducing the server's advertised functionality.
## Key Learnings/Insights
- **Explicit Design Distinction:** The codebase clearly separates `repo_url` tools (using `cloneRepo` for temporary local copies) from `repo_path` tools (using `simpleGit` or `fs` directly on the provided path).
- **Filesystem Dependency:** Many tools, including hooks and attributes management, rely on direct `fs` access within the target `repo_path`, further cementing the local execution dependency.
- **`execPromise` Usage:** Some tools (`git_search_code`, `git_lfs`, `git_lfs_fetch`) use direct command execution (`execPromise`), adding another layer of dependency on the execution environment's PATH and installed tools (like `git lfs`).
```
--------------------------------------------------------------------------------
/memory-bank/productContext.md:
--------------------------------------------------------------------------------
```markdown
# Product Context: Git Commands MCP Server
## Problem Solved
Interacting with Git repositories often requires manual command-line usage or complex scripting. This MCP server aims to simplify Git interactions by providing a standardized, programmatic interface via MCP tools. This is particularly useful for:
- **AI Agents:** Enabling AI agents like Cline to perform Git operations as part of development tasks.
- **Automation:** Facilitating automated workflows that involve Git (e.g., CI/CD-like tasks, repository management).
- **Integration:** Providing a consistent way to integrate Git functionality into other applications or services that understand MCP.
## How It Should Work
- Users (or agents) invoke specific Git tools provided by the server (e.g., `git_directory_structure`, `git_commit`, `git_branch_diff`).
- The server receives the request, validates parameters, and routes to the appropriate handler.
- **`repo_url`-based Tools:** For tools operating on remote repositories (identified by `repo_url` parameter), the server uses a utility (`cloneRepo` in `src/utils/git.js`) to clone the repository into a temporary directory within the server's own execution environment. Subsequent operations for that request (e.g., reading files, getting history, diffing branches) are performed on this temporary local clone using `simple-git`, `fs`, or direct `git` commands (`execPromise`). This _might_ work in a container if prerequisites (Git installed, network, permissions, auth) are met.
- **`repo_path`-based Tools:** For tools operating on local repositories (identified by `repo_path` parameter), the server initializes `simple-git` directly with the provided path or uses `fs`/`execPromise` to interact with files/commands within that path. This requires the server process to have direct read/write access to the specified filesystem path. **This mode is fundamentally incompatible with standard containerized deployment (like Docker/Smithery) due to filesystem isolation.**
- The server processes the output from the underlying Git operation (via `simple-git`, `fs`, or `execPromise`) and returns a structured JSON response to the caller.
## User Experience Goals
- **Simplicity:** Abstract away the complexities of Git command-line syntax.
- **Reliability:** Execute Git commands accurately and handle errors gracefully.
- **Discoverability:** Clearly define available tools and their parameters through the MCP schema.
- **Flexibility:** Support a wide range of common Git operations for both local and remote workflows.
## Target Users
- AI Development Agents (like Cline)
- Developers building automation scripts
- Platform engineers integrating Git operations
```
--------------------------------------------------------------------------------
/memory-bank/progress.md:
--------------------------------------------------------------------------------
```markdown
# Progress & Status: Git Commands MCP Server (2025-05-02)
## What Works
- The MCP server successfully exposes a wide range of Git commands as tools, defined in `src/server.js`.
- **`repo_url`-based Tools:** These tools (e.g., `git_directory_structure`, `git_read_files`, `git_commit_history`) function by cloning the remote repo into a temporary directory within the server's environment (`os.tmpdir()`) and operating on that clone. This works reliably when the server runs locally with network access and appropriate credentials (if needed).
- **`repo_path`-based Tools:** These tools (e.g., `git_commit`, `git_push`, `git_local_changes`, `git_checkout_branch`) function correctly _only when the server process has direct filesystem access_ to the specified `repo_path`.
## What's Left to Build / Current Tasks
- **Evaluate Smithery Deployment PR:** Analyze the feasibility of the proposed Docker/Smithery deployment in light of the confirmed incompatibility of `repo_path`-based tools with containerization. This requires reviewing the PR's `Dockerfile` and Smithery config file.
- **Address Container Compatibility:** Decide how to handle the incompatibility issue. Options include:
- Deploying only the `repo_url`-based tools.
- Modifying the server architecture (significant effort).
- Rejecting the containerized deployment approach for this server.
## Current Status
- **Code Review Complete:** Full review of `src/` directory completed.
- **Memory Bank Updated:** Core memory bank files created and refined based on code review.
- **Blocked:** Further action on the Smithery PR is blocked pending review of its specific files (`Dockerfile`, config) and a decision on how to handle the `repo_path` tool incompatibility.
## Known Issues
- **Fundamental Container Incompatibility (`repo_path` tools):** Tools requiring `repo_path` cannot function in a standard isolated container (like Docker/Smithery) because the container lacks access to the user-specified host filesystem paths.
- **Container Prerequisites (`repo_url` tools):** For `repo_url` tools to work in a container, the container needs:
- Git installed.
- Network access.
- Write permissions to its temporary directory.
- A mechanism to handle authentication for private repositories (major challenge).
- **Dependency on Local Tools:** Some handlers rely on `git lfs` being installed locally.
## Evolution of Decisions
- The initial design leveraging `simple-git` and direct filesystem access (`repo_path`) is effective for local use but unsuitable for standard containerized deployment.
- The `cloneRepo` utility for `repo_url` tools provides a potential (but limited) path for containerization, focusing only on remote repository interactions.
- The Smithery PR necessitates a decision on whether to adapt the server, limit its deployed scope, or abandon containerization for this specific MCP.
```
--------------------------------------------------------------------------------
/src/handlers/advanced-operations.js:
--------------------------------------------------------------------------------
```javascript
import { simpleGit } from "./common.js";
/**
* Handles git rebase operations
* @param {string} repoPath - Path to the local repository
* @param {string} onto - Branch or commit to rebase onto
* @param {boolean} interactive - Whether to perform an interactive rebase
* @returns {Object} - Rebase result
*/
export async function handleGitRebase({
repo_path,
onto,
interactive = false,
}) {
try {
// For interactive rebase, we need to use exec as simple-git doesn't support it well
if (interactive) {
return {
content: [
{
type: "text",
text: JSON.stringify(
{ error: "Interactive rebase not supported through API" },
null,
2
),
},
],
isError: true,
};
}
const git = simpleGit(repo_path);
const rebaseResult = await git.rebase([onto]);
return {
content: [
{
type: "text",
text: JSON.stringify(
{
success: true,
message: `Rebased onto ${onto}`,
result: rebaseResult,
},
null,
2
),
},
],
};
} catch (error) {
return {
content: [
{
type: "text",
text: JSON.stringify(
{
error: `Failed to rebase: ${error.message}`,
conflicts: error.git ? error.git.conflicts : null,
},
null,
2
),
},
],
isError: true,
};
}
}
/**
* Resets repository to specified commit or state
* @param {string} repoPath - Path to the local repository
* @param {string} mode - Reset mode (soft, mixed, hard)
* @param {string} to - Commit or reference to reset to
* @returns {Object} - Reset result
*/
export async function handleGitReset({
repo_path,
mode = "mixed",
to = "HEAD",
}) {
try {
const git = simpleGit(repo_path);
// Check valid mode
if (!["soft", "mixed", "hard"].includes(mode)) {
return {
content: [
{
type: "text",
text: JSON.stringify(
{
error: `Invalid reset mode: ${mode}. Use 'soft', 'mixed', or 'hard'.`,
},
null,
2
),
},
],
isError: true,
};
}
// Perform the reset
await git.reset([`--${mode}`, to]);
return {
content: [
{
type: "text",
text: JSON.stringify(
{
success: true,
message: `Reset (${mode}) to ${to}`,
mode: mode,
target: to,
},
null,
2
),
},
],
};
} catch (error) {
return {
content: [
{
type: "text",
text: JSON.stringify(
{ error: `Failed to reset repository: ${error.message}` },
null,
2
),
},
],
isError: true,
};
}
}
```
--------------------------------------------------------------------------------
/src/handlers/stash-operations.js:
--------------------------------------------------------------------------------
```javascript
import { simpleGit } from "./common.js";
/**
* Creates or applies a stash
* @param {string} repoPath - Path to the local repository
* @param {string} action - Stash action (save, pop, apply, list, drop)
* @param {string} message - Stash message (for save action)
* @param {number} index - Stash index (for pop, apply, drop actions)
* @returns {Object} - Stash operation result
*/
export async function handleGitStash({
repo_path,
action = "save",
message = "",
index = 0,
}) {
try {
const git = simpleGit(repo_path);
let result;
switch (action) {
case "save":
result = await git.stash(["save", message]);
return {
content: [
{
type: "text",
text: JSON.stringify(
{
success: true,
message: "Changes stashed successfully",
stash_message: message,
},
null,
2
),
},
],
};
case "pop":
result = await git.stash(["pop", index.toString()]);
return {
content: [
{
type: "text",
text: JSON.stringify(
{
success: true,
message: `Applied and dropped stash@{${index}}`,
},
null,
2
),
},
],
};
case "apply":
result = await git.stash(["apply", index.toString()]);
return {
content: [
{
type: "text",
text: JSON.stringify(
{
success: true,
message: `Applied stash@{${index}}`,
},
null,
2
),
},
],
};
case "list":
result = await git.stash(["list"]);
// Parse the stash list
const stashList = result
.trim()
.split("\n")
.filter((line) => line.trim() !== "")
.map((line) => {
const match = line.match(/stash@\{(\d+)\}: (.*)/);
if (match) {
return {
index: parseInt(match[1]),
description: match[2],
};
}
return null;
})
.filter((item) => item !== null);
return {
content: [
{
type: "text",
text: JSON.stringify(
{
success: true,
stashes: stashList,
},
null,
2
),
},
],
};
case "drop":
result = await git.stash(["drop", index.toString()]);
return {
content: [
{
type: "text",
text: JSON.stringify(
{
success: true,
message: `Dropped stash@{${index}}`,
},
null,
2
),
},
],
};
default:
return {
content: [
{
type: "text",
text: JSON.stringify(
{ error: `Unknown stash action: ${action}` },
null,
2
),
},
],
isError: true,
};
}
} catch (error) {
return {
content: [
{
type: "text",
text: JSON.stringify(
{ error: `Failed to perform stash operation: ${error.message}` },
null,
2
),
},
],
isError: true,
};
}
}
```
--------------------------------------------------------------------------------
/memory-bank/systemPatterns.md:
--------------------------------------------------------------------------------
```markdown
# System Patterns: Git Commands MCP Server
## Core Architecture
The server follows a standard MCP server pattern:
1. **Initialization:** Sets up an MCP server instance (`src/server.js`).
2. **Tool Registration:** Defines and registers available Git tools with their input schemas (`src/handlers/index.js`). Each tool corresponds to a specific Git operation.
3. **Request Handling:** Listens for incoming MCP requests. When a tool is invoked, the server routes the request to the appropriate handler function.
4. **Git Interaction:** Handler functions utilize the `simple-git` library (`src/utils/git.js`) to interact with the Git command-line executable.
5. **Response Formatting:** Handler functions process the output from `simple-git` (or handle errors) and return a structured JSON response conforming to the MCP standard.
## Key Design Patterns
- **Handler Mapping:** A map (`this.handlersMap` in `src/handlers/index.js`) associates tool names (e.g., `git_clone`) with their corresponding implementation functions (e.g., `handleGitClone`).
- **Tool Listing:** A separate list (`this.toolsList` in `src/handlers/index.js`) defines the tools exposed via the MCP interface, including their schemas. This ensures separation between internal implementation and external interface definition.
- **Categorization:** Handlers are grouped into categories (`this.handlerCategories` in `src/handlers/index.js`) for organization, although this is primarily for internal code structure.
- **Wrapper Library:** Abstraction of direct Git command execution through the `simple-git` library. This simplifies handler logic but introduces a dependency on the local Git environment.
## Critical Implementation Paths
- **`repo_path`-based Operations:** Tools accepting a `repo_path` parameter (e.g., `git_commit`, `git_push`, `git_local_changes`, `git_checkout_branch`) initialize `simpleGit` directly with this path or use `fs`/`execPromise` within this path. This requires the server process to have direct read/write access to the specified local filesystem path. **This path is incompatible with standard container isolation.**
- **`repo_url`-based Operations:** Tools accepting a `repo_url` parameter (e.g., `git_directory_structure`, `git_read_files`, `git_commit_history`) use the `cloneRepo` utility (`src/utils/git.js`). This clones the remote repo into a temporary directory within the server's execution environment (`os.tmpdir()`) and performs operations on that temporary clone. **This path _might_ be adaptable to containerization if prerequisites are met.**
- **Direct Command Execution:** Some tools (`git_search_code`, `git_lfs`, `git_lfs_fetch`) use `execPromise` to run `git` or `git lfs` commands directly, relying on these being available in the server environment's PATH.
## Dependencies
- **Local Git Installation:** `simple-git` and direct `git` commands require a functional Git executable available in the system's PATH where the server runs.
- **Node.js `fs` Module:** Used for direct file operations in some handlers (e.g., `handleGitHooks`, `handleGitAttributes`, reading files from temporary clones).
- **Node.js `os` Module:** Used by `cloneRepo` to determine the temporary directory location.
- **Node.js `crypto` Module:** Used by `cloneRepo` to generate deterministic temporary directory names.
- **Filesystem Access (`repo_path` tools):** Require direct read/write access to the user-specified local repository paths.
- **Filesystem Access (`repo_url` tools):** Require write access to the server's temporary directory (`os.tmpdir()`).
- **Network Access (`repo_url` tools):** Require network connectivity to clone remote Git repositories.
- **Authentication (`repo_url` tools):** Cloning private remote repositories requires credentials (e.g., SSH keys, HTTPS tokens) to be configured and accessible within the server's execution environment. This is a major challenge for containerized deployments.
- **Optional Tools:** `git lfs` commands require the `git-lfs` extension to be installed in the server's environment.
```
--------------------------------------------------------------------------------
/src/handlers/branch-operations.js:
--------------------------------------------------------------------------------
```javascript
import { simpleGit, cloneRepo } from "./common.js";
/**
* Handles the git_branch_diff tool request
* @param {Object} params - Tool parameters
* @param {string} params.repo_url - Repository URL
* @param {string} params.source_branch - Source branch name
* @param {string} params.target_branch - Target branch name
* @param {boolean} params.show_patch - Whether to include diff patches
* @returns {Object} - Tool response
*/
export async function handleGitBranchDiff({
repo_url,
source_branch,
target_branch,
show_patch = false,
}) {
try {
const repoPath = await cloneRepo(repo_url);
const git = simpleGit(repoPath);
// Make sure both branches exist locally
const branches = await git.branch();
if (!branches.all.includes(source_branch)) {
await git.fetch("origin", source_branch);
await git.checkout(source_branch);
}
if (!branches.all.includes(target_branch)) {
await git.fetch("origin", target_branch);
}
// Get the diff between branches
const diffOptions = ["--name-status"];
if (show_patch) {
diffOptions.push("--patch");
}
const diff = await git.diff([
...diffOptions,
`${target_branch}...${source_branch}`,
]);
// Get commit range information
const logSummary = await git.log({
from: target_branch,
to: source_branch,
});
const result = {
commits_count: logSummary.total,
diff_summary: diff,
};
return {
content: [
{
type: "text",
text: JSON.stringify(result, null, 2),
},
],
};
} catch (error) {
return {
content: [
{
type: "text",
text: JSON.stringify(
{ error: `Failed to get branch diff: ${error.message}` },
null,
2
),
},
],
isError: true,
};
}
}
/**
* Creates and checks out a new branch
* @param {string} repoPath - Path to the local repository
* @param {string} branchName - Name of the new branch
* @param {string} startPoint - Starting point for the branch (optional)
* @returns {Object} - Branch creation result
*/
export async function handleGitCheckoutBranch({
repo_path,
branch_name,
start_point = null,
create = false,
}) {
try {
const git = simpleGit(repo_path);
if (create) {
// Create and checkout a new branch
if (start_point) {
await git.checkoutBranch(branch_name, start_point);
} else {
await git.checkoutLocalBranch(branch_name);
}
return {
content: [
{
type: "text",
text: JSON.stringify(
{
success: true,
message: `Created and checked out new branch: ${branch_name}`,
branch: branch_name,
},
null,
2
),
},
],
};
} else {
// Just checkout an existing branch
await git.checkout(branch_name);
return {
content: [
{
type: "text",
text: JSON.stringify(
{
success: true,
message: `Checked out branch: ${branch_name}`,
branch: branch_name,
},
null,
2
),
},
],
};
}
} catch (error) {
return {
content: [
{
type: "text",
text: JSON.stringify(
{ error: `Failed to checkout branch: ${error.message}` },
null,
2
),
},
],
isError: true,
};
}
}
/**
* Deletes a branch
* @param {string} repoPath - Path to the local repository
* @param {string} branchName - Name of the branch to delete
* @param {boolean} force - Whether to force deletion
* @returns {Object} - Branch deletion result
*/
export async function handleGitDeleteBranch({
repo_path,
branch_name,
force = false,
}) {
try {
const git = simpleGit(repo_path);
// Get current branch to prevent deleting the active branch
const currentBranch = await git.branch();
if (currentBranch.current === branch_name) {
return {
content: [
{
type: "text",
text: JSON.stringify(
{ error: "Cannot delete the currently checked out branch" },
null,
2
),
},
],
isError: true,
};
}
// Delete the branch
if (force) {
await git.deleteLocalBranch(branch_name, true);
} else {
await git.deleteLocalBranch(branch_name);
}
return {
content: [
{
type: "text",
text: JSON.stringify(
{
success: true,
message: `Deleted branch: ${branch_name}`,
branch: branch_name,
},
null,
2
),
},
],
};
} catch (error) {
return {
content: [
{
type: "text",
text: JSON.stringify(
{ error: `Failed to delete branch: ${error.message}` },
null,
2
),
},
],
isError: true,
};
}
}
/**
* Merges a source branch into the current branch
* @param {string} repoPath - Path to the local repository
* @param {string} sourceBranch - Branch to merge from
* @param {string} targetBranch - Branch to merge into (optional, uses current branch if not provided)
* @param {boolean} noFastForward - Whether to create a merge commit even if fast-forward is possible
* @returns {Object} - Merge result
*/
export async function handleGitMergeBranch({
repo_path,
source_branch,
target_branch = null,
no_fast_forward = false,
}) {
try {
const git = simpleGit(repo_path);
// If target branch is specified, checkout to it first
if (target_branch) {
await git.checkout(target_branch);
}
// Perform the merge
let mergeOptions = [];
if (no_fast_forward) {
mergeOptions = ["--no-ff"];
}
const mergeResult = await git.merge([...mergeOptions, source_branch]);
return {
content: [
{
type: "text",
text: JSON.stringify(
{
success: true,
result: mergeResult,
message: `Merged ${source_branch} into ${
target_branch || "current branch"
}`,
},
null,
2
),
},
],
};
} catch (error) {
return {
content: [
{
type: "text",
text: JSON.stringify(
{
error: `Failed to merge branches: ${error.message}`,
conflicts: error.git ? error.git.conflicts : null,
},
null,
2
),
},
],
isError: true,
};
}
}
```
--------------------------------------------------------------------------------
/src/handlers/directory-operations.js:
--------------------------------------------------------------------------------
```javascript
import {
execPromise,
cloneRepo,
getDirectoryTree,
simpleGit,
path,
fs,
} from "./common.js";
/**
* Handles the git_directory_structure tool request
* @param {Object} params - Tool parameters
* @param {string} params.repo_url - Repository URL
* @returns {Object} - Tool response
*/
export async function handleGitDirectoryStructure({ repo_url }) {
try {
const repoPath = await cloneRepo(repo_url);
const tree = await getDirectoryTree(repoPath);
return {
content: [
{
type: "text",
text: tree,
},
],
};
} catch (error) {
return {
content: [
{
type: "text",
text: `Error: ${error.message}`,
},
],
isError: true,
};
}
}
/**
* Handles the git_read_files tool request
* @param {Object} params - Tool parameters
* @param {string} params.repo_url - Repository URL
* @param {string[]} params.file_paths - File paths to read
* @returns {Object} - Tool response
*/
export async function handleGitReadFiles({ repo_url, file_paths }) {
try {
const repoPath = await cloneRepo(repo_url);
const results = {};
for (const filePath of file_paths) {
const fullPath = path.join(repoPath, filePath);
try {
if (await fs.pathExists(fullPath)) {
results[filePath] = await fs.readFile(fullPath, "utf8");
} else {
results[filePath] = "Error: File not found";
}
} catch (error) {
results[filePath] = `Error reading file: ${error.message}`;
}
}
return {
content: [
{
type: "text",
text: JSON.stringify(results, null, 2),
},
],
};
} catch (error) {
return {
content: [
{
type: "text",
text: JSON.stringify(
{ error: `Failed to process repository: ${error.message}` },
null,
2
),
},
],
isError: true,
};
}
}
/**
* Handles the git_search_code tool request
* @param {Object} params - Tool parameters
* @param {string} params.repo_url - Repository URL
* @param {string} params.pattern - Search pattern (regex or string)
* @param {string[]} params.file_patterns - Optional file patterns to filter (e.g., "*.js")
* @param {boolean} params.case_sensitive - Whether the search is case sensitive
* @param {number} params.context_lines - Number of context lines to include
* @returns {Object} - Tool response
*/
export async function handleGitSearchCode({
repo_url,
pattern,
file_patterns = [],
case_sensitive = false,
context_lines = 2,
}) {
try {
const repoPath = await cloneRepo(repo_url);
// Build the grep command
let grepCommand = `cd "${repoPath}" && git grep`;
// Add options
if (!case_sensitive) {
grepCommand += " -i";
}
// Add context lines
grepCommand += ` -n -C${context_lines}`;
// Add pattern (escape quotes in the pattern)
const escapedPattern = pattern.replace(/"/g, '\\"');
grepCommand += ` "${escapedPattern}"`;
// Add file patterns if provided
if (file_patterns && file_patterns.length > 0) {
grepCommand += ` -- ${file_patterns.join(" ")}`;
}
// Execute the command
const { stdout, stderr } = await execPromise(grepCommand);
if (stderr) {
console.error(`Search error: ${stderr}`);
}
// Process the results
const results = [];
if (stdout) {
// Split by file sections (git grep output format)
const fileMatches = stdout.split(/^(?=\S[^:]*:)/m);
for (const fileMatch of fileMatches) {
if (!fileMatch.trim()) continue;
// Extract file name and matches
const lines = fileMatch.split("\n");
const firstLine = lines[0];
const fileNameMatch = firstLine.match(/^([^:]+):/);
if (fileNameMatch) {
const fileName = fileNameMatch[1];
const matches = [];
// Process each line
let currentMatch = null;
let contextLines = [];
for (let i = 0; i < lines.length; i++) {
const line = lines[i];
// Skip empty lines
if (!line.trim()) continue;
// Check if this is a line number indicator
const lineNumberMatch = line.match(/^([^-][^:]+):(\d+):(.*)/);
if (lineNumberMatch) {
// If we have a previous match, add it to the results
if (currentMatch) {
currentMatch.context_after = contextLines;
matches.push(currentMatch);
contextLines = [];
}
// Start a new match
currentMatch = {
file: fileName,
line_number: parseInt(lineNumberMatch[2]),
content: lineNumberMatch[3],
context_before: contextLines,
context_after: [],
};
contextLines = [];
} else {
// This is a context line
const contextMatch = line.match(/^([^:]+)-(\d+)-(.*)/);
if (contextMatch) {
contextLines.push({
line_number: parseInt(contextMatch[2]),
content: contextMatch[3],
});
}
}
}
// Add the last match if there is one
if (currentMatch) {
currentMatch.context_after = contextLines;
matches.push(currentMatch);
}
if (matches.length > 0) {
results.push({
file: fileName,
matches: matches,
});
}
}
}
}
return {
content: [
{
type: "text",
text: JSON.stringify(
{
pattern: pattern,
case_sensitive: case_sensitive,
context_lines: context_lines,
file_patterns: file_patterns,
results: results,
total_matches: results.reduce(
(sum, file) => sum + file.matches.length,
0
),
total_files: results.length,
},
null,
2
),
},
],
};
} catch (error) {
return {
content: [
{
type: "text",
text: JSON.stringify(
{ error: `Failed to search repository: ${error.message}` },
null,
2
),
},
],
isError: true,
};
}
}
/**
* Handles the git_local_changes tool request
* @param {Object} params - Tool parameters
* @param {string} params.repo_path - Local repository path
* @returns {Object} - Tool response
*/
export async function handleGitLocalChanges({ repo_path }) {
try {
// Use the provided local repo path
const git = simpleGit(repo_path);
// Get status information
const status = await git.status();
// Get detailed diff for modified files
let diffs = {};
for (const file of status.modified) {
diffs[file] = await git.diff([file]);
}
return {
content: [
{
type: "text",
text: JSON.stringify(
{
branch: status.current,
staged_files: status.staged,
modified_files: status.modified,
new_files: status.not_added,
deleted_files: status.deleted,
conflicted_files: status.conflicted,
diffs: diffs,
},
null,
2
),
},
],
};
} catch (error) {
return {
content: [
{
type: "text",
text: JSON.stringify(
{ error: `Failed to get local changes: ${error.message}` },
null,
2
),
},
],
isError: true,
};
}
}
```
--------------------------------------------------------------------------------
/src/handlers/commit-operations.js:
--------------------------------------------------------------------------------
```javascript
import { path, fs, simpleGit, cloneRepo } from "./common.js";
/**
* Handles the git_commit_history tool request
* @param {Object} params - Tool parameters
* @param {string} params.repo_url - Repository URL
* @param {string} params.branch - Branch name
* @param {number} params.max_count - Maximum number of commits
* @param {string} params.author - Author filter
* @param {string} params.since - Date filter (after)
* @param {string} params.until - Date filter (before)
* @param {string} params.grep - Message content filter
* @returns {Object} - Tool response
*/
export async function handleGitCommitHistory({
repo_url,
branch = "main",
max_count = 10,
author,
since,
until,
grep,
}) {
try {
const repoPath = await cloneRepo(repo_url);
const git = simpleGit(repoPath);
// Prepare log options
const logOptions = {
maxCount: max_count,
};
if (author) {
logOptions["--author"] = author;
}
if (since) {
logOptions["--since"] = since;
}
if (until) {
logOptions["--until"] = until;
}
if (grep) {
logOptions["--grep"] = grep;
}
// Make sure branch exists locally
const branches = await git.branch();
if (!branches.all.includes(branch)) {
await git.fetch("origin", branch);
}
// Get commit history
const log = await git.log(logOptions, branch);
// Format the commits
const commits = log.all.map((commit) => ({
hash: commit.hash,
author: commit.author_name,
email: commit.author_email,
date: commit.date,
message: commit.message,
body: commit.body || "",
}));
return {
content: [
{
type: "text",
text: JSON.stringify({ commits }, null, 2),
},
],
};
} catch (error) {
return {
content: [
{
type: "text",
text: JSON.stringify(
{ error: `Failed to get commit history: ${error.message}` },
null,
2
),
},
],
isError: true,
};
}
}
/**
* Handles the git_commits_details tool request
* @param {Object} params - Tool parameters
* @param {string} params.repo_url - Repository URL
* @param {string} params.branch - Branch name
* @param {number} params.max_count - Maximum number of commits
* @param {boolean} params.include_diff - Whether to include diffs
* @param {string} params.author - Author filter
* @param {string} params.since - Date filter (after)
* @param {string} params.until - Date filter (before)
* @param {string} params.grep - Message content filter
* @returns {Object} - Tool response
*/
export async function handleGitCommitsDetails({
repo_url,
branch = "main",
max_count = 10,
include_diff = false,
author,
since,
until,
grep,
}) {
try {
const repoPath = await cloneRepo(repo_url);
const git = simpleGit(repoPath);
// Ensure branch exists locally
const branches = await git.branch();
if (!branches.all.includes(branch)) {
await git.fetch("origin", branch);
}
// Prepare log options with full details
const logOptions = {
maxCount: max_count,
"--format": "fuller", // Get more detailed commit info
};
if (author) {
logOptions["--author"] = author;
}
if (since) {
logOptions["--since"] = since;
}
if (until) {
logOptions["--until"] = until;
}
if (grep) {
logOptions["--grep"] = grep;
}
// Get commit history with full details
const log = await git.log(logOptions, branch);
// Enhance with additional details
const commitsDetails = [];
for (const commit of log.all) {
const commitDetails = {
hash: commit.hash,
author: commit.author_name,
author_email: commit.author_email,
committer: commit.committer_name,
committer_email: commit.committer_email,
date: commit.date,
message: commit.message,
body: commit.body || "",
refs: commit.refs,
};
// Get the commit diff if requested
if (include_diff) {
if (commit.parents && commit.parents.length > 0) {
// For normal commits with parents
const diff = await git.diff([`${commit.hash}^..${commit.hash}`]);
commitDetails.diff = diff;
} else {
// For initial commits with no parents
const diff = await git.diff([
"4b825dc642cb6eb9a060e54bf8d69288fbee4904",
commit.hash,
]);
commitDetails.diff = diff;
}
// Get list of changed files
const showResult = await git.show([
"--name-status",
"--oneline",
commit.hash,
]);
// Parse the changed files from the result
const fileLines = showResult
.split("\n")
.slice(1) // Skip the first line (commit summary)
.filter(Boolean); // Remove empty lines
commitDetails.changed_files = fileLines
.map((line) => {
const match = line.match(/^([AMDTRC])\s+(.+)$/);
if (match) {
return {
status: match[1],
file: match[2],
};
}
return null;
})
.filter(Boolean);
}
commitsDetails.push(commitDetails);
}
return {
content: [
{
type: "text",
text: JSON.stringify(
{
commits: commitsDetails,
},
null,
2
),
},
],
};
} catch (error) {
return {
content: [
{
type: "text",
text: JSON.stringify(
{ error: `Failed to get commit details: ${error.message}` },
null,
2
),
},
],
isError: true,
};
}
}
/**
* Creates a commit with the specified message
* @param {string} repoPath - Path to the local repository
* @param {string} message - Commit message
* @returns {Object} - Commit result
*/
export async function handleGitCommit({ repo_path, message }) {
try {
const git = simpleGit(repo_path);
// Create the commit (only commit what's in the staging area)
const commitResult = await git.commit(message);
return {
content: [
{
type: "text",
text: JSON.stringify(
{
success: true,
commit_hash: commitResult.commit,
commit_message: message,
summary: commitResult.summary,
},
null,
2
),
},
],
};
} catch (error) {
return {
content: [
{
type: "text",
text: JSON.stringify(
{ error: `Failed to create commit: ${error.message}` },
null,
2
),
},
],
isError: true,
};
}
}
/**
* Tracks (stages) specific files or all files
* @param {string} repoPath - Path to the local repository
* @param {string[]} files - Array of file paths to track/stage (use ["."] for all files)
* @returns {Object} - Tracking result
*/
export async function handleGitTrack({ repo_path, files = ["."] }) {
try {
const git = simpleGit(repo_path);
// Add the specified files to the staging area
await git.add(files);
// Get status to show what files were tracked
const status = await git.status();
return {
content: [
{
type: "text",
text: JSON.stringify(
{
success: true,
message: `Tracked ${
files.length === 1 && files[0] === "."
? "all files"
: files.length + " files"
}`,
staged: status.staged,
not_staged: status.not_added,
modified: status.modified,
},
null,
2
),
},
],
};
} catch (error) {
return {
content: [
{
type: "text",
text: JSON.stringify(
{ error: `Failed to track files: ${error.message}` },
null,
2
),
},
],
isError: true,
};
}
}
```
--------------------------------------------------------------------------------
/src/handlers/remote-operations.js:
--------------------------------------------------------------------------------
```javascript
import { simpleGit } from "./common.js";
/**
* Pushes changes to a remote repository
* @param {string} repoPath - Path to the local repository
* @param {string} remote - Remote name (default: origin)
* @param {string} branch - Branch to push (default: current branch)
* @param {boolean} force - Whether to force push
* @returns {Object} - Push result
*/
export async function handleGitPush({
repo_path,
remote = "origin",
branch = null,
force = false,
}) {
try {
const git = simpleGit(repo_path);
// If no branch specified, get the current branch
if (!branch) {
const branchInfo = await git.branch();
branch = branchInfo.current;
}
// Perform the push
let pushOptions = [];
if (force) {
pushOptions.push("--force");
}
const pushResult = await git.push(remote, branch, pushOptions);
return {
content: [
{
type: "text",
text: JSON.stringify(
{
success: true,
result: pushResult,
message: `Pushed ${branch} to ${remote}`,
},
null,
2
),
},
],
};
} catch (error) {
return {
content: [
{
type: "text",
text: JSON.stringify(
{ error: `Failed to push changes: ${error.message}` },
null,
2
),
},
],
isError: true,
};
}
}
/**
* Pulls changes from a remote repository
* @param {string} repoPath - Path to the local repository
* @param {string} remote - Remote name (default: origin)
* @param {string} branch - Branch to pull (default: current branch)
* @param {boolean} rebase - Whether to rebase instead of merge
* @returns {Object} - Pull result
*/
export async function handleGitPull({
repo_path,
remote = "origin",
branch = null,
rebase = false,
}) {
try {
const git = simpleGit(repo_path);
// If no branch specified, use current branch
if (!branch) {
const branchInfo = await git.branch();
branch = branchInfo.current;
}
// Set up pull options
const pullOptions = {};
if (rebase) {
pullOptions["--rebase"] = null;
}
// Perform the pull
const pullResult = await git.pull(remote, branch, pullOptions);
return {
content: [
{
type: "text",
text: JSON.stringify(
{
success: true,
result: pullResult,
message: `Pulled from ${remote}/${branch}`,
},
null,
2
),
},
],
};
} catch (error) {
return {
content: [
{
type: "text",
text: JSON.stringify(
{
error: `Failed to pull changes: ${error.message}`,
conflicts: error.git ? error.git.conflicts : null,
},
null,
2
),
},
],
isError: true,
};
}
}
/**
* Manages Git remotes
* @param {string} repoPath - Path to the local repository
* @param {string} action - Remote action (list, add, remove, set-url, prune, get-url, rename, show)
* @param {string} name - Remote name
* @param {string} url - Remote URL (for add and set-url)
* @param {string} newName - New remote name (for rename)
* @param {boolean} pushUrl - Whether to set push URL instead of fetch URL (for set-url)
* @returns {Object} - Operation result
*/
export async function handleGitRemote({
repo_path,
action,
name = "",
url = "",
new_name = "",
push_url = false,
}) {
try {
const git = simpleGit(repo_path);
switch (action) {
case "list":
// Get all remotes with their URLs
const remotes = await git.remote(["-v"]);
// Parse the output
const remotesList = [];
const lines = remotes.trim().split("\n");
for (const line of lines) {
const match = line.match(/^([^\s]+)\s+([^\s]+)\s+\(([^)]+)\)$/);
if (match) {
const remoteName = match[1];
const remoteUrl = match[2];
const purpose = match[3];
// Check if this remote is already in our list
const existingRemote = remotesList.find(
(r) => r.name === remoteName
);
if (existingRemote) {
if (purpose === "fetch") {
existingRemote.fetch_url = remoteUrl;
} else if (purpose === "push") {
existingRemote.push_url = remoteUrl;
}
} else {
const remote = { name: remoteName };
if (purpose === "fetch") {
remote.fetch_url = remoteUrl;
} else if (purpose === "push") {
remote.push_url = remoteUrl;
}
remotesList.push(remote);
}
}
}
return {
content: [
{
type: "text",
text: JSON.stringify(
{
success: true,
remotes: remotesList,
},
null,
2
),
},
],
};
case "add":
if (!name) {
return {
content: [
{
type: "text",
text: JSON.stringify(
{ error: "Remote name is required for add action" },
null,
2
),
},
],
isError: true,
};
}
if (!url) {
return {
content: [
{
type: "text",
text: JSON.stringify(
{ error: "Remote URL is required for add action" },
null,
2
),
},
],
isError: true,
};
}
// Add the remote
await git.remote(["add", name, url]);
return {
content: [
{
type: "text",
text: JSON.stringify(
{
success: true,
message: `Added remote '${name}' with URL '${url}'`,
name: name,
url: url,
},
null,
2
),
},
],
};
case "remove":
if (!name) {
return {
content: [
{
type: "text",
text: JSON.stringify(
{ error: "Remote name is required for remove action" },
null,
2
),
},
],
isError: true,
};
}
// Remove the remote
await git.remote(["remove", name]);
return {
content: [
{
type: "text",
text: JSON.stringify(
{
success: true,
message: `Removed remote '${name}'`,
name: name,
},
null,
2
),
},
],
};
case "set-url":
if (!name) {
return {
content: [
{
type: "text",
text: JSON.stringify(
{ error: "Remote name is required for set-url action" },
null,
2
),
},
],
isError: true,
};
}
if (!url) {
return {
content: [
{
type: "text",
text: JSON.stringify(
{ error: "Remote URL is required for set-url action" },
null,
2
),
},
],
isError: true,
};
}
// Set the remote URL (fetch or push)
const args = ["set-url"];
if (push_url) {
args.push("--push");
}
args.push(name, url);
await git.remote(args);
return {
content: [
{
type: "text",
text: JSON.stringify(
{
success: true,
message: `Updated ${
push_url ? "push" : "fetch"
} URL for remote '${name}' to '${url}'`,
name: name,
url: url,
type: push_url ? "push" : "fetch",
},
null,
2
),
},
],
};
case "get-url":
if (!name) {
return {
content: [
{
type: "text",
text: JSON.stringify(
{ error: "Remote name is required for get-url action" },
null,
2
),
},
],
isError: true,
};
}
// Get the remote URL(s)
const getUrlArgs = ["get-url"];
if (push_url) {
getUrlArgs.push("--push");
}
getUrlArgs.push(name);
const remoteUrl = await git.remote(getUrlArgs);
const urls = remoteUrl.trim().split("\n");
return {
content: [
{
type: "text",
text: JSON.stringify(
{
success: true,
name: name,
urls: urls,
type: push_url ? "push" : "fetch",
},
null,
2
),
},
],
};
case "rename":
if (!name) {
return {
content: [
{
type: "text",
text: JSON.stringify(
{ error: "Remote name is required for rename action" },
null,
2
),
},
],
isError: true,
};
}
if (!new_name) {
return {
content: [
{
type: "text",
text: JSON.stringify(
{ error: "New remote name is required for rename action" },
null,
2
),
},
],
isError: true,
};
}
// Rename the remote
await git.remote(["rename", name, new_name]);
return {
content: [
{
type: "text",
text: JSON.stringify(
{
success: true,
message: `Renamed remote '${name}' to '${new_name}'`,
old_name: name,
new_name: new_name,
},
null,
2
),
},
],
};
case "prune":
if (!name) {
return {
content: [
{
type: "text",
text: JSON.stringify(
{ error: "Remote name is required for prune action" },
null,
2
),
},
],
isError: true,
};
}
// Prune the remote
await git.remote(["prune", name]);
return {
content: [
{
type: "text",
text: JSON.stringify(
{
success: true,
message: `Pruned remote '${name}'`,
name: name,
},
null,
2
),
},
],
};
case "show":
if (!name) {
return {
content: [
{
type: "text",
text: JSON.stringify(
{ error: "Remote name is required for show action" },
null,
2
),
},
],
isError: true,
};
}
// Show remote details
const showOutput = await git.raw(["remote", "show", name]);
// Parse the output to extract useful information
const remoteLines = showOutput.trim().split("\n");
const remoteInfo = {
name: name,
fetch_url: "",
push_url: "",
head_branch: "",
remote_branches: [],
local_branches: [],
};
for (const line of remoteLines) {
const trimmed = line.trim();
if (trimmed.startsWith("Fetch URL:")) {
remoteInfo.fetch_url = trimmed
.substring("Fetch URL:".length)
.trim();
} else if (trimmed.startsWith("Push URL:")) {
remoteInfo.push_url = trimmed.substring("Push URL:".length).trim();
} else if (trimmed.startsWith("HEAD branch:")) {
remoteInfo.head_branch = trimmed
.substring("HEAD branch:".length)
.trim();
} else if (trimmed.startsWith("Remote branch")) {
// Skip the "Remote branches:" line
} else if (trimmed.startsWith("Local branch")) {
// Skip the "Local branches:" line
} else if (trimmed.includes("merges with remote")) {
const parts = trimmed.split("merges with remote");
if (parts.length === 2) {
const localBranch = parts[0].trim();
const remoteBranch = parts[1].trim();
remoteInfo.local_branches.push({
local: localBranch,
remote: remoteBranch,
});
}
} else if (trimmed.includes("tracked")) {
const branch = trimmed.split(" ")[0].trim();
if (branch) {
remoteInfo.remote_branches.push(branch);
}
}
}
return {
content: [
{
type: "text",
text: JSON.stringify(
{
success: true,
remote: remoteInfo,
raw_output: showOutput,
},
null,
2
),
},
],
};
default:
return {
content: [
{
type: "text",
text: JSON.stringify(
{ error: `Unknown remote action: ${action}` },
null,
2
),
},
],
isError: true,
};
}
} catch (error) {
return {
content: [
{
type: "text",
text: JSON.stringify(
{ error: `Failed to manage remote: ${error.message}` },
null,
2
),
},
],
isError: true,
};
}
}
```
--------------------------------------------------------------------------------
/src/handlers/other-operations.js:
--------------------------------------------------------------------------------
```javascript
import { path, fs, simpleGit, execPromise } from "./common.js";
/**
* Manages Git hooks in the repository
* @param {string} repoPath - Path to the local repository
* @param {string} action - Hook action (list, get, create, enable, disable)
* @param {string} hookName - Name of the hook (e.g., "pre-commit", "post-merge")
* @param {string} script - Script content for the hook (for create action)
* @returns {Object} - Hook operation result
*/
export async function handleGitHooks({
repo_path,
action,
hook_name = "",
script = "",
}) {
try {
// Path to the hooks directory
const hooksDir = path.join(repo_path, ".git", "hooks");
switch (action) {
case "list":
// Get all available hooks
const files = await fs.readdir(hooksDir);
const hooks = [];
for (const file of files) {
// Filter out sample hooks
if (!file.endsWith(".sample")) {
const hookPath = path.join(hooksDir, file);
const stats = await fs.stat(hookPath);
hooks.push({
name: file,
path: hookPath,
size: stats.size,
executable: (stats.mode & 0o111) !== 0, // Check if executable
});
}
}
return {
content: [
{
type: "text",
text: JSON.stringify(
{
success: true,
hooks: hooks,
},
null,
2
),
},
],
};
case "get":
if (!hook_name) {
return {
content: [
{
type: "text",
text: JSON.stringify(
{ error: "Hook name is required for get action" },
null,
2
),
},
],
isError: true,
};
}
const hookPath = path.join(hooksDir, hook_name);
// Check if hook exists
if (!(await fs.pathExists(hookPath))) {
return {
content: [
{
type: "text",
text: JSON.stringify(
{ error: `Hook '${hook_name}' does not exist` },
null,
2
),
},
],
isError: true,
};
}
// Read hook content
const hookContent = await fs.readFile(hookPath, "utf8");
const stats = await fs.stat(hookPath);
return {
content: [
{
type: "text",
text: JSON.stringify(
{
success: true,
name: hook_name,
content: hookContent,
executable: (stats.mode & 0o111) !== 0,
},
null,
2
),
},
],
};
case "create":
if (!hook_name) {
return {
content: [
{
type: "text",
text: JSON.stringify(
{ error: "Hook name is required for create action" },
null,
2
),
},
],
isError: true,
};
}
if (!script) {
return {
content: [
{
type: "text",
text: JSON.stringify(
{ error: "Script content is required for create action" },
null,
2
),
},
],
isError: true,
};
}
const createHookPath = path.join(hooksDir, hook_name);
// Write hook content
await fs.writeFile(createHookPath, script);
// Make hook executable
await fs.chmod(createHookPath, 0o755);
return {
content: [
{
type: "text",
text: JSON.stringify(
{
success: true,
message: `Created hook '${hook_name}'`,
name: hook_name,
executable: true,
},
null,
2
),
},
],
};
default:
return {
content: [
{
type: "text",
text: JSON.stringify(
{ error: `Unknown hook action: ${action}` },
null,
2
),
},
],
isError: true,
};
}
} catch (error) {
return {
content: [
{
type: "text",
text: JSON.stringify(
{ error: `Failed to manage hook: ${error.message}` },
null,
2
),
},
],
isError: true,
};
}
}
/**
* Reverts a commit
* @param {string} repoPath - Path to the local repository
* @param {string} commit - Commit hash or reference to revert
* @param {boolean} noCommit - Whether to stage changes without committing
* @returns {Object} - Revert result
*/
export async function handleGitRevert({
repo_path,
commit,
no_commit = false,
}) {
try {
const git = simpleGit(repo_path);
if (!commit) {
return {
content: [
{
type: "text",
text: JSON.stringify(
{ error: "Commit reference is required" },
null,
2
),
},
],
isError: true,
};
}
// Build the revert command
const revertOptions = [];
if (no_commit) {
revertOptions.push("--no-commit");
}
// Perform the revert
const result = await git.raw(["revert", ...revertOptions, commit]);
return {
content: [
{
type: "text",
text: JSON.stringify(
{
success: true,
message: `Reverted commit ${commit}`,
commit: commit,
result: result,
},
null,
2
),
},
],
};
} catch (error) {
return {
content: [
{
type: "text",
text: JSON.stringify(
{
error: `Failed to revert commit: ${error.message}`,
conflicts: error.git ? error.git.conflicts : null,
},
null,
2
),
},
],
isError: true,
};
}
}
/**
* Performs Git clean operations
* @param {string} repoPath - Path to the local repository
* @param {boolean} directories - Whether to remove directories as well
* @param {boolean} force - Whether to force clean
* @param {boolean} dryRun - Whether to perform a dry run (show what would be done)
* @returns {Object} - Clean result
*/
export async function handleGitClean({
repo_path,
directories = false,
force = false,
dry_run = true,
}) {
try {
const git = simpleGit(repo_path);
// At least one of force or dry_run must be true for safety
if (!force && !dry_run) {
return {
content: [
{
type: "text",
text: JSON.stringify(
{ error: "For safety, either force or dry_run must be true" },
null,
2
),
},
],
isError: true,
};
}
// Build the clean command
const cleanOptions = [];
if (directories) {
cleanOptions.push("-d");
}
if (force) {
cleanOptions.push("-f");
}
if (dry_run) {
cleanOptions.push("-n");
}
// Get the files that would be removed
const preview = await git.clean([
"--dry-run",
...(directories ? ["-d"] : []),
]);
const filesToRemove = preview
.split("\n")
.filter((line) => line.startsWith("Would remove"))
.map((line) => line.replace("Would remove ", "").trim());
if (!dry_run) {
// Perform the actual clean
await git.clean(cleanOptions);
}
return {
content: [
{
type: "text",
text: JSON.stringify(
{
success: true,
message: dry_run
? `Would remove ${filesToRemove.length} files/directories`
: `Removed ${filesToRemove.length} files/directories`,
files: filesToRemove,
dry_run: dry_run,
},
null,
2
),
},
],
};
} catch (error) {
return {
content: [
{
type: "text",
text: JSON.stringify(
{ error: `Failed to clean repository: ${error.message}` },
null,
2
),
},
],
isError: true,
};
}
}
/**
* Updates Git LFS objects
* @param {string} repoPath - Path to the local repository
* @param {boolean} dryRun - Whether to perform a dry run
* @param {boolean} pointers - Whether to convert pointers to objects
* @returns {Object} - LFS objects update result
*/
export async function handleGitLFSFetch({
repo_path,
dry_run = false,
pointers = false,
}) {
try {
// Build the command
let command = `cd "${repo_path}" && git lfs fetch`;
if (dry_run) {
command += " --dry-run";
}
if (pointers) {
command += " --pointers";
}
// Execute the command
const { stdout, stderr } = await execPromise(command);
// Parse the output
const output = stdout.trim();
const errors = stderr.trim();
if (errors && !output) {
return {
content: [
{
type: "text",
text: JSON.stringify({ error: errors }, null, 2),
},
],
isError: true,
};
}
return {
content: [
{
type: "text",
text: JSON.stringify(
{
success: true,
message: "Git LFS fetch completed",
output: output,
dry_run: dry_run,
},
null,
2
),
},
],
};
} catch (error) {
// Special handling for "git lfs not installed" error
if (error.message.includes("git: lfs is not a git command")) {
return {
content: [
{
type: "text",
text: JSON.stringify(
{ error: "Git LFS is not installed on the system" },
null,
2
),
},
],
isError: true,
};
}
return {
content: [
{
type: "text",
text: JSON.stringify(
{ error: `Failed to fetch LFS objects: ${error.message}` },
null,
2
),
},
],
isError: true,
};
}
}
/**
* Gets blame information for a file
* @param {string} repoPath - Path to the local repository
* @param {string} filePath - Path to the file
* @param {string} rev - Revision to blame (default: HEAD)
* @returns {Object} - Blame result
*/
export async function handleGitBlame({ repo_path, file_path, rev = "HEAD" }) {
try {
const git = simpleGit(repo_path);
// Run git blame
const blameResult = await git.raw([
"blame",
"--line-porcelain",
rev,
"--",
file_path,
]);
// Parse the output
const lines = blameResult.split("\n");
const blameInfo = [];
let currentCommit = null;
for (let i = 0; i < lines.length; i++) {
const line = lines[i];
// Start of a new blame entry
if (line.match(/^[0-9a-f]{40}/)) {
if (currentCommit) {
blameInfo.push(currentCommit);
}
const parts = line.split(" ");
currentCommit = {
hash: parts[0],
originalLine: parseInt(parts[1]),
finalLine: parseInt(parts[2]),
lineCount: parseInt(parts[3] || 1),
author: "",
authorMail: "",
authorTime: 0,
subject: "",
content: "",
};
} else if (line.startsWith("author ") && currentCommit) {
currentCommit.author = line.substring(7);
} else if (line.startsWith("author-mail ") && currentCommit) {
currentCommit.authorMail = line.substring(12).replace(/[<>]/g, "");
} else if (line.startsWith("author-time ") && currentCommit) {
currentCommit.authorTime = parseInt(line.substring(12));
} else if (line.startsWith("summary ") && currentCommit) {
currentCommit.subject = line.substring(8);
} else if (line.startsWith("\t") && currentCommit) {
// This is the content line
currentCommit.content = line.substring(1);
blameInfo.push(currentCommit);
currentCommit = null;
}
}
// Add the last commit if there is one
if (currentCommit) {
blameInfo.push(currentCommit);
}
return {
content: [
{
type: "text",
text: JSON.stringify(
{
success: true,
file: file_path,
blame: blameInfo,
},
null,
2
),
},
],
};
} catch (error) {
return {
content: [
{
type: "text",
text: JSON.stringify(
{ error: `Failed to get blame information: ${error.message}` },
null,
2
),
},
],
isError: true,
};
}
}
/**
* Manages git attributes for files
* @param {string} repoPath - Path to the local repository
* @param {string} action - Action (get, set, list)
* @param {string} pattern - File pattern
* @param {string} attribute - Attribute to set
* @returns {Object} - Operation result
*/
export async function handleGitAttributes({
repo_path,
action,
pattern = "",
attribute = "",
}) {
try {
const attributesPath = path.join(repo_path, ".gitattributes");
switch (action) {
case "list":
// Check if .gitattributes exists
if (!(await fs.pathExists(attributesPath))) {
return {
content: [
{
type: "text",
text: JSON.stringify(
{
success: true,
attributes: [],
message: ".gitattributes file does not exist",
},
null,
2
),
},
],
};
}
// Read and parse .gitattributes
const content = await fs.readFile(attributesPath, "utf8");
const lines = content
.split("\n")
.filter((line) => line.trim() && !line.startsWith("#"));
const attributes = lines.map((line) => {
const parts = line.trim().split(/\s+/);
return {
pattern: parts[0],
attributes: parts.slice(1),
};
});
return {
content: [
{
type: "text",
text: JSON.stringify(
{
success: true,
attributes: attributes,
},
null,
2
),
},
],
};
case "get":
if (!pattern) {
return {
content: [
{
type: "text",
text: JSON.stringify(
{ error: "Pattern is required for get action" },
null,
2
),
},
],
isError: true,
};
}
// Check if .gitattributes exists
if (!(await fs.pathExists(attributesPath))) {
return {
content: [
{
type: "text",
text: JSON.stringify(
{
success: true,
pattern: pattern,
attributes: [],
message: ".gitattributes file does not exist",
},
null,
2
),
},
],
};
}
// Read and find pattern
const getContent = await fs.readFile(attributesPath, "utf8");
const getLines = getContent.split("\n");
const matchingLines = getLines.filter((line) => {
const parts = line.trim().split(/\s+/);
return parts[0] === pattern;
});
if (matchingLines.length === 0) {
return {
content: [
{
type: "text",
text: JSON.stringify(
{
success: true,
pattern: pattern,
attributes: [],
message: `No attributes found for pattern '${pattern}'`,
},
null,
2
),
},
],
};
}
// Parse attributes
const patternAttributes = matchingLines
.map((line) => {
const parts = line.trim().split(/\s+/);
return parts.slice(1);
})
.flat();
return {
content: [
{
type: "text",
text: JSON.stringify(
{
success: true,
pattern: pattern,
attributes: patternAttributes,
},
null,
2
),
},
],
};
case "set":
if (!pattern) {
return {
content: [
{
type: "text",
text: JSON.stringify(
{ error: "Pattern is required for set action" },
null,
2
),
},
],
isError: true,
};
}
if (!attribute) {
return {
content: [
{
type: "text",
text: JSON.stringify(
{ error: "Attribute is required for set action" },
null,
2
),
},
],
isError: true,
};
}
// Check if .gitattributes exists, create if not
if (!(await fs.pathExists(attributesPath))) {
await fs.writeFile(attributesPath, "");
}
// Read current content
const setContent = await fs.readFile(attributesPath, "utf8");
const setLines = setContent.split("\n");
// Check if pattern already exists
const patternIndex = setLines.findIndex((line) => {
const parts = line.trim().split(/\s+/);
return parts[0] === pattern;
});
if (patternIndex !== -1) {
// Update existing pattern
const parts = setLines[patternIndex].trim().split(/\s+/);
// Check if attribute already exists
if (!parts.includes(attribute)) {
parts.push(attribute);
setLines[patternIndex] = parts.join(" ");
}
} else {
// Add new pattern
setLines.push(`${pattern} ${attribute}`);
}
// Write back to file
await fs.writeFile(
attributesPath,
setLines.filter(Boolean).join("\n") + "\n"
);
return {
content: [
{
type: "text",
text: JSON.stringify(
{
success: true,
message: `Set attribute '${attribute}' for pattern '${pattern}'`,
pattern: pattern,
attribute: attribute,
},
null,
2
),
},
],
};
default:
return {
content: [
{
type: "text",
text: JSON.stringify(
{ error: `Unknown attributes action: ${action}` },
null,
2
),
},
],
isError: true,
};
}
} catch (error) {
return {
content: [
{
type: "text",
text: JSON.stringify(
{ error: `Failed to manage git attributes: ${error.message}` },
null,
2
),
},
],
isError: true,
};
}
}
/**
* Creates a git archive (zip or tar)
* @param {string} repoPath - Path to the local repository
* @param {string} outputPath - Output path for the archive
* @param {string} format - Archive format (zip or tar)
* @param {string} prefix - Prefix for files in the archive
* @param {string} treeish - Tree-ish to archive (default: HEAD)
* @returns {Object} - Archive result
*/
export async function handleGitArchive({
repo_path,
output_path,
format = "zip",
prefix = "",
treeish = "HEAD",
}) {
try {
const git = simpleGit(repo_path);
// Validate format
if (!["zip", "tar"].includes(format)) {
return {
content: [
{
type: "text",
text: JSON.stringify(
{
error: `Invalid archive format: ${format}. Use 'zip' or 'tar'.`,
},
null,
2
),
},
],
isError: true,
};
}
// Build archive command
const archiveArgs = ["archive", `--format=${format}`];
if (prefix) {
archiveArgs.push(`--prefix=${prefix}/`);
}
archiveArgs.push("-o", output_path, treeish);
// Create archive
await git.raw(archiveArgs);
// Check if archive was created
if (!(await fs.pathExists(output_path))) {
return {
content: [
{
type: "text",
text: JSON.stringify(
{ error: "Failed to create archive: output file not found" },
null,
2
),
},
],
isError: true,
};
}
// Get file size
const stats = await fs.stat(output_path);
return {
content: [
{
type: "text",
text: JSON.stringify(
{
success: true,
message: `Created ${format} archive at ${output_path}`,
format: format,
output_path: output_path,
size_bytes: stats.size,
treeish: treeish,
},
null,
2
),
},
],
};
} catch (error) {
return {
content: [
{
type: "text",
text: JSON.stringify(
{ error: `Failed to create archive: ${error.message}` },
null,
2
),
},
],
isError: true,
};
}
}
/**
* Manages Git LFS (Large File Storage)
* @param {string} repoPath - Path to the local repository
* @param {string} action - LFS action (install, track, untrack, list)
* @param {string|string[]} patterns - File patterns for track/untrack
* @returns {Object} - Operation result
*/
export async function handleGitLFS({ repo_path, action, patterns = [] }) {
try {
// Make sure patterns is an array
const patternsArray = Array.isArray(patterns) ? patterns : [patterns];
switch (action) {
case "install":
// Install Git LFS in the repository
const { stdout: installOutput } = await execPromise(
`cd "${repo_path}" && git lfs install`
);
return {
content: [
{
type: "text",
text: JSON.stringify(
{
success: true,
message: "Git LFS installed successfully",
output: installOutput.trim(),
},
null,
2
),
},
],
};
case "track":
if (patternsArray.length === 0) {
return {
content: [
{
type: "text",
text: JSON.stringify(
{
error: "At least one pattern is required for track action",
},
null,
2
),
},
],
isError: true,
};
}
// Track files with LFS
const trackResults = [];
for (const pattern of patternsArray) {
const { stdout: trackOutput } = await execPromise(
`cd "${repo_path}" && git lfs track "${pattern}"`
);
trackResults.push({
pattern: pattern,
output: trackOutput.trim(),
});
}
return {
content: [
{
type: "text",
text: JSON.stringify(
{
success: true,
message: `Tracked ${patternsArray.length} pattern(s) with Git LFS`,
patterns: patternsArray,
results: trackResults,
},
null,
2
),
},
],
};
case "untrack":
if (patternsArray.length === 0) {
return {
content: [
{
type: "text",
text: JSON.stringify(
{
error:
"At least one pattern is required for untrack action",
},
null,
2
),
},
],
isError: true,
};
}
// Untrack files from LFS
const untrackResults = [];
for (const pattern of patternsArray) {
const { stdout: untrackOutput } = await execPromise(
`cd "${repo_path}" && git lfs untrack "${pattern}"`
);
untrackResults.push({
pattern: pattern,
output: untrackOutput.trim(),
});
}
return {
content: [
{
type: "text",
text: JSON.stringify(
{
success: true,
message: `Untracked ${patternsArray.length} pattern(s) from Git LFS`,
patterns: patternsArray,
results: untrackResults,
},
null,
2
),
},
],
};
case "list":
// List tracked patterns
const { stdout: listOutput } = await execPromise(
`cd "${repo_path}" && git lfs track`
);
// Parse the output to extract patterns
const trackedPatterns = listOutput
.split("\n")
.filter((line) => line.includes("("))
.map((line) => {
const match = line.match(/Tracking "([^"]+)"/);
return match ? match[1] : null;
})
.filter(Boolean);
return {
content: [
{
type: "text",
text: JSON.stringify(
{
success: true,
tracked_patterns: trackedPatterns,
},
null,
2
),
},
],
};
default:
return {
content: [
{
type: "text",
text: JSON.stringify(
{ error: `Unknown LFS action: ${action}` },
null,
2
),
},
],
isError: true,
};
}
} catch (error) {
// Special handling for "git lfs not installed" error
if (error.message.includes("git: lfs is not a git command")) {
return {
content: [
{
type: "text",
text: JSON.stringify(
{ error: "Git LFS is not installed on the system" },
null,
2
),
},
],
isError: true,
};
}
return {
content: [
{
type: "text",
text: JSON.stringify(
{ error: `Failed to perform LFS operation: ${error.message}` },
null,
2
),
},
],
isError: true,
};
}
}
```
--------------------------------------------------------------------------------
/src/server.js:
--------------------------------------------------------------------------------
```javascript
import { Server } from "@modelcontextprotocol/sdk/server/index.js";
import {
CallToolRequestSchema,
ErrorCode,
ListToolsRequestSchema,
McpError,
} from "@modelcontextprotocol/sdk/types.js";
import { StdioServerTransport } from "@modelcontextprotocol/sdk/server/stdio.js";
import {
handleGitDirectoryStructure,
handleGitReadFiles,
handleGitBranchDiff,
handleGitCommitHistory,
handleGitCommitsDetails,
handleGitLocalChanges,
handleGitSearchCode,
handleGitCommit,
handleGitTrack,
handleGitCheckoutBranch,
handleGitDeleteBranch,
handleGitMergeBranch,
handleGitPush,
handleGitPull,
handleGitStash,
handleGitCreateTag,
handleGitRebase,
handleGitConfig,
handleGitReset,
handleGitArchive,
handleGitAttributes,
handleGitBlame,
handleGitClean,
handleGitHooks,
handleGitLFS,
handleGitLFSFetch,
handleGitRevert,
} from "./handlers/index.js";
/**
* Main server class for the Git Repository Browser MCP server
*/
export class GitRepoBrowserServer {
/**
* Initialize the server
*/
constructor() {
this.server = new Server(
{
name: "mcp-git-repo-browser",
version: "0.1.0",
},
{
capabilities: {
tools: {},
},
}
);
this.setupToolHandlers();
// Error handling
this.server.onerror = (error) => console.error("[MCP Error]", error);
process.on("SIGINT", async () => {
await this.server.close();
process.exit(0);
});
}
/**
* Get all registered handler names
* @returns {string[]} Array of handler names
*/
getHandlerNames() {
return Object.keys(this.handlersMap || {});
}
/**
* Check if a handler exists
* @param {string} name - Handler name to check
* @returns {boolean} True if handler exists
*/
hasHandler(name) {
return Boolean(this.handlersMap && this.handlersMap[name]);
}
/**
* Set up tool handlers for the server
*/
setupToolHandlers() {
// Store tools list for dynamic updates
this.toolsList = [
// Basic Repository Operations
{
name: "git_directory_structure",
description:
"Clone a Git repository and return its directory structure in a tree format.",
inputSchema: {
type: "object",
properties: {
repo_url: {
type: "string",
description: "The URL of the Git repository",
},
},
required: ["repo_url"],
},
},
{
name: "git_read_files",
description:
"Read the contents of specified files in a given git repository.",
inputSchema: {
type: "object",
properties: {
repo_url: {
type: "string",
description: "The URL of the Git repository",
},
file_paths: {
type: "array",
items: { type: "string" },
description:
"List of file paths to read (relative to repository root)",
},
},
required: ["repo_url", "file_paths"],
},
},
// Branch Operations
{
name: "git_branch_diff",
description:
"Compare two branches and show files changed between them.",
inputSchema: {
type: "object",
properties: {
repo_url: {
type: "string",
description: "The URL of the Git repository",
},
source_branch: {
type: "string",
description: "The source branch name",
},
target_branch: {
type: "string",
description: "The target branch name",
},
show_patch: {
type: "boolean",
description: "Whether to include the actual diff patches",
default: false,
},
},
required: ["repo_url", "source_branch", "target_branch"],
},
},
{
name: "git_checkout_branch",
description: "Create and/or checkout a branch.",
inputSchema: {
type: "object",
properties: {
repo_path: {
type: "string",
description: "The path to the local Git repository",
},
branch_name: {
type: "string",
description: "The name of the branch to checkout",
},
start_point: {
type: "string",
description: "Starting point for the branch (optional)",
},
create: {
type: "boolean",
description: "Whether to create a new branch",
default: false,
},
},
required: ["repo_path", "branch_name"],
},
},
{
name: "git_delete_branch",
description: "Delete a branch from the repository.",
inputSchema: {
type: "object",
properties: {
repo_path: {
type: "string",
description: "The path to the local Git repository",
},
branch_name: {
type: "string",
description: "The name of the branch to delete",
},
force: {
type: "boolean",
description: "Whether to force deletion",
default: false,
},
},
required: ["repo_path", "branch_name"],
},
},
{
name: "git_merge_branch",
description: "Merge a source branch into the current or target branch.",
inputSchema: {
type: "object",
properties: {
repo_path: {
type: "string",
description: "The path to the local Git repository",
},
source_branch: {
type: "string",
description: "Branch to merge from",
},
target_branch: {
type: "string",
description:
"Branch to merge into (optional, uses current branch if not provided)",
},
no_fast_forward: {
type: "boolean",
description:
"Whether to create a merge commit even if fast-forward is possible",
default: false,
},
},
required: ["repo_path", "source_branch"],
},
},
// Commit Operations
{
name: "git_commit_history",
description: "Get commit history for a branch with optional filtering.",
inputSchema: {
type: "object",
properties: {
repo_url: {
type: "string",
description: "The URL of the Git repository",
},
branch: {
type: "string",
description: "The branch to get history from",
default: "main",
},
max_count: {
type: "integer",
description: "Maximum number of commits to retrieve",
default: 10,
},
author: {
type: "string",
description: "Filter by author (optional)",
},
since: {
type: "string",
description:
'Get commits after this date (e.g., "1 week ago", "2023-01-01")',
},
until: {
type: "string",
description:
'Get commits before this date (e.g., "yesterday", "2023-12-31")',
},
grep: {
type: "string",
description: "Filter commits by message content (optional)",
},
},
required: ["repo_url"],
},
},
{
name: "git_commits_details",
description:
"Get detailed information about commits including full messages and diffs.",
inputSchema: {
type: "object",
properties: {
repo_url: {
type: "string",
description: "The URL of the Git repository",
},
branch: {
type: "string",
description: "The branch to get commits from",
default: "main",
},
max_count: {
type: "integer",
description: "Maximum number of commits to retrieve",
default: 10,
},
include_diff: {
type: "boolean",
description: "Whether to include the commit diffs",
default: false,
},
since: {
type: "string",
description:
'Get commits after this date (e.g., "1 week ago", "2023-01-01")',
},
until: {
type: "string",
description:
'Get commits before this date (e.g., "yesterday", "2023-12-31")',
},
author: {
type: "string",
description: "Filter by author (optional)",
},
grep: {
type: "string",
description: "Filter commits by message content (optional)",
},
},
required: ["repo_url"],
},
},
{
name: "git_commit",
description: "Create a commit with the specified message.",
inputSchema: {
type: "object",
properties: {
repo_path: {
type: "string",
description: "The path to the local Git repository",
},
message: {
type: "string",
description: "The commit message",
},
},
required: ["repo_path", "message"],
},
},
{
name: "git_track",
description: "Track (stage) specific files or all files.",
inputSchema: {
type: "object",
properties: {
repo_path: {
type: "string",
description: "The path to the local Git repository",
},
files: {
type: "array",
items: { type: "string" },
description:
'Array of file paths to track/stage (use ["."] for all files)',
default: ["."],
},
},
required: ["repo_path"],
},
},
{
name: "git_local_changes",
description: "Get uncommitted changes in the working directory.",
inputSchema: {
type: "object",
properties: {
repo_path: {
type: "string",
description: "The path to the local Git repository",
},
},
required: ["repo_path"],
},
},
{
name: "git_search_code",
description: "Search for patterns in repository code.",
inputSchema: {
type: "object",
properties: {
repo_url: {
type: "string",
description: "The URL of the Git repository",
},
pattern: {
type: "string",
description: "Search pattern (regex or string)",
},
file_patterns: {
type: "array",
items: { type: "string" },
description: 'Optional file patterns to filter (e.g., "*.js")',
},
case_sensitive: {
type: "boolean",
description: "Whether the search is case sensitive",
default: false,
},
context_lines: {
type: "integer",
description: "Number of context lines to include",
default: 2,
},
},
required: ["repo_url", "pattern"],
},
},
// Remote Operations
{
name: "git_push",
description: "Push changes to a remote repository.",
inputSchema: {
type: "object",
properties: {
repo_path: {
type: "string",
description: "The path to the local Git repository",
},
remote: {
type: "string",
description: "Remote name",
default: "origin",
},
branch: {
type: "string",
description: "Branch to push (default: current branch)",
},
force: {
type: "boolean",
description: "Whether to force push",
default: false,
},
},
required: ["repo_path"],
},
},
{
name: "git_pull",
description: "Pull changes from a remote repository.",
inputSchema: {
type: "object",
properties: {
repo_path: {
type: "string",
description: "The path to the local Git repository",
},
remote: {
type: "string",
description: "Remote name",
default: "origin",
},
branch: {
type: "string",
description: "Branch to pull (default: current branch)",
},
rebase: {
type: "boolean",
description: "Whether to rebase instead of merge",
default: false,
},
},
required: ["repo_path"],
},
},
// Stash Operations
{
name: "git_stash",
description: "Create or apply a stash.",
inputSchema: {
type: "object",
properties: {
repo_path: {
type: "string",
description: "The path to the local Git repository",
},
action: {
type: "string",
description: "Stash action (save, pop, apply, list, drop)",
default: "save",
enum: ["save", "pop", "apply", "list", "drop"],
},
message: {
type: "string",
description: "Stash message (for save action)",
default: "",
},
index: {
type: "integer",
description: "Stash index (for pop, apply, drop actions)",
default: 0,
},
},
required: ["repo_path"],
},
},
// Tag Operations
{
name: "git_create_tag",
description: "Create a tag.",
inputSchema: {
type: "object",
properties: {
repo_path: {
type: "string",
description: "The path to the local Git repository",
},
tag_name: {
type: "string",
description: "Name of the tag",
},
message: {
type: "string",
description: "Tag message (for annotated tags)",
default: "",
},
annotated: {
type: "boolean",
description: "Whether to create an annotated tag",
default: true,
},
},
required: ["repo_path", "tag_name"],
},
},
// Advanced Operations
{
name: "git_rebase",
description: "Rebase the current branch onto another branch or commit.",
inputSchema: {
type: "object",
properties: {
repo_path: {
type: "string",
description: "The path to the local Git repository",
},
onto: {
type: "string",
description: "Branch or commit to rebase onto",
},
interactive: {
type: "boolean",
description: "Whether to perform an interactive rebase",
default: false,
},
},
required: ["repo_path", "onto"],
},
},
// Configuration
{
name: "git_config",
description: "Configure git settings for the repository.",
inputSchema: {
type: "object",
properties: {
repo_path: {
type: "string",
description: "The path to the local Git repository",
},
scope: {
type: "string",
description: "Configuration scope (local, global, system)",
default: "local",
enum: ["local", "global", "system"],
},
key: {
type: "string",
description: "Configuration key",
},
value: {
type: "string",
description: "Configuration value",
},
},
required: ["repo_path", "key", "value"],
},
},
// Repo Management
{
name: "git_reset",
description: "Reset repository to specified commit or state.",
inputSchema: {
type: "object",
properties: {
repo_path: {
type: "string",
description: "The path to the local Git repository",
},
mode: {
type: "string",
description: "Reset mode (soft, mixed, hard)",
default: "mixed",
enum: ["soft", "mixed", "hard"],
},
to: {
type: "string",
description: "Commit or reference to reset to",
default: "HEAD",
},
},
required: ["repo_path"],
},
},
// Archive Operations
{
name: "git_archive",
description: "Create a git archive (zip or tar).",
inputSchema: {
type: "object",
properties: {
repo_path: {
type: "string",
description: "The path to the local Git repository",
},
output_path: {
type: "string",
description: "Output path for the archive",
},
format: {
type: "string",
description: "Archive format (zip or tar)",
default: "zip",
enum: ["zip", "tar"],
},
prefix: {
type: "string",
description: "Prefix for files in the archive",
},
treeish: {
type: "string",
description: "Tree-ish to archive (default: HEAD)",
default: "HEAD",
},
},
required: ["repo_path", "output_path"],
},
},
// Attributes Operations
{
name: "git_attributes",
description: "Manage git attributes for files.",
inputSchema: {
type: "object",
properties: {
repo_path: {
type: "string",
description: "The path to the local Git repository",
},
action: {
type: "string",
description: "Action (get, set, list)",
default: "list",
enum: ["get", "set", "list"],
},
pattern: {
type: "string",
description: "File pattern",
},
attribute: {
type: "string",
description: "Attribute to set",
},
},
required: ["repo_path", "action"],
},
},
// Blame Operations
{
name: "git_blame",
description: "Get blame information for a file.",
inputSchema: {
type: "object",
properties: {
repo_path: {
type: "string",
description: "The path to the local Git repository",
},
file_path: {
type: "string",
description: "Path to the file",
},
rev: {
type: "string",
description: "Revision to blame (default: HEAD)",
default: "HEAD",
},
},
required: ["repo_path", "file_path"],
},
},
// Clean Operations
{
name: "git_clean",
description: "Perform git clean operations.",
inputSchema: {
type: "object",
properties: {
repo_path: {
type: "string",
description: "The path to the local Git repository",
},
directories: {
type: "boolean",
description: "Whether to remove directories as well",
default: false,
},
force: {
type: "boolean",
description: "Whether to force clean",
default: false,
},
dry_run: {
type: "boolean",
description: "Whether to perform a dry run",
default: true,
},
},
required: ["repo_path"],
},
},
// Hooks Operations
{
name: "git_hooks",
description: "Manage git hooks in the repository.",
inputSchema: {
type: "object",
properties: {
repo_path: {
type: "string",
description: "The path to the local Git repository",
},
action: {
type: "string",
description: "Hook action (list, get, create, enable, disable)",
default: "list",
enum: ["list", "get", "create", "enable", "disable"],
},
hook_name: {
type: "string",
description:
"Name of the hook (e.g., 'pre-commit', 'post-merge')",
},
script: {
type: "string",
description: "Script content for the hook (for create action)",
},
},
required: ["repo_path", "action"],
},
},
// LFS Operations
{
name: "git_lfs",
description: "Manage Git LFS (Large File Storage).",
inputSchema: {
type: "object",
properties: {
repo_path: {
type: "string",
description: "The path to the local Git repository",
},
action: {
type: "string",
description: "LFS action (install, track, untrack, list)",
default: "list",
enum: ["install", "track", "untrack", "list"],
},
patterns: {
type: "array",
description: "File patterns for track/untrack",
items: { type: "string" },
},
},
required: ["repo_path", "action"],
},
},
// LFS Fetch Operations
{
name: "git_lfs_fetch",
description: "Fetch LFS objects from the remote repository.",
inputSchema: {
type: "object",
properties: {
repo_path: {
type: "string",
description: "The path to the local Git repository",
},
dry_run: {
type: "boolean",
description: "Whether to perform a dry run",
default: false,
},
pointers: {
type: "boolean",
description: "Whether to convert pointers to objects",
default: false,
},
},
required: ["repo_path"],
},
},
// Revert Operations
{
name: "git_revert",
description: "Revert the current branch to a commit or state.",
inputSchema: {
type: "object",
properties: {
repo_path: {
type: "string",
description: "The path to the local Git repository",
},
commit: {
type: "string",
description: "Commit hash or reference to revert",
},
no_commit: {
type: "boolean",
description: "Whether to stage changes without committing",
default: false,
},
},
required: ["repo_path"],
},
},
];
// Set up dynamic tool listing handler
this.server.setRequestHandler(ListToolsRequestSchema, async () => ({
tools: this.toolsList,
}));
// Handler categories for organization and improved discoverability
this.handlerCategories = {
read: [
"git_directory_structure",
"git_read_files",
"git_branch_diff",
"git_commit_history",
"git_commits_details",
"git_local_changes",
"git_search_code",
],
write: ["git_commit", "git_track", "git_reset"],
branch: [
"git_checkout_branch",
"git_delete_branch",
"git_merge_branch",
"git_branch_diff",
],
remote: ["git_push", "git_pull"],
stash: ["git_stash"],
config: ["git_config"],
tag: ["git_create_tag"],
advanced: ["git_rebase"],
};
// Create handler aliases for improved usability
this.handlerAliases = {
git_ls: "git_directory_structure",
git_show: "git_read_files",
git_diff: "git_branch_diff",
git_log: "git_commit_history",
git_status: "git_local_changes",
git_grep: "git_search_code",
git_add: "git_track",
git_checkout: "git_checkout_branch",
git_fetch: "git_pull",
};
// Initialize statistics tracking
this.handlerStats = new Map();
// Create a handlers mapping for O(1) lookup time
this.handlersMap = {
// Primary handlers
git_directory_structure: handleGitDirectoryStructure,
git_read_files: handleGitReadFiles,
git_branch_diff: handleGitBranchDiff,
git_commit_history: handleGitCommitHistory,
git_commits_details: handleGitCommitsDetails,
git_local_changes: handleGitLocalChanges,
git_search_code: handleGitSearchCode,
git_commit: handleGitCommit,
git_track: handleGitTrack,
git_checkout_branch: handleGitCheckoutBranch,
git_delete_branch: handleGitDeleteBranch,
git_merge_branch: handleGitMergeBranch,
git_push: handleGitPush,
git_pull: handleGitPull,
git_stash: handleGitStash,
git_create_tag: handleGitCreateTag,
git_rebase: handleGitRebase,
git_config: handleGitConfig,
git_reset: handleGitReset,
git_archive: handleGitArchive,
git_attributes: handleGitAttributes,
git_blame: handleGitBlame,
git_clean: handleGitClean,
git_hooks: handleGitHooks,
git_lfs: handleGitLFS,
git_lfs_fetch: handleGitLFSFetch,
git_revert: handleGitRevert,
};
// Register aliases for O(1) lookup
Object.entries(this.handlerAliases).forEach(([alias, target]) => {
if (this.handlersMap[target]) {
this.handlersMap[alias] = this.handlersMap[target];
}
});
// Log registered handlers
console.error(
`[INFO] Registered ${
Object.keys(this.handlersMap).length
} Git tool handlers`
);
// Add method to get handlers by category
this.getHandlersByCategory = (category) => {
return this.handlerCategories[category] || [];
};
// Add method to execute multiple Git operations in sequence
this.executeSequence = async (operations) => {
const results = [];
for (const op of operations) {
const { name, arguments: args } = op;
const handler = this.handlersMap[name];
if (!handler) {
throw new McpError(ErrorCode.MethodNotFound, `Unknown tool: ${name}`);
}
results.push(await handler(args));
}
return results;
};
// Add method to check if a repository is valid
this.validateRepository = async (repoPath) => {
try {
// Implementation would verify if the path is a valid git repository
return true;
} catch (error) {
return false;
}
};
this.server.setRequestHandler(CallToolRequestSchema, async (request) => {
const { name, arguments: args } = request.params;
const startTime = Date.now();
// Handle batch operations as a special case
if (name === "git_batch") {
if (!Array.isArray(args.operations)) {
throw new McpError(
ErrorCode.InvalidParams,
"Operations must be an array"
);
}
return await this.executeSequence(args.operations);
}
try {
// Resolve handler via direct match or alias
const handler = this.handlersMap[name];
if (handler) {
// Track usage statistics
const stats = this.handlerStats.get(name) || {
count: 0,
totalTime: 0,
};
stats.count++;
this.handlerStats.set(name, stats);
console.error(`[INFO] Executing Git tool: ${name}`);
const result = await handler(args);
const executionTime = Date.now() - startTime;
stats.totalTime += executionTime;
console.error(`[INFO] Completed ${name} in ${executionTime}ms`);
return result;
}
// Suggest similar commands if not found
const similarCommands = Object.keys(this.handlersMap)
.filter((cmd) => cmd.includes(name.replace(/^git_/, "")))
.slice(0, 3);
const suggestion =
similarCommands.length > 0
? `. Did you mean: ${similarCommands.join(", ")}?`
: "";
throw new McpError(
ErrorCode.MethodNotFound,
`Unknown tool: ${name}${suggestion}`
);
} catch (error) {
// Enhanced error handling
if (error instanceof McpError) {
throw error;
}
console.error(`[ERROR] Failed to execute ${name}: ${error.message}`);
throw new McpError(
ErrorCode.InternalError,
`Failed to execute ${name}: ${error.message}`
);
}
});
/**
* Register a new handler at runtime
* @param {string} name - The name of the handler
* @param {Function} handler - The handler function
* @param {Object} [toolInfo] - Optional tool information for ListToolsRequestSchema
* @returns {boolean} True if registration was successful
*/
this.registerHandler = (name, handler, toolInfo) => {
if (typeof handler !== "function") {
throw new Error(`Handler for ${name} must be a function`);
}
// Add to handlers map
this.handlersMap[name] = handler;
// Update tools list if toolInfo is provided
if (toolInfo && typeof toolInfo === "object") {
// Get current tools
const currentTools = this.toolsList || [];
// Add new tool info if not already present
const exists = currentTools.some((tool) => tool.name === name);
if (!exists) {
this.toolsList = [...currentTools, { name, ...toolInfo }];
}
}
console.error(`[INFO] Dynamically registered new handler: ${name}`);
return true;
};
/**
* Remove a handler
* @param {string} name - The name of the handler to remove
* @returns {boolean} True if removal was successful
*/
this.unregisterHandler = (name) => {
if (!this.handlersMap[name]) {
return false;
}
delete this.handlersMap[name];
console.error(`[INFO] Unregistered handler: ${name}`);
return true;
};
}
/**
* Start the server
*/
async run() {
const transport = new StdioServerTransport();
await this.server.connect(transport);
console.error("Git Repo Browser MCP server running on stdio");
}
}
```