# Directory Structure ``` ├── .gitignore ├── config │ └── example.env ├── config-mcp.json ├── DELETION_TOOLS.md ├── docker.yml ├── HIERARCHICAL_FEATURES.md ├── LICENSE ├── pyproject.toml ├── README.md ├── requirements.txt ├── setup.sh ├── src │ └── wiki_mcp_server.py ├── start-server.sh └── test-server.sh ``` # Files -------------------------------------------------------------------------------- /.gitignore: -------------------------------------------------------------------------------- ``` # Environment files .env .env.local .env.*.local # Database files *.db *.sqlite *.sqlite3 wikijs_mappings.db # Logs *.log logs/ wikijs_mcp.log # Python __pycache__/ *.py[cod] *$py.class *.so .Python build/ develop-eggs/ dist/ downloads/ eggs/ .eggs/ lib/ lib64/ parts/ sdist/ var/ wheels/ *.egg-info/ .installed.cfg *.egg MANIFEST # Virtual environments venv/ env/ ENV/ env.bak/ venv.bak/ .venv/ # Poetry poetry.lock # IDE .vscode/ .idea/ *.swp *.swo *~ # OS generated files .DS_Store .DS_Store? ._* .Spotlight-V100 .Trashes ehthumbs.db Thumbs.db # Testing .coverage .pytest_cache/ .tox/ .nox/ htmlcov/ # MyPy .mypy_cache/ .dmypy.json dmypy.json # Jupyter Notebook .ipynb_checkpoints # pyenv .python-version # Temporary files tmp/ temp/ *.tmp *.bak *.backup # Docker volumes postgres_data/ wikijs_data/ # Resources resources/ ``` -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- ```markdown # Wiki.js MCP Server A comprehensive **Model Context Protocol (MCP) server** for Wiki.js integration with **hierarchical documentation** support and Docker deployment. Perfect for organizations managing multiple repositories and large-scale documentation. ## 🚀 Quick Start ### 1. Environment Setup First, clone this repository and set up environment variables: ```bash # Copy environment template cp config/example.env .env # Edit .env with your credentials: # - Set POSTGRES_PASSWORD to a secure password # - Update other settings as needed ``` ### 2. Docker Deployment (Recommended) ```bash # Start Wiki.js with Docker docker-compose -f docker.yml up -d ``` Wiki.js will be available at http://localhost:3000 Complete the initial setup in the web interface ### 3. Setup MCP Server ```bash # Install Python dependencies ./setup.sh # Update .env with Wiki.js API credentials: # - Get API key from Wiki.js admin panel # - Set WIKIJS_TOKEN in .env file # Test the connection ./test-server.sh # Start MCP server # (not needed for AI IDEs like Cursor, simply click on the refresh icon after editing mcp.json # and you should see a green dot with all tools listed. In existing open Cursor windows, # this refresh is necessary in order to use this MCP) ./start-server.sh ``` ### 4. Configure Cursor MCP Add to your `~/.cursor/mcp.json`: ```json { "mcpServers": { "wikijs": { "command": "/path/to/wiki-js-mcp/start-server.sh" } } } ``` ## 🎯 Enhanced Cursor Integration ### Global Rules for Documentation-First Development Add these **Global Rules** in Cursor to automatically leverage documentation before coding: ``` Before writing any code, always: 1. Search existing documentation using wikijs_search_pages to understand current patterns and architecture 2. Check for related components, functions, or modules that might already exist 3. If documentation exists for similar functionality, follow the established patterns and naming conventions 4. If no documentation exists, create it using wikijs_create_page or wikijs_create_nested_page before implementing 5. Always update documentation when making changes using wikijs_sync_file_docs 6. For new features, use wikijs_create_repo_structure to plan the documentation hierarchy first ``` These rules ensure that your AI assistant will: - ✅ Check documentation before suggesting implementations - ✅ Follow existing patterns and conventions - ✅ Maintain up-to-date documentation automatically - ✅ Create structured documentation for new features - ✅ Avoid duplicating existing functionality ### Usage Tips for Cursor ``` # Before starting a new feature "Search the documentation for authentication patterns before implementing login" # When creating components "Create nested documentation under frontend-app/components before building the React component" # For API development "Check existing API documentation and create endpoint docs using the established structure" # During refactoring "Update all related documentation pages for the files I'm about to modify" ``` ## 🚀 Key Features ### 📁 **Hierarchical Documentation** - **Repository-level organization**: Create structured docs for multiple repos - **Nested page creation**: Automatic parent-child relationships - **Auto-organization**: Smart categorization by file type (components, API, utils, etc.) - **Enterprise scalability**: Handle hundreds of repos and thousands of files ### 🔧 **Core Functionality** - **GraphQL API integration**: Full Wiki.js v2+ compatibility - **File-to-page mapping**: Automatic linking between source code and documentation - **Code structure analysis**: Extract classes, functions, and dependencies - **Bulk operations**: Update multiple pages simultaneously - **Change tracking**: Monitor file modifications and sync docs ### 🐳 **Docker Setup** - **One-command deployment**: Complete Wiki.js setup with PostgreSQL - **Persistent storage**: Data survives container restarts - **Health checks**: Automatic service monitoring - **Production-ready**: Optimized for development and deployment ### 🔍 **Smart Features** - **Repository context detection**: Auto-detect Git repositories - **Content generation**: Auto-create documentation from code structure - **Search integration**: Full-text search across hierarchical content - **Health monitoring**: Connection status and error handling ## 📊 MCP Tools (21 Total) ### 🏗️ **Hierarchical Documentation Tools** 1. **`wikijs_create_repo_structure`** - Create complete repository documentation structure 2. **`wikijs_create_nested_page`** - Create pages with hierarchical paths 3. **`wikijs_get_page_children`** - Navigate parent-child page relationships 4. **`wikijs_create_documentation_hierarchy`** - Auto-organize project files into docs ### 📝 **Core Page Management** 5. **`wikijs_create_page`** - Create new pages (now with parent support) 6. **`wikijs_update_page`** - Update existing pages 7. **`wikijs_get_page`** - Retrieve page content and metadata 8. **`wikijs_search_pages`** - Search pages by text (fixed GraphQL issues) ### 🗑️ **Deletion & Cleanup Tools** 9. **`wikijs_delete_page`** - Delete specific pages by ID or path 10. **`wikijs_batch_delete_pages`** - Batch delete with pattern matching and safety checks 11. **`wikijs_delete_hierarchy`** - Delete entire page hierarchies with multiple modes 12. **`wikijs_cleanup_orphaned_mappings`** - Clean up orphaned file-to-page mappings ### 🗂️ **Organization & Structure** 13. **`wikijs_list_spaces`** - List top-level documentation spaces 14. **`wikijs_create_space`** - Create new documentation spaces 15. **`wikijs_manage_collections`** - Manage page collections ### 🔗 **File Integration** 16. **`wikijs_link_file_to_page`** - Link source files to documentation pages 17. **`wikijs_sync_file_docs`** - Sync code changes to documentation 18. **`wikijs_generate_file_overview`** - Auto-generate file documentation ### 🚀 **Bulk Operations** 19. **`wikijs_bulk_update_project_docs`** - Batch update multiple pages ### 🔧 **System Tools** 20. **`wikijs_connection_status`** - Check API connection health 21. **`wikijs_repository_context`** - Show repository mappings and context ## 🏢 Enterprise Use Cases ### Multi-Repository Documentation ``` Company Documentation/ ├── frontend-web-app/ │ ├── Overview/ │ ├── Components/ │ │ ├── Button/ │ │ ├── Modal/ │ │ └── Form/ │ ├── API Integration/ │ └── Deployment/ ├── backend-api/ │ ├── Overview/ │ ├── Controllers/ │ ├── Models/ │ └── Database Schema/ ├── mobile-app/ │ ├── Overview/ │ ├── Screens/ │ └── Native Components/ └── shared-libraries/ ├── UI Components/ ├── Utilities/ └── Type Definitions/ ``` ### Automatic Organization The system intelligently categorizes files: - **Components**: React/Vue components, UI elements - **API**: Endpoints, controllers, routes - **Utils**: Helper functions, utilities - **Services**: Business logic, external integrations - **Models**: Data models, types, schemas - **Tests**: Unit tests, integration tests - **Config**: Configuration files, environment setup ## 📚 Usage Examples ### Create Repository Documentation ```python # Create complete repository structure await wikijs_create_repo_structure( "My Frontend App", "Modern React application with TypeScript", ["Overview", "Components", "API", "Testing", "Deployment"] ) # Create nested component documentation await wikijs_create_nested_page( "Button Component", "# Button Component\n\nReusable button with multiple variants...", "my-frontend-app/components" ) # Auto-organize entire project await wikijs_create_documentation_hierarchy( "My Project", [ {"file_path": "src/components/Button.tsx"}, {"file_path": "src/api/users.ts"}, {"file_path": "src/utils/helpers.ts"} ], auto_organize=True ) ``` ### Documentation Management ```python # Clean up and manage documentation # Preview what would be deleted (safe) preview = await wikijs_delete_hierarchy( "old-project", delete_mode="include_root", confirm_deletion=False ) # Delete entire deprecated project await wikijs_delete_hierarchy( "old-project", delete_mode="include_root", confirm_deletion=True ) # Batch delete test pages await wikijs_batch_delete_pages( path_pattern="*test*", confirm_deletion=True ) # Clean up orphaned file mappings await wikijs_cleanup_orphaned_mappings() ``` ## ⚙️ Configuration ### Environment Variables ```bash # Docker Database Configuration POSTGRES_DB=wikijs POSTGRES_USER=wikijs POSTGRES_PASSWORD=your_secure_password_here # Wiki.js Connection WIKIJS_API_URL=http://localhost:3000 WIKIJS_API_KEY=your_jwt_token_here # Alternative: Username/Password WIKIJS_USERNAME=your_username WIKIJS_PASSWORD=your_password # Database & Logging WIKIJS_MCP_DB=./wikijs_mappings.db LOG_LEVEL=INFO LOG_FILE=wikijs_mcp.log # Repository Settings REPOSITORY_ROOT=./ DEFAULT_SPACE_NAME=Documentation ``` ### Authentication Options 1. **JWT Token** (Recommended): Use API key from Wiki.js admin panel 2. **Username/Password**: Traditional login credentials ## 🔧 Technical Architecture ### GraphQL Integration - **Full GraphQL API support**: Native Wiki.js v2+ compatibility - **Optimized queries**: Efficient data fetching and mutations - **Error handling**: Comprehensive GraphQL error management - **Retry logic**: Automatic retry with exponential backoff ### Database Layer - **SQLite storage**: Local file-to-page mappings - **Repository context**: Git repository detection and tracking - **Change tracking**: File hash monitoring for sync detection - **Relationship management**: Parent-child page hierarchies ### Code Analysis - **AST parsing**: Extract Python classes, functions, imports - **Structure detection**: Identify code patterns and organization - **Documentation generation**: Auto-create comprehensive overviews - **Dependency mapping**: Track imports and relationships ## 📈 Performance & Scalability - **Async operations**: Non-blocking I/O for all API calls - **Bulk processing**: Efficient batch operations for large projects - **Caching**: Smart caching of page relationships and metadata - **Connection pooling**: Optimized HTTP client management ## 🛠️ Development ### Project Structure ``` wiki-js-mcp/ ├── src/ │ └── wiki_mcp_server.py # Main MCP server implementation ├── config/ │ └── example.env # Configuration template ├── docker.yml # Docker Compose setup ├── pyproject.toml # Poetry dependencies ├── requirements.txt # Pip dependencies ├── setup.sh # Environment setup script ├── start-server.sh # MCP server launcher ├── test-server.sh # Interactive testing script ├── HIERARCHICAL_FEATURES.md # Hierarchical documentation guide ├── DELETION_TOOLS.md # Deletion and cleanup guide ├── LICENSE # MIT License └── README.md # This file ``` ### Dependencies - **FastMCP**: Official Python MCP SDK - **httpx**: Async HTTP client for GraphQL - **SQLAlchemy**: Database ORM for mappings - **Pydantic**: Configuration and validation - **tenacity**: Retry logic for reliability ## 🔍 Troubleshooting ### Docker Issues ```bash # Check containers docker-compose -f docker.yml ps # View logs docker-compose -f docker.yml logs wiki docker-compose -f docker.yml logs postgres # Reset everything docker-compose -f docker.yml down -v docker-compose -f docker.yml up -d ``` ### Connection Issues ```bash # Check Wiki.js is running curl http://localhost:3000/graphql # Verify authentication ./test-server.sh # Debug mode export LOG_LEVEL=DEBUG ./start-server.sh ``` ### Common Problems - **Port conflicts**: Change port 3000 in `docker.yml` if needed - **Database issues**: Remove `postgres_data/` and restart - **API permissions**: Ensure API key has admin privileges - **Python dependencies**: Run `./setup.sh` to reinstall ## 📚 Documentation - **[Hierarchical Features Guide](HIERARCHICAL_FEATURES.md)** - Complete guide to enterprise documentation - **[Deletion Tools Guide](DELETION_TOOLS.md)** - Comprehensive deletion and cleanup tools - **[Configuration Examples](config/example.env)** - Environment setup ## 🤝 Contributing 1. Fork the repository 2. Create feature branch (`git checkout -b feature/amazing-feature`) 3. Commit changes (`git commit -m 'Add amazing feature'`) 4. Push to branch (`git push origin feature/amazing-feature`) 5. Open Pull Request ## 📄 License This project is licensed under the MIT License - see the LICENSE file for details. ## 🙏 Acknowledgments - **Wiki.js Team**: For the excellent documentation platform - **MCP Protocol**: For the standardized AI integration framework - **FastMCP**: For the Python MCP SDK --- **Ready to scale your documentation?** 🚀 Start with `wikijs_create_repo_structure` and build enterprise-grade documentation hierarchies! Use the Cursor global rules to ensure documentation-first development! 📚✨ ``` -------------------------------------------------------------------------------- /config-mcp.json: -------------------------------------------------------------------------------- ```json { "mcpServers": { "wikijs": { "command": "/absolute/path/to/wiki-js-mcp/start-server.sh" } } } ``` -------------------------------------------------------------------------------- /requirements.txt: -------------------------------------------------------------------------------- ``` # Wiki.js MCP Server Dependencies # Python 3.12+ required # Core MCP Framework fastmcp>=0.1.0 # HTTP Client httpx>=0.27.0 # Data Validation and Settings pydantic>=2.0.0 pydantic-settings>=2.0.0 # Text Processing python-slugify>=8.0.0 markdown>=3.5.0 beautifulsoup4>=4.12.0 # Environment variable loading python-dotenv>=1.0.0 # Database sqlalchemy>=2.0.0 aiosqlite>=0.19.0 # Retry logic tenacity>=8.0.0 # Development Dependencies (optional) pytest>=7.4.0 pytest-asyncio>=0.21.0 black>=23.0.0 isort>=5.12.0 mypy>=1.5.0 ``` -------------------------------------------------------------------------------- /config/example.env: -------------------------------------------------------------------------------- ``` # Wiki.js MCP Server Configuration # Copy this file to .env and update with your actual values # Docker Database Configuration (for docker.yml) POSTGRES_DB=wikijs POSTGRES_USER=wikijs POSTGRES_PASSWORD=your_secure_password_here # Wiki.js Instance Configuration WIKIJS_API_URL=http://localhost:3000 WIKIJS_TOKEN=eyJhbGciOiJSUzI1NiIsInR5cCI6IkpXVCJ9.your_token_here # Alternative: Username/Password Authentication # [email protected] # WIKIJS_PASSWORD=your_password # Local Database Configuration WIKIJS_MCP_DB=./wikijs_mappings.db # Optional: Logging Configuration LOG_LEVEL=INFO LOG_FILE=wikijs_mcp.log # Optional: Repository Context REPOSITORY_ROOT=./ DEFAULT_SPACE_NAME=Documentation # Example values (replace with your actual credentials): # POSTGRES_PASSWORD=MySecurePassword123! # WIKIJS_API_URL=http://localhost:3000 # WIKIJS_TOKEN=your_actual_jwt_token_here ``` -------------------------------------------------------------------------------- /docker.yml: -------------------------------------------------------------------------------- ```yaml services: db: image: postgres:15-alpine container_name: wikijs_db restart: unless-stopped environment: POSTGRES_DB: ${POSTGRES_DB:-wikijs} POSTGRES_USER: ${POSTGRES_USER:-wikijs} POSTGRES_PASSWORD: ${POSTGRES_PASSWORD} volumes: - db_data:/var/lib/postgresql/data healthcheck: test: ["CMD-SHELL", "pg_isready -U ${POSTGRES_USER:-wikijs}"] interval: 30s timeout: 10s retries: 3 wiki: image: ghcr.io/requarks/wiki:2 container_name: wikijs_app depends_on: db: condition: service_healthy ports: - "3000:3000" restart: unless-stopped environment: DB_TYPE: postgres DB_HOST: db DB_PORT: 5432 DB_USER: ${POSTGRES_USER:-wikijs} DB_PASS: ${POSTGRES_PASSWORD} DB_NAME: ${POSTGRES_DB:-wikijs} PORT: 3000 healthcheck: test: ["CMD", "wget", "--no-verbose", "--tries=1", "--spider", "http://localhost:3000/healthz"] interval: 30s timeout: 10s retries: 3 start_period: 40s volumes: db_data: ``` -------------------------------------------------------------------------------- /pyproject.toml: -------------------------------------------------------------------------------- ```toml [tool.poetry] name = "wiki-js-mcp" version = "1.0.0" description = "Model Context Protocol (MCP) server for Wiki.js integration with hierarchical documentation support" authors = ["Sahil Pethe <[email protected]>"] license = "MIT" readme = "README.md" packages = [{include = "wiki_mcp_server.py", from = "src"}] [tool.poetry.dependencies] python = "^3.12" fastmcp = "^0.1.0" httpx = "^0.27.0" pydantic = "^2.0" pydantic-settings = "^2.0" python-slugify = "^8.0" markdown = "^3.5" beautifulsoup4 = "^4.12" python-dotenv = "^1.0.0" sqlalchemy = "^2.0" tenacity = "^8.0" aiosqlite = "^0.19.0" [tool.poetry.group.dev.dependencies] pytest = "^7.4.0" pytest-asyncio = "^0.21.0" black = "^23.0.0" isort = "^5.12.0" mypy = "^1.5.0" [build-system] requires = ["poetry-core"] build-backend = "poetry.core.masonry.api" [tool.poetry.scripts] wikijs-mcp = "wiki_mcp_server:main" [tool.black] line-length = 88 target-version = ['py312'] [tool.isort] profile = "black" line_length = 88 [tool.mypy] python_version = "3.12" warn_return_any = true warn_unused_configs = true disallow_untyped_defs = true ``` -------------------------------------------------------------------------------- /start-server.sh: -------------------------------------------------------------------------------- ```bash #!/bin/bash # Wiki.js MCP Server Start Script # This script activates the virtual environment and starts the MCP server set -e # Exit on any error # Get the directory where this script is located SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)" cd "$SCRIPT_DIR" # Check if virtual environment exists if [ ! -d "venv" ]; then echo "❌ Error: Virtual environment not found. Please run ./setup.sh first." >&2 exit 1 fi # Check if .env file exists if [ ! -f ".env" ]; then echo "❌ Error: .env file not found. Please copy config/example.env to .env and configure it." >&2 exit 1 fi # Activate virtual environment source venv/bin/activate # Check if the main server file exists if [ ! -f "src/wiki_mcp_server.py" ]; then echo "❌ Error: Server file src/wiki_mcp_server.py not found." >&2 exit 1 fi # Load environment variables for validation source .env # Validate required environment variables if [ -z "$WIKIJS_API_URL" ]; then echo "❌ Error: WIKIJS_API_URL not set in .env file" >&2 exit 1 fi if [ -z "$WIKIJS_TOKEN" ] && [ -z "$WIKIJS_USERNAME" ]; then echo "❌ Error: Either WIKIJS_TOKEN or WIKIJS_USERNAME must be set in .env file" >&2 exit 1 fi # Create logs directory if it doesn't exist mkdir -p logs # Start the MCP server (this will handle stdin/stdout communication with Cursor) exec python src/wiki_mcp_server.py ``` -------------------------------------------------------------------------------- /test-server.sh: -------------------------------------------------------------------------------- ```bash #!/bin/bash # Wiki.js MCP Server Test Script # This script is for interactive testing and debugging set -e # Exit on any error echo "🚀 Testing Wiki.js MCP Server..." # Get the directory where this script is located SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)" cd "$SCRIPT_DIR" # Check if virtual environment exists if [ ! -d "venv" ]; then echo "❌ Error: Virtual environment not found. Please run ./setup.sh first." exit 1 fi # Check if .env file exists if [ ! -f ".env" ]; then echo "❌ Error: .env file not found. Please copy config/example.env to .env and configure it." exit 1 fi # Activate virtual environment echo "🔧 Activating virtual environment..." source venv/bin/activate # Check if the main server file exists if [ ! -f "src/wiki_mcp_server.py" ]; then echo "❌ Error: Server file src/wiki_mcp_server.py not found." exit 1 fi # Load environment variables for validation echo "⚙️ Loading configuration..." source .env # Validate required environment variables if [ -z "$WIKIJS_API_URL" ]; then echo "❌ Error: WIKIJS_API_URL not set in .env file" exit 1 fi if [ -z "$WIKIJS_TOKEN" ] && [ -z "$WIKIJS_USERNAME" ]; then echo "❌ Error: Either WIKIJS_TOKEN or WIKIJS_USERNAME must be set in .env file" exit 1 fi echo "✅ Configuration validated" # Create logs directory if it doesn't exist mkdir -p logs # Display server information echo "" echo "📊 Server Configuration:" echo " API URL: $WIKIJS_API_URL" echo " Database: ${WIKIJS_MCP_DB:-./wikijs_mappings.db}" echo " Log Level: ${LOG_LEVEL:-INFO}" echo " Log File: ${LOG_FILE:-wikijs_mcp.log}" echo "" # Function to handle cleanup on exit cleanup() { echo "" echo "🛑 Shutting down Wiki.js MCP Server..." exit 0 } # Set up signal handlers trap cleanup SIGINT SIGTERM # Start the server echo "🌟 Starting MCP server for testing..." echo " Press Ctrl+C to stop the server" echo "" # Run the server with error handling if python src/wiki_mcp_server.py; then echo "✅ Server started successfully" else echo "❌ Server failed to start. Check the logs for details:" echo " Log file: ${LOG_FILE:-wikijs_mcp.log}" echo "" echo "💡 Troubleshooting tips:" echo " 1. Verify Wiki.js is running and accessible" echo " 2. Check your API token is valid" echo " 3. Ensure all dependencies are installed (run ./setup.sh)" echo " 4. Check the log file for detailed error messages" exit 1 fi ``` -------------------------------------------------------------------------------- /setup.sh: -------------------------------------------------------------------------------- ```bash #!/bin/bash # Wiki.js MCP Server Setup Script # This script sets up the Python virtual environment and installs dependencies set -e # Exit on any error echo "🚀 Setting up Wiki.js MCP Server..." # Get the directory where this script is located SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)" cd "$SCRIPT_DIR" # Check if Python 3.12+ is available echo "📋 Checking Python version..." if command -v python3 &> /dev/null; then PYTHON_CMD="python3" elif command -v python &> /dev/null; then PYTHON_CMD="python" else echo "❌ Error: Python is not installed or not in PATH" exit 1 fi # Check Python version PYTHON_VERSION=$($PYTHON_CMD --version 2>&1 | cut -d' ' -f2) PYTHON_MAJOR=$(echo $PYTHON_VERSION | cut -d'.' -f1) PYTHON_MINOR=$(echo $PYTHON_VERSION | cut -d'.' -f2) if [ "$PYTHON_MAJOR" -lt 3 ] || ([ "$PYTHON_MAJOR" -eq 3 ] && [ "$PYTHON_MINOR" -lt 12 ]); then echo "❌ Error: Python 3.12+ is required. Found: $PYTHON_VERSION" exit 1 fi echo "✅ Python $PYTHON_VERSION found" # Create virtual environment if it doesn't exist if [ ! -d "venv" ]; then echo "📦 Creating virtual environment..." $PYTHON_CMD -m venv venv echo "✅ Virtual environment created" else echo "📦 Virtual environment already exists" fi # Activate virtual environment echo "🔧 Activating virtual environment..." source venv/bin/activate # Upgrade pip echo "⬆️ Upgrading pip..." pip install --upgrade pip # Install dependencies echo "📚 Installing dependencies..." if [ -f "pyproject.toml" ] && command -v poetry &> /dev/null; then echo "📖 Using Poetry for dependency management..." poetry install elif [ -f "requirements.txt" ]; then echo "📖 Using pip for dependency management..." pip install -r requirements.txt else echo "❌ Error: No pyproject.toml or requirements.txt found" exit 1 fi # Create .env file from example if it doesn't exist if [ ! -f ".env" ] && [ -f "config/example.env" ]; then echo "⚙️ Creating .env file from example..." cp config/example.env .env echo "✅ .env file created. Please edit it with your Wiki.js credentials." else echo "⚙️ .env file already exists" fi # Create necessary directories echo "📁 Creating necessary directories..." mkdir -p logs # Set executable permissions for scripts echo "🔐 Setting executable permissions..." chmod +x setup.sh chmod +x start-server.sh echo "" echo "🎉 Setup completed successfully!" echo "" echo "📝 Next steps:" echo "1. Edit .env file with your Wiki.js credentials:" echo " - Set WIKIJS_API_URL (e.g., http://localhost:3000)" echo " - Set WIKIJS_TOKEN (your JWT token from Wiki.js)" echo "" echo "2. Test the server:" echo " ./start-server.sh" echo "" echo "3. Configure Cursor MCP:" echo " - Copy config-mcp.json settings to your Cursor MCP configuration" echo " - Update the absolute paths in the configuration" echo "" echo "📖 For more information, see README.md" ``` -------------------------------------------------------------------------------- /HIERARCHICAL_FEATURES.md: -------------------------------------------------------------------------------- ```markdown # Hierarchical Documentation Features ## Overview The Wiki.js MCP server now supports **hierarchical documentation structures** perfect for enterprise-scale repository documentation. This allows you to create organized, nested documentation that scales from individual files to entire company codebases. ## New MCP Tools ### 1. `wikijs_create_repo_structure` Creates a complete repository documentation structure with nested pages. **Usage:** ```python await wikijs_create_repo_structure( repo_name="Frontend App", description="A modern React frontend application", sections=["Overview", "Components", "API", "Deployment", "Testing"] ) ``` **Creates:** ``` Frontend App/ ├── Overview/ ├── Components/ ├── API/ ├── Deployment/ └── Testing/ ``` ### 2. `wikijs_create_nested_page` Creates pages with hierarchical paths, automatically creating parent pages if needed. **Usage:** ```python await wikijs_create_nested_page( title="Button Component", content="# Button Component\n\nA reusable button...", parent_path="frontend-app/components", create_parent_if_missing=True ) ``` **Creates:** `frontend-app/components/button-component` ### 3. `wikijs_get_page_children` Retrieves all direct child pages of a parent page for navigation. **Usage:** ```python await wikijs_get_page_children(page_path="frontend-app/components") ``` **Returns:** ```json { "parent": {"pageId": 18, "title": "Components", "path": "frontend-app/components"}, "children": [ {"pageId": 22, "title": "Button Component", "path": "frontend-app/components/button-component"}, {"pageId": 23, "title": "Modal Component", "path": "frontend-app/components/modal-component"} ], "total_children": 2 } ``` ### 4. `wikijs_create_documentation_hierarchy` Creates a complete documentation hierarchy for a project based on file structure with auto-organization. **Usage:** ```python await wikijs_create_documentation_hierarchy( project_name="My App", file_mappings=[ {"file_path": "src/components/Button.tsx", "doc_path": "components/button"}, {"file_path": "src/api/users.ts", "doc_path": "api/users"}, {"file_path": "src/utils/helpers.ts", "doc_path": "utils/helpers"} ], auto_organize=True ) ``` **Auto-organizes into sections:** - **Components**: Files with "component" or "/components/" in path - **API**: Files with "api", "/api/", or "endpoint" in path - **Utils**: Files with "util", "/utils/", or "/helpers/" in path - **Services**: Files with "service" or "/services/" in path - **Models**: Files with "model", "/models/", or "/types/" in path - **Tests**: Files with "test", "/tests/", or ".test." in path - **Config**: Files with "config", "/config/", or ".config." in path ## Enhanced Existing Tools ### `wikijs_create_page` (Enhanced) Now supports `parent_id` parameter for creating hierarchical pages: ```python await wikijs_create_page( title="API Endpoints", content="# API Documentation...", parent_id="16" # Creates as child of page 16 ) ``` ### `wikijs_search_pages` (Fixed) Fixed GraphQL query issues - now works properly: ```python await wikijs_search_pages("Button") # Returns: {"results": [...], "total": 1} ``` ## Enterprise Use Cases ### 1. Company-wide Repository Documentation ``` Company Docs/ ├── frontend-web-app/ │ ├── Overview/ │ ├── Components/ │ │ ├── Button/ │ │ ├── Modal/ │ │ └── Form/ │ ├── API/ │ └── Deployment/ ├── backend-api/ │ ├── Overview/ │ ├── Controllers/ │ ├── Models/ │ └── Database/ ├── mobile-app/ │ ├── Overview/ │ ├── Screens/ │ └── Components/ └── shared-libraries/ ├── UI Components/ ├── Utilities/ └── Types/ ``` ### 2. Automatic File-to-Documentation Mapping The system automatically: - Creates hierarchical page structures - Links source files to documentation pages - Organizes files by type (components, API, utils, etc.) - Maintains parent-child relationships - Enables easy navigation between related docs ### 3. Scalable Documentation Architecture - **Root Level**: Repository names only - **Section Level**: Logical groupings (Components, API, etc.) - **Page Level**: Individual files/features - **Sub-page Level**: Detailed documentation sections ## Benefits ✅ **Scalable**: Handles hundreds of repositories and thousands of files ✅ **Organized**: Auto-categorizes files into logical sections ✅ **Navigable**: Parent-child relationships enable easy browsing ✅ **Searchable**: Full-text search across all hierarchical content ✅ **Maintainable**: File-to-page mappings keep docs in sync with code ✅ **Enterprise-ready**: Perfect for large organizations with many repos ## Example: Complete Repository Setup ```python # 1. Create repository structure repo_result = await wikijs_create_repo_structure( "E-commerce Platform", "Full-stack e-commerce application", ["Overview", "Frontend", "Backend", "API", "Database", "Deployment"] ) # 2. Create component documentation button_result = await wikijs_create_nested_page( "Button Component", "# Button Component\n\nReusable button with variants...", "e-commerce-platform/frontend" ) # 3. Get navigation structure children = await wikijs_get_page_children(page_path="e-commerce-platform/frontend") # 4. Search across all docs search_results = await wikijs_search_pages("authentication") ``` This creates a professional, enterprise-grade documentation structure that scales with your organization's growth! ``` -------------------------------------------------------------------------------- /DELETION_TOOLS.md: -------------------------------------------------------------------------------- ```markdown # Deletion Tools Documentation ## Overview The Wiki.js MCP server includes **comprehensive deletion tools** for managing pages and hierarchies. These tools provide safe, flexible options for cleaning up documentation, reorganizing content, and maintaining your Wiki.js instance. ## 🛡️ Safety Features All deletion tools include **built-in safety mechanisms**: - **Confirmation required**: `confirm_deletion=True` must be explicitly set - **Preview mode**: See what will be deleted before confirming - **Detailed reporting**: Know exactly what was deleted or failed - **File mapping cleanup**: Automatically clean up orphaned database entries ## 🗑️ Deletion Tools (4 Total) ### 1. `wikijs_delete_page` Delete a specific page from Wiki.js. **Usage:** ```python # Delete by page ID await wikijs_delete_page(page_id=123) # Delete by page path await wikijs_delete_page(page_path="frontend-app/components/button") # Keep file mappings (don't clean up database) await wikijs_delete_page(page_id=123, remove_file_mapping=False) ``` **Returns:** ```json { "deleted": true, "pageId": 123, "title": "Button Component", "path": "frontend-app/components/button", "status": "deleted", "file_mapping_removed": true } ``` ### 2. `wikijs_batch_delete_pages` Delete multiple pages with flexible selection criteria. **Selection Methods:** **By Page IDs:** ```python await wikijs_batch_delete_pages( page_ids=[123, 124, 125], confirm_deletion=True ) ``` **By Page Paths:** ```python await wikijs_batch_delete_pages( page_paths=["frontend-app/old-component", "backend-api/deprecated"], confirm_deletion=True ) ``` **By Pattern Matching:** ```python # Delete all pages under frontend-app await wikijs_batch_delete_pages( path_pattern="frontend-app/*", confirm_deletion=True ) # Delete all test pages await wikijs_batch_delete_pages( path_pattern="*test*", confirm_deletion=True ) ``` **Safety Check (Preview Mode):** ```python # Preview what would be deleted (safe - won't actually delete) result = await wikijs_batch_delete_pages( path_pattern="frontend-app/*", confirm_deletion=False # Safety check ) # Returns: {"error": "confirm_deletion must be True...", "safety_note": "..."} ``` **Returns:** ```json { "total_found": 5, "deleted_count": 4, "failed_count": 1, "deleted_pages": [ {"pageId": 123, "title": "Button", "path": "frontend-app/button"}, {"pageId": 124, "title": "Modal", "path": "frontend-app/modal"} ], "failed_deletions": [ {"pageId": 125, "title": "Protected", "path": "frontend-app/protected", "error": "Access denied"} ], "status": "completed" } ``` ### 3. `wikijs_delete_hierarchy` Delete entire page hierarchies (folder structures) with precise control. **Deletion Modes:** **Children Only** (default): ```python # Delete all child pages but keep the root page await wikijs_delete_hierarchy( root_path="frontend-app", delete_mode="children_only", confirm_deletion=True ) ``` **Include Root**: ```python # Delete the entire hierarchy including the root page await wikijs_delete_hierarchy( root_path="frontend-app", delete_mode="include_root", confirm_deletion=True ) ``` **Root Only**: ```python # Delete only the root page, leave children orphaned await wikijs_delete_hierarchy( root_path="frontend-app", delete_mode="root_only", confirm_deletion=True ) ``` **Preview Mode:** ```python # Preview hierarchy deletion (safe) result = await wikijs_delete_hierarchy( root_path="frontend-app", delete_mode="include_root", confirm_deletion=False ) # Returns preview with safety warnings ``` **Returns:** ```json { "root_path": "frontend-app", "delete_mode": "children_only", "total_found": 8, "deleted_count": 7, "failed_count": 1, "deleted_pages": [ {"pageId": 124, "title": "Button", "path": "frontend-app/components/button", "depth": 2}, {"pageId": 125, "title": "Modal", "path": "frontend-app/components/modal", "depth": 2} ], "failed_deletions": [], "hierarchy_summary": { "root_page_found": true, "child_pages_found": 8, "max_depth": 3 }, "status": "completed" } ``` ### 4. `wikijs_cleanup_orphaned_mappings` Clean up file-to-page mappings for pages that no longer exist. **Usage:** ```python # Clean up orphaned mappings await wikijs_cleanup_orphaned_mappings() ``` **Returns:** ```json { "total_mappings": 25, "valid_mappings": 22, "orphaned_mappings": 3, "cleaned_count": 3, "orphaned_details": [ {"file_path": "src/deleted-component.tsx", "page_id": 999, "last_updated": "2024-01-15T10:30:00Z"}, {"file_path": "src/old-util.ts", "page_id": 998, "error": "Page not found"} ], "status": "completed" } ``` ## 🎯 Common Use Cases ### 1. Clean Up Test Documentation ```python # Remove all test pages await wikijs_batch_delete_pages( path_pattern="*test*", confirm_deletion=True ) ``` ### 2. Remove Deprecated Project ```python # Delete entire project hierarchy await wikijs_delete_hierarchy( root_path="old-mobile-app", delete_mode="include_root", confirm_deletion=True ) ``` ### 3. Reorganize Documentation Structure ```python # Step 1: Preview what will be affected preview = await wikijs_delete_hierarchy( root_path="frontend-app/old-structure", delete_mode="children_only", confirm_deletion=False ) # Step 2: Delete old structure await wikijs_delete_hierarchy( root_path="frontend-app/old-structure", delete_mode="children_only", confirm_deletion=True ) # Step 3: Clean up orphaned mappings await wikijs_cleanup_orphaned_mappings() ``` ### 4. Bulk Cleanup by Pattern ```python # Remove all draft pages await wikijs_batch_delete_pages( path_pattern="*draft*", confirm_deletion=True ) # Remove all pages from a specific author/date await wikijs_batch_delete_pages( path_pattern="temp-*", confirm_deletion=True ) ``` ### 5. Maintenance Operations ```python # Regular cleanup of orphaned mappings cleanup_result = await wikijs_cleanup_orphaned_mappings() print(f"Cleaned up {cleanup_result['cleaned_count']} orphaned mappings") # Remove specific outdated pages await wikijs_batch_delete_pages( page_paths=[ "old-api/v1/endpoints", "deprecated/legacy-components", "archive/old-docs" ], confirm_deletion=True ) ``` ## 🔒 Safety Best Practices ### 1. Always Preview First ```python # GOOD: Preview before deleting preview = await wikijs_delete_hierarchy("important-docs", confirm_deletion=False) print(f"Would delete {preview.get('total_found', 0)} pages") # Then confirm if safe if input("Proceed? (y/N): ").lower() == 'y': await wikijs_delete_hierarchy("important-docs", confirm_deletion=True) ``` ### 2. Use Specific Patterns ```python # GOOD: Specific pattern await wikijs_batch_delete_pages(path_pattern="test-project/temp/*", confirm_deletion=True) # DANGEROUS: Too broad # await wikijs_batch_delete_pages(path_pattern="*", confirm_deletion=True) # DON'T DO THIS ``` ### 3. Check Results ```python result = await wikijs_batch_delete_pages( path_pattern="old-docs/*", confirm_deletion=True ) print(f"Deleted: {result['deleted_count']}") print(f"Failed: {result['failed_count']}") # Check for failures if result['failed_deletions']: print("Failed deletions:") for failure in result['failed_deletions']: print(f" - {failure['title']}: {failure['error']}") ``` ### 4. Regular Maintenance ```python # Weekly cleanup routine async def weekly_cleanup(): # Clean up orphaned mappings cleanup = await wikijs_cleanup_orphaned_mappings() print(f"Cleaned {cleanup['cleaned_count']} orphaned mappings") # Remove temp/test pages temp_cleanup = await wikijs_batch_delete_pages( path_pattern="temp-*", confirm_deletion=True ) print(f"Removed {temp_cleanup['deleted_count']} temp pages") ``` ## ⚠️ Important Notes ### Deletion Order - **Hierarchy deletion** processes pages from deepest to shallowest to avoid dependency issues - **Child pages are deleted before parent pages** automatically - **Failed deletions** are reported with specific error messages ### File Mappings - **Automatic cleanup**: File-to-page mappings are removed by default when pages are deleted - **Manual control**: Set `remove_file_mappings=False` to preserve mappings - **Orphaned cleanup**: Use `wikijs_cleanup_orphaned_mappings()` for maintenance ### Pattern Matching - **Supports wildcards**: Use `*` for pattern matching (e.g., `"frontend-*"`, `"*test*"`) - **Case sensitive**: Patterns match exactly as written - **Path-based**: Patterns match against the full page path ### Error Handling - **Graceful failures**: Individual page deletion failures don't stop batch operations - **Detailed reporting**: All failures are logged with specific error messages - **Partial success**: Operations can succeed partially with detailed results ## 🧪 Testing All deletion tools have been thoroughly tested: - ✅ Single page deletion - ✅ Batch deletion with safety checks - ✅ Pattern-based deletion - ✅ Hierarchy deletion modes - ✅ Orphaned mappings cleanup - ✅ File mapping integration - ✅ Error handling and reporting The tools are **production-ready** and safe for enterprise use with proper confirmation procedures. ``` -------------------------------------------------------------------------------- /src/wiki_mcp_server.py: -------------------------------------------------------------------------------- ```python #!/usr/bin/env python3 """Wiki.js MCP server using FastMCP - GraphQL version.""" import os from dotenv import load_dotenv load_dotenv() import sys import datetime import json import hashlib import logging import ast import re from pathlib import Path from typing import Optional, List, Dict, Any, Union from dataclasses import dataclass import httpx from fastmcp import FastMCP from slugify import slugify import markdown from sqlalchemy import create_engine, Column, Integer, String, DateTime, Text from sqlalchemy.orm import declarative_base, sessionmaker from sqlalchemy.exc import SQLAlchemyError from tenacity import retry, stop_after_attempt, wait_exponential from pydantic import Field from pydantic_settings import BaseSettings # Create FastMCP server mcp = FastMCP("Wiki.js Integration") # Configuration class Settings(BaseSettings): WIKIJS_API_URL: str = Field(default="http://localhost:3000") WIKIJS_TOKEN: Optional[str] = Field(default=None) WIKIJS_API_KEY: Optional[str] = Field(default=None) # Alternative name for token WIKIJS_USERNAME: Optional[str] = Field(default=None) WIKIJS_PASSWORD: Optional[str] = Field(default=None) WIKIJS_MCP_DB: str = Field(default="./wikijs_mappings.db") LOG_LEVEL: str = Field(default="INFO") LOG_FILE: str = Field(default="wikijs_mcp.log") REPOSITORY_ROOT: str = Field(default="./") DEFAULT_SPACE_NAME: str = Field(default="Documentation") class Config: env_file = ".env" extra = "ignore" # Allow extra fields without validation errors @property def token(self) -> Optional[str]: """Get the token from either WIKIJS_TOKEN or WIKIJS_API_KEY.""" return self.WIKIJS_TOKEN or self.WIKIJS_API_KEY settings = Settings() # Setup logging logging.basicConfig( level=getattr(logging, settings.LOG_LEVEL.upper()), format='%(asctime)s - %(name)s - %(levelname)s - %(message)s', handlers=[ logging.FileHandler(settings.LOG_FILE), logging.StreamHandler() ] ) logger = logging.getLogger(__name__) # Database models Base = declarative_base() class FileMapping(Base): __tablename__ = 'file_mappings' id = Column(Integer, primary_key=True) file_path = Column(String, unique=True, nullable=False) page_id = Column(Integer, nullable=False) relationship_type = Column(String, nullable=False) last_updated = Column(DateTime, default=datetime.datetime.utcnow) file_hash = Column(String) repository_root = Column(String, default='') space_name = Column(String, default='') class RepositoryContext(Base): __tablename__ = 'repository_contexts' id = Column(Integer, primary_key=True) root_path = Column(String, unique=True, nullable=False) space_name = Column(String, nullable=False) space_id = Column(Integer) last_updated = Column(DateTime, default=datetime.datetime.utcnow) # Database setup engine = create_engine(f"sqlite:///{settings.WIKIJS_MCP_DB}") Base.metadata.create_all(engine) SessionLocal = sessionmaker(autocommit=False, autoflush=False, bind=engine) class WikiJSClient: """Wiki.js GraphQL API client for handling requests.""" def __init__(self): self.base_url = settings.WIKIJS_API_URL.rstrip('/') self.client = httpx.AsyncClient(timeout=30.0) self.authenticated = False async def authenticate(self) -> bool: """Set up authentication headers for GraphQL requests.""" if settings.token: self.client.headers.update({ "Authorization": f"Bearer {settings.token}", "Content-Type": "application/json" }) self.authenticated = True return True elif settings.WIKIJS_USERNAME and settings.WIKIJS_PASSWORD: # For username/password, we need to login via GraphQL mutation try: login_mutation = """ mutation($username: String!, $password: String!) { authentication { login(username: $username, password: $password) { succeeded jwt message } } } """ response = await self.graphql_request(login_mutation, { "username": settings.WIKIJS_USERNAME, "password": settings.WIKIJS_PASSWORD }) if response.get("data", {}).get("authentication", {}).get("login", {}).get("succeeded"): jwt_token = response["data"]["authentication"]["login"]["jwt"] self.client.headers.update({ "Authorization": f"Bearer {jwt_token}", "Content-Type": "application/json" }) self.authenticated = True return True else: logger.error(f"Login failed: {response}") return False except Exception as e: logger.error(f"Authentication failed: {e}") return False return False @retry(stop=stop_after_attempt(3), wait=wait_exponential(multiplier=1, min=4, max=10)) async def graphql_request(self, query: str, variables: Dict = None) -> Dict: """Make GraphQL request to Wiki.js.""" url = f"{self.base_url}/graphql" payload = {"query": query} if variables: payload["variables"] = variables try: response = await self.client.post(url, json=payload) response.raise_for_status() data = response.json() # Check for GraphQL errors if "errors" in data: error_msg = "; ".join([err.get("message", str(err)) for err in data["errors"]]) raise Exception(f"GraphQL error: {error_msg}") return data except httpx.HTTPStatusError as e: logger.error(f"Wiki.js GraphQL HTTP error {e.response.status_code}: {e.response.text}") raise Exception(f"Wiki.js GraphQL HTTP error {e.response.status_code}: {e.response.text}") except httpx.RequestError as e: logger.error(f"Wiki.js connection error: {str(e)}") raise Exception(f"Wiki.js connection error: {str(e)}") # Initialize client wikijs = WikiJSClient() def get_db(): """Get database session.""" db = SessionLocal() try: return db finally: db.close() def get_file_hash(file_path: str) -> str: """Calculate SHA256 hash of file content.""" try: with open(file_path, 'rb') as f: return hashlib.sha256(f.read()).hexdigest() except FileNotFoundError: return "" def markdown_to_html(content: str) -> str: """Convert markdown content to HTML.""" md = markdown.Markdown(extensions=['codehilite', 'fenced_code', 'tables']) return md.convert(content) def find_repository_root(start_path: str = None) -> Optional[str]: """Find the repository root by looking for .git directory or .wikijs_mcp file.""" if start_path is None: start_path = os.getcwd() current_path = Path(start_path).resolve() # Walk up the directory tree for path in [current_path] + list(current_path.parents): # Check for .git directory (Git repository) if (path / '.git').exists(): return str(path) # Check for .wikijs_mcp file (explicit Wiki.js repository marker) if (path / '.wikijs_mcp').exists(): return str(path) # If no repository markers found, use current directory return str(current_path) def extract_code_structure(file_path: str) -> Dict[str, Any]: """Extract classes and functions from Python files using AST.""" try: with open(file_path, 'r', encoding='utf-8') as f: content = f.read() tree = ast.parse(content) structure = { 'classes': [], 'functions': [], 'imports': [] } for node in ast.walk(tree): if isinstance(node, ast.ClassDef): structure['classes'].append({ 'name': node.name, 'line': node.lineno, 'docstring': ast.get_docstring(node) }) elif isinstance(node, ast.FunctionDef): structure['functions'].append({ 'name': node.name, 'line': node.lineno, 'docstring': ast.get_docstring(node) }) elif isinstance(node, (ast.Import, ast.ImportFrom)): if isinstance(node, ast.Import): for alias in node.names: structure['imports'].append(alias.name) else: module = node.module or '' for alias in node.names: structure['imports'].append(f"{module}.{alias.name}") return structure except Exception as e: logger.error(f"Error parsing {file_path}: {e}") return {'classes': [], 'functions': [], 'imports': []} # MCP Tools Implementation @mcp.tool() async def wikijs_create_page(title: str, content: str, space_id: str = "", parent_id: str = "") -> str: """ Create a new page in Wiki.js with support for hierarchical organization. Args: title: Page title content: Page content (markdown or HTML) space_id: Space ID (optional, uses default if not provided) parent_id: Parent page ID for hierarchical organization (optional) Returns: JSON string with page details: {'pageId': int, 'url': str} """ try: await wikijs.authenticate() # Generate path - if parent_id provided, create nested path if parent_id: # Get parent page to build nested path parent_query = """ query($id: Int!) { pages { single(id: $id) { path title } } } """ parent_response = await wikijs.graphql_request(parent_query, {"id": int(parent_id)}) parent_data = parent_response.get("data", {}).get("pages", {}).get("single") if parent_data: parent_path = parent_data["path"] # Create nested path: parent-path/child-title path = f"{parent_path}/{slugify(title)}" else: path = slugify(title) else: path = slugify(title) # GraphQL mutation to create a page mutation = """ mutation($content: String!, $description: String!, $editor: String!, $isPublished: Boolean!, $isPrivate: Boolean!, $locale: String!, $path: String!, $publishEndDate: Date, $publishStartDate: Date, $scriptCss: String, $scriptJs: String, $tags: [String]!, $title: String!) { pages { create(content: $content, description: $description, editor: $editor, isPublished: $isPublished, isPrivate: $isPrivate, locale: $locale, path: $path, publishEndDate: $publishEndDate, publishStartDate: $publishStartDate, scriptCss: $scriptCss, scriptJs: $scriptJs, tags: $tags, title: $title) { responseResult { succeeded errorCode slug message } page { id path title } } } } """ variables = { "content": content, "description": "", "editor": "markdown", "isPublished": True, "isPrivate": False, "locale": "en", "path": path, "publishEndDate": None, "publishStartDate": None, "scriptCss": "", "scriptJs": "", "tags": [], "title": title } response = await wikijs.graphql_request(mutation, variables) create_result = response.get("data", {}).get("pages", {}).get("create", {}) response_result = create_result.get("responseResult", {}) if response_result.get("succeeded"): page_data = create_result.get("page", {}) result = { "pageId": page_data.get("id"), "url": page_data.get("path"), "title": page_data.get("title"), "status": "created", "parentId": int(parent_id) if parent_id else None, "hierarchicalPath": path } logger.info(f"Created page: {title} (ID: {result['pageId']}) at path: {path}") return json.dumps(result) else: error_msg = response_result.get("message", "Unknown error") return json.dumps({"error": f"Failed to create page: {error_msg}"}) except Exception as e: error_msg = f"Failed to create page '{title}': {str(e)}" logger.error(error_msg) return json.dumps({"error": error_msg}) @mcp.tool() async def wikijs_update_page(page_id: int, title: str = None, content: str = None) -> str: """ Update an existing page in Wiki.js. Args: page_id: Page ID to update title: New title (optional) content: New content (optional) Returns: JSON string with update status """ try: await wikijs.authenticate() # First get the current page data get_query = """ query($id: Int!) { pages { single(id: $id) { id path title content description isPrivate isPublished locale tags { tag } } } } """ get_response = await wikijs.graphql_request(get_query, {"id": page_id}) current_page = get_response.get("data", {}).get("pages", {}).get("single") if not current_page: return json.dumps({"error": f"Page with ID {page_id} not found"}) # GraphQL mutation to update a page mutation = """ mutation($id: Int!, $content: String!, $description: String!, $editor: String!, $isPrivate: Boolean!, $isPublished: Boolean!, $locale: String!, $path: String!, $scriptCss: String, $scriptJs: String, $tags: [String]!, $title: String!) { pages { update(id: $id, content: $content, description: $description, editor: $editor, isPrivate: $isPrivate, isPublished: $isPublished, locale: $locale, path: $path, scriptCss: $scriptCss, scriptJs: $scriptJs, tags: $tags, title: $title) { responseResult { succeeded errorCode slug message } page { id path title updatedAt } } } } """ # Use provided values or keep current ones new_title = title if title is not None else current_page["title"] new_content = content if content is not None else current_page["content"] variables = { "id": page_id, "content": new_content, "description": current_page.get("description", ""), "editor": "markdown", "isPrivate": current_page.get("isPrivate", False), "isPublished": current_page.get("isPublished", True), "locale": current_page.get("locale", "en"), "path": current_page["path"], "scriptCss": "", "scriptJs": "", "tags": [tag["tag"] for tag in current_page.get("tags", [])], "title": new_title } response = await wikijs.graphql_request(mutation, variables) update_result = response.get("data", {}).get("pages", {}).get("update", {}) response_result = update_result.get("responseResult", {}) if response_result.get("succeeded"): page_data = update_result.get("page", {}) result = { "pageId": page_id, "status": "updated", "title": page_data.get("title"), "lastModified": page_data.get("updatedAt") } logger.info(f"Updated page ID: {page_id}") return json.dumps(result) else: error_msg = response_result.get("message", "Unknown error") return json.dumps({"error": f"Failed to update page: {error_msg}"}) except Exception as e: error_msg = f"Failed to update page {page_id}: {str(e)}" logger.error(error_msg) return json.dumps({"error": error_msg}) @mcp.tool() async def wikijs_get_page(page_id: int = None, slug: str = None) -> str: """ Retrieve page metadata and content from Wiki.js. Args: page_id: Page ID (optional) slug: Page slug/path (optional) Returns: JSON string with page data """ try: await wikijs.authenticate() if page_id: query = """ query($id: Int!) { pages { single(id: $id) { id path title content description isPrivate isPublished locale createdAt updatedAt tags { tag } } } } """ variables = {"id": page_id} elif slug: query = """ query($path: String!) { pages { singleByPath(path: $path, locale: "en") { id path title content description isPrivate isPublished locale createdAt updatedAt tags { tag } } } } """ variables = {"path": slug} else: return json.dumps({"error": "Either page_id or slug must be provided"}) response = await wikijs.graphql_request(query, variables) page_data = None if page_id: page_data = response.get("data", {}).get("pages", {}).get("single") else: page_data = response.get("data", {}).get("pages", {}).get("singleByPath") if not page_data: return json.dumps({"error": "Page not found"}) result = { "pageId": page_data.get("id"), "title": page_data.get("title"), "content": page_data.get("content"), "contentType": "markdown", "lastModified": page_data.get("updatedAt"), "path": page_data.get("path"), "isPublished": page_data.get("isPublished"), "description": page_data.get("description"), "tags": [tag["tag"] for tag in page_data.get("tags", [])] } return json.dumps(result) except Exception as e: error_msg = f"Failed to get page: {str(e)}" logger.error(error_msg) return json.dumps({"error": error_msg}) @mcp.tool() async def wikijs_search_pages(query: str, space_id: str = None) -> str: """ Search pages by text in Wiki.js. Args: query: Search query space_id: Space ID to limit search (optional) Returns: JSON string with search results """ try: await wikijs.authenticate() # GraphQL query for search (fixed - removed invalid suggestions subfields) search_query = """ query($query: String!) { pages { search(query: $query, path: "", locale: "en") { results { id title description path locale } totalHits } } } """ variables = {"query": query} response = await wikijs.graphql_request(search_query, variables) search_data = response.get("data", {}).get("pages", {}).get("search", {}) results = [] for item in search_data.get("results", []): results.append({ "pageId": item.get("id"), "title": item.get("title"), "snippet": item.get("description", ""), "score": 1.0, # Wiki.js doesn't provide scores "path": item.get("path") }) return json.dumps({ "results": results, "total": search_data.get("totalHits", len(results)) }) except Exception as e: error_msg = f"Search failed: {str(e)}" logger.error(error_msg) return json.dumps({"error": error_msg}) @mcp.tool() async def wikijs_list_spaces() -> str: """ List all spaces (top-level Wiki.js containers). Note: Wiki.js doesn't have "spaces" like BookStack, but we can list pages at root level. Returns: JSON string with spaces list """ try: await wikijs.authenticate() # Get all pages and group by top-level paths query = """ query { pages { list { id title path description isPublished locale } } } """ response = await wikijs.graphql_request(query) pages = response.get("data", {}).get("pages", {}).get("list", []) # Group pages by top-level path (simulate spaces) spaces = {} for page in pages: path_parts = page["path"].split("/") if len(path_parts) > 0: top_level = path_parts[0] if path_parts[0] else "root" if top_level not in spaces: spaces[top_level] = { "spaceId": hash(top_level) % 10000, # Generate pseudo ID "name": top_level.replace("-", " ").title(), "slug": top_level, "description": f"Pages under /{top_level}", "pageCount": 0 } spaces[top_level]["pageCount"] += 1 return json.dumps({"spaces": list(spaces.values())}) except Exception as e: error_msg = f"Failed to list spaces: {str(e)}" logger.error(error_msg) return json.dumps({"error": error_msg}) @mcp.tool() async def wikijs_create_space(name: str, description: str = None) -> str: """ Create a new space in Wiki.js. Note: Wiki.js doesn't have spaces, so this creates a root-level page as a space placeholder. Args: name: Space name description: Space description (optional) Returns: JSON string with space details """ try: # Create a root page that acts as a space space_content = f"# {name}\n\n{description or 'This is the main page for the ' + name + ' section.'}\n\n## Pages in this section:\n\n*Pages will be listed here as they are created.*" result = await wikijs_create_page(name, space_content) result_data = json.loads(result) if "error" not in result_data: # Convert page result to space format space_result = { "spaceId": result_data.get("pageId"), "name": name, "slug": slugify(name), "status": "created", "description": description } logger.info(f"Created space (root page): {name} (ID: {space_result['spaceId']})") return json.dumps(space_result) else: return result except Exception as e: error_msg = f"Failed to create space '{name}': {str(e)}" logger.error(error_msg) return json.dumps({"error": error_msg}) @mcp.tool() async def wikijs_link_file_to_page(file_path: str, page_id: int, relationship: str = "documents") -> str: """ Persist link between code file and Wiki.js page in local database. Args: file_path: Path to the source file page_id: Wiki.js page ID relationship: Type of relationship (documents, references, etc.) Returns: JSON string with link status """ try: db = get_db() # Calculate file hash file_hash = get_file_hash(file_path) repo_root = find_repository_root(file_path) # Create or update mapping mapping = db.query(FileMapping).filter(FileMapping.file_path == file_path).first() if mapping: mapping.page_id = page_id mapping.relationship_type = relationship mapping.file_hash = file_hash mapping.last_updated = datetime.datetime.utcnow() else: mapping = FileMapping( file_path=file_path, page_id=page_id, relationship_type=relationship, file_hash=file_hash, repository_root=repo_root or "" ) db.add(mapping) db.commit() result = { "linked": True, "file_path": file_path, "page_id": page_id, "relationship": relationship } logger.info(f"Linked file {file_path} to page {page_id}") return json.dumps(result) except Exception as e: error_msg = f"Failed to link file to page: {str(e)}" logger.error(error_msg) return json.dumps({"error": error_msg}) @mcp.tool() async def wikijs_sync_file_docs(file_path: str, change_summary: str, snippet: str = None) -> str: """ Sync a code change to the linked Wiki.js page. Args: file_path: Path to the changed file change_summary: Summary of changes made snippet: Code snippet showing changes (optional) Returns: JSON string with sync status """ try: db = get_db() # Look up page mapping mapping = db.query(FileMapping).filter(FileMapping.file_path == file_path).first() if not mapping: return json.dumps({"error": f"No page mapping found for {file_path}"}) # Get current page content page_response = await wikijs_get_page(page_id=mapping.page_id) page_data = json.loads(page_response) if "error" in page_data: return json.dumps({"error": f"Failed to get page: {page_data['error']}"}) # Append change summary to page content current_content = page_data.get("content", "") update_section = f"\n\n## Recent Changes\n\n**{datetime.datetime.now().strftime('%Y-%m-%d %H:%M')}**: {change_summary}\n" if snippet: update_section += f"\n```\n{snippet}\n```\n" new_content = current_content + update_section # Update the page update_response = await wikijs_update_page(mapping.page_id, content=new_content) update_data = json.loads(update_response) if "error" in update_data: return json.dumps({"error": f"Failed to update page: {update_data['error']}"}) # Update file hash mapping.file_hash = get_file_hash(file_path) mapping.last_updated = datetime.datetime.utcnow() db.commit() result = { "updated": True, "file_path": file_path, "page_id": mapping.page_id, "change_summary": change_summary } logger.info(f"Synced changes from {file_path} to page {mapping.page_id}") return json.dumps(result) except Exception as e: error_msg = f"Failed to sync file docs: {str(e)}" logger.error(error_msg) return json.dumps({"error": error_msg}) @mcp.tool() async def wikijs_generate_file_overview( file_path: str, include_functions: bool = True, include_classes: bool = True, include_dependencies: bool = True, include_examples: bool = False, target_page_id: int = None ) -> str: """ Create or update a structured overview page for a file. Args: file_path: Path to the source file include_functions: Include function documentation include_classes: Include class documentation include_dependencies: Include import/dependency information include_examples: Include usage examples target_page_id: Specific page ID to update (optional) Returns: JSON string with overview page details """ try: if not os.path.exists(file_path): return json.dumps({"error": f"File not found: {file_path}"}) # Extract code structure structure = extract_code_structure(file_path) # Generate documentation content content_parts = [f"# {os.path.basename(file_path)} Overview\n"] content_parts.append(f"**File Path**: `{file_path}`\n") content_parts.append(f"**Last Updated**: {datetime.datetime.now().strftime('%Y-%m-%d %H:%M')}\n") if include_dependencies and structure['imports']: content_parts.append("\n## Dependencies\n") for imp in structure['imports']: content_parts.append(f"- `{imp}`") content_parts.append("") if include_classes and structure['classes']: content_parts.append("\n## Classes\n") for cls in structure['classes']: content_parts.append(f"### {cls['name']} (Line {cls['line']})\n") if cls['docstring']: content_parts.append(f"{cls['docstring']}\n") if include_functions and structure['functions']: content_parts.append("\n## Functions\n") for func in structure['functions']: content_parts.append(f"### {func['name']}() (Line {func['line']})\n") if func['docstring']: content_parts.append(f"{func['docstring']}\n") if include_examples: content_parts.append("\n## Usage Examples\n") content_parts.append("```python\n# Add usage examples here\n```\n") content = "\n".join(content_parts) # Create or update page if target_page_id: # Update existing page response = await wikijs_update_page(target_page_id, content=content) result_data = json.loads(response) if "error" not in result_data: result_data["action"] = "updated" else: # Create new page title = f"{os.path.basename(file_path)} Documentation" response = await wikijs_create_page(title, content) result_data = json.loads(response) if "error" not in result_data: result_data["action"] = "created" # Link file to new page if "pageId" in result_data: await wikijs_link_file_to_page(file_path, result_data["pageId"], "documents") logger.info(f"Generated overview for {file_path}") return json.dumps(result_data) except Exception as e: error_msg = f"Failed to generate file overview: {str(e)}" logger.error(error_msg) return json.dumps({"error": error_msg}) @mcp.tool() async def wikijs_bulk_update_project_docs( summary: str, affected_files: List[str], context: str, auto_create_missing: bool = True ) -> str: """ Batch update pages for large changes across multiple files. Args: summary: Overall project change summary affected_files: List of file paths that were changed context: Additional context about the changes auto_create_missing: Create pages for files without mappings Returns: JSON string with bulk update results """ try: db = get_db() results = { "updated_pages": [], "created_pages": [], "errors": [] } # Process each affected file for file_path in affected_files: try: # Check if file has a mapping mapping = db.query(FileMapping).filter(FileMapping.file_path == file_path).first() if mapping: # Update existing page sync_response = await wikijs_sync_file_docs( file_path, f"Bulk update: {summary}", context ) sync_data = json.loads(sync_response) if "error" not in sync_data: results["updated_pages"].append({ "file_path": file_path, "page_id": mapping.page_id }) else: results["errors"].append({ "file_path": file_path, "error": sync_data["error"] }) elif auto_create_missing: # Create new overview page overview_response = await wikijs_generate_file_overview(file_path) overview_data = json.loads(overview_response) if "error" not in overview_data and "pageId" in overview_data: results["created_pages"].append({ "file_path": file_path, "page_id": overview_data["pageId"] }) else: results["errors"].append({ "file_path": file_path, "error": overview_data.get("error", "Failed to create page") }) except Exception as e: results["errors"].append({ "file_path": file_path, "error": str(e) }) results["summary"] = { "total_files": len(affected_files), "updated": len(results["updated_pages"]), "created": len(results["created_pages"]), "errors": len(results["errors"]) } logger.info(f"Bulk update completed: {results['summary']}") return json.dumps(results) except Exception as e: error_msg = f"Bulk update failed: {str(e)}" logger.error(error_msg) return json.dumps({"error": error_msg}) @mcp.tool() async def wikijs_manage_collections(collection_name: str, description: str = None, space_ids: List[int] = None) -> str: """ Manage Wiki.js collections (groups of spaces/pages). Note: This is a placeholder as Wiki.js collections API may vary by version. Args: collection_name: Name of the collection description: Collection description space_ids: List of space IDs to include Returns: JSON string with collection details """ try: # This is a conceptual implementation # Actual Wiki.js API for collections may differ result = { "collection_name": collection_name, "description": description, "space_ids": space_ids or [], "status": "managed", "note": "Collection management depends on Wiki.js version and configuration" } logger.info(f"Managed collection: {collection_name}") return json.dumps(result) except Exception as e: error_msg = f"Failed to manage collection: {str(e)}" logger.error(error_msg) return json.dumps({"error": error_msg}) @mcp.tool() async def wikijs_connection_status() -> str: """ Check the status of the Wiki.js connection and authentication. Returns: JSON string with connection status """ try: auth_success = await wikijs.authenticate() if auth_success: # Test with a simple API call response = await wikijs.graphql_request("query { pages { list { id } } }") result = { "connected": True, "authenticated": True, "api_url": settings.WIKIJS_API_URL, "auth_method": "token" if settings.token else "session", "status": "healthy" } else: result = { "connected": False, "authenticated": False, "api_url": settings.WIKIJS_API_URL, "status": "authentication_failed" } return json.dumps(result) except Exception as e: result = { "connected": False, "authenticated": False, "api_url": settings.WIKIJS_API_URL, "error": str(e), "status": "connection_failed" } return json.dumps(result) @mcp.tool() async def wikijs_repository_context() -> str: """ Show current repository context and Wiki.js organization. Returns: JSON string with repository context """ try: repo_root = find_repository_root() db = get_db() # Get repository context from database context = db.query(RepositoryContext).filter( RepositoryContext.root_path == repo_root ).first() # Get file mappings for this repository mappings = db.query(FileMapping).filter( FileMapping.repository_root == repo_root ).all() result = { "repository_root": repo_root, "space_name": context.space_name if context else settings.DEFAULT_SPACE_NAME, "space_id": context.space_id if context else None, "mapped_files": len(mappings), "mappings": [ { "file_path": m.file_path, "page_id": m.page_id, "relationship": m.relationship_type, "last_updated": m.last_updated.isoformat() if m.last_updated else None } for m in mappings[:10] # Limit to first 10 for brevity ] } return json.dumps(result) except Exception as e: error_msg = f"Failed to get repository context: {str(e)}" logger.error(error_msg) return json.dumps({"error": error_msg}) @mcp.tool() async def wikijs_create_repo_structure(repo_name: str, description: str = None, sections: List[str] = None) -> str: """ Create a complete repository documentation structure with nested pages. Args: repo_name: Repository name (will be the root page) description: Repository description sections: List of main sections to create (e.g., ["Overview", "API", "Components", "Deployment"]) Returns: JSON string with created structure details """ try: # Default sections if none provided if not sections: sections = ["Overview", "Getting Started", "Architecture", "API Reference", "Development", "Deployment"] # Create root repository page root_content = f"""# {repo_name} {description or f'Documentation for the {repo_name} repository.'} ## Repository Structure This documentation is organized into the following sections: """ for section in sections: root_content += f"- [{section}]({slugify(repo_name)}/{slugify(section)})\n" root_content += f""" ## Quick Links - [Repository Overview]({slugify(repo_name)}/overview) - [Getting Started Guide]({slugify(repo_name)}/getting-started) - [API Documentation]({slugify(repo_name)}/api-reference) --- *This documentation structure was created by the Wiki.js MCP server.* """ # Create root page root_result = await wikijs_create_page(repo_name, root_content) root_data = json.loads(root_result) if "error" in root_data: return json.dumps({"error": f"Failed to create root page: {root_data['error']}"}) root_page_id = root_data["pageId"] created_pages = [root_data] # Create section pages for section in sections: section_content = f"""# {section} This is the {section.lower()} section for {repo_name}. ## Contents *Content will be added here as the documentation grows.* ## Related Pages - [Back to {repo_name}]({slugify(repo_name)}) --- *This page is part of the {repo_name} documentation structure.* """ section_result = await wikijs_create_page(section, section_content, parent_id=str(root_page_id)) section_data = json.loads(section_result) if "error" not in section_data: created_pages.append(section_data) else: logger.warning(f"Failed to create section '{section}': {section_data['error']}") result = { "repository": repo_name, "root_page_id": root_page_id, "created_pages": len(created_pages), "sections": sections, "pages": created_pages, "status": "created" } logger.info(f"Created repository structure for {repo_name} with {len(created_pages)} pages") return json.dumps(result) except Exception as e: error_msg = f"Failed to create repository structure: {str(e)}" logger.error(error_msg) return json.dumps({"error": error_msg}) @mcp.tool() async def wikijs_create_nested_page(title: str, content: str, parent_path: str, create_parent_if_missing: bool = True) -> str: """ Create a nested page using hierarchical paths (e.g., "repo/api/endpoints"). Args: title: Page title content: Page content parent_path: Full path to parent (e.g., "my-repo/api") create_parent_if_missing: Create parent pages if they don't exist Returns: JSON string with page details """ try: await wikijs.authenticate() # Check if parent exists parent_query = """ query($path: String!) { pages { singleByPath(path: $path, locale: "en") { id path title } } } """ parent_response = await wikijs.graphql_request(parent_query, {"path": parent_path}) parent_data = parent_response.get("data", {}).get("pages", {}).get("singleByPath") if not parent_data and create_parent_if_missing: # Create parent structure path_parts = parent_path.split("/") current_path = "" parent_id = None for i, part in enumerate(path_parts): if current_path: current_path += f"/{part}" else: current_path = part # Check if this level exists check_response = await wikijs.graphql_request(parent_query, {"path": current_path}) existing = check_response.get("data", {}).get("pages", {}).get("singleByPath") if not existing: # Create this level part_title = part.replace("-", " ").title() part_content = f"""# {part_title} This is a section page for organizing documentation. ## Subsections *Subsections will appear here as they are created.* --- *This page was auto-created as part of the documentation hierarchy.* """ create_result = await wikijs_create_page(part_title, part_content, parent_id=str(parent_id) if parent_id else "") create_data = json.loads(create_result) if "error" not in create_data: parent_id = create_data["pageId"] else: return json.dumps({"error": f"Failed to create parent '{current_path}': {create_data['error']}"}) else: parent_id = existing["id"] elif parent_data: parent_id = parent_data["id"] else: return json.dumps({"error": f"Parent path '{parent_path}' not found and create_parent_if_missing is False"}) # Create the target page result = await wikijs_create_page(title, content, parent_id=str(parent_id)) result_data = json.loads(result) if "error" not in result_data: result_data["parent_path"] = parent_path result_data["full_path"] = f"{parent_path}/{slugify(title)}" return json.dumps(result_data) except Exception as e: error_msg = f"Failed to create nested page: {str(e)}" logger.error(error_msg) return json.dumps({"error": error_msg}) @mcp.tool() async def wikijs_get_page_children(page_id: int = None, page_path: str = None) -> str: """ Get all child pages of a given page for hierarchical navigation. Args: page_id: Parent page ID (optional) page_path: Parent page path (optional) Returns: JSON string with child pages list """ try: await wikijs.authenticate() # Get the parent page first if page_id: parent_query = """ query($id: Int!) { pages { single(id: $id) { id path title } } } """ parent_response = await wikijs.graphql_request(parent_query, {"id": page_id}) parent_data = parent_response.get("data", {}).get("pages", {}).get("single") elif page_path: parent_query = """ query($path: String!) { pages { singleByPath(path: $path, locale: "en") { id path title } } } """ parent_response = await wikijs.graphql_request(parent_query, {"path": page_path}) parent_data = parent_response.get("data", {}).get("pages", {}).get("singleByPath") else: return json.dumps({"error": "Either page_id or page_path must be provided"}) if not parent_data: return json.dumps({"error": "Parent page not found"}) parent_path = parent_data["path"] # Get all pages and filter for children all_pages_query = """ query { pages { list { id title path description isPublished updatedAt } } } """ response = await wikijs.graphql_request(all_pages_query) all_pages = response.get("data", {}).get("pages", {}).get("list", []) # Filter for direct children (path starts with parent_path/ but no additional slashes) children = [] for page in all_pages: page_path_str = page["path"] if page_path_str.startswith(f"{parent_path}/"): # Check if it's a direct child (no additional slashes after parent) remaining_path = page_path_str[len(parent_path) + 1:] if "/" not in remaining_path: # Direct child children.append({ "pageId": page["id"], "title": page["title"], "path": page["path"], "description": page.get("description", ""), "lastModified": page.get("updatedAt"), "isPublished": page.get("isPublished", True) }) result = { "parent": { "pageId": parent_data["id"], "title": parent_data["title"], "path": parent_data["path"] }, "children": children, "total_children": len(children) } return json.dumps(result) except Exception as e: error_msg = f"Failed to get page children: {str(e)}" logger.error(error_msg) return json.dumps({"error": error_msg}) @mcp.tool() async def wikijs_create_documentation_hierarchy(project_name: str, file_mappings: List[Dict[str, str]], auto_organize: bool = True) -> str: """ Create a complete documentation hierarchy for a project based on file structure. Args: project_name: Name of the project/repository file_mappings: List of {"file_path": "src/components/Button.tsx", "doc_path": "components/button"} mappings auto_organize: Automatically organize files into logical sections Returns: JSON string with created hierarchy details """ try: # Auto-organize files into sections if requested if auto_organize: sections = { "components": [], "api": [], "utils": [], "services": [], "models": [], "tests": [], "config": [], "docs": [] } for mapping in file_mappings: file_path = mapping["file_path"].lower() if "component" in file_path or "/components/" in file_path: sections["components"].append(mapping) elif "api" in file_path or "/api/" in file_path or "endpoint" in file_path: sections["api"].append(mapping) elif "util" in file_path or "/utils/" in file_path or "/helpers/" in file_path: sections["utils"].append(mapping) elif "service" in file_path or "/services/" in file_path: sections["services"].append(mapping) elif "model" in file_path or "/models/" in file_path or "/types/" in file_path: sections["models"].append(mapping) elif "test" in file_path or "/tests/" in file_path or ".test." in file_path: sections["tests"].append(mapping) elif "config" in file_path or "/config/" in file_path or ".config." in file_path: sections["config"].append(mapping) else: sections["docs"].append(mapping) # Create root project structure section_names = [name.title() for name, files in sections.items() if files] if auto_organize else ["Documentation"] repo_result = await wikijs_create_repo_structure(project_name, f"Documentation for {project_name}", section_names) repo_data = json.loads(repo_result) if "error" in repo_data: return repo_result created_pages = [] created_mappings = [] if auto_organize: # Create pages for each section for section_name, files in sections.items(): if not files: continue section_title = section_name.title() for file_mapping in files: file_path = file_mapping["file_path"] doc_path = file_mapping.get("doc_path", slugify(os.path.basename(file_path))) # Generate documentation content for the file file_overview_result = await wikijs_generate_file_overview(file_path, target_page_id=None) overview_data = json.loads(file_overview_result) if "error" not in overview_data: created_pages.append(overview_data) # Create mapping mapping_result = await wikijs_link_file_to_page(file_path, overview_data["pageId"], "documents") mapping_data = json.loads(mapping_result) if "error" not in mapping_data: created_mappings.append(mapping_data) else: # Create pages without auto-organization for file_mapping in file_mappings: file_path = file_mapping["file_path"] doc_path = file_mapping.get("doc_path", f"{project_name}/{slugify(os.path.basename(file_path))}") # Create nested page nested_result = await wikijs_create_nested_page( os.path.basename(file_path), f"# {os.path.basename(file_path)}\n\nDocumentation for {file_path}", doc_path ) nested_data = json.loads(nested_result) if "error" not in nested_data: created_pages.append(nested_data) # Create mapping mapping_result = await wikijs_link_file_to_page(file_path, nested_data["pageId"], "documents") mapping_data = json.loads(mapping_result) if "error" not in mapping_data: created_mappings.append(mapping_data) result = { "project": project_name, "root_structure": repo_data, "created_pages": len(created_pages), "created_mappings": len(created_mappings), "auto_organized": auto_organize, "sections": list(sections.keys()) if auto_organize else ["manual"], "pages": created_pages[:10], # Limit output "mappings": created_mappings[:10], # Limit output "status": "completed" } logger.info(f"Created documentation hierarchy for {project_name}: {len(created_pages)} pages, {len(created_mappings)} mappings") return json.dumps(result) except Exception as e: error_msg = f"Failed to create documentation hierarchy: {str(e)}" logger.error(error_msg) return json.dumps({"error": error_msg}) @mcp.tool() async def wikijs_delete_page(page_id: int = None, page_path: str = None, remove_file_mapping: bool = True) -> str: """ Delete a specific page from Wiki.js. Args: page_id: Page ID to delete (optional) page_path: Page path to delete (optional) remove_file_mapping: Also remove file-to-page mapping from local database Returns: JSON string with deletion status """ try: await wikijs.authenticate() # Get page info first if page_id: get_query = """ query($id: Int!) { pages { single(id: $id) { id path title } } } """ get_response = await wikijs.graphql_request(get_query, {"id": page_id}) page_data = get_response.get("data", {}).get("pages", {}).get("single") elif page_path: get_query = """ query($path: String!) { pages { singleByPath(path: $path, locale: "en") { id path title } } } """ get_response = await wikijs.graphql_request(get_query, {"path": page_path}) page_data = get_response.get("data", {}).get("pages", {}).get("singleByPath") if page_data: page_id = page_data["id"] else: return json.dumps({"error": "Either page_id or page_path must be provided"}) if not page_data: return json.dumps({"error": "Page not found"}) # Delete the page using GraphQL mutation delete_mutation = """ mutation($id: Int!) { pages { delete(id: $id) { responseResult { succeeded errorCode slug message } } } } """ response = await wikijs.graphql_request(delete_mutation, {"id": page_id}) delete_result = response.get("data", {}).get("pages", {}).get("delete", {}) response_result = delete_result.get("responseResult", {}) if response_result.get("succeeded"): result = { "deleted": True, "pageId": page_id, "title": page_data["title"], "path": page_data["path"], "status": "deleted" } # Remove file mapping if requested if remove_file_mapping: db = get_db() mapping = db.query(FileMapping).filter(FileMapping.page_id == page_id).first() if mapping: db.delete(mapping) db.commit() result["file_mapping_removed"] = True else: result["file_mapping_removed"] = False logger.info(f"Deleted page: {page_data['title']} (ID: {page_id})") return json.dumps(result) else: error_msg = response_result.get("message", "Unknown error") return json.dumps({"error": f"Failed to delete page: {error_msg}"}) except Exception as e: error_msg = f"Failed to delete page: {str(e)}" logger.error(error_msg) return json.dumps({"error": error_msg}) @mcp.tool() async def wikijs_batch_delete_pages( page_ids: List[int] = None, page_paths: List[str] = None, path_pattern: str = None, confirm_deletion: bool = False, remove_file_mappings: bool = True ) -> str: """ Batch delete multiple pages from Wiki.js. Args: page_ids: List of page IDs to delete (optional) page_paths: List of page paths to delete (optional) path_pattern: Pattern to match paths (e.g., "frontend-app/*" for all pages under frontend-app) confirm_deletion: Must be True to actually delete pages (safety check) remove_file_mappings: Also remove file-to-page mappings from local database Returns: JSON string with batch deletion results """ try: if not confirm_deletion: return json.dumps({ "error": "confirm_deletion must be True to proceed with batch deletion", "safety_note": "This is a safety check to prevent accidental deletions" }) await wikijs.authenticate() pages_to_delete = [] # Collect pages by IDs if page_ids: for page_id in page_ids: get_query = """ query($id: Int!) { pages { single(id: $id) { id path title } } } """ get_response = await wikijs.graphql_request(get_query, {"id": page_id}) page_data = get_response.get("data", {}).get("pages", {}).get("single") if page_data: pages_to_delete.append(page_data) # Collect pages by paths if page_paths: for page_path in page_paths: get_query = """ query($path: String!) { pages { singleByPath(path: $path, locale: "en") { id path title } } } """ get_response = await wikijs.graphql_request(get_query, {"path": page_path}) page_data = get_response.get("data", {}).get("pages", {}).get("singleByPath") if page_data: pages_to_delete.append(page_data) # Collect pages by pattern if path_pattern: # Get all pages and filter by pattern all_pages_query = """ query { pages { list { id title path } } } """ response = await wikijs.graphql_request(all_pages_query) all_pages = response.get("data", {}).get("pages", {}).get("list", []) # Simple pattern matching (supports * wildcard) import fnmatch for page in all_pages: if fnmatch.fnmatch(page["path"], path_pattern): pages_to_delete.append(page) if not pages_to_delete: return json.dumps({"error": "No pages found to delete"}) # Remove duplicates unique_pages = {} for page in pages_to_delete: unique_pages[page["id"]] = page pages_to_delete = list(unique_pages.values()) # Delete pages deleted_pages = [] failed_deletions = [] for page in pages_to_delete: try: delete_result = await wikijs_delete_page( page_id=page["id"], remove_file_mapping=remove_file_mappings ) delete_data = json.loads(delete_result) if "error" not in delete_data: deleted_pages.append({ "pageId": page["id"], "title": page["title"], "path": page["path"] }) else: failed_deletions.append({ "pageId": page["id"], "title": page["title"], "path": page["path"], "error": delete_data["error"] }) except Exception as e: failed_deletions.append({ "pageId": page["id"], "title": page["title"], "path": page["path"], "error": str(e) }) result = { "total_found": len(pages_to_delete), "deleted_count": len(deleted_pages), "failed_count": len(failed_deletions), "deleted_pages": deleted_pages, "failed_deletions": failed_deletions, "status": "completed" } logger.info(f"Batch deletion completed: {len(deleted_pages)} deleted, {len(failed_deletions)} failed") return json.dumps(result) except Exception as e: error_msg = f"Batch deletion failed: {str(e)}" logger.error(error_msg) return json.dumps({"error": error_msg}) @mcp.tool() async def wikijs_delete_hierarchy( root_path: str, delete_mode: str = "children_only", confirm_deletion: bool = False, remove_file_mappings: bool = True ) -> str: """ Delete an entire page hierarchy (folder structure) from Wiki.js. Args: root_path: Root path of the hierarchy to delete (e.g., "frontend-app" or "frontend-app/components") delete_mode: Deletion mode - "children_only", "include_root", or "root_only" confirm_deletion: Must be True to actually delete pages (safety check) remove_file_mappings: Also remove file-to-page mappings from local database Returns: JSON string with hierarchy deletion results """ try: if not confirm_deletion: return json.dumps({ "error": "confirm_deletion must be True to proceed with hierarchy deletion", "safety_note": "This is a safety check to prevent accidental deletions", "preview_mode": "Set confirm_deletion=True to actually delete" }) valid_modes = ["children_only", "include_root", "root_only"] if delete_mode not in valid_modes: return json.dumps({ "error": f"Invalid delete_mode. Must be one of: {valid_modes}" }) await wikijs.authenticate() # Get all pages to find hierarchy all_pages_query = """ query { pages { list { id title path } } } """ response = await wikijs.graphql_request(all_pages_query) all_pages = response.get("data", {}).get("pages", {}).get("list", []) # Find root page root_page = None for page in all_pages: if page["path"] == root_path: root_page = page break if not root_page and delete_mode in ["include_root", "root_only"]: return json.dumps({"error": f"Root page not found: {root_path}"}) # Find child pages child_pages = [] for page in all_pages: page_path = page["path"] if page_path.startswith(f"{root_path}/"): child_pages.append(page) # Determine pages to delete based on mode pages_to_delete = [] if delete_mode == "children_only": pages_to_delete = child_pages elif delete_mode == "include_root": pages_to_delete = child_pages + ([root_page] if root_page else []) elif delete_mode == "root_only": pages_to_delete = [root_page] if root_page else [] if not pages_to_delete: return json.dumps({ "message": f"No pages found to delete for path: {root_path}", "delete_mode": delete_mode, "root_found": root_page is not None, "children_found": len(child_pages) }) # Sort by depth (deepest first) to avoid dependency issues pages_to_delete.sort(key=lambda x: x["path"].count("/"), reverse=True) # Delete pages deleted_pages = [] failed_deletions = [] for page in pages_to_delete: try: delete_result = await wikijs_delete_page( page_id=page["id"], remove_file_mapping=remove_file_mappings ) delete_data = json.loads(delete_result) if "error" not in delete_data: deleted_pages.append({ "pageId": page["id"], "title": page["title"], "path": page["path"], "depth": page["path"].count("/") }) else: failed_deletions.append({ "pageId": page["id"], "title": page["title"], "path": page["path"], "error": delete_data["error"] }) except Exception as e: failed_deletions.append({ "pageId": page["id"], "title": page["title"], "path": page["path"], "error": str(e) }) result = { "root_path": root_path, "delete_mode": delete_mode, "total_found": len(pages_to_delete), "deleted_count": len(deleted_pages), "failed_count": len(failed_deletions), "deleted_pages": deleted_pages, "failed_deletions": failed_deletions, "hierarchy_summary": { "root_page_found": root_page is not None, "child_pages_found": len(child_pages), "max_depth": max([p["path"].count("/") for p in pages_to_delete]) if pages_to_delete else 0 }, "status": "completed" } logger.info(f"Hierarchy deletion completed for {root_path}: {len(deleted_pages)} deleted, {len(failed_deletions)} failed") return json.dumps(result) except Exception as e: error_msg = f"Hierarchy deletion failed: {str(e)}" logger.error(error_msg) return json.dumps({"error": error_msg}) @mcp.tool() async def wikijs_cleanup_orphaned_mappings() -> str: """ Clean up file-to-page mappings for pages that no longer exist in Wiki.js. Returns: JSON string with cleanup results """ try: await wikijs.authenticate() db = get_db() # Get all file mappings mappings = db.query(FileMapping).all() if not mappings: return json.dumps({ "message": "No file mappings found", "cleaned_count": 0 }) # Check which pages still exist orphaned_mappings = [] valid_mappings = [] for mapping in mappings: try: get_query = """ query($id: Int!) { pages { single(id: $id) { id title path } } } """ get_response = await wikijs.graphql_request(get_query, {"id": mapping.page_id}) page_data = get_response.get("data", {}).get("pages", {}).get("single") if page_data: valid_mappings.append({ "file_path": mapping.file_path, "page_id": mapping.page_id, "page_title": page_data["title"] }) else: orphaned_mappings.append({ "file_path": mapping.file_path, "page_id": mapping.page_id, "last_updated": mapping.last_updated.isoformat() if mapping.last_updated else None }) # Delete orphaned mapping db.delete(mapping) except Exception as e: # If we can't check the page, consider it orphaned orphaned_mappings.append({ "file_path": mapping.file_path, "page_id": mapping.page_id, "error": str(e) }) db.delete(mapping) db.commit() result = { "total_mappings": len(mappings), "valid_mappings": len(valid_mappings), "orphaned_mappings": len(orphaned_mappings), "cleaned_count": len(orphaned_mappings), "orphaned_details": orphaned_mappings, "status": "completed" } logger.info(f"Cleaned up {len(orphaned_mappings)} orphaned file mappings") return json.dumps(result) except Exception as e: error_msg = f"Cleanup failed: {str(e)}" logger.error(error_msg) return json.dumps({"error": error_msg}) def main(): """Main entry point for the MCP server.""" import asyncio async def run_server(): await wikijs.authenticate() logger.info("Wiki.js MCP Server started") # Run the server mcp.run() if __name__ == "__main__": main() ```