#
tokens: 34792/50000 13/13 files
lines: on (toggle) GitHub
raw markdown copy reset
# Directory Structure

```
├── .gitignore
├── config
│   └── example.env
├── config-mcp.json
├── DELETION_TOOLS.md
├── docker.yml
├── HIERARCHICAL_FEATURES.md
├── LICENSE
├── pyproject.toml
├── README.md
├── requirements.txt
├── setup.sh
├── src
│   └── wiki_mcp_server.py
├── start-server.sh
└── test-server.sh
```

# Files

--------------------------------------------------------------------------------
/.gitignore:
--------------------------------------------------------------------------------

```
 1 | # Environment files
 2 | .env
 3 | .env.local
 4 | .env.*.local
 5 | 
 6 | # Database files
 7 | *.db
 8 | *.sqlite
 9 | *.sqlite3
10 | wikijs_mappings.db
11 | 
12 | # Logs
13 | *.log
14 | logs/
15 | wikijs_mcp.log
16 | 
17 | # Python
18 | __pycache__/
19 | *.py[cod]
20 | *$py.class
21 | *.so
22 | .Python
23 | build/
24 | develop-eggs/
25 | dist/
26 | downloads/
27 | eggs/
28 | .eggs/
29 | lib/
30 | lib64/
31 | parts/
32 | sdist/
33 | var/
34 | wheels/
35 | *.egg-info/
36 | .installed.cfg
37 | *.egg
38 | MANIFEST
39 | 
40 | # Virtual environments
41 | venv/
42 | env/
43 | ENV/
44 | env.bak/
45 | venv.bak/
46 | .venv/
47 | 
48 | # Poetry
49 | poetry.lock
50 | 
51 | # IDE
52 | .vscode/
53 | .idea/
54 | *.swp
55 | *.swo
56 | *~
57 | 
58 | # OS generated files
59 | .DS_Store
60 | .DS_Store?
61 | ._*
62 | .Spotlight-V100
63 | .Trashes
64 | ehthumbs.db
65 | Thumbs.db
66 | 
67 | # Testing
68 | .coverage
69 | .pytest_cache/
70 | .tox/
71 | .nox/
72 | htmlcov/
73 | 
74 | # MyPy
75 | .mypy_cache/
76 | .dmypy.json
77 | dmypy.json
78 | 
79 | # Jupyter Notebook
80 | .ipynb_checkpoints
81 | 
82 | # pyenv
83 | .python-version
84 | 
85 | # Temporary files
86 | tmp/
87 | temp/
88 | *.tmp
89 | *.bak
90 | *.backup 
91 | 
92 | # Docker volumes
93 | postgres_data/
94 | wikijs_data/
95 | 
96 | # Resources
97 | resources/ 
```

--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------

```markdown
  1 | # Wiki.js MCP Server
  2 | 
  3 | A comprehensive **Model Context Protocol (MCP) server** for Wiki.js integration with **hierarchical documentation** support and Docker deployment. Perfect for organizations managing multiple repositories and large-scale documentation.
  4 | 
  5 | ## 🚀 Quick Start
  6 | 
  7 | ### 1. Environment Setup
  8 | 
  9 | First, clone this repository and set up environment variables:
 10 | ```bash
 11 | # Copy environment template
 12 | cp config/example.env .env
 13 | 
 14 | # Edit .env with your credentials:
 15 | # - Set POSTGRES_PASSWORD to a secure password
 16 | # - Update other settings as needed
 17 | ```
 18 | 
 19 | ### 2. Docker Deployment (Recommended)
 20 | 
 21 | ```bash
 22 | # Start Wiki.js with Docker
 23 | docker-compose -f docker.yml up -d
 24 | ```
 25 | Wiki.js will be available at http://localhost:3000
 26 | 
 27 | Complete the initial setup in the web interface
 28 | 
 29 | ### 3. Setup MCP Server
 30 | ```bash
 31 | # Install Python dependencies
 32 | ./setup.sh
 33 | 
 34 | # Update .env with Wiki.js API credentials:
 35 | # - Get API key from Wiki.js admin panel  
 36 | # - Set WIKIJS_TOKEN in .env file
 37 | 
 38 | # Test the connection
 39 | ./test-server.sh
 40 | 
 41 | # Start MCP server
 42 | # (not needed for AI IDEs like Cursor, simply click on the refresh icon after editing mcp.json
 43 | # and you should see a green dot with all tools listed. In existing open Cursor windows,
 44 | # this refresh is necessary in order to use this MCP)
 45 | ./start-server.sh
 46 | ```
 47 | 
 48 | ### 4. Configure Cursor MCP
 49 | Add to your `~/.cursor/mcp.json`:
 50 | ```json
 51 | {
 52 |   "mcpServers": {
 53 |     "wikijs": {
 54 |       "command": "/path/to/wiki-js-mcp/start-server.sh"
 55 |     }
 56 |   }
 57 | }
 58 | ```
 59 | 
 60 | ## 🎯 Enhanced Cursor Integration
 61 | 
 62 | ### Global Rules for Documentation-First Development
 63 | Add these **Global Rules** in Cursor to automatically leverage documentation before coding:
 64 | 
 65 | ```
 66 | Before writing any code, always:
 67 | 1. Search existing documentation using wikijs_search_pages to understand current patterns and architecture
 68 | 2. Check for related components, functions, or modules that might already exist
 69 | 3. If documentation exists for similar functionality, follow the established patterns and naming conventions
 70 | 4. If no documentation exists, create it using wikijs_create_page or wikijs_create_nested_page before implementing
 71 | 5. Always update documentation when making changes using wikijs_sync_file_docs
 72 | 6. For new features, use wikijs_create_repo_structure to plan the documentation hierarchy first
 73 | ```
 74 | 
 75 | These rules ensure that your AI assistant will:
 76 | - ✅ Check documentation before suggesting implementations
 77 | - ✅ Follow existing patterns and conventions
 78 | - ✅ Maintain up-to-date documentation automatically
 79 | - ✅ Create structured documentation for new features
 80 | - ✅ Avoid duplicating existing functionality
 81 | 
 82 | ### Usage Tips for Cursor
 83 | ```
 84 | # Before starting a new feature
 85 | "Search the documentation for authentication patterns before implementing login"
 86 | 
 87 | # When creating components
 88 | "Create nested documentation under frontend-app/components before building the React component"
 89 | 
 90 | # For API development
 91 | "Check existing API documentation and create endpoint docs using the established structure"
 92 | 
 93 | # During refactoring
 94 | "Update all related documentation pages for the files I'm about to modify"
 95 | ```
 96 | 
 97 | ## 🚀 Key Features
 98 | 
 99 | ### 📁 **Hierarchical Documentation**
100 | - **Repository-level organization**: Create structured docs for multiple repos
101 | - **Nested page creation**: Automatic parent-child relationships
102 | - **Auto-organization**: Smart categorization by file type (components, API, utils, etc.)
103 | - **Enterprise scalability**: Handle hundreds of repos and thousands of files
104 | 
105 | ### 🔧 **Core Functionality**
106 | - **GraphQL API integration**: Full Wiki.js v2+ compatibility
107 | - **File-to-page mapping**: Automatic linking between source code and documentation
108 | - **Code structure analysis**: Extract classes, functions, and dependencies
109 | - **Bulk operations**: Update multiple pages simultaneously
110 | - **Change tracking**: Monitor file modifications and sync docs
111 | 
112 | ### 🐳 **Docker Setup**
113 | - **One-command deployment**: Complete Wiki.js setup with PostgreSQL
114 | - **Persistent storage**: Data survives container restarts
115 | - **Health checks**: Automatic service monitoring
116 | - **Production-ready**: Optimized for development and deployment
117 | 
118 | ### 🔍 **Smart Features**
119 | - **Repository context detection**: Auto-detect Git repositories
120 | - **Content generation**: Auto-create documentation from code structure
121 | - **Search integration**: Full-text search across hierarchical content
122 | - **Health monitoring**: Connection status and error handling
123 | 
124 | ## 📊 MCP Tools (21 Total)
125 | 
126 | ### 🏗️ **Hierarchical Documentation Tools**
127 | 1. **`wikijs_create_repo_structure`** - Create complete repository documentation structure
128 | 2. **`wikijs_create_nested_page`** - Create pages with hierarchical paths
129 | 3. **`wikijs_get_page_children`** - Navigate parent-child page relationships
130 | 4. **`wikijs_create_documentation_hierarchy`** - Auto-organize project files into docs
131 | 
132 | ### 📝 **Core Page Management**
133 | 5. **`wikijs_create_page`** - Create new pages (now with parent support)
134 | 6. **`wikijs_update_page`** - Update existing pages
135 | 7. **`wikijs_get_page`** - Retrieve page content and metadata
136 | 8. **`wikijs_search_pages`** - Search pages by text (fixed GraphQL issues)
137 | 
138 | ### 🗑️ **Deletion & Cleanup Tools**
139 | 9. **`wikijs_delete_page`** - Delete specific pages by ID or path
140 | 10. **`wikijs_batch_delete_pages`** - Batch delete with pattern matching and safety checks
141 | 11. **`wikijs_delete_hierarchy`** - Delete entire page hierarchies with multiple modes
142 | 12. **`wikijs_cleanup_orphaned_mappings`** - Clean up orphaned file-to-page mappings
143 | 
144 | ### 🗂️ **Organization & Structure**
145 | 13. **`wikijs_list_spaces`** - List top-level documentation spaces
146 | 14. **`wikijs_create_space`** - Create new documentation spaces
147 | 15. **`wikijs_manage_collections`** - Manage page collections
148 | 
149 | ### 🔗 **File Integration**
150 | 16. **`wikijs_link_file_to_page`** - Link source files to documentation pages
151 | 17. **`wikijs_sync_file_docs`** - Sync code changes to documentation
152 | 18. **`wikijs_generate_file_overview`** - Auto-generate file documentation
153 | 
154 | ### 🚀 **Bulk Operations**
155 | 19. **`wikijs_bulk_update_project_docs`** - Batch update multiple pages
156 | 
157 | ### 🔧 **System Tools**
158 | 20. **`wikijs_connection_status`** - Check API connection health
159 | 21. **`wikijs_repository_context`** - Show repository mappings and context
160 | 
161 | ## 🏢 Enterprise Use Cases
162 | 
163 | ### Multi-Repository Documentation
164 | ```
165 | Company Documentation/
166 | ├── frontend-web-app/
167 | │   ├── Overview/
168 | │   ├── Components/
169 | │   │   ├── Button/
170 | │   │   ├── Modal/
171 | │   │   └── Form/
172 | │   ├── API Integration/
173 | │   └── Deployment/
174 | ├── backend-api/
175 | │   ├── Overview/
176 | │   ├── Controllers/
177 | │   ├── Models/
178 | │   └── Database Schema/
179 | ├── mobile-app/
180 | │   ├── Overview/
181 | │   ├── Screens/
182 | │   └── Native Components/
183 | └── shared-libraries/
184 |     ├── UI Components/
185 |     ├── Utilities/
186 |     └── Type Definitions/
187 | ```
188 | 
189 | ### Automatic Organization
190 | The system intelligently categorizes files:
191 | - **Components**: React/Vue components, UI elements
192 | - **API**: Endpoints, controllers, routes
193 | - **Utils**: Helper functions, utilities
194 | - **Services**: Business logic, external integrations
195 | - **Models**: Data models, types, schemas
196 | - **Tests**: Unit tests, integration tests
197 | - **Config**: Configuration files, environment setup
198 | 
199 | ## 📚 Usage Examples
200 | 
201 | ### Create Repository Documentation
202 | ```python
203 | # Create complete repository structure
204 | await wikijs_create_repo_structure(
205 |     "My Frontend App",
206 |     "Modern React application with TypeScript",
207 |     ["Overview", "Components", "API", "Testing", "Deployment"]
208 | )
209 | 
210 | # Create nested component documentation
211 | await wikijs_create_nested_page(
212 |     "Button Component",
213 |     "# Button Component\n\nReusable button with multiple variants...",
214 |     "my-frontend-app/components"
215 | )
216 | 
217 | # Auto-organize entire project
218 | await wikijs_create_documentation_hierarchy(
219 |     "My Project",
220 |     [
221 |         {"file_path": "src/components/Button.tsx"},
222 |         {"file_path": "src/api/users.ts"},
223 |         {"file_path": "src/utils/helpers.ts"}
224 |     ],
225 |     auto_organize=True
226 | )
227 | ```
228 | 
229 | ### Documentation Management
230 | ```python
231 | # Clean up and manage documentation
232 | # Preview what would be deleted (safe)
233 | preview = await wikijs_delete_hierarchy(
234 |     "old-project",
235 |     delete_mode="include_root",
236 |     confirm_deletion=False
237 | )
238 | 
239 | # Delete entire deprecated project
240 | await wikijs_delete_hierarchy(
241 |     "old-project",
242 |     delete_mode="include_root", 
243 |     confirm_deletion=True
244 | )
245 | 
246 | # Batch delete test pages
247 | await wikijs_batch_delete_pages(
248 |     path_pattern="*test*",
249 |     confirm_deletion=True
250 | )
251 | 
252 | # Clean up orphaned file mappings
253 | await wikijs_cleanup_orphaned_mappings()
254 | ```
255 | 
256 | ## ⚙️ Configuration
257 | 
258 | ### Environment Variables
259 | ```bash
260 | # Docker Database Configuration
261 | POSTGRES_DB=wikijs
262 | POSTGRES_USER=wikijs
263 | POSTGRES_PASSWORD=your_secure_password_here
264 | 
265 | # Wiki.js Connection
266 | WIKIJS_API_URL=http://localhost:3000
267 | WIKIJS_API_KEY=your_jwt_token_here
268 | 
269 | # Alternative: Username/Password
270 | WIKIJS_USERNAME=your_username
271 | WIKIJS_PASSWORD=your_password
272 | 
273 | # Database & Logging
274 | WIKIJS_MCP_DB=./wikijs_mappings.db
275 | LOG_LEVEL=INFO
276 | LOG_FILE=wikijs_mcp.log
277 | 
278 | # Repository Settings
279 | REPOSITORY_ROOT=./
280 | DEFAULT_SPACE_NAME=Documentation
281 | ```
282 | 
283 | ### Authentication Options
284 | 1. **JWT Token** (Recommended): Use API key from Wiki.js admin panel
285 | 2. **Username/Password**: Traditional login credentials
286 | 
287 | ## 🔧 Technical Architecture
288 | 
289 | ### GraphQL Integration
290 | - **Full GraphQL API support**: Native Wiki.js v2+ compatibility
291 | - **Optimized queries**: Efficient data fetching and mutations
292 | - **Error handling**: Comprehensive GraphQL error management
293 | - **Retry logic**: Automatic retry with exponential backoff
294 | 
295 | ### Database Layer
296 | - **SQLite storage**: Local file-to-page mappings
297 | - **Repository context**: Git repository detection and tracking
298 | - **Change tracking**: File hash monitoring for sync detection
299 | - **Relationship management**: Parent-child page hierarchies
300 | 
301 | ### Code Analysis
302 | - **AST parsing**: Extract Python classes, functions, imports
303 | - **Structure detection**: Identify code patterns and organization
304 | - **Documentation generation**: Auto-create comprehensive overviews
305 | - **Dependency mapping**: Track imports and relationships
306 | 
307 | ## 📈 Performance & Scalability
308 | 
309 | - **Async operations**: Non-blocking I/O for all API calls
310 | - **Bulk processing**: Efficient batch operations for large projects
311 | - **Caching**: Smart caching of page relationships and metadata
312 | - **Connection pooling**: Optimized HTTP client management
313 | 
314 | ## 🛠️ Development
315 | 
316 | ### Project Structure
317 | ```
318 | wiki-js-mcp/
319 | ├── src/
320 | │   └── wiki_mcp_server.py      # Main MCP server implementation
321 | ├── config/
322 | │   └── example.env             # Configuration template
323 | ├── docker.yml                  # Docker Compose setup
324 | ├── pyproject.toml              # Poetry dependencies
325 | ├── requirements.txt            # Pip dependencies
326 | ├── setup.sh                    # Environment setup script
327 | ├── start-server.sh             # MCP server launcher
328 | ├── test-server.sh              # Interactive testing script
329 | ├── HIERARCHICAL_FEATURES.md    # Hierarchical documentation guide
330 | ├── DELETION_TOOLS.md           # Deletion and cleanup guide
331 | ├── LICENSE                     # MIT License
332 | └── README.md                   # This file
333 | ```
334 | 
335 | ### Dependencies
336 | - **FastMCP**: Official Python MCP SDK
337 | - **httpx**: Async HTTP client for GraphQL
338 | - **SQLAlchemy**: Database ORM for mappings
339 | - **Pydantic**: Configuration and validation
340 | - **tenacity**: Retry logic for reliability
341 | 
342 | ## 🔍 Troubleshooting
343 | 
344 | ### Docker Issues
345 | ```bash
346 | # Check containers
347 | docker-compose -f docker.yml ps
348 | 
349 | # View logs
350 | docker-compose -f docker.yml logs wiki
351 | docker-compose -f docker.yml logs postgres
352 | 
353 | # Reset everything
354 | docker-compose -f docker.yml down -v
355 | docker-compose -f docker.yml up -d
356 | ```
357 | 
358 | ### Connection Issues
359 | ```bash
360 | # Check Wiki.js is running
361 | curl http://localhost:3000/graphql
362 | 
363 | # Verify authentication
364 | ./test-server.sh
365 | 
366 | # Debug mode
367 | export LOG_LEVEL=DEBUG
368 | ./start-server.sh
369 | ```
370 | 
371 | ### Common Problems
372 | - **Port conflicts**: Change port 3000 in `docker.yml` if needed
373 | - **Database issues**: Remove `postgres_data/` and restart
374 | - **API permissions**: Ensure API key has admin privileges
375 | - **Python dependencies**: Run `./setup.sh` to reinstall
376 | 
377 | ## 📚 Documentation
378 | 
379 | - **[Hierarchical Features Guide](HIERARCHICAL_FEATURES.md)** - Complete guide to enterprise documentation
380 | - **[Deletion Tools Guide](DELETION_TOOLS.md)** - Comprehensive deletion and cleanup tools
381 | - **[Configuration Examples](config/example.env)** - Environment setup
382 | 
383 | ## 🤝 Contributing
384 | 
385 | 1. Fork the repository
386 | 2. Create feature branch (`git checkout -b feature/amazing-feature`)
387 | 3. Commit changes (`git commit -m 'Add amazing feature'`)
388 | 4. Push to branch (`git push origin feature/amazing-feature`)
389 | 5. Open Pull Request
390 | 
391 | ## 📄 License
392 | 
393 | This project is licensed under the MIT License - see the LICENSE file for details.
394 | 
395 | ## 🙏 Acknowledgments
396 | 
397 | - **Wiki.js Team**: For the excellent documentation platform
398 | - **MCP Protocol**: For the standardized AI integration framework
399 | - **FastMCP**: For the Python MCP SDK
400 | 
401 | ---
402 | 
403 | **Ready to scale your documentation?** 🚀 Start with `wikijs_create_repo_structure` and build enterprise-grade documentation hierarchies! Use the Cursor global rules to ensure documentation-first development! 📚✨ 
```

--------------------------------------------------------------------------------
/config-mcp.json:
--------------------------------------------------------------------------------

```json
1 | {
2 |   "mcpServers": {
3 |     "wikijs": {
4 |       "command": "/absolute/path/to/wiki-js-mcp/start-server.sh"
5 |     }
6 |   }
7 | } 
```

--------------------------------------------------------------------------------
/requirements.txt:
--------------------------------------------------------------------------------

```
 1 | # Wiki.js MCP Server Dependencies
 2 | # Python 3.12+ required
 3 | 
 4 | # Core MCP Framework
 5 | fastmcp>=0.1.0
 6 | 
 7 | # HTTP Client
 8 | httpx>=0.27.0
 9 | 
10 | # Data Validation and Settings
11 | pydantic>=2.0.0
12 | pydantic-settings>=2.0.0
13 | 
14 | # Text Processing
15 | python-slugify>=8.0.0
16 | markdown>=3.5.0
17 | beautifulsoup4>=4.12.0
18 | 
19 | # Environment variable loading
20 | python-dotenv>=1.0.0
21 | 
22 | # Database
23 | sqlalchemy>=2.0.0
24 | aiosqlite>=0.19.0
25 | 
26 | # Retry logic
27 | tenacity>=8.0.0
28 | 
29 | # Development Dependencies (optional)
30 | pytest>=7.4.0
31 | pytest-asyncio>=0.21.0
32 | black>=23.0.0
33 | isort>=5.12.0
34 | mypy>=1.5.0 
```

--------------------------------------------------------------------------------
/config/example.env:
--------------------------------------------------------------------------------

```
 1 | # Wiki.js MCP Server Configuration
 2 | # Copy this file to .env and update with your actual values
 3 | 
 4 | # Docker Database Configuration (for docker.yml)
 5 | POSTGRES_DB=wikijs
 6 | POSTGRES_USER=wikijs
 7 | POSTGRES_PASSWORD=your_secure_password_here
 8 | 
 9 | # Wiki.js Instance Configuration
10 | WIKIJS_API_URL=http://localhost:3000
11 | WIKIJS_TOKEN=eyJhbGciOiJSUzI1NiIsInR5cCI6IkpXVCJ9.your_token_here
12 | 
13 | # Alternative: Username/Password Authentication
14 | # [email protected]
15 | # WIKIJS_PASSWORD=your_password
16 | 
17 | # Local Database Configuration
18 | WIKIJS_MCP_DB=./wikijs_mappings.db
19 | 
20 | # Optional: Logging Configuration
21 | LOG_LEVEL=INFO
22 | LOG_FILE=wikijs_mcp.log
23 | 
24 | # Optional: Repository Context
25 | REPOSITORY_ROOT=./
26 | DEFAULT_SPACE_NAME=Documentation
27 | 
28 | # Example values (replace with your actual credentials):
29 | # POSTGRES_PASSWORD=MySecurePassword123!
30 | # WIKIJS_API_URL=http://localhost:3000
31 | # WIKIJS_TOKEN=your_actual_jwt_token_here 
```

--------------------------------------------------------------------------------
/docker.yml:
--------------------------------------------------------------------------------

```yaml
 1 | services:
 2 |   db:
 3 |     image: postgres:15-alpine
 4 |     container_name: wikijs_db
 5 |     restart: unless-stopped
 6 |     environment:
 7 |       POSTGRES_DB: ${POSTGRES_DB:-wikijs}
 8 |       POSTGRES_USER: ${POSTGRES_USER:-wikijs}
 9 |       POSTGRES_PASSWORD: ${POSTGRES_PASSWORD}
10 |     volumes:
11 |       - db_data:/var/lib/postgresql/data
12 |     healthcheck:
13 |       test: ["CMD-SHELL", "pg_isready -U ${POSTGRES_USER:-wikijs}"]
14 |       interval: 30s
15 |       timeout: 10s
16 |       retries: 3
17 | 
18 |   wiki:
19 |     image: ghcr.io/requarks/wiki:2
20 |     container_name: wikijs_app
21 |     depends_on:
22 |       db:
23 |         condition: service_healthy
24 |     ports:
25 |       - "3000:3000"
26 |     restart: unless-stopped
27 |     environment:
28 |       DB_TYPE: postgres
29 |       DB_HOST: db
30 |       DB_PORT: 5432
31 |       DB_USER: ${POSTGRES_USER:-wikijs}
32 |       DB_PASS: ${POSTGRES_PASSWORD}
33 |       DB_NAME: ${POSTGRES_DB:-wikijs}
34 |       PORT: 3000
35 |     healthcheck:
36 |       test: ["CMD", "wget", "--no-verbose", "--tries=1", "--spider", "http://localhost:3000/healthz"]
37 |       interval: 30s
38 |       timeout: 10s
39 |       retries: 3
40 |       start_period: 40s
41 | 
42 | volumes:
43 |   db_data:
```

--------------------------------------------------------------------------------
/pyproject.toml:
--------------------------------------------------------------------------------

```toml
 1 | [tool.poetry]
 2 | name = "wiki-js-mcp"
 3 | version = "1.0.0"
 4 | description = "Model Context Protocol (MCP) server for Wiki.js integration with hierarchical documentation support"
 5 | authors = ["Sahil Pethe <[email protected]>"]
 6 | license = "MIT"
 7 | readme = "README.md"
 8 | packages = [{include = "wiki_mcp_server.py", from = "src"}]
 9 | 
10 | [tool.poetry.dependencies]
11 | python = "^3.12"
12 | fastmcp = "^0.1.0"
13 | httpx = "^0.27.0"
14 | pydantic = "^2.0"
15 | pydantic-settings = "^2.0"
16 | python-slugify = "^8.0"
17 | markdown = "^3.5"
18 | beautifulsoup4 = "^4.12"
19 | python-dotenv = "^1.0.0"
20 | sqlalchemy = "^2.0"
21 | tenacity = "^8.0"
22 | aiosqlite = "^0.19.0"
23 | 
24 | [tool.poetry.group.dev.dependencies]
25 | pytest = "^7.4.0"
26 | pytest-asyncio = "^0.21.0"
27 | black = "^23.0.0"
28 | isort = "^5.12.0"
29 | mypy = "^1.5.0"
30 | 
31 | [build-system]
32 | requires = ["poetry-core"]
33 | build-backend = "poetry.core.masonry.api"
34 | 
35 | [tool.poetry.scripts]
36 | wikijs-mcp = "wiki_mcp_server:main"
37 | 
38 | [tool.black]
39 | line-length = 88
40 | target-version = ['py312']
41 | 
42 | [tool.isort]
43 | profile = "black"
44 | line_length = 88
45 | 
46 | [tool.mypy]
47 | python_version = "3.12"
48 | warn_return_any = true
49 | warn_unused_configs = true
50 | disallow_untyped_defs = true 
```

--------------------------------------------------------------------------------
/start-server.sh:
--------------------------------------------------------------------------------

```bash
 1 | #!/bin/bash
 2 | 
 3 | # Wiki.js MCP Server Start Script
 4 | # This script activates the virtual environment and starts the MCP server
 5 | 
 6 | set -e  # Exit on any error
 7 | 
 8 | # Get the directory where this script is located
 9 | SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
10 | cd "$SCRIPT_DIR"
11 | 
12 | # Check if virtual environment exists
13 | if [ ! -d "venv" ]; then
14 |     echo "❌ Error: Virtual environment not found. Please run ./setup.sh first." >&2
15 |     exit 1
16 | fi
17 | 
18 | # Check if .env file exists
19 | if [ ! -f ".env" ]; then
20 |     echo "❌ Error: .env file not found. Please copy config/example.env to .env and configure it." >&2
21 |     exit 1
22 | fi
23 | 
24 | # Activate virtual environment
25 | source venv/bin/activate
26 | 
27 | # Check if the main server file exists
28 | if [ ! -f "src/wiki_mcp_server.py" ]; then
29 |     echo "❌ Error: Server file src/wiki_mcp_server.py not found." >&2
30 |     exit 1
31 | fi
32 | 
33 | # Load environment variables for validation
34 | source .env
35 | 
36 | # Validate required environment variables
37 | if [ -z "$WIKIJS_API_URL" ]; then
38 |     echo "❌ Error: WIKIJS_API_URL not set in .env file" >&2
39 |     exit 1
40 | fi
41 | 
42 | if [ -z "$WIKIJS_TOKEN" ] && [ -z "$WIKIJS_USERNAME" ]; then
43 |     echo "❌ Error: Either WIKIJS_TOKEN or WIKIJS_USERNAME must be set in .env file" >&2
44 |     exit 1
45 | fi
46 | 
47 | # Create logs directory if it doesn't exist
48 | mkdir -p logs
49 | 
50 | # Start the MCP server (this will handle stdin/stdout communication with Cursor)
51 | exec python src/wiki_mcp_server.py 
```

--------------------------------------------------------------------------------
/test-server.sh:
--------------------------------------------------------------------------------

```bash
 1 | #!/bin/bash
 2 | 
 3 | # Wiki.js MCP Server Test Script
 4 | # This script is for interactive testing and debugging
 5 | 
 6 | set -e  # Exit on any error
 7 | 
 8 | echo "🚀 Testing Wiki.js MCP Server..."
 9 | 
10 | # Get the directory where this script is located
11 | SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
12 | cd "$SCRIPT_DIR"
13 | 
14 | # Check if virtual environment exists
15 | if [ ! -d "venv" ]; then
16 |     echo "❌ Error: Virtual environment not found. Please run ./setup.sh first."
17 |     exit 1
18 | fi
19 | 
20 | # Check if .env file exists
21 | if [ ! -f ".env" ]; then
22 |     echo "❌ Error: .env file not found. Please copy config/example.env to .env and configure it."
23 |     exit 1
24 | fi
25 | 
26 | # Activate virtual environment
27 | echo "🔧 Activating virtual environment..."
28 | source venv/bin/activate
29 | 
30 | # Check if the main server file exists
31 | if [ ! -f "src/wiki_mcp_server.py" ]; then
32 |     echo "❌ Error: Server file src/wiki_mcp_server.py not found."
33 |     exit 1
34 | fi
35 | 
36 | # Load environment variables for validation
37 | echo "⚙️  Loading configuration..."
38 | source .env
39 | 
40 | # Validate required environment variables
41 | if [ -z "$WIKIJS_API_URL" ]; then
42 |     echo "❌ Error: WIKIJS_API_URL not set in .env file"
43 |     exit 1
44 | fi
45 | 
46 | if [ -z "$WIKIJS_TOKEN" ] && [ -z "$WIKIJS_USERNAME" ]; then
47 |     echo "❌ Error: Either WIKIJS_TOKEN or WIKIJS_USERNAME must be set in .env file"
48 |     exit 1
49 | fi
50 | 
51 | echo "✅ Configuration validated"
52 | 
53 | # Create logs directory if it doesn't exist
54 | mkdir -p logs
55 | 
56 | # Display server information
57 | echo ""
58 | echo "📊 Server Configuration:"
59 | echo "   API URL: $WIKIJS_API_URL"
60 | echo "   Database: ${WIKIJS_MCP_DB:-./wikijs_mappings.db}"
61 | echo "   Log Level: ${LOG_LEVEL:-INFO}"
62 | echo "   Log File: ${LOG_FILE:-wikijs_mcp.log}"
63 | echo ""
64 | 
65 | # Function to handle cleanup on exit
66 | cleanup() {
67 |     echo ""
68 |     echo "🛑 Shutting down Wiki.js MCP Server..."
69 |     exit 0
70 | }
71 | 
72 | # Set up signal handlers
73 | trap cleanup SIGINT SIGTERM
74 | 
75 | # Start the server
76 | echo "🌟 Starting MCP server for testing..."
77 | echo "   Press Ctrl+C to stop the server"
78 | echo ""
79 | 
80 | # Run the server with error handling
81 | if python src/wiki_mcp_server.py; then
82 |     echo "✅ Server started successfully"
83 | else
84 |     echo "❌ Server failed to start. Check the logs for details:"
85 |     echo "   Log file: ${LOG_FILE:-wikijs_mcp.log}"
86 |     echo ""
87 |     echo "💡 Troubleshooting tips:"
88 |     echo "   1. Verify Wiki.js is running and accessible"
89 |     echo "   2. Check your API token is valid"
90 |     echo "   3. Ensure all dependencies are installed (run ./setup.sh)"
91 |     echo "   4. Check the log file for detailed error messages"
92 |     exit 1
93 | fi 
```

--------------------------------------------------------------------------------
/setup.sh:
--------------------------------------------------------------------------------

```bash
  1 | #!/bin/bash
  2 | 
  3 | # Wiki.js MCP Server Setup Script
  4 | # This script sets up the Python virtual environment and installs dependencies
  5 | 
  6 | set -e  # Exit on any error
  7 | 
  8 | echo "🚀 Setting up Wiki.js MCP Server..."
  9 | 
 10 | # Get the directory where this script is located
 11 | SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
 12 | cd "$SCRIPT_DIR"
 13 | 
 14 | # Check if Python 3.12+ is available
 15 | echo "📋 Checking Python version..."
 16 | if command -v python3 &> /dev/null; then
 17 |     PYTHON_CMD="python3"
 18 | elif command -v python &> /dev/null; then
 19 |     PYTHON_CMD="python"
 20 | else
 21 |     echo "❌ Error: Python is not installed or not in PATH"
 22 |     exit 1
 23 | fi
 24 | 
 25 | # Check Python version
 26 | PYTHON_VERSION=$($PYTHON_CMD --version 2>&1 | cut -d' ' -f2)
 27 | PYTHON_MAJOR=$(echo $PYTHON_VERSION | cut -d'.' -f1)
 28 | PYTHON_MINOR=$(echo $PYTHON_VERSION | cut -d'.' -f2)
 29 | 
 30 | if [ "$PYTHON_MAJOR" -lt 3 ] || ([ "$PYTHON_MAJOR" -eq 3 ] && [ "$PYTHON_MINOR" -lt 12 ]); then
 31 |     echo "❌ Error: Python 3.12+ is required. Found: $PYTHON_VERSION"
 32 |     exit 1
 33 | fi
 34 | 
 35 | echo "✅ Python $PYTHON_VERSION found"
 36 | 
 37 | # Create virtual environment if it doesn't exist
 38 | if [ ! -d "venv" ]; then
 39 |     echo "📦 Creating virtual environment..."
 40 |     $PYTHON_CMD -m venv venv
 41 |     echo "✅ Virtual environment created"
 42 | else
 43 |     echo "📦 Virtual environment already exists"
 44 | fi
 45 | 
 46 | # Activate virtual environment
 47 | echo "🔧 Activating virtual environment..."
 48 | source venv/bin/activate
 49 | 
 50 | # Upgrade pip
 51 | echo "⬆️  Upgrading pip..."
 52 | pip install --upgrade pip
 53 | 
 54 | # Install dependencies
 55 | echo "📚 Installing dependencies..."
 56 | if [ -f "pyproject.toml" ] && command -v poetry &> /dev/null; then
 57 |     echo "📖 Using Poetry for dependency management..."
 58 |     poetry install
 59 | elif [ -f "requirements.txt" ]; then
 60 |     echo "📖 Using pip for dependency management..."
 61 |     pip install -r requirements.txt
 62 | else
 63 |     echo "❌ Error: No pyproject.toml or requirements.txt found"
 64 |     exit 1
 65 | fi
 66 | 
 67 | # Create .env file from example if it doesn't exist
 68 | if [ ! -f ".env" ] && [ -f "config/example.env" ]; then
 69 |     echo "⚙️  Creating .env file from example..."
 70 |     cp config/example.env .env
 71 |     echo "✅ .env file created. Please edit it with your Wiki.js credentials."
 72 | else
 73 |     echo "⚙️  .env file already exists"
 74 | fi
 75 | 
 76 | # Create necessary directories
 77 | echo "📁 Creating necessary directories..."
 78 | mkdir -p logs
 79 | 
 80 | # Set executable permissions for scripts
 81 | echo "🔐 Setting executable permissions..."
 82 | chmod +x setup.sh
 83 | chmod +x start-server.sh
 84 | 
 85 | echo ""
 86 | echo "🎉 Setup completed successfully!"
 87 | echo ""
 88 | echo "📝 Next steps:"
 89 | echo "1. Edit .env file with your Wiki.js credentials:"
 90 | echo "   - Set WIKIJS_API_URL (e.g., http://localhost:3000)"
 91 | echo "   - Set WIKIJS_TOKEN (your JWT token from Wiki.js)"
 92 | echo ""
 93 | echo "2. Test the server:"
 94 | echo "   ./start-server.sh"
 95 | echo ""
 96 | echo "3. Configure Cursor MCP:"
 97 | echo "   - Copy config-mcp.json settings to your Cursor MCP configuration"
 98 | echo "   - Update the absolute paths in the configuration"
 99 | echo ""
100 | echo "📖 For more information, see README.md" 
```

--------------------------------------------------------------------------------
/HIERARCHICAL_FEATURES.md:
--------------------------------------------------------------------------------

```markdown
  1 | # Hierarchical Documentation Features
  2 | 
  3 | ## Overview
  4 | 
  5 | The Wiki.js MCP server now supports **hierarchical documentation structures** perfect for enterprise-scale repository documentation. This allows you to create organized, nested documentation that scales from individual files to entire company codebases.
  6 | 
  7 | ## New MCP Tools
  8 | 
  9 | ### 1. `wikijs_create_repo_structure`
 10 | Creates a complete repository documentation structure with nested pages.
 11 | 
 12 | **Usage:**
 13 | ```python
 14 | await wikijs_create_repo_structure(
 15 |     repo_name="Frontend App",
 16 |     description="A modern React frontend application", 
 17 |     sections=["Overview", "Components", "API", "Deployment", "Testing"]
 18 | )
 19 | ```
 20 | 
 21 | **Creates:**
 22 | ```
 23 | Frontend App/
 24 | ├── Overview/
 25 | ├── Components/
 26 | ├── API/
 27 | ├── Deployment/
 28 | └── Testing/
 29 | ```
 30 | 
 31 | ### 2. `wikijs_create_nested_page`
 32 | Creates pages with hierarchical paths, automatically creating parent pages if needed.
 33 | 
 34 | **Usage:**
 35 | ```python
 36 | await wikijs_create_nested_page(
 37 |     title="Button Component",
 38 |     content="# Button Component\n\nA reusable button...",
 39 |     parent_path="frontend-app/components",
 40 |     create_parent_if_missing=True
 41 | )
 42 | ```
 43 | 
 44 | **Creates:** `frontend-app/components/button-component`
 45 | 
 46 | ### 3. `wikijs_get_page_children`
 47 | Retrieves all direct child pages of a parent page for navigation.
 48 | 
 49 | **Usage:**
 50 | ```python
 51 | await wikijs_get_page_children(page_path="frontend-app/components")
 52 | ```
 53 | 
 54 | **Returns:**
 55 | ```json
 56 | {
 57 |   "parent": {"pageId": 18, "title": "Components", "path": "frontend-app/components"},
 58 |   "children": [
 59 |     {"pageId": 22, "title": "Button Component", "path": "frontend-app/components/button-component"},
 60 |     {"pageId": 23, "title": "Modal Component", "path": "frontend-app/components/modal-component"}
 61 |   ],
 62 |   "total_children": 2
 63 | }
 64 | ```
 65 | 
 66 | ### 4. `wikijs_create_documentation_hierarchy`
 67 | Creates a complete documentation hierarchy for a project based on file structure with auto-organization.
 68 | 
 69 | **Usage:**
 70 | ```python
 71 | await wikijs_create_documentation_hierarchy(
 72 |     project_name="My App",
 73 |     file_mappings=[
 74 |         {"file_path": "src/components/Button.tsx", "doc_path": "components/button"},
 75 |         {"file_path": "src/api/users.ts", "doc_path": "api/users"},
 76 |         {"file_path": "src/utils/helpers.ts", "doc_path": "utils/helpers"}
 77 |     ],
 78 |     auto_organize=True
 79 | )
 80 | ```
 81 | 
 82 | **Auto-organizes into sections:**
 83 | - **Components**: Files with "component" or "/components/" in path
 84 | - **API**: Files with "api", "/api/", or "endpoint" in path  
 85 | - **Utils**: Files with "util", "/utils/", or "/helpers/" in path
 86 | - **Services**: Files with "service" or "/services/" in path
 87 | - **Models**: Files with "model", "/models/", or "/types/" in path
 88 | - **Tests**: Files with "test", "/tests/", or ".test." in path
 89 | - **Config**: Files with "config", "/config/", or ".config." in path
 90 | 
 91 | ## Enhanced Existing Tools
 92 | 
 93 | ### `wikijs_create_page` (Enhanced)
 94 | Now supports `parent_id` parameter for creating hierarchical pages:
 95 | 
 96 | ```python
 97 | await wikijs_create_page(
 98 |     title="API Endpoints", 
 99 |     content="# API Documentation...",
100 |     parent_id="16"  # Creates as child of page 16
101 | )
102 | ```
103 | 
104 | ### `wikijs_search_pages` (Fixed)
105 | Fixed GraphQL query issues - now works properly:
106 | 
107 | ```python
108 | await wikijs_search_pages("Button")
109 | # Returns: {"results": [...], "total": 1}
110 | ```
111 | 
112 | ## Enterprise Use Cases
113 | 
114 | ### 1. Company-wide Repository Documentation
115 | ```
116 | Company Docs/
117 | ├── frontend-web-app/
118 | │   ├── Overview/
119 | │   ├── Components/
120 | │   │   ├── Button/
121 | │   │   ├── Modal/
122 | │   │   └── Form/
123 | │   ├── API/
124 | │   └── Deployment/
125 | ├── backend-api/
126 | │   ├── Overview/
127 | │   ├── Controllers/
128 | │   ├── Models/
129 | │   └── Database/
130 | ├── mobile-app/
131 | │   ├── Overview/
132 | │   ├── Screens/
133 | │   └── Components/
134 | └── shared-libraries/
135 |     ├── UI Components/
136 |     ├── Utilities/
137 |     └── Types/
138 | ```
139 | 
140 | ### 2. Automatic File-to-Documentation Mapping
141 | The system automatically:
142 | - Creates hierarchical page structures
143 | - Links source files to documentation pages
144 | - Organizes files by type (components, API, utils, etc.)
145 | - Maintains parent-child relationships
146 | - Enables easy navigation between related docs
147 | 
148 | ### 3. Scalable Documentation Architecture
149 | - **Root Level**: Repository names only
150 | - **Section Level**: Logical groupings (Components, API, etc.)
151 | - **Page Level**: Individual files/features
152 | - **Sub-page Level**: Detailed documentation sections
153 | 
154 | ## Benefits
155 | 
156 | ✅ **Scalable**: Handles hundreds of repositories and thousands of files  
157 | ✅ **Organized**: Auto-categorizes files into logical sections  
158 | ✅ **Navigable**: Parent-child relationships enable easy browsing  
159 | ✅ **Searchable**: Full-text search across all hierarchical content  
160 | ✅ **Maintainable**: File-to-page mappings keep docs in sync with code  
161 | ✅ **Enterprise-ready**: Perfect for large organizations with many repos  
162 | 
163 | ## Example: Complete Repository Setup
164 | 
165 | ```python
166 | # 1. Create repository structure
167 | repo_result = await wikijs_create_repo_structure(
168 |     "E-commerce Platform",
169 |     "Full-stack e-commerce application",
170 |     ["Overview", "Frontend", "Backend", "API", "Database", "Deployment"]
171 | )
172 | 
173 | # 2. Create component documentation
174 | button_result = await wikijs_create_nested_page(
175 |     "Button Component",
176 |     "# Button Component\n\nReusable button with variants...",
177 |     "e-commerce-platform/frontend"
178 | )
179 | 
180 | # 3. Get navigation structure
181 | children = await wikijs_get_page_children(page_path="e-commerce-platform/frontend")
182 | 
183 | # 4. Search across all docs
184 | search_results = await wikijs_search_pages("authentication")
185 | ```
186 | 
187 | This creates a professional, enterprise-grade documentation structure that scales with your organization's growth! 
```

--------------------------------------------------------------------------------
/DELETION_TOOLS.md:
--------------------------------------------------------------------------------

```markdown
  1 | # Deletion Tools Documentation
  2 | 
  3 | ## Overview
  4 | 
  5 | The Wiki.js MCP server includes **comprehensive deletion tools** for managing pages and hierarchies. These tools provide safe, flexible options for cleaning up documentation, reorganizing content, and maintaining your Wiki.js instance.
  6 | 
  7 | ## 🛡️ Safety Features
  8 | 
  9 | All deletion tools include **built-in safety mechanisms**:
 10 | - **Confirmation required**: `confirm_deletion=True` must be explicitly set
 11 | - **Preview mode**: See what will be deleted before confirming
 12 | - **Detailed reporting**: Know exactly what was deleted or failed
 13 | - **File mapping cleanup**: Automatically clean up orphaned database entries
 14 | 
 15 | ## 🗑️ Deletion Tools (4 Total)
 16 | 
 17 | ### 1. `wikijs_delete_page`
 18 | Delete a specific page from Wiki.js.
 19 | 
 20 | **Usage:**
 21 | ```python
 22 | # Delete by page ID
 23 | await wikijs_delete_page(page_id=123)
 24 | 
 25 | # Delete by page path
 26 | await wikijs_delete_page(page_path="frontend-app/components/button")
 27 | 
 28 | # Keep file mappings (don't clean up database)
 29 | await wikijs_delete_page(page_id=123, remove_file_mapping=False)
 30 | ```
 31 | 
 32 | **Returns:**
 33 | ```json
 34 | {
 35 |   "deleted": true,
 36 |   "pageId": 123,
 37 |   "title": "Button Component",
 38 |   "path": "frontend-app/components/button",
 39 |   "status": "deleted",
 40 |   "file_mapping_removed": true
 41 | }
 42 | ```
 43 | 
 44 | ### 2. `wikijs_batch_delete_pages`
 45 | Delete multiple pages with flexible selection criteria.
 46 | 
 47 | **Selection Methods:**
 48 | 
 49 | **By Page IDs:**
 50 | ```python
 51 | await wikijs_batch_delete_pages(
 52 |     page_ids=[123, 124, 125],
 53 |     confirm_deletion=True
 54 | )
 55 | ```
 56 | 
 57 | **By Page Paths:**
 58 | ```python
 59 | await wikijs_batch_delete_pages(
 60 |     page_paths=["frontend-app/old-component", "backend-api/deprecated"],
 61 |     confirm_deletion=True
 62 | )
 63 | ```
 64 | 
 65 | **By Pattern Matching:**
 66 | ```python
 67 | # Delete all pages under frontend-app
 68 | await wikijs_batch_delete_pages(
 69 |     path_pattern="frontend-app/*",
 70 |     confirm_deletion=True
 71 | )
 72 | 
 73 | # Delete all test pages
 74 | await wikijs_batch_delete_pages(
 75 |     path_pattern="*test*",
 76 |     confirm_deletion=True
 77 | )
 78 | ```
 79 | 
 80 | **Safety Check (Preview Mode):**
 81 | ```python
 82 | # Preview what would be deleted (safe - won't actually delete)
 83 | result = await wikijs_batch_delete_pages(
 84 |     path_pattern="frontend-app/*",
 85 |     confirm_deletion=False  # Safety check
 86 | )
 87 | # Returns: {"error": "confirm_deletion must be True...", "safety_note": "..."}
 88 | ```
 89 | 
 90 | **Returns:**
 91 | ```json
 92 | {
 93 |   "total_found": 5,
 94 |   "deleted_count": 4,
 95 |   "failed_count": 1,
 96 |   "deleted_pages": [
 97 |     {"pageId": 123, "title": "Button", "path": "frontend-app/button"},
 98 |     {"pageId": 124, "title": "Modal", "path": "frontend-app/modal"}
 99 |   ],
100 |   "failed_deletions": [
101 |     {"pageId": 125, "title": "Protected", "path": "frontend-app/protected", "error": "Access denied"}
102 |   ],
103 |   "status": "completed"
104 | }
105 | ```
106 | 
107 | ### 3. `wikijs_delete_hierarchy`
108 | Delete entire page hierarchies (folder structures) with precise control.
109 | 
110 | **Deletion Modes:**
111 | 
112 | **Children Only** (default):
113 | ```python
114 | # Delete all child pages but keep the root page
115 | await wikijs_delete_hierarchy(
116 |     root_path="frontend-app",
117 |     delete_mode="children_only",
118 |     confirm_deletion=True
119 | )
120 | ```
121 | 
122 | **Include Root**:
123 | ```python
124 | # Delete the entire hierarchy including the root page
125 | await wikijs_delete_hierarchy(
126 |     root_path="frontend-app",
127 |     delete_mode="include_root",
128 |     confirm_deletion=True
129 | )
130 | ```
131 | 
132 | **Root Only**:
133 | ```python
134 | # Delete only the root page, leave children orphaned
135 | await wikijs_delete_hierarchy(
136 |     root_path="frontend-app",
137 |     delete_mode="root_only",
138 |     confirm_deletion=True
139 | )
140 | ```
141 | 
142 | **Preview Mode:**
143 | ```python
144 | # Preview hierarchy deletion (safe)
145 | result = await wikijs_delete_hierarchy(
146 |     root_path="frontend-app",
147 |     delete_mode="include_root",
148 |     confirm_deletion=False
149 | )
150 | # Returns preview with safety warnings
151 | ```
152 | 
153 | **Returns:**
154 | ```json
155 | {
156 |   "root_path": "frontend-app",
157 |   "delete_mode": "children_only",
158 |   "total_found": 8,
159 |   "deleted_count": 7,
160 |   "failed_count": 1,
161 |   "deleted_pages": [
162 |     {"pageId": 124, "title": "Button", "path": "frontend-app/components/button", "depth": 2},
163 |     {"pageId": 125, "title": "Modal", "path": "frontend-app/components/modal", "depth": 2}
164 |   ],
165 |   "failed_deletions": [],
166 |   "hierarchy_summary": {
167 |     "root_page_found": true,
168 |     "child_pages_found": 8,
169 |     "max_depth": 3
170 |   },
171 |   "status": "completed"
172 | }
173 | ```
174 | 
175 | ### 4. `wikijs_cleanup_orphaned_mappings`
176 | Clean up file-to-page mappings for pages that no longer exist.
177 | 
178 | **Usage:**
179 | ```python
180 | # Clean up orphaned mappings
181 | await wikijs_cleanup_orphaned_mappings()
182 | ```
183 | 
184 | **Returns:**
185 | ```json
186 | {
187 |   "total_mappings": 25,
188 |   "valid_mappings": 22,
189 |   "orphaned_mappings": 3,
190 |   "cleaned_count": 3,
191 |   "orphaned_details": [
192 |     {"file_path": "src/deleted-component.tsx", "page_id": 999, "last_updated": "2024-01-15T10:30:00Z"},
193 |     {"file_path": "src/old-util.ts", "page_id": 998, "error": "Page not found"}
194 |   ],
195 |   "status": "completed"
196 | }
197 | ```
198 | 
199 | ## 🎯 Common Use Cases
200 | 
201 | ### 1. Clean Up Test Documentation
202 | ```python
203 | # Remove all test pages
204 | await wikijs_batch_delete_pages(
205 |     path_pattern="*test*",
206 |     confirm_deletion=True
207 | )
208 | ```
209 | 
210 | ### 2. Remove Deprecated Project
211 | ```python
212 | # Delete entire project hierarchy
213 | await wikijs_delete_hierarchy(
214 |     root_path="old-mobile-app",
215 |     delete_mode="include_root",
216 |     confirm_deletion=True
217 | )
218 | ```
219 | 
220 | ### 3. Reorganize Documentation Structure
221 | ```python
222 | # Step 1: Preview what will be affected
223 | preview = await wikijs_delete_hierarchy(
224 |     root_path="frontend-app/old-structure",
225 |     delete_mode="children_only",
226 |     confirm_deletion=False
227 | )
228 | 
229 | # Step 2: Delete old structure
230 | await wikijs_delete_hierarchy(
231 |     root_path="frontend-app/old-structure",
232 |     delete_mode="children_only", 
233 |     confirm_deletion=True
234 | )
235 | 
236 | # Step 3: Clean up orphaned mappings
237 | await wikijs_cleanup_orphaned_mappings()
238 | ```
239 | 
240 | ### 4. Bulk Cleanup by Pattern
241 | ```python
242 | # Remove all draft pages
243 | await wikijs_batch_delete_pages(
244 |     path_pattern="*draft*",
245 |     confirm_deletion=True
246 | )
247 | 
248 | # Remove all pages from a specific author/date
249 | await wikijs_batch_delete_pages(
250 |     path_pattern="temp-*",
251 |     confirm_deletion=True
252 | )
253 | ```
254 | 
255 | ### 5. Maintenance Operations
256 | ```python
257 | # Regular cleanup of orphaned mappings
258 | cleanup_result = await wikijs_cleanup_orphaned_mappings()
259 | print(f"Cleaned up {cleanup_result['cleaned_count']} orphaned mappings")
260 | 
261 | # Remove specific outdated pages
262 | await wikijs_batch_delete_pages(
263 |     page_paths=[
264 |         "old-api/v1/endpoints",
265 |         "deprecated/legacy-components",
266 |         "archive/old-docs"
267 |     ],
268 |     confirm_deletion=True
269 | )
270 | ```
271 | 
272 | ## 🔒 Safety Best Practices
273 | 
274 | ### 1. Always Preview First
275 | ```python
276 | # GOOD: Preview before deleting
277 | preview = await wikijs_delete_hierarchy("important-docs", confirm_deletion=False)
278 | print(f"Would delete {preview.get('total_found', 0)} pages")
279 | 
280 | # Then confirm if safe
281 | if input("Proceed? (y/N): ").lower() == 'y':
282 |     await wikijs_delete_hierarchy("important-docs", confirm_deletion=True)
283 | ```
284 | 
285 | ### 2. Use Specific Patterns
286 | ```python
287 | # GOOD: Specific pattern
288 | await wikijs_batch_delete_pages(path_pattern="test-project/temp/*", confirm_deletion=True)
289 | 
290 | # DANGEROUS: Too broad
291 | # await wikijs_batch_delete_pages(path_pattern="*", confirm_deletion=True)  # DON'T DO THIS
292 | ```
293 | 
294 | ### 3. Check Results
295 | ```python
296 | result = await wikijs_batch_delete_pages(
297 |     path_pattern="old-docs/*",
298 |     confirm_deletion=True
299 | )
300 | 
301 | print(f"Deleted: {result['deleted_count']}")
302 | print(f"Failed: {result['failed_count']}")
303 | 
304 | # Check for failures
305 | if result['failed_deletions']:
306 |     print("Failed deletions:")
307 |     for failure in result['failed_deletions']:
308 |         print(f"  - {failure['title']}: {failure['error']}")
309 | ```
310 | 
311 | ### 4. Regular Maintenance
312 | ```python
313 | # Weekly cleanup routine
314 | async def weekly_cleanup():
315 |     # Clean up orphaned mappings
316 |     cleanup = await wikijs_cleanup_orphaned_mappings()
317 |     print(f"Cleaned {cleanup['cleaned_count']} orphaned mappings")
318 |     
319 |     # Remove temp/test pages
320 |     temp_cleanup = await wikijs_batch_delete_pages(
321 |         path_pattern="temp-*",
322 |         confirm_deletion=True
323 |     )
324 |     print(f"Removed {temp_cleanup['deleted_count']} temp pages")
325 | ```
326 | 
327 | ## ⚠️ Important Notes
328 | 
329 | ### Deletion Order
330 | - **Hierarchy deletion** processes pages from deepest to shallowest to avoid dependency issues
331 | - **Child pages are deleted before parent pages** automatically
332 | - **Failed deletions** are reported with specific error messages
333 | 
334 | ### File Mappings
335 | - **Automatic cleanup**: File-to-page mappings are removed by default when pages are deleted
336 | - **Manual control**: Set `remove_file_mappings=False` to preserve mappings
337 | - **Orphaned cleanup**: Use `wikijs_cleanup_orphaned_mappings()` for maintenance
338 | 
339 | ### Pattern Matching
340 | - **Supports wildcards**: Use `*` for pattern matching (e.g., `"frontend-*"`, `"*test*"`)
341 | - **Case sensitive**: Patterns match exactly as written
342 | - **Path-based**: Patterns match against the full page path
343 | 
344 | ### Error Handling
345 | - **Graceful failures**: Individual page deletion failures don't stop batch operations
346 | - **Detailed reporting**: All failures are logged with specific error messages
347 | - **Partial success**: Operations can succeed partially with detailed results
348 | 
349 | ## 🧪 Testing
350 | 
351 | All deletion tools have been thoroughly tested:
352 | - ✅ Single page deletion
353 | - ✅ Batch deletion with safety checks
354 | - ✅ Pattern-based deletion
355 | - ✅ Hierarchy deletion modes
356 | - ✅ Orphaned mappings cleanup
357 | - ✅ File mapping integration
358 | - ✅ Error handling and reporting
359 | 
360 | The tools are **production-ready** and safe for enterprise use with proper confirmation procedures. 
```

--------------------------------------------------------------------------------
/src/wiki_mcp_server.py:
--------------------------------------------------------------------------------

```python
   1 | #!/usr/bin/env python3
   2 | """Wiki.js MCP server using FastMCP - GraphQL version."""
   3 | 
   4 | import os
   5 | from dotenv import load_dotenv
   6 | load_dotenv()
   7 | 
   8 | import sys
   9 | import datetime
  10 | import json
  11 | import hashlib
  12 | import logging
  13 | import ast
  14 | import re
  15 | from pathlib import Path
  16 | from typing import Optional, List, Dict, Any, Union
  17 | from dataclasses import dataclass
  18 | 
  19 | import httpx
  20 | from fastmcp import FastMCP
  21 | from slugify import slugify
  22 | import markdown
  23 | from sqlalchemy import create_engine, Column, Integer, String, DateTime, Text
  24 | from sqlalchemy.orm import declarative_base, sessionmaker
  25 | from sqlalchemy.exc import SQLAlchemyError
  26 | from tenacity import retry, stop_after_attempt, wait_exponential
  27 | from pydantic import Field
  28 | from pydantic_settings import BaseSettings
  29 | 
  30 | # Create FastMCP server
  31 | mcp = FastMCP("Wiki.js Integration")
  32 | 
  33 | # Configuration
  34 | class Settings(BaseSettings):
  35 |     WIKIJS_API_URL: str = Field(default="http://localhost:3000")
  36 |     WIKIJS_TOKEN: Optional[str] = Field(default=None)
  37 |     WIKIJS_API_KEY: Optional[str] = Field(default=None)  # Alternative name for token
  38 |     WIKIJS_USERNAME: Optional[str] = Field(default=None)
  39 |     WIKIJS_PASSWORD: Optional[str] = Field(default=None)
  40 |     WIKIJS_MCP_DB: str = Field(default="./wikijs_mappings.db")
  41 |     LOG_LEVEL: str = Field(default="INFO")
  42 |     LOG_FILE: str = Field(default="wikijs_mcp.log")
  43 |     REPOSITORY_ROOT: str = Field(default="./")
  44 |     DEFAULT_SPACE_NAME: str = Field(default="Documentation")
  45 |     
  46 |     class Config:
  47 |         env_file = ".env"
  48 |         extra = "ignore"  # Allow extra fields without validation errors
  49 |     
  50 |     @property
  51 |     def token(self) -> Optional[str]:
  52 |         """Get the token from either WIKIJS_TOKEN or WIKIJS_API_KEY."""
  53 |         return self.WIKIJS_TOKEN or self.WIKIJS_API_KEY
  54 | 
  55 | settings = Settings()
  56 | 
  57 | # Setup logging
  58 | logging.basicConfig(
  59 |     level=getattr(logging, settings.LOG_LEVEL.upper()),
  60 |     format='%(asctime)s - %(name)s - %(levelname)s - %(message)s',
  61 |     handlers=[
  62 |         logging.FileHandler(settings.LOG_FILE),
  63 |         logging.StreamHandler()
  64 |     ]
  65 | )
  66 | logger = logging.getLogger(__name__)
  67 | 
  68 | # Database models
  69 | Base = declarative_base()
  70 | 
  71 | class FileMapping(Base):
  72 |     __tablename__ = 'file_mappings'
  73 |     
  74 |     id = Column(Integer, primary_key=True)
  75 |     file_path = Column(String, unique=True, nullable=False)
  76 |     page_id = Column(Integer, nullable=False)
  77 |     relationship_type = Column(String, nullable=False)
  78 |     last_updated = Column(DateTime, default=datetime.datetime.utcnow)
  79 |     file_hash = Column(String)
  80 |     repository_root = Column(String, default='')
  81 |     space_name = Column(String, default='')
  82 | 
  83 | class RepositoryContext(Base):
  84 |     __tablename__ = 'repository_contexts'
  85 |     
  86 |     id = Column(Integer, primary_key=True)
  87 |     root_path = Column(String, unique=True, nullable=False)
  88 |     space_name = Column(String, nullable=False)
  89 |     space_id = Column(Integer)
  90 |     last_updated = Column(DateTime, default=datetime.datetime.utcnow)
  91 | 
  92 | # Database setup
  93 | engine = create_engine(f"sqlite:///{settings.WIKIJS_MCP_DB}")
  94 | Base.metadata.create_all(engine)
  95 | SessionLocal = sessionmaker(autocommit=False, autoflush=False, bind=engine)
  96 | 
  97 | class WikiJSClient:
  98 |     """Wiki.js GraphQL API client for handling requests."""
  99 |     
 100 |     def __init__(self):
 101 |         self.base_url = settings.WIKIJS_API_URL.rstrip('/')
 102 |         self.client = httpx.AsyncClient(timeout=30.0)
 103 |         self.authenticated = False
 104 |         
 105 |     async def authenticate(self) -> bool:
 106 |         """Set up authentication headers for GraphQL requests."""
 107 |         if settings.token:
 108 |             self.client.headers.update({
 109 |                 "Authorization": f"Bearer {settings.token}",
 110 |                 "Content-Type": "application/json"
 111 |             })
 112 |             self.authenticated = True
 113 |             return True
 114 |         elif settings.WIKIJS_USERNAME and settings.WIKIJS_PASSWORD:
 115 |             # For username/password, we need to login via GraphQL mutation
 116 |             try:
 117 |                 login_mutation = """
 118 |                 mutation($username: String!, $password: String!) {
 119 |                     authentication {
 120 |                         login(username: $username, password: $password) {
 121 |                             succeeded
 122 |                             jwt
 123 |                             message
 124 |                         }
 125 |                     }
 126 |                 }
 127 |                 """
 128 |                 
 129 |                 response = await self.graphql_request(login_mutation, {
 130 |                     "username": settings.WIKIJS_USERNAME,
 131 |                     "password": settings.WIKIJS_PASSWORD
 132 |                 })
 133 |                 
 134 |                 if response.get("data", {}).get("authentication", {}).get("login", {}).get("succeeded"):
 135 |                     jwt_token = response["data"]["authentication"]["login"]["jwt"]
 136 |                     self.client.headers.update({
 137 |                         "Authorization": f"Bearer {jwt_token}",
 138 |                         "Content-Type": "application/json"
 139 |                     })
 140 |                     self.authenticated = True
 141 |                     return True
 142 |                 else:
 143 |                     logger.error(f"Login failed: {response}")
 144 |                     return False
 145 |             except Exception as e:
 146 |                 logger.error(f"Authentication failed: {e}")
 147 |                 return False
 148 |         return False
 149 |     
 150 |     @retry(stop=stop_after_attempt(3), wait=wait_exponential(multiplier=1, min=4, max=10))
 151 |     async def graphql_request(self, query: str, variables: Dict = None) -> Dict:
 152 |         """Make GraphQL request to Wiki.js."""
 153 |         url = f"{self.base_url}/graphql"
 154 |         
 155 |         payload = {"query": query}
 156 |         if variables:
 157 |             payload["variables"] = variables
 158 |         
 159 |         try:
 160 |             response = await self.client.post(url, json=payload)
 161 |             response.raise_for_status()
 162 |             
 163 |             data = response.json()
 164 |             
 165 |             # Check for GraphQL errors
 166 |             if "errors" in data:
 167 |                 error_msg = "; ".join([err.get("message", str(err)) for err in data["errors"]])
 168 |                 raise Exception(f"GraphQL error: {error_msg}")
 169 |             
 170 |             return data
 171 |         except httpx.HTTPStatusError as e:
 172 |             logger.error(f"Wiki.js GraphQL HTTP error {e.response.status_code}: {e.response.text}")
 173 |             raise Exception(f"Wiki.js GraphQL HTTP error {e.response.status_code}: {e.response.text}")
 174 |         except httpx.RequestError as e:
 175 |             logger.error(f"Wiki.js connection error: {str(e)}")
 176 |             raise Exception(f"Wiki.js connection error: {str(e)}")
 177 | 
 178 | # Initialize client
 179 | wikijs = WikiJSClient()
 180 | 
 181 | def get_db():
 182 |     """Get database session."""
 183 |     db = SessionLocal()
 184 |     try:
 185 |         return db
 186 |     finally:
 187 |         db.close()
 188 | 
 189 | def get_file_hash(file_path: str) -> str:
 190 |     """Calculate SHA256 hash of file content."""
 191 |     try:
 192 |         with open(file_path, 'rb') as f:
 193 |             return hashlib.sha256(f.read()).hexdigest()
 194 |     except FileNotFoundError:
 195 |         return ""
 196 | 
 197 | def markdown_to_html(content: str) -> str:
 198 |     """Convert markdown content to HTML."""
 199 |     md = markdown.Markdown(extensions=['codehilite', 'fenced_code', 'tables'])
 200 |     return md.convert(content)
 201 | 
 202 | def find_repository_root(start_path: str = None) -> Optional[str]:
 203 |     """Find the repository root by looking for .git directory or .wikijs_mcp file."""
 204 |     if start_path is None:
 205 |         start_path = os.getcwd()
 206 |     
 207 |     current_path = Path(start_path).resolve()
 208 |     
 209 |     # Walk up the directory tree
 210 |     for path in [current_path] + list(current_path.parents):
 211 |         # Check for .git directory (Git repository)
 212 |         if (path / '.git').exists():
 213 |             return str(path)
 214 |         # Check for .wikijs_mcp file (explicit Wiki.js repository marker)
 215 |         if (path / '.wikijs_mcp').exists():
 216 |             return str(path)
 217 |     
 218 |     # If no repository markers found, use current directory
 219 |     return str(current_path)
 220 | 
 221 | def extract_code_structure(file_path: str) -> Dict[str, Any]:
 222 |     """Extract classes and functions from Python files using AST."""
 223 |     try:
 224 |         with open(file_path, 'r', encoding='utf-8') as f:
 225 |             content = f.read()
 226 |         
 227 |         tree = ast.parse(content)
 228 |         structure = {
 229 |             'classes': [],
 230 |             'functions': [],
 231 |             'imports': []
 232 |         }
 233 |         
 234 |         for node in ast.walk(tree):
 235 |             if isinstance(node, ast.ClassDef):
 236 |                 structure['classes'].append({
 237 |                     'name': node.name,
 238 |                     'line': node.lineno,
 239 |                     'docstring': ast.get_docstring(node)
 240 |                 })
 241 |             elif isinstance(node, ast.FunctionDef):
 242 |                 structure['functions'].append({
 243 |                     'name': node.name,
 244 |                     'line': node.lineno,
 245 |                     'docstring': ast.get_docstring(node)
 246 |                 })
 247 |             elif isinstance(node, (ast.Import, ast.ImportFrom)):
 248 |                 if isinstance(node, ast.Import):
 249 |                     for alias in node.names:
 250 |                         structure['imports'].append(alias.name)
 251 |                 else:
 252 |                     module = node.module or ''
 253 |                     for alias in node.names:
 254 |                         structure['imports'].append(f"{module}.{alias.name}")
 255 |         
 256 |         return structure
 257 |     except Exception as e:
 258 |         logger.error(f"Error parsing {file_path}: {e}")
 259 |         return {'classes': [], 'functions': [], 'imports': []}
 260 | 
 261 | # MCP Tools Implementation
 262 | 
 263 | @mcp.tool()
 264 | async def wikijs_create_page(title: str, content: str, space_id: str = "", parent_id: str = "") -> str:
 265 |     """
 266 |     Create a new page in Wiki.js with support for hierarchical organization.
 267 |     
 268 |     Args:
 269 |         title: Page title
 270 |         content: Page content (markdown or HTML)
 271 |         space_id: Space ID (optional, uses default if not provided)
 272 |         parent_id: Parent page ID for hierarchical organization (optional)
 273 |     
 274 |     Returns:
 275 |         JSON string with page details: {'pageId': int, 'url': str}
 276 |     """
 277 |     try:
 278 |         await wikijs.authenticate()
 279 |         
 280 |         # Generate path - if parent_id provided, create nested path
 281 |         if parent_id:
 282 |             # Get parent page to build nested path
 283 |             parent_query = """
 284 |             query($id: Int!) {
 285 |                 pages {
 286 |                     single(id: $id) {
 287 |                         path
 288 |                         title
 289 |                     }
 290 |                 }
 291 |             }
 292 |             """
 293 |             parent_response = await wikijs.graphql_request(parent_query, {"id": int(parent_id)})
 294 |             parent_data = parent_response.get("data", {}).get("pages", {}).get("single")
 295 |             
 296 |             if parent_data:
 297 |                 parent_path = parent_data["path"]
 298 |                 # Create nested path: parent-path/child-title
 299 |                 path = f"{parent_path}/{slugify(title)}"
 300 |             else:
 301 |                 path = slugify(title)
 302 |         else:
 303 |             path = slugify(title)
 304 |         
 305 |         # GraphQL mutation to create a page
 306 |         mutation = """
 307 |         mutation($content: String!, $description: String!, $editor: String!, $isPublished: Boolean!, $isPrivate: Boolean!, $locale: String!, $path: String!, $publishEndDate: Date, $publishStartDate: Date, $scriptCss: String, $scriptJs: String, $tags: [String]!, $title: String!) {
 308 |             pages {
 309 |                 create(content: $content, description: $description, editor: $editor, isPublished: $isPublished, isPrivate: $isPrivate, locale: $locale, path: $path, publishEndDate: $publishEndDate, publishStartDate: $publishStartDate, scriptCss: $scriptCss, scriptJs: $scriptJs, tags: $tags, title: $title) {
 310 |                     responseResult {
 311 |                         succeeded
 312 |                         errorCode
 313 |                         slug
 314 |                         message
 315 |                     }
 316 |                     page {
 317 |                         id
 318 |                         path
 319 |                         title
 320 |                     }
 321 |                 }
 322 |             }
 323 |         }
 324 |         """
 325 |         
 326 |         variables = {
 327 |             "content": content,
 328 |             "description": "",
 329 |             "editor": "markdown",
 330 |             "isPublished": True,
 331 |             "isPrivate": False,
 332 |             "locale": "en",
 333 |             "path": path,
 334 |             "publishEndDate": None,
 335 |             "publishStartDate": None,
 336 |             "scriptCss": "",
 337 |             "scriptJs": "",
 338 |             "tags": [],
 339 |             "title": title
 340 |         }
 341 |         
 342 |         response = await wikijs.graphql_request(mutation, variables)
 343 |         
 344 |         create_result = response.get("data", {}).get("pages", {}).get("create", {})
 345 |         response_result = create_result.get("responseResult", {})
 346 |         
 347 |         if response_result.get("succeeded"):
 348 |             page_data = create_result.get("page", {})
 349 |             result = {
 350 |                 "pageId": page_data.get("id"),
 351 |                 "url": page_data.get("path"),
 352 |                 "title": page_data.get("title"),
 353 |                 "status": "created",
 354 |                 "parentId": int(parent_id) if parent_id else None,
 355 |                 "hierarchicalPath": path
 356 |             }
 357 |             logger.info(f"Created page: {title} (ID: {result['pageId']}) at path: {path}")
 358 |             return json.dumps(result)
 359 |         else:
 360 |             error_msg = response_result.get("message", "Unknown error")
 361 |             return json.dumps({"error": f"Failed to create page: {error_msg}"})
 362 |         
 363 |     except Exception as e:
 364 |         error_msg = f"Failed to create page '{title}': {str(e)}"
 365 |         logger.error(error_msg)
 366 |         return json.dumps({"error": error_msg})
 367 | 
 368 | @mcp.tool()
 369 | async def wikijs_update_page(page_id: int, title: str = None, content: str = None) -> str:
 370 |     """
 371 |     Update an existing page in Wiki.js.
 372 |     
 373 |     Args:
 374 |         page_id: Page ID to update
 375 |         title: New title (optional)
 376 |         content: New content (optional)
 377 |     
 378 |     Returns:
 379 |         JSON string with update status
 380 |     """
 381 |     try:
 382 |         await wikijs.authenticate()
 383 |         
 384 |         # First get the current page data
 385 |         get_query = """
 386 |         query($id: Int!) {
 387 |             pages {
 388 |                 single(id: $id) {
 389 |                     id
 390 |                     path
 391 |                     title
 392 |                     content
 393 |                     description
 394 |                     isPrivate
 395 |                     isPublished
 396 |                     locale
 397 |                     tags {
 398 |                         tag
 399 |                     }
 400 |                 }
 401 |             }
 402 |         }
 403 |         """
 404 |         
 405 |         get_response = await wikijs.graphql_request(get_query, {"id": page_id})
 406 |         current_page = get_response.get("data", {}).get("pages", {}).get("single")
 407 |         
 408 |         if not current_page:
 409 |             return json.dumps({"error": f"Page with ID {page_id} not found"})
 410 |         
 411 |         # GraphQL mutation to update a page
 412 |         mutation = """
 413 |         mutation($id: Int!, $content: String!, $description: String!, $editor: String!, $isPrivate: Boolean!, $isPublished: Boolean!, $locale: String!, $path: String!, $scriptCss: String, $scriptJs: String, $tags: [String]!, $title: String!) {
 414 |             pages {
 415 |                 update(id: $id, content: $content, description: $description, editor: $editor, isPrivate: $isPrivate, isPublished: $isPublished, locale: $locale, path: $path, scriptCss: $scriptCss, scriptJs: $scriptJs, tags: $tags, title: $title) {
 416 |                     responseResult {
 417 |                         succeeded
 418 |                         errorCode
 419 |                         slug
 420 |                         message
 421 |                     }
 422 |                     page {
 423 |                         id
 424 |                         path
 425 |                         title
 426 |                         updatedAt
 427 |                     }
 428 |                 }
 429 |             }
 430 |         }
 431 |         """
 432 |         
 433 |         # Use provided values or keep current ones
 434 |         new_title = title if title is not None else current_page["title"]
 435 |         new_content = content if content is not None else current_page["content"]
 436 |         
 437 |         variables = {
 438 |             "id": page_id,
 439 |             "content": new_content,
 440 |             "description": current_page.get("description", ""),
 441 |             "editor": "markdown",
 442 |             "isPrivate": current_page.get("isPrivate", False),
 443 |             "isPublished": current_page.get("isPublished", True),
 444 |             "locale": current_page.get("locale", "en"),
 445 |             "path": current_page["path"],
 446 |             "scriptCss": "",
 447 |             "scriptJs": "",
 448 |             "tags": [tag["tag"] for tag in current_page.get("tags", [])],
 449 |             "title": new_title
 450 |         }
 451 |         
 452 |         response = await wikijs.graphql_request(mutation, variables)
 453 |         
 454 |         update_result = response.get("data", {}).get("pages", {}).get("update", {})
 455 |         response_result = update_result.get("responseResult", {})
 456 |         
 457 |         if response_result.get("succeeded"):
 458 |             page_data = update_result.get("page", {})
 459 |             result = {
 460 |                 "pageId": page_id,
 461 |                 "status": "updated",
 462 |                 "title": page_data.get("title"),
 463 |                 "lastModified": page_data.get("updatedAt")
 464 |             }
 465 |             logger.info(f"Updated page ID: {page_id}")
 466 |             return json.dumps(result)
 467 |         else:
 468 |             error_msg = response_result.get("message", "Unknown error")
 469 |             return json.dumps({"error": f"Failed to update page: {error_msg}"})
 470 |         
 471 |     except Exception as e:
 472 |         error_msg = f"Failed to update page {page_id}: {str(e)}"
 473 |         logger.error(error_msg)
 474 |         return json.dumps({"error": error_msg})
 475 | 
 476 | @mcp.tool()
 477 | async def wikijs_get_page(page_id: int = None, slug: str = None) -> str:
 478 |     """
 479 |     Retrieve page metadata and content from Wiki.js.
 480 |     
 481 |     Args:
 482 |         page_id: Page ID (optional)
 483 |         slug: Page slug/path (optional)
 484 |     
 485 |     Returns:
 486 |         JSON string with page data
 487 |     """
 488 |     try:
 489 |         await wikijs.authenticate()
 490 |         
 491 |         if page_id:
 492 |             query = """
 493 |             query($id: Int!) {
 494 |                 pages {
 495 |                     single(id: $id) {
 496 |                         id
 497 |                         path
 498 |                         title
 499 |                         content
 500 |                         description
 501 |                         isPrivate
 502 |                         isPublished
 503 |                         locale
 504 |                         createdAt
 505 |                         updatedAt
 506 |                         tags {
 507 |                             tag
 508 |                         }
 509 |                     }
 510 |                 }
 511 |             }
 512 |             """
 513 |             variables = {"id": page_id}
 514 |         elif slug:
 515 |             query = """
 516 |             query($path: String!) {
 517 |                 pages {
 518 |                     singleByPath(path: $path, locale: "en") {
 519 |                         id
 520 |                         path
 521 |                         title
 522 |                         content
 523 |                         description
 524 |                         isPrivate
 525 |                         isPublished
 526 |                         locale
 527 |                         createdAt
 528 |                         updatedAt
 529 |                         tags {
 530 |                             tag
 531 |                         }
 532 |                     }
 533 |                 }
 534 |             }
 535 |             """
 536 |             variables = {"path": slug}
 537 |         else:
 538 |             return json.dumps({"error": "Either page_id or slug must be provided"})
 539 |         
 540 |         response = await wikijs.graphql_request(query, variables)
 541 |         
 542 |         page_data = None
 543 |         if page_id:
 544 |             page_data = response.get("data", {}).get("pages", {}).get("single")
 545 |         else:
 546 |             page_data = response.get("data", {}).get("pages", {}).get("singleByPath")
 547 |         
 548 |         if not page_data:
 549 |             return json.dumps({"error": "Page not found"})
 550 |         
 551 |         result = {
 552 |             "pageId": page_data.get("id"),
 553 |             "title": page_data.get("title"),
 554 |             "content": page_data.get("content"),
 555 |             "contentType": "markdown",
 556 |             "lastModified": page_data.get("updatedAt"),
 557 |             "path": page_data.get("path"),
 558 |             "isPublished": page_data.get("isPublished"),
 559 |             "description": page_data.get("description"),
 560 |             "tags": [tag["tag"] for tag in page_data.get("tags", [])]
 561 |         }
 562 |         
 563 |         return json.dumps(result)
 564 |         
 565 |     except Exception as e:
 566 |         error_msg = f"Failed to get page: {str(e)}"
 567 |         logger.error(error_msg)
 568 |         return json.dumps({"error": error_msg})
 569 | 
 570 | @mcp.tool()
 571 | async def wikijs_search_pages(query: str, space_id: str = None) -> str:
 572 |     """
 573 |     Search pages by text in Wiki.js.
 574 |     
 575 |     Args:
 576 |         query: Search query
 577 |         space_id: Space ID to limit search (optional)
 578 |     
 579 |     Returns:
 580 |         JSON string with search results
 581 |     """
 582 |     try:
 583 |         await wikijs.authenticate()
 584 |         
 585 |         # GraphQL query for search (fixed - removed invalid suggestions subfields)
 586 |         search_query = """
 587 |         query($query: String!) {
 588 |             pages {
 589 |                 search(query: $query, path: "", locale: "en") {
 590 |                     results {
 591 |                         id
 592 |                         title
 593 |                         description
 594 |                         path
 595 |                         locale
 596 |                     }
 597 |                     totalHits
 598 |                 }
 599 |             }
 600 |         }
 601 |         """
 602 |         
 603 |         variables = {"query": query}
 604 |         
 605 |         response = await wikijs.graphql_request(search_query, variables)
 606 |         
 607 |         search_data = response.get("data", {}).get("pages", {}).get("search", {})
 608 |         
 609 |         results = []
 610 |         for item in search_data.get("results", []):
 611 |             results.append({
 612 |                 "pageId": item.get("id"),
 613 |                 "title": item.get("title"),
 614 |                 "snippet": item.get("description", ""),
 615 |                 "score": 1.0,  # Wiki.js doesn't provide scores
 616 |                 "path": item.get("path")
 617 |             })
 618 |         
 619 |         return json.dumps({
 620 |             "results": results, 
 621 |             "total": search_data.get("totalHits", len(results))
 622 |         })
 623 |         
 624 |     except Exception as e:
 625 |         error_msg = f"Search failed: {str(e)}"
 626 |         logger.error(error_msg)
 627 |         return json.dumps({"error": error_msg})
 628 | 
 629 | @mcp.tool()
 630 | async def wikijs_list_spaces() -> str:
 631 |     """
 632 |     List all spaces (top-level Wiki.js containers).
 633 |     Note: Wiki.js doesn't have "spaces" like BookStack, but we can list pages at root level.
 634 |     
 635 |     Returns:
 636 |         JSON string with spaces list
 637 |     """
 638 |     try:
 639 |         await wikijs.authenticate()
 640 |         
 641 |         # Get all pages and group by top-level paths
 642 |         query = """
 643 |         query {
 644 |             pages {
 645 |                 list {
 646 |                     id
 647 |                     title
 648 |                     path
 649 |                     description
 650 |                     isPublished
 651 |                     locale
 652 |                 }
 653 |             }
 654 |         }
 655 |         """
 656 |         
 657 |         response = await wikijs.graphql_request(query)
 658 |         
 659 |         pages = response.get("data", {}).get("pages", {}).get("list", [])
 660 |         
 661 |         # Group pages by top-level path (simulate spaces)
 662 |         spaces = {}
 663 |         for page in pages:
 664 |             path_parts = page["path"].split("/")
 665 |             if len(path_parts) > 0:
 666 |                 top_level = path_parts[0] if path_parts[0] else "root"
 667 |                 if top_level not in spaces:
 668 |                     spaces[top_level] = {
 669 |                         "spaceId": hash(top_level) % 10000,  # Generate pseudo ID
 670 |                         "name": top_level.replace("-", " ").title(),
 671 |                         "slug": top_level,
 672 |                         "description": f"Pages under /{top_level}",
 673 |                         "pageCount": 0
 674 |                     }
 675 |                 spaces[top_level]["pageCount"] += 1
 676 |         
 677 |         return json.dumps({"spaces": list(spaces.values())})
 678 |         
 679 |     except Exception as e:
 680 |         error_msg = f"Failed to list spaces: {str(e)}"
 681 |         logger.error(error_msg)
 682 |         return json.dumps({"error": error_msg})
 683 | 
 684 | @mcp.tool()
 685 | async def wikijs_create_space(name: str, description: str = None) -> str:
 686 |     """
 687 |     Create a new space in Wiki.js.
 688 |     Note: Wiki.js doesn't have spaces, so this creates a root-level page as a space placeholder.
 689 |     
 690 |     Args:
 691 |         name: Space name
 692 |         description: Space description (optional)
 693 |     
 694 |     Returns:
 695 |         JSON string with space details
 696 |     """
 697 |     try:
 698 |         # Create a root page that acts as a space
 699 |         space_content = f"# {name}\n\n{description or 'This is the main page for the ' + name + ' section.'}\n\n## Pages in this section:\n\n*Pages will be listed here as they are created.*"
 700 |         
 701 |         result = await wikijs_create_page(name, space_content)
 702 |         result_data = json.loads(result)
 703 |         
 704 |         if "error" not in result_data:
 705 |             # Convert page result to space format
 706 |             space_result = {
 707 |                 "spaceId": result_data.get("pageId"),
 708 |                 "name": name,
 709 |                 "slug": slugify(name),
 710 |                 "status": "created",
 711 |                 "description": description
 712 |             }
 713 |             logger.info(f"Created space (root page): {name} (ID: {space_result['spaceId']})")
 714 |             return json.dumps(space_result)
 715 |         else:
 716 |             return result
 717 |         
 718 |     except Exception as e:
 719 |         error_msg = f"Failed to create space '{name}': {str(e)}"
 720 |         logger.error(error_msg)
 721 |         return json.dumps({"error": error_msg})
 722 | 
 723 | @mcp.tool()
 724 | async def wikijs_link_file_to_page(file_path: str, page_id: int, relationship: str = "documents") -> str:
 725 |     """
 726 |     Persist link between code file and Wiki.js page in local database.
 727 |     
 728 |     Args:
 729 |         file_path: Path to the source file
 730 |         page_id: Wiki.js page ID
 731 |         relationship: Type of relationship (documents, references, etc.)
 732 |     
 733 |     Returns:
 734 |         JSON string with link status
 735 |     """
 736 |     try:
 737 |         db = get_db()
 738 |         
 739 |         # Calculate file hash
 740 |         file_hash = get_file_hash(file_path)
 741 |         repo_root = find_repository_root(file_path)
 742 |         
 743 |         # Create or update mapping
 744 |         mapping = db.query(FileMapping).filter(FileMapping.file_path == file_path).first()
 745 |         if mapping:
 746 |             mapping.page_id = page_id
 747 |             mapping.relationship_type = relationship
 748 |             mapping.file_hash = file_hash
 749 |             mapping.last_updated = datetime.datetime.utcnow()
 750 |         else:
 751 |             mapping = FileMapping(
 752 |                 file_path=file_path,
 753 |                 page_id=page_id,
 754 |                 relationship_type=relationship,
 755 |                 file_hash=file_hash,
 756 |                 repository_root=repo_root or ""
 757 |             )
 758 |             db.add(mapping)
 759 |         
 760 |         db.commit()
 761 |         
 762 |         result = {
 763 |             "linked": True,
 764 |             "file_path": file_path,
 765 |             "page_id": page_id,
 766 |             "relationship": relationship
 767 |         }
 768 |         
 769 |         logger.info(f"Linked file {file_path} to page {page_id}")
 770 |         return json.dumps(result)
 771 |         
 772 |     except Exception as e:
 773 |         error_msg = f"Failed to link file to page: {str(e)}"
 774 |         logger.error(error_msg)
 775 |         return json.dumps({"error": error_msg})
 776 | 
 777 | @mcp.tool()
 778 | async def wikijs_sync_file_docs(file_path: str, change_summary: str, snippet: str = None) -> str:
 779 |     """
 780 |     Sync a code change to the linked Wiki.js page.
 781 |     
 782 |     Args:
 783 |         file_path: Path to the changed file
 784 |         change_summary: Summary of changes made
 785 |         snippet: Code snippet showing changes (optional)
 786 |     
 787 |     Returns:
 788 |         JSON string with sync status
 789 |     """
 790 |     try:
 791 |         db = get_db()
 792 |         
 793 |         # Look up page mapping
 794 |         mapping = db.query(FileMapping).filter(FileMapping.file_path == file_path).first()
 795 |         if not mapping:
 796 |             return json.dumps({"error": f"No page mapping found for {file_path}"})
 797 |         
 798 |         # Get current page content
 799 |         page_response = await wikijs_get_page(page_id=mapping.page_id)
 800 |         page_data = json.loads(page_response)
 801 |         
 802 |         if "error" in page_data:
 803 |             return json.dumps({"error": f"Failed to get page: {page_data['error']}"})
 804 |         
 805 |         # Append change summary to page content
 806 |         current_content = page_data.get("content", "")
 807 |         
 808 |         update_section = f"\n\n## Recent Changes\n\n**{datetime.datetime.now().strftime('%Y-%m-%d %H:%M')}**: {change_summary}\n"
 809 |         if snippet:
 810 |             update_section += f"\n```\n{snippet}\n```\n"
 811 |         
 812 |         new_content = current_content + update_section
 813 |         
 814 |         # Update the page
 815 |         update_response = await wikijs_update_page(mapping.page_id, content=new_content)
 816 |         update_data = json.loads(update_response)
 817 |         
 818 |         if "error" in update_data:
 819 |             return json.dumps({"error": f"Failed to update page: {update_data['error']}"})
 820 |         
 821 |         # Update file hash
 822 |         mapping.file_hash = get_file_hash(file_path)
 823 |         mapping.last_updated = datetime.datetime.utcnow()
 824 |         db.commit()
 825 |         
 826 |         result = {
 827 |             "updated": True,
 828 |             "file_path": file_path,
 829 |             "page_id": mapping.page_id,
 830 |             "change_summary": change_summary
 831 |         }
 832 |         
 833 |         logger.info(f"Synced changes from {file_path} to page {mapping.page_id}")
 834 |         return json.dumps(result)
 835 |         
 836 |     except Exception as e:
 837 |         error_msg = f"Failed to sync file docs: {str(e)}"
 838 |         logger.error(error_msg)
 839 |         return json.dumps({"error": error_msg})
 840 | 
 841 | @mcp.tool()
 842 | async def wikijs_generate_file_overview(
 843 |     file_path: str, 
 844 |     include_functions: bool = True, 
 845 |     include_classes: bool = True,
 846 |     include_dependencies: bool = True,
 847 |     include_examples: bool = False,
 848 |     target_page_id: int = None
 849 | ) -> str:
 850 |     """
 851 |     Create or update a structured overview page for a file.
 852 |     
 853 |     Args:
 854 |         file_path: Path to the source file
 855 |         include_functions: Include function documentation
 856 |         include_classes: Include class documentation
 857 |         include_dependencies: Include import/dependency information
 858 |         include_examples: Include usage examples
 859 |         target_page_id: Specific page ID to update (optional)
 860 |     
 861 |     Returns:
 862 |         JSON string with overview page details
 863 |     """
 864 |     try:
 865 |         if not os.path.exists(file_path):
 866 |             return json.dumps({"error": f"File not found: {file_path}"})
 867 |         
 868 |         # Extract code structure
 869 |         structure = extract_code_structure(file_path)
 870 |         
 871 |         # Generate documentation content
 872 |         content_parts = [f"# {os.path.basename(file_path)} Overview\n"]
 873 |         content_parts.append(f"**File Path**: `{file_path}`\n")
 874 |         content_parts.append(f"**Last Updated**: {datetime.datetime.now().strftime('%Y-%m-%d %H:%M')}\n")
 875 |         
 876 |         if include_dependencies and structure['imports']:
 877 |             content_parts.append("\n## Dependencies\n")
 878 |             for imp in structure['imports']:
 879 |                 content_parts.append(f"- `{imp}`")
 880 |             content_parts.append("")
 881 |         
 882 |         if include_classes and structure['classes']:
 883 |             content_parts.append("\n## Classes\n")
 884 |             for cls in structure['classes']:
 885 |                 content_parts.append(f"### {cls['name']} (Line {cls['line']})\n")
 886 |                 if cls['docstring']:
 887 |                     content_parts.append(f"{cls['docstring']}\n")
 888 |         
 889 |         if include_functions and structure['functions']:
 890 |             content_parts.append("\n## Functions\n")
 891 |             for func in structure['functions']:
 892 |                 content_parts.append(f"### {func['name']}() (Line {func['line']})\n")
 893 |                 if func['docstring']:
 894 |                     content_parts.append(f"{func['docstring']}\n")
 895 |         
 896 |         if include_examples:
 897 |             content_parts.append("\n## Usage Examples\n")
 898 |             content_parts.append("```python\n# Add usage examples here\n```\n")
 899 |         
 900 |         content = "\n".join(content_parts)
 901 |         
 902 |         # Create or update page
 903 |         if target_page_id:
 904 |             # Update existing page
 905 |             response = await wikijs_update_page(target_page_id, content=content)
 906 |             result_data = json.loads(response)
 907 |             if "error" not in result_data:
 908 |                 result_data["action"] = "updated"
 909 |         else:
 910 |             # Create new page
 911 |             title = f"{os.path.basename(file_path)} Documentation"
 912 |             response = await wikijs_create_page(title, content)
 913 |             result_data = json.loads(response)
 914 |             if "error" not in result_data:
 915 |                 result_data["action"] = "created"
 916 |                 # Link file to new page
 917 |                 if "pageId" in result_data:
 918 |                     await wikijs_link_file_to_page(file_path, result_data["pageId"], "documents")
 919 |         
 920 |         logger.info(f"Generated overview for {file_path}")
 921 |         return json.dumps(result_data)
 922 |         
 923 |     except Exception as e:
 924 |         error_msg = f"Failed to generate file overview: {str(e)}"
 925 |         logger.error(error_msg)
 926 |         return json.dumps({"error": error_msg})
 927 | 
 928 | @mcp.tool()
 929 | async def wikijs_bulk_update_project_docs(
 930 |     summary: str, 
 931 |     affected_files: List[str], 
 932 |     context: str,
 933 |     auto_create_missing: bool = True
 934 | ) -> str:
 935 |     """
 936 |     Batch update pages for large changes across multiple files.
 937 |     
 938 |     Args:
 939 |         summary: Overall project change summary
 940 |         affected_files: List of file paths that were changed
 941 |         context: Additional context about the changes
 942 |         auto_create_missing: Create pages for files without mappings
 943 |     
 944 |     Returns:
 945 |         JSON string with bulk update results
 946 |     """
 947 |     try:
 948 |         db = get_db()
 949 |         results = {
 950 |             "updated_pages": [],
 951 |             "created_pages": [],
 952 |             "errors": []
 953 |         }
 954 |         
 955 |         # Process each affected file
 956 |         for file_path in affected_files:
 957 |             try:
 958 |                 # Check if file has a mapping
 959 |                 mapping = db.query(FileMapping).filter(FileMapping.file_path == file_path).first()
 960 |                 
 961 |                 if mapping:
 962 |                     # Update existing page
 963 |                     sync_response = await wikijs_sync_file_docs(
 964 |                         file_path, 
 965 |                         f"Bulk update: {summary}", 
 966 |                         context
 967 |                     )
 968 |                     sync_data = json.loads(sync_response)
 969 |                     if "error" not in sync_data:
 970 |                         results["updated_pages"].append({
 971 |                             "file_path": file_path,
 972 |                             "page_id": mapping.page_id
 973 |                         })
 974 |                     else:
 975 |                         results["errors"].append({
 976 |                             "file_path": file_path,
 977 |                             "error": sync_data["error"]
 978 |                         })
 979 |                 
 980 |                 elif auto_create_missing:
 981 |                     # Create new overview page
 982 |                     overview_response = await wikijs_generate_file_overview(file_path)
 983 |                     overview_data = json.loads(overview_response)
 984 |                     if "error" not in overview_data and "pageId" in overview_data:
 985 |                         results["created_pages"].append({
 986 |                             "file_path": file_path,
 987 |                             "page_id": overview_data["pageId"]
 988 |                         })
 989 |                     else:
 990 |                         results["errors"].append({
 991 |                             "file_path": file_path,
 992 |                             "error": overview_data.get("error", "Failed to create page")
 993 |                         })
 994 |                 
 995 |             except Exception as e:
 996 |                 results["errors"].append({
 997 |                     "file_path": file_path,
 998 |                     "error": str(e)
 999 |                 })
1000 |         
1001 |         results["summary"] = {
1002 |             "total_files": len(affected_files),
1003 |             "updated": len(results["updated_pages"]),
1004 |             "created": len(results["created_pages"]),
1005 |             "errors": len(results["errors"])
1006 |         }
1007 |         
1008 |         logger.info(f"Bulk update completed: {results['summary']}")
1009 |         return json.dumps(results)
1010 |         
1011 |     except Exception as e:
1012 |         error_msg = f"Bulk update failed: {str(e)}"
1013 |         logger.error(error_msg)
1014 |         return json.dumps({"error": error_msg})
1015 | 
1016 | @mcp.tool()
1017 | async def wikijs_manage_collections(collection_name: str, description: str = None, space_ids: List[int] = None) -> str:
1018 |     """
1019 |     Manage Wiki.js collections (groups of spaces/pages).
1020 |     Note: This is a placeholder as Wiki.js collections API may vary by version.
1021 |     
1022 |     Args:
1023 |         collection_name: Name of the collection
1024 |         description: Collection description
1025 |         space_ids: List of space IDs to include
1026 |     
1027 |     Returns:
1028 |         JSON string with collection details
1029 |     """
1030 |     try:
1031 |         # This is a conceptual implementation
1032 |         # Actual Wiki.js API for collections may differ
1033 |         result = {
1034 |             "collection_name": collection_name,
1035 |             "description": description,
1036 |             "space_ids": space_ids or [],
1037 |             "status": "managed",
1038 |             "note": "Collection management depends on Wiki.js version and configuration"
1039 |         }
1040 |         
1041 |         logger.info(f"Managed collection: {collection_name}")
1042 |         return json.dumps(result)
1043 |         
1044 |     except Exception as e:
1045 |         error_msg = f"Failed to manage collection: {str(e)}"
1046 |         logger.error(error_msg)
1047 |         return json.dumps({"error": error_msg})
1048 | 
1049 | @mcp.tool()
1050 | async def wikijs_connection_status() -> str:
1051 |     """
1052 |     Check the status of the Wiki.js connection and authentication.
1053 |     
1054 |     Returns:
1055 |         JSON string with connection status
1056 |     """
1057 |     try:
1058 |         auth_success = await wikijs.authenticate()
1059 |         
1060 |         if auth_success:
1061 |             # Test with a simple API call
1062 |             response = await wikijs.graphql_request("query { pages { list { id } } }")
1063 |             
1064 |             result = {
1065 |                 "connected": True,
1066 |                 "authenticated": True,
1067 |                 "api_url": settings.WIKIJS_API_URL,
1068 |                 "auth_method": "token" if settings.token else "session",
1069 |                 "status": "healthy"
1070 |             }
1071 |         else:
1072 |             result = {
1073 |                 "connected": False,
1074 |                 "authenticated": False,
1075 |                 "api_url": settings.WIKIJS_API_URL,
1076 |                 "status": "authentication_failed"
1077 |             }
1078 |         
1079 |         return json.dumps(result)
1080 |         
1081 |     except Exception as e:
1082 |         result = {
1083 |             "connected": False,
1084 |             "authenticated": False,
1085 |             "api_url": settings.WIKIJS_API_URL,
1086 |             "error": str(e),
1087 |             "status": "connection_failed"
1088 |         }
1089 |         return json.dumps(result)
1090 | 
1091 | @mcp.tool()
1092 | async def wikijs_repository_context() -> str:
1093 |     """
1094 |     Show current repository context and Wiki.js organization.
1095 |     
1096 |     Returns:
1097 |         JSON string with repository context
1098 |     """
1099 |     try:
1100 |         repo_root = find_repository_root()
1101 |         db = get_db()
1102 |         
1103 |         # Get repository context from database
1104 |         context = db.query(RepositoryContext).filter(
1105 |             RepositoryContext.root_path == repo_root
1106 |         ).first()
1107 |         
1108 |         # Get file mappings for this repository
1109 |         mappings = db.query(FileMapping).filter(
1110 |             FileMapping.repository_root == repo_root
1111 |         ).all()
1112 |         
1113 |         result = {
1114 |             "repository_root": repo_root,
1115 |             "space_name": context.space_name if context else settings.DEFAULT_SPACE_NAME,
1116 |             "space_id": context.space_id if context else None,
1117 |             "mapped_files": len(mappings),
1118 |             "mappings": [
1119 |                 {
1120 |                     "file_path": m.file_path,
1121 |                     "page_id": m.page_id,
1122 |                     "relationship": m.relationship_type,
1123 |                     "last_updated": m.last_updated.isoformat() if m.last_updated else None
1124 |                 }
1125 |                 for m in mappings[:10]  # Limit to first 10 for brevity
1126 |             ]
1127 |         }
1128 |         
1129 |         return json.dumps(result)
1130 |         
1131 |     except Exception as e:
1132 |         error_msg = f"Failed to get repository context: {str(e)}"
1133 |         logger.error(error_msg)
1134 |         return json.dumps({"error": error_msg})
1135 | 
1136 | @mcp.tool()
1137 | async def wikijs_create_repo_structure(repo_name: str, description: str = None, sections: List[str] = None) -> str:
1138 |     """
1139 |     Create a complete repository documentation structure with nested pages.
1140 |     
1141 |     Args:
1142 |         repo_name: Repository name (will be the root page)
1143 |         description: Repository description
1144 |         sections: List of main sections to create (e.g., ["Overview", "API", "Components", "Deployment"])
1145 |     
1146 |     Returns:
1147 |         JSON string with created structure details
1148 |     """
1149 |     try:
1150 |         # Default sections if none provided
1151 |         if not sections:
1152 |             sections = ["Overview", "Getting Started", "Architecture", "API Reference", "Development", "Deployment"]
1153 |         
1154 |         # Create root repository page
1155 |         root_content = f"""# {repo_name}
1156 | 
1157 | {description or f'Documentation for the {repo_name} repository.'}
1158 | 
1159 | ## Repository Structure
1160 | 
1161 | This documentation is organized into the following sections:
1162 | 
1163 | """
1164 |         
1165 |         for section in sections:
1166 |             root_content += f"- [{section}]({slugify(repo_name)}/{slugify(section)})\n"
1167 |         
1168 |         root_content += f"""
1169 | 
1170 | ## Quick Links
1171 | 
1172 | - [Repository Overview]({slugify(repo_name)}/overview)
1173 | - [Getting Started Guide]({slugify(repo_name)}/getting-started)
1174 | - [API Documentation]({slugify(repo_name)}/api-reference)
1175 | 
1176 | ---
1177 | *This documentation structure was created by the Wiki.js MCP server.*
1178 | """
1179 |         
1180 |         # Create root page
1181 |         root_result = await wikijs_create_page(repo_name, root_content)
1182 |         root_data = json.loads(root_result)
1183 |         
1184 |         if "error" in root_data:
1185 |             return json.dumps({"error": f"Failed to create root page: {root_data['error']}"})
1186 |         
1187 |         root_page_id = root_data["pageId"]
1188 |         created_pages = [root_data]
1189 |         
1190 |         # Create section pages
1191 |         for section in sections:
1192 |             section_content = f"""# {section}
1193 | 
1194 | This is the {section.lower()} section for {repo_name}.
1195 | 
1196 | ## Contents
1197 | 
1198 | *Content will be added here as the documentation grows.*
1199 | 
1200 | ## Related Pages
1201 | 
1202 | - [Back to {repo_name}]({slugify(repo_name)})
1203 | 
1204 | ---
1205 | *This page is part of the {repo_name} documentation structure.*
1206 | """
1207 |             
1208 |             section_result = await wikijs_create_page(section, section_content, parent_id=str(root_page_id))
1209 |             section_data = json.loads(section_result)
1210 |             
1211 |             if "error" not in section_data:
1212 |                 created_pages.append(section_data)
1213 |             else:
1214 |                 logger.warning(f"Failed to create section '{section}': {section_data['error']}")
1215 |         
1216 |         result = {
1217 |             "repository": repo_name,
1218 |             "root_page_id": root_page_id,
1219 |             "created_pages": len(created_pages),
1220 |             "sections": sections,
1221 |             "pages": created_pages,
1222 |             "status": "created"
1223 |         }
1224 |         
1225 |         logger.info(f"Created repository structure for {repo_name} with {len(created_pages)} pages")
1226 |         return json.dumps(result)
1227 |         
1228 |     except Exception as e:
1229 |         error_msg = f"Failed to create repository structure: {str(e)}"
1230 |         logger.error(error_msg)
1231 |         return json.dumps({"error": error_msg})
1232 | 
1233 | @mcp.tool()
1234 | async def wikijs_create_nested_page(title: str, content: str, parent_path: str, create_parent_if_missing: bool = True) -> str:
1235 |     """
1236 |     Create a nested page using hierarchical paths (e.g., "repo/api/endpoints").
1237 |     
1238 |     Args:
1239 |         title: Page title
1240 |         content: Page content
1241 |         parent_path: Full path to parent (e.g., "my-repo/api")
1242 |         create_parent_if_missing: Create parent pages if they don't exist
1243 |     
1244 |     Returns:
1245 |         JSON string with page details
1246 |     """
1247 |     try:
1248 |         await wikijs.authenticate()
1249 |         
1250 |         # Check if parent exists
1251 |         parent_query = """
1252 |         query($path: String!) {
1253 |             pages {
1254 |                 singleByPath(path: $path, locale: "en") {
1255 |                     id
1256 |                     path
1257 |                     title
1258 |                 }
1259 |             }
1260 |         }
1261 |         """
1262 |         
1263 |         parent_response = await wikijs.graphql_request(parent_query, {"path": parent_path})
1264 |         parent_data = parent_response.get("data", {}).get("pages", {}).get("singleByPath")
1265 |         
1266 |         if not parent_data and create_parent_if_missing:
1267 |             # Create parent structure
1268 |             path_parts = parent_path.split("/")
1269 |             current_path = ""
1270 |             parent_id = None
1271 |             
1272 |             for i, part in enumerate(path_parts):
1273 |                 if current_path:
1274 |                     current_path += f"/{part}"
1275 |                 else:
1276 |                     current_path = part
1277 |                 
1278 |                 # Check if this level exists
1279 |                 check_response = await wikijs.graphql_request(parent_query, {"path": current_path})
1280 |                 existing = check_response.get("data", {}).get("pages", {}).get("singleByPath")
1281 |                 
1282 |                 if not existing:
1283 |                     # Create this level
1284 |                     part_title = part.replace("-", " ").title()
1285 |                     part_content = f"""# {part_title}
1286 | 
1287 | This is a section page for organizing documentation.
1288 | 
1289 | ## Subsections
1290 | 
1291 | *Subsections will appear here as they are created.*
1292 | 
1293 | ---
1294 | *This page was auto-created as part of the documentation hierarchy.*
1295 | """
1296 |                     
1297 |                     create_result = await wikijs_create_page(part_title, part_content, parent_id=str(parent_id) if parent_id else "")
1298 |                     create_data = json.loads(create_result)
1299 |                     
1300 |                     if "error" not in create_data:
1301 |                         parent_id = create_data["pageId"]
1302 |                     else:
1303 |                         return json.dumps({"error": f"Failed to create parent '{current_path}': {create_data['error']}"})
1304 |                 else:
1305 |                     parent_id = existing["id"]
1306 |         
1307 |         elif parent_data:
1308 |             parent_id = parent_data["id"]
1309 |         else:
1310 |             return json.dumps({"error": f"Parent path '{parent_path}' not found and create_parent_if_missing is False"})
1311 |         
1312 |         # Create the target page
1313 |         result = await wikijs_create_page(title, content, parent_id=str(parent_id))
1314 |         result_data = json.loads(result)
1315 |         
1316 |         if "error" not in result_data:
1317 |             result_data["parent_path"] = parent_path
1318 |             result_data["full_path"] = f"{parent_path}/{slugify(title)}"
1319 |         
1320 |         return json.dumps(result_data)
1321 |         
1322 |     except Exception as e:
1323 |         error_msg = f"Failed to create nested page: {str(e)}"
1324 |         logger.error(error_msg)
1325 |         return json.dumps({"error": error_msg})
1326 | 
1327 | @mcp.tool()
1328 | async def wikijs_get_page_children(page_id: int = None, page_path: str = None) -> str:
1329 |     """
1330 |     Get all child pages of a given page for hierarchical navigation.
1331 |     
1332 |     Args:
1333 |         page_id: Parent page ID (optional)
1334 |         page_path: Parent page path (optional)
1335 |     
1336 |     Returns:
1337 |         JSON string with child pages list
1338 |     """
1339 |     try:
1340 |         await wikijs.authenticate()
1341 |         
1342 |         # Get the parent page first
1343 |         if page_id:
1344 |             parent_query = """
1345 |             query($id: Int!) {
1346 |                 pages {
1347 |                     single(id: $id) {
1348 |                         id
1349 |                         path
1350 |                         title
1351 |                     }
1352 |                 }
1353 |             }
1354 |             """
1355 |             parent_response = await wikijs.graphql_request(parent_query, {"id": page_id})
1356 |             parent_data = parent_response.get("data", {}).get("pages", {}).get("single")
1357 |         elif page_path:
1358 |             parent_query = """
1359 |             query($path: String!) {
1360 |                 pages {
1361 |                     singleByPath(path: $path, locale: "en") {
1362 |                         id
1363 |                         path
1364 |                         title
1365 |                     }
1366 |                 }
1367 |             }
1368 |             """
1369 |             parent_response = await wikijs.graphql_request(parent_query, {"path": page_path})
1370 |             parent_data = parent_response.get("data", {}).get("pages", {}).get("singleByPath")
1371 |         else:
1372 |             return json.dumps({"error": "Either page_id or page_path must be provided"})
1373 |         
1374 |         if not parent_data:
1375 |             return json.dumps({"error": "Parent page not found"})
1376 |         
1377 |         parent_path = parent_data["path"]
1378 |         
1379 |         # Get all pages and filter for children
1380 |         all_pages_query = """
1381 |         query {
1382 |             pages {
1383 |                 list {
1384 |                     id
1385 |                     title
1386 |                     path
1387 |                     description
1388 |                     isPublished
1389 |                     updatedAt
1390 |                 }
1391 |             }
1392 |         }
1393 |         """
1394 |         
1395 |         response = await wikijs.graphql_request(all_pages_query)
1396 |         all_pages = response.get("data", {}).get("pages", {}).get("list", [])
1397 |         
1398 |         # Filter for direct children (path starts with parent_path/ but no additional slashes)
1399 |         children = []
1400 |         for page in all_pages:
1401 |             page_path_str = page["path"]
1402 |             if page_path_str.startswith(f"{parent_path}/"):
1403 |                 # Check if it's a direct child (no additional slashes after parent)
1404 |                 remaining_path = page_path_str[len(parent_path) + 1:]
1405 |                 if "/" not in remaining_path:  # Direct child
1406 |                     children.append({
1407 |                         "pageId": page["id"],
1408 |                         "title": page["title"],
1409 |                         "path": page["path"],
1410 |                         "description": page.get("description", ""),
1411 |                         "lastModified": page.get("updatedAt"),
1412 |                         "isPublished": page.get("isPublished", True)
1413 |                     })
1414 |         
1415 |         result = {
1416 |             "parent": {
1417 |                 "pageId": parent_data["id"],
1418 |                 "title": parent_data["title"],
1419 |                 "path": parent_data["path"]
1420 |             },
1421 |             "children": children,
1422 |             "total_children": len(children)
1423 |         }
1424 |         
1425 |         return json.dumps(result)
1426 |         
1427 |     except Exception as e:
1428 |         error_msg = f"Failed to get page children: {str(e)}"
1429 |         logger.error(error_msg)
1430 |         return json.dumps({"error": error_msg})
1431 | 
1432 | @mcp.tool()
1433 | async def wikijs_create_documentation_hierarchy(project_name: str, file_mappings: List[Dict[str, str]], auto_organize: bool = True) -> str:
1434 |     """
1435 |     Create a complete documentation hierarchy for a project based on file structure.
1436 |     
1437 |     Args:
1438 |         project_name: Name of the project/repository
1439 |         file_mappings: List of {"file_path": "src/components/Button.tsx", "doc_path": "components/button"} mappings
1440 |         auto_organize: Automatically organize files into logical sections
1441 |     
1442 |     Returns:
1443 |         JSON string with created hierarchy details
1444 |     """
1445 |     try:
1446 |         # Auto-organize files into sections if requested
1447 |         if auto_organize:
1448 |             sections = {
1449 |                 "components": [],
1450 |                 "api": [],
1451 |                 "utils": [],
1452 |                 "services": [],
1453 |                 "models": [],
1454 |                 "tests": [],
1455 |                 "config": [],
1456 |                 "docs": []
1457 |             }
1458 |             
1459 |             for mapping in file_mappings:
1460 |                 file_path = mapping["file_path"].lower()
1461 |                 
1462 |                 if "component" in file_path or "/components/" in file_path:
1463 |                     sections["components"].append(mapping)
1464 |                 elif "api" in file_path or "/api/" in file_path or "endpoint" in file_path:
1465 |                     sections["api"].append(mapping)
1466 |                 elif "util" in file_path or "/utils/" in file_path or "/helpers/" in file_path:
1467 |                     sections["utils"].append(mapping)
1468 |                 elif "service" in file_path or "/services/" in file_path:
1469 |                     sections["services"].append(mapping)
1470 |                 elif "model" in file_path or "/models/" in file_path or "/types/" in file_path:
1471 |                     sections["models"].append(mapping)
1472 |                 elif "test" in file_path or "/tests/" in file_path or ".test." in file_path:
1473 |                     sections["tests"].append(mapping)
1474 |                 elif "config" in file_path or "/config/" in file_path or ".config." in file_path:
1475 |                     sections["config"].append(mapping)
1476 |                 else:
1477 |                     sections["docs"].append(mapping)
1478 |         
1479 |         # Create root project structure
1480 |         section_names = [name.title() for name, files in sections.items() if files] if auto_organize else ["Documentation"]
1481 |         
1482 |         repo_result = await wikijs_create_repo_structure(project_name, f"Documentation for {project_name}", section_names)
1483 |         repo_data = json.loads(repo_result)
1484 |         
1485 |         if "error" in repo_data:
1486 |             return repo_result
1487 |         
1488 |         created_pages = []
1489 |         created_mappings = []
1490 |         
1491 |         if auto_organize:
1492 |             # Create pages for each section
1493 |             for section_name, files in sections.items():
1494 |                 if not files:
1495 |                     continue
1496 |                 
1497 |                 section_title = section_name.title()
1498 |                 
1499 |                 for file_mapping in files:
1500 |                     file_path = file_mapping["file_path"]
1501 |                     doc_path = file_mapping.get("doc_path", slugify(os.path.basename(file_path)))
1502 |                     
1503 |                     # Generate documentation content for the file
1504 |                     file_overview_result = await wikijs_generate_file_overview(file_path, target_page_id=None)
1505 |                     overview_data = json.loads(file_overview_result)
1506 |                     
1507 |                     if "error" not in overview_data:
1508 |                         created_pages.append(overview_data)
1509 |                         
1510 |                         # Create mapping
1511 |                         mapping_result = await wikijs_link_file_to_page(file_path, overview_data["pageId"], "documents")
1512 |                         mapping_data = json.loads(mapping_result)
1513 |                         
1514 |                         if "error" not in mapping_data:
1515 |                             created_mappings.append(mapping_data)
1516 |         else:
1517 |             # Create pages without auto-organization
1518 |             for file_mapping in file_mappings:
1519 |                 file_path = file_mapping["file_path"]
1520 |                 doc_path = file_mapping.get("doc_path", f"{project_name}/{slugify(os.path.basename(file_path))}")
1521 |                 
1522 |                 # Create nested page
1523 |                 nested_result = await wikijs_create_nested_page(
1524 |                     os.path.basename(file_path),
1525 |                     f"# {os.path.basename(file_path)}\n\nDocumentation for {file_path}",
1526 |                     doc_path
1527 |                 )
1528 |                 nested_data = json.loads(nested_result)
1529 |                 
1530 |                 if "error" not in nested_data:
1531 |                     created_pages.append(nested_data)
1532 |                     
1533 |                     # Create mapping
1534 |                     mapping_result = await wikijs_link_file_to_page(file_path, nested_data["pageId"], "documents")
1535 |                     mapping_data = json.loads(mapping_result)
1536 |                     
1537 |                     if "error" not in mapping_data:
1538 |                         created_mappings.append(mapping_data)
1539 |         
1540 |         result = {
1541 |             "project": project_name,
1542 |             "root_structure": repo_data,
1543 |             "created_pages": len(created_pages),
1544 |             "created_mappings": len(created_mappings),
1545 |             "auto_organized": auto_organize,
1546 |             "sections": list(sections.keys()) if auto_organize else ["manual"],
1547 |             "pages": created_pages[:10],  # Limit output
1548 |             "mappings": created_mappings[:10],  # Limit output
1549 |             "status": "completed"
1550 |         }
1551 |         
1552 |         logger.info(f"Created documentation hierarchy for {project_name}: {len(created_pages)} pages, {len(created_mappings)} mappings")
1553 |         return json.dumps(result)
1554 |         
1555 |     except Exception as e:
1556 |         error_msg = f"Failed to create documentation hierarchy: {str(e)}"
1557 |         logger.error(error_msg)
1558 |         return json.dumps({"error": error_msg})
1559 | 
1560 | @mcp.tool()
1561 | async def wikijs_delete_page(page_id: int = None, page_path: str = None, remove_file_mapping: bool = True) -> str:
1562 |     """
1563 |     Delete a specific page from Wiki.js.
1564 |     
1565 |     Args:
1566 |         page_id: Page ID to delete (optional)
1567 |         page_path: Page path to delete (optional)
1568 |         remove_file_mapping: Also remove file-to-page mapping from local database
1569 |     
1570 |     Returns:
1571 |         JSON string with deletion status
1572 |     """
1573 |     try:
1574 |         await wikijs.authenticate()
1575 |         
1576 |         # Get page info first
1577 |         if page_id:
1578 |             get_query = """
1579 |             query($id: Int!) {
1580 |                 pages {
1581 |                     single(id: $id) {
1582 |                         id
1583 |                         path
1584 |                         title
1585 |                     }
1586 |                 }
1587 |             }
1588 |             """
1589 |             get_response = await wikijs.graphql_request(get_query, {"id": page_id})
1590 |             page_data = get_response.get("data", {}).get("pages", {}).get("single")
1591 |         elif page_path:
1592 |             get_query = """
1593 |             query($path: String!) {
1594 |                 pages {
1595 |                     singleByPath(path: $path, locale: "en") {
1596 |                         id
1597 |                         path
1598 |                         title
1599 |                     }
1600 |                 }
1601 |             }
1602 |             """
1603 |             get_response = await wikijs.graphql_request(get_query, {"path": page_path})
1604 |             page_data = get_response.get("data", {}).get("pages", {}).get("singleByPath")
1605 |             if page_data:
1606 |                 page_id = page_data["id"]
1607 |         else:
1608 |             return json.dumps({"error": "Either page_id or page_path must be provided"})
1609 |         
1610 |         if not page_data:
1611 |             return json.dumps({"error": "Page not found"})
1612 |         
1613 |         # Delete the page using GraphQL mutation
1614 |         delete_mutation = """
1615 |         mutation($id: Int!) {
1616 |             pages {
1617 |                 delete(id: $id) {
1618 |                     responseResult {
1619 |                         succeeded
1620 |                         errorCode
1621 |                         slug
1622 |                         message
1623 |                     }
1624 |                 }
1625 |             }
1626 |         }
1627 |         """
1628 |         
1629 |         response = await wikijs.graphql_request(delete_mutation, {"id": page_id})
1630 |         
1631 |         delete_result = response.get("data", {}).get("pages", {}).get("delete", {})
1632 |         response_result = delete_result.get("responseResult", {})
1633 |         
1634 |         if response_result.get("succeeded"):
1635 |             result = {
1636 |                 "deleted": True,
1637 |                 "pageId": page_id,
1638 |                 "title": page_data["title"],
1639 |                 "path": page_data["path"],
1640 |                 "status": "deleted"
1641 |             }
1642 |             
1643 |             # Remove file mapping if requested
1644 |             if remove_file_mapping:
1645 |                 db = get_db()
1646 |                 mapping = db.query(FileMapping).filter(FileMapping.page_id == page_id).first()
1647 |                 if mapping:
1648 |                     db.delete(mapping)
1649 |                     db.commit()
1650 |                     result["file_mapping_removed"] = True
1651 |                 else:
1652 |                     result["file_mapping_removed"] = False
1653 |             
1654 |             logger.info(f"Deleted page: {page_data['title']} (ID: {page_id})")
1655 |             return json.dumps(result)
1656 |         else:
1657 |             error_msg = response_result.get("message", "Unknown error")
1658 |             return json.dumps({"error": f"Failed to delete page: {error_msg}"})
1659 |         
1660 |     except Exception as e:
1661 |         error_msg = f"Failed to delete page: {str(e)}"
1662 |         logger.error(error_msg)
1663 |         return json.dumps({"error": error_msg})
1664 | 
1665 | @mcp.tool()
1666 | async def wikijs_batch_delete_pages(
1667 |     page_ids: List[int] = None, 
1668 |     page_paths: List[str] = None,
1669 |     path_pattern: str = None,
1670 |     confirm_deletion: bool = False,
1671 |     remove_file_mappings: bool = True
1672 | ) -> str:
1673 |     """
1674 |     Batch delete multiple pages from Wiki.js.
1675 |     
1676 |     Args:
1677 |         page_ids: List of page IDs to delete (optional)
1678 |         page_paths: List of page paths to delete (optional)
1679 |         path_pattern: Pattern to match paths (e.g., "frontend-app/*" for all pages under frontend-app)
1680 |         confirm_deletion: Must be True to actually delete pages (safety check)
1681 |         remove_file_mappings: Also remove file-to-page mappings from local database
1682 |     
1683 |     Returns:
1684 |         JSON string with batch deletion results
1685 |     """
1686 |     try:
1687 |         if not confirm_deletion:
1688 |             return json.dumps({
1689 |                 "error": "confirm_deletion must be True to proceed with batch deletion",
1690 |                 "safety_note": "This is a safety check to prevent accidental deletions"
1691 |             })
1692 |         
1693 |         await wikijs.authenticate()
1694 |         
1695 |         pages_to_delete = []
1696 |         
1697 |         # Collect pages by IDs
1698 |         if page_ids:
1699 |             for page_id in page_ids:
1700 |                 get_query = """
1701 |                 query($id: Int!) {
1702 |                     pages {
1703 |                         single(id: $id) {
1704 |                             id
1705 |                             path
1706 |                             title
1707 |                         }
1708 |                     }
1709 |                 }
1710 |                 """
1711 |                 get_response = await wikijs.graphql_request(get_query, {"id": page_id})
1712 |                 page_data = get_response.get("data", {}).get("pages", {}).get("single")
1713 |                 if page_data:
1714 |                     pages_to_delete.append(page_data)
1715 |         
1716 |         # Collect pages by paths
1717 |         if page_paths:
1718 |             for page_path in page_paths:
1719 |                 get_query = """
1720 |                 query($path: String!) {
1721 |                     pages {
1722 |                         singleByPath(path: $path, locale: "en") {
1723 |                             id
1724 |                             path
1725 |                             title
1726 |                         }
1727 |                     }
1728 |                 }
1729 |                 """
1730 |                 get_response = await wikijs.graphql_request(get_query, {"path": page_path})
1731 |                 page_data = get_response.get("data", {}).get("pages", {}).get("singleByPath")
1732 |                 if page_data:
1733 |                     pages_to_delete.append(page_data)
1734 |         
1735 |         # Collect pages by pattern
1736 |         if path_pattern:
1737 |             # Get all pages and filter by pattern
1738 |             all_pages_query = """
1739 |             query {
1740 |                 pages {
1741 |                     list {
1742 |                         id
1743 |                         title
1744 |                         path
1745 |                     }
1746 |                 }
1747 |             }
1748 |             """
1749 |             
1750 |             response = await wikijs.graphql_request(all_pages_query)
1751 |             all_pages = response.get("data", {}).get("pages", {}).get("list", [])
1752 |             
1753 |             # Simple pattern matching (supports * wildcard)
1754 |             import fnmatch
1755 |             for page in all_pages:
1756 |                 if fnmatch.fnmatch(page["path"], path_pattern):
1757 |                     pages_to_delete.append(page)
1758 |         
1759 |         if not pages_to_delete:
1760 |             return json.dumps({"error": "No pages found to delete"})
1761 |         
1762 |         # Remove duplicates
1763 |         unique_pages = {}
1764 |         for page in pages_to_delete:
1765 |             unique_pages[page["id"]] = page
1766 |         pages_to_delete = list(unique_pages.values())
1767 |         
1768 |         # Delete pages
1769 |         deleted_pages = []
1770 |         failed_deletions = []
1771 |         
1772 |         for page in pages_to_delete:
1773 |             try:
1774 |                 delete_result = await wikijs_delete_page(
1775 |                     page_id=page["id"], 
1776 |                     remove_file_mapping=remove_file_mappings
1777 |                 )
1778 |                 delete_data = json.loads(delete_result)
1779 |                 
1780 |                 if "error" not in delete_data:
1781 |                     deleted_pages.append({
1782 |                         "pageId": page["id"],
1783 |                         "title": page["title"],
1784 |                         "path": page["path"]
1785 |                     })
1786 |                 else:
1787 |                     failed_deletions.append({
1788 |                         "pageId": page["id"],
1789 |                         "title": page["title"],
1790 |                         "path": page["path"],
1791 |                         "error": delete_data["error"]
1792 |                     })
1793 |             except Exception as e:
1794 |                 failed_deletions.append({
1795 |                     "pageId": page["id"],
1796 |                     "title": page["title"],
1797 |                     "path": page["path"],
1798 |                     "error": str(e)
1799 |                 })
1800 |         
1801 |         result = {
1802 |             "total_found": len(pages_to_delete),
1803 |             "deleted_count": len(deleted_pages),
1804 |             "failed_count": len(failed_deletions),
1805 |             "deleted_pages": deleted_pages,
1806 |             "failed_deletions": failed_deletions,
1807 |             "status": "completed"
1808 |         }
1809 |         
1810 |         logger.info(f"Batch deletion completed: {len(deleted_pages)} deleted, {len(failed_deletions)} failed")
1811 |         return json.dumps(result)
1812 |         
1813 |     except Exception as e:
1814 |         error_msg = f"Batch deletion failed: {str(e)}"
1815 |         logger.error(error_msg)
1816 |         return json.dumps({"error": error_msg})
1817 | 
1818 | @mcp.tool()
1819 | async def wikijs_delete_hierarchy(
1820 |     root_path: str,
1821 |     delete_mode: str = "children_only",
1822 |     confirm_deletion: bool = False,
1823 |     remove_file_mappings: bool = True
1824 | ) -> str:
1825 |     """
1826 |     Delete an entire page hierarchy (folder structure) from Wiki.js.
1827 |     
1828 |     Args:
1829 |         root_path: Root path of the hierarchy to delete (e.g., "frontend-app" or "frontend-app/components")
1830 |         delete_mode: Deletion mode - "children_only", "include_root", or "root_only"
1831 |         confirm_deletion: Must be True to actually delete pages (safety check)
1832 |         remove_file_mappings: Also remove file-to-page mappings from local database
1833 |     
1834 |     Returns:
1835 |         JSON string with hierarchy deletion results
1836 |     """
1837 |     try:
1838 |         if not confirm_deletion:
1839 |             return json.dumps({
1840 |                 "error": "confirm_deletion must be True to proceed with hierarchy deletion",
1841 |                 "safety_note": "This is a safety check to prevent accidental deletions",
1842 |                 "preview_mode": "Set confirm_deletion=True to actually delete"
1843 |             })
1844 |         
1845 |         valid_modes = ["children_only", "include_root", "root_only"]
1846 |         if delete_mode not in valid_modes:
1847 |             return json.dumps({
1848 |                 "error": f"Invalid delete_mode. Must be one of: {valid_modes}"
1849 |             })
1850 |         
1851 |         await wikijs.authenticate()
1852 |         
1853 |         # Get all pages to find hierarchy
1854 |         all_pages_query = """
1855 |         query {
1856 |             pages {
1857 |                 list {
1858 |                     id
1859 |                     title
1860 |                     path
1861 |                 }
1862 |             }
1863 |         }
1864 |         """
1865 |         
1866 |         response = await wikijs.graphql_request(all_pages_query)
1867 |         all_pages = response.get("data", {}).get("pages", {}).get("list", [])
1868 |         
1869 |         # Find root page
1870 |         root_page = None
1871 |         for page in all_pages:
1872 |             if page["path"] == root_path:
1873 |                 root_page = page
1874 |                 break
1875 |         
1876 |         if not root_page and delete_mode in ["include_root", "root_only"]:
1877 |             return json.dumps({"error": f"Root page not found: {root_path}"})
1878 |         
1879 |         # Find child pages
1880 |         child_pages = []
1881 |         for page in all_pages:
1882 |             page_path = page["path"]
1883 |             if page_path.startswith(f"{root_path}/"):
1884 |                 child_pages.append(page)
1885 |         
1886 |         # Determine pages to delete based on mode
1887 |         pages_to_delete = []
1888 |         
1889 |         if delete_mode == "children_only":
1890 |             pages_to_delete = child_pages
1891 |         elif delete_mode == "include_root":
1892 |             pages_to_delete = child_pages + ([root_page] if root_page else [])
1893 |         elif delete_mode == "root_only":
1894 |             pages_to_delete = [root_page] if root_page else []
1895 |         
1896 |         if not pages_to_delete:
1897 |             return json.dumps({
1898 |                 "message": f"No pages found to delete for path: {root_path}",
1899 |                 "delete_mode": delete_mode,
1900 |                 "root_found": root_page is not None,
1901 |                 "children_found": len(child_pages)
1902 |             })
1903 |         
1904 |         # Sort by depth (deepest first) to avoid dependency issues
1905 |         pages_to_delete.sort(key=lambda x: x["path"].count("/"), reverse=True)
1906 |         
1907 |         # Delete pages
1908 |         deleted_pages = []
1909 |         failed_deletions = []
1910 |         
1911 |         for page in pages_to_delete:
1912 |             try:
1913 |                 delete_result = await wikijs_delete_page(
1914 |                     page_id=page["id"], 
1915 |                     remove_file_mapping=remove_file_mappings
1916 |                 )
1917 |                 delete_data = json.loads(delete_result)
1918 |                 
1919 |                 if "error" not in delete_data:
1920 |                     deleted_pages.append({
1921 |                         "pageId": page["id"],
1922 |                         "title": page["title"],
1923 |                         "path": page["path"],
1924 |                         "depth": page["path"].count("/")
1925 |                     })
1926 |                 else:
1927 |                     failed_deletions.append({
1928 |                         "pageId": page["id"],
1929 |                         "title": page["title"],
1930 |                         "path": page["path"],
1931 |                         "error": delete_data["error"]
1932 |                     })
1933 |             except Exception as e:
1934 |                 failed_deletions.append({
1935 |                     "pageId": page["id"],
1936 |                     "title": page["title"],
1937 |                     "path": page["path"],
1938 |                     "error": str(e)
1939 |                 })
1940 |         
1941 |         result = {
1942 |             "root_path": root_path,
1943 |             "delete_mode": delete_mode,
1944 |             "total_found": len(pages_to_delete),
1945 |             "deleted_count": len(deleted_pages),
1946 |             "failed_count": len(failed_deletions),
1947 |             "deleted_pages": deleted_pages,
1948 |             "failed_deletions": failed_deletions,
1949 |             "hierarchy_summary": {
1950 |                 "root_page_found": root_page is not None,
1951 |                 "child_pages_found": len(child_pages),
1952 |                 "max_depth": max([p["path"].count("/") for p in pages_to_delete]) if pages_to_delete else 0
1953 |             },
1954 |             "status": "completed"
1955 |         }
1956 |         
1957 |         logger.info(f"Hierarchy deletion completed for {root_path}: {len(deleted_pages)} deleted, {len(failed_deletions)} failed")
1958 |         return json.dumps(result)
1959 |         
1960 |     except Exception as e:
1961 |         error_msg = f"Hierarchy deletion failed: {str(e)}"
1962 |         logger.error(error_msg)
1963 |         return json.dumps({"error": error_msg})
1964 | 
1965 | @mcp.tool()
1966 | async def wikijs_cleanup_orphaned_mappings() -> str:
1967 |     """
1968 |     Clean up file-to-page mappings for pages that no longer exist in Wiki.js.
1969 |     
1970 |     Returns:
1971 |         JSON string with cleanup results
1972 |     """
1973 |     try:
1974 |         await wikijs.authenticate()
1975 |         db = get_db()
1976 |         
1977 |         # Get all file mappings
1978 |         mappings = db.query(FileMapping).all()
1979 |         
1980 |         if not mappings:
1981 |             return json.dumps({
1982 |                 "message": "No file mappings found",
1983 |                 "cleaned_count": 0
1984 |             })
1985 |         
1986 |         # Check which pages still exist
1987 |         orphaned_mappings = []
1988 |         valid_mappings = []
1989 |         
1990 |         for mapping in mappings:
1991 |             try:
1992 |                 get_query = """
1993 |                 query($id: Int!) {
1994 |                     pages {
1995 |                         single(id: $id) {
1996 |                             id
1997 |                             title
1998 |                             path
1999 |                         }
2000 |                     }
2001 |                 }
2002 |                 """
2003 |                 get_response = await wikijs.graphql_request(get_query, {"id": mapping.page_id})
2004 |                 page_data = get_response.get("data", {}).get("pages", {}).get("single")
2005 |                 
2006 |                 if page_data:
2007 |                     valid_mappings.append({
2008 |                         "file_path": mapping.file_path,
2009 |                         "page_id": mapping.page_id,
2010 |                         "page_title": page_data["title"]
2011 |                     })
2012 |                 else:
2013 |                     orphaned_mappings.append({
2014 |                         "file_path": mapping.file_path,
2015 |                         "page_id": mapping.page_id,
2016 |                         "last_updated": mapping.last_updated.isoformat() if mapping.last_updated else None
2017 |                     })
2018 |                     # Delete orphaned mapping
2019 |                     db.delete(mapping)
2020 |                     
2021 |             except Exception as e:
2022 |                 # If we can't check the page, consider it orphaned
2023 |                 orphaned_mappings.append({
2024 |                     "file_path": mapping.file_path,
2025 |                     "page_id": mapping.page_id,
2026 |                     "error": str(e)
2027 |                 })
2028 |                 db.delete(mapping)
2029 |         
2030 |         db.commit()
2031 |         
2032 |         result = {
2033 |             "total_mappings": len(mappings),
2034 |             "valid_mappings": len(valid_mappings),
2035 |             "orphaned_mappings": len(orphaned_mappings),
2036 |             "cleaned_count": len(orphaned_mappings),
2037 |             "orphaned_details": orphaned_mappings,
2038 |             "status": "completed"
2039 |         }
2040 |         
2041 |         logger.info(f"Cleaned up {len(orphaned_mappings)} orphaned file mappings")
2042 |         return json.dumps(result)
2043 |         
2044 |     except Exception as e:
2045 |         error_msg = f"Cleanup failed: {str(e)}"
2046 |         logger.error(error_msg)
2047 |         return json.dumps({"error": error_msg})
2048 | 
2049 | def main():
2050 |     """Main entry point for the MCP server."""
2051 |     import asyncio
2052 |     
2053 |     async def run_server():
2054 |         await wikijs.authenticate()
2055 |         logger.info("Wiki.js MCP Server started")
2056 |         
2057 |     # Run the server
2058 |     mcp.run()
2059 | 
2060 | if __name__ == "__main__":
2061 |     main() 
```