#
tokens: 49201/50000 5/416 files (page 19/27)
lines: on (toggle) GitHub
raw markdown copy reset
This is page 19 of 27. Use http://codebase.md/basicmachines-co/basic-memory?lines=true&page={x} to view the full context.

# Directory Structure

```
├── .claude
│   ├── commands
│   │   ├── release
│   │   │   ├── beta.md
│   │   │   ├── changelog.md
│   │   │   ├── release-check.md
│   │   │   └── release.md
│   │   ├── spec.md
│   │   └── test-live.md
│   └── settings.json
├── .dockerignore
├── .env.example
├── .github
│   ├── dependabot.yml
│   ├── ISSUE_TEMPLATE
│   │   ├── bug_report.md
│   │   ├── config.yml
│   │   ├── documentation.md
│   │   └── feature_request.md
│   └── workflows
│       ├── claude-code-review.yml
│       ├── claude-issue-triage.yml
│       ├── claude.yml
│       ├── dev-release.yml
│       ├── docker.yml
│       ├── pr-title.yml
│       ├── release.yml
│       └── test.yml
├── .gitignore
├── .python-version
├── CHANGELOG.md
├── CITATION.cff
├── CLA.md
├── CLAUDE.md
├── CODE_OF_CONDUCT.md
├── CONTRIBUTING.md
├── docker-compose-postgres.yml
├── docker-compose.yml
├── Dockerfile
├── docs
│   ├── ai-assistant-guide-extended.md
│   ├── ARCHITECTURE.md
│   ├── character-handling.md
│   ├── cloud-cli.md
│   ├── Docker.md
│   └── testing-coverage.md
├── justfile
├── LICENSE
├── llms-install.md
├── pyproject.toml
├── README.md
├── SECURITY.md
├── smithery.yaml
├── specs
│   ├── SPEC-1 Specification-Driven Development Process.md
│   ├── SPEC-10 Unified Deployment Workflow and Event Tracking.md
│   ├── SPEC-11 Basic Memory API Performance Optimization.md
│   ├── SPEC-12 OpenTelemetry Observability.md
│   ├── SPEC-13 CLI Authentication with Subscription Validation.md
│   ├── SPEC-14 Cloud Git Versioning & GitHub Backup.md
│   ├── SPEC-14- Cloud Git Versioning & GitHub Backup.md
│   ├── SPEC-15 Configuration Persistence via Tigris for Cloud Tenants.md
│   ├── SPEC-16 MCP Cloud Service Consolidation.md
│   ├── SPEC-17 Semantic Search with ChromaDB.md
│   ├── SPEC-18 AI Memory Management Tool.md
│   ├── SPEC-19 Sync Performance and Memory Optimization.md
│   ├── SPEC-2 Slash Commands Reference.md
│   ├── SPEC-20 Simplified Project-Scoped Rclone Sync.md
│   ├── SPEC-3 Agent Definitions.md
│   ├── SPEC-4 Notes Web UI Component Architecture.md
│   ├── SPEC-5 CLI Cloud Upload via WebDAV.md
│   ├── SPEC-6 Explicit Project Parameter Architecture.md
│   ├── SPEC-7 POC to spike Tigris Turso for local access to cloud data.md
│   ├── SPEC-8 TigrisFS Integration.md
│   ├── SPEC-9 Multi-Project Bidirectional Sync Architecture.md
│   ├── SPEC-9 Signed Header Tenant Information.md
│   └── SPEC-9-1 Follow-Ups- Conflict, Sync, and Observability.md
├── src
│   └── basic_memory
│       ├── __init__.py
│       ├── alembic
│       │   ├── alembic.ini
│       │   ├── env.py
│       │   ├── migrations.py
│       │   ├── script.py.mako
│       │   └── versions
│       │       ├── 314f1ea54dc4_add_postgres_full_text_search_support_.py
│       │       ├── 3dae7c7b1564_initial_schema.py
│       │       ├── 502b60eaa905_remove_required_from_entity_permalink.py
│       │       ├── 5fe1ab1ccebe_add_projects_table.py
│       │       ├── 647e7a75e2cd_project_constraint_fix.py
│       │       ├── 6830751f5fb6_merge_multiple_heads.py
│       │       ├── 9d9c1cb7d8f5_add_mtime_and_size_columns_to_entity_.py
│       │       ├── a1b2c3d4e5f6_fix_project_foreign_keys.py
│       │       ├── a2b3c4d5e6f7_add_search_index_entity_cascade.py
│       │       ├── b3c3938bacdb_relation_to_name_unique_index.py
│       │       ├── cc7172b46608_update_search_index_schema.py
│       │       ├── e7e1f4367280_add_scan_watermark_tracking_to_project.py
│       │       ├── f8a9b2c3d4e5_add_pg_trgm_for_fuzzy_link_resolution.py
│       │       └── g9a0b3c4d5e6_add_external_id_to_project_and_entity.py
│       ├── api
│       │   ├── __init__.py
│       │   ├── app.py
│       │   ├── container.py
│       │   ├── routers
│       │   │   ├── __init__.py
│       │   │   ├── directory_router.py
│       │   │   ├── importer_router.py
│       │   │   ├── knowledge_router.py
│       │   │   ├── management_router.py
│       │   │   ├── memory_router.py
│       │   │   ├── project_router.py
│       │   │   ├── prompt_router.py
│       │   │   ├── resource_router.py
│       │   │   ├── search_router.py
│       │   │   └── utils.py
│       │   ├── template_loader.py
│       │   └── v2
│       │       ├── __init__.py
│       │       └── routers
│       │           ├── __init__.py
│       │           ├── directory_router.py
│       │           ├── importer_router.py
│       │           ├── knowledge_router.py
│       │           ├── memory_router.py
│       │           ├── project_router.py
│       │           ├── prompt_router.py
│       │           ├── resource_router.py
│       │           └── search_router.py
│       ├── cli
│       │   ├── __init__.py
│       │   ├── app.py
│       │   ├── auth.py
│       │   ├── commands
│       │   │   ├── __init__.py
│       │   │   ├── cloud
│       │   │   │   ├── __init__.py
│       │   │   │   ├── api_client.py
│       │   │   │   ├── bisync_commands.py
│       │   │   │   ├── cloud_utils.py
│       │   │   │   ├── core_commands.py
│       │   │   │   ├── rclone_commands.py
│       │   │   │   ├── rclone_config.py
│       │   │   │   ├── rclone_installer.py
│       │   │   │   ├── upload_command.py
│       │   │   │   └── upload.py
│       │   │   ├── command_utils.py
│       │   │   ├── db.py
│       │   │   ├── format.py
│       │   │   ├── import_chatgpt.py
│       │   │   ├── import_claude_conversations.py
│       │   │   ├── import_claude_projects.py
│       │   │   ├── import_memory_json.py
│       │   │   ├── mcp.py
│       │   │   ├── project.py
│       │   │   ├── status.py
│       │   │   ├── telemetry.py
│       │   │   └── tool.py
│       │   ├── container.py
│       │   └── main.py
│       ├── config.py
│       ├── db.py
│       ├── deps
│       │   ├── __init__.py
│       │   ├── config.py
│       │   ├── db.py
│       │   ├── importers.py
│       │   ├── projects.py
│       │   ├── repositories.py
│       │   └── services.py
│       ├── deps.py
│       ├── file_utils.py
│       ├── ignore_utils.py
│       ├── importers
│       │   ├── __init__.py
│       │   ├── base.py
│       │   ├── chatgpt_importer.py
│       │   ├── claude_conversations_importer.py
│       │   ├── claude_projects_importer.py
│       │   ├── memory_json_importer.py
│       │   └── utils.py
│       ├── markdown
│       │   ├── __init__.py
│       │   ├── entity_parser.py
│       │   ├── markdown_processor.py
│       │   ├── plugins.py
│       │   ├── schemas.py
│       │   └── utils.py
│       ├── mcp
│       │   ├── __init__.py
│       │   ├── async_client.py
│       │   ├── clients
│       │   │   ├── __init__.py
│       │   │   ├── directory.py
│       │   │   ├── knowledge.py
│       │   │   ├── memory.py
│       │   │   ├── project.py
│       │   │   ├── resource.py
│       │   │   └── search.py
│       │   ├── container.py
│       │   ├── project_context.py
│       │   ├── prompts
│       │   │   ├── __init__.py
│       │   │   ├── ai_assistant_guide.py
│       │   │   ├── continue_conversation.py
│       │   │   ├── recent_activity.py
│       │   │   ├── search.py
│       │   │   └── utils.py
│       │   ├── resources
│       │   │   ├── ai_assistant_guide.md
│       │   │   └── project_info.py
│       │   ├── server.py
│       │   └── tools
│       │       ├── __init__.py
│       │       ├── build_context.py
│       │       ├── canvas.py
│       │       ├── chatgpt_tools.py
│       │       ├── delete_note.py
│       │       ├── edit_note.py
│       │       ├── list_directory.py
│       │       ├── move_note.py
│       │       ├── project_management.py
│       │       ├── read_content.py
│       │       ├── read_note.py
│       │       ├── recent_activity.py
│       │       ├── search.py
│       │       ├── utils.py
│       │       ├── view_note.py
│       │       └── write_note.py
│       ├── models
│       │   ├── __init__.py
│       │   ├── base.py
│       │   ├── knowledge.py
│       │   ├── project.py
│       │   └── search.py
│       ├── project_resolver.py
│       ├── repository
│       │   ├── __init__.py
│       │   ├── entity_repository.py
│       │   ├── observation_repository.py
│       │   ├── postgres_search_repository.py
│       │   ├── project_info_repository.py
│       │   ├── project_repository.py
│       │   ├── relation_repository.py
│       │   ├── repository.py
│       │   ├── search_index_row.py
│       │   ├── search_repository_base.py
│       │   ├── search_repository.py
│       │   └── sqlite_search_repository.py
│       ├── runtime.py
│       ├── schemas
│       │   ├── __init__.py
│       │   ├── base.py
│       │   ├── cloud.py
│       │   ├── delete.py
│       │   ├── directory.py
│       │   ├── importer.py
│       │   ├── memory.py
│       │   ├── project_info.py
│       │   ├── prompt.py
│       │   ├── request.py
│       │   ├── response.py
│       │   ├── search.py
│       │   ├── sync_report.py
│       │   └── v2
│       │       ├── __init__.py
│       │       ├── entity.py
│       │       └── resource.py
│       ├── services
│       │   ├── __init__.py
│       │   ├── context_service.py
│       │   ├── directory_service.py
│       │   ├── entity_service.py
│       │   ├── exceptions.py
│       │   ├── file_service.py
│       │   ├── initialization.py
│       │   ├── link_resolver.py
│       │   ├── project_service.py
│       │   ├── search_service.py
│       │   └── service.py
│       ├── sync
│       │   ├── __init__.py
│       │   ├── background_sync.py
│       │   ├── coordinator.py
│       │   ├── sync_service.py
│       │   └── watch_service.py
│       ├── telemetry.py
│       ├── templates
│       │   └── prompts
│       │       ├── continue_conversation.hbs
│       │       └── search.hbs
│       └── utils.py
├── test-int
│   ├── BENCHMARKS.md
│   ├── cli
│   │   ├── test_project_commands_integration.py
│   │   └── test_version_integration.py
│   ├── conftest.py
│   ├── mcp
│   │   ├── test_build_context_underscore.py
│   │   ├── test_build_context_validation.py
│   │   ├── test_chatgpt_tools_integration.py
│   │   ├── test_default_project_mode_integration.py
│   │   ├── test_delete_note_integration.py
│   │   ├── test_edit_note_integration.py
│   │   ├── test_lifespan_shutdown_sync_task_cancellation_integration.py
│   │   ├── test_list_directory_integration.py
│   │   ├── test_move_note_integration.py
│   │   ├── test_project_management_integration.py
│   │   ├── test_project_state_sync_integration.py
│   │   ├── test_read_content_integration.py
│   │   ├── test_read_note_integration.py
│   │   ├── test_search_integration.py
│   │   ├── test_single_project_mcp_integration.py
│   │   └── test_write_note_integration.py
│   ├── test_db_wal_mode.py
│   └── test_disable_permalinks_integration.py
├── tests
│   ├── __init__.py
│   ├── api
│   │   ├── conftest.py
│   │   ├── test_api_container.py
│   │   ├── test_async_client.py
│   │   ├── test_continue_conversation_template.py
│   │   ├── test_directory_router.py
│   │   ├── test_importer_router.py
│   │   ├── test_knowledge_router.py
│   │   ├── test_management_router.py
│   │   ├── test_memory_router.py
│   │   ├── test_project_router_operations.py
│   │   ├── test_project_router.py
│   │   ├── test_prompt_router.py
│   │   ├── test_relation_background_resolution.py
│   │   ├── test_resource_router.py
│   │   ├── test_search_router.py
│   │   ├── test_search_template.py
│   │   ├── test_template_loader_helpers.py
│   │   ├── test_template_loader.py
│   │   └── v2
│   │       ├── __init__.py
│   │       ├── conftest.py
│   │       ├── test_directory_router.py
│   │       ├── test_importer_router.py
│   │       ├── test_knowledge_router.py
│   │       ├── test_memory_router.py
│   │       ├── test_project_router.py
│   │       ├── test_prompt_router.py
│   │       ├── test_resource_router.py
│   │       └── test_search_router.py
│   ├── cli
│   │   ├── cloud
│   │   │   ├── test_cloud_api_client_and_utils.py
│   │   │   ├── test_rclone_config_and_bmignore_filters.py
│   │   │   └── test_upload_path.py
│   │   ├── conftest.py
│   │   ├── test_auth_cli_auth.py
│   │   ├── test_cli_container.py
│   │   ├── test_cli_exit.py
│   │   ├── test_cli_tool_exit.py
│   │   ├── test_cli_tools.py
│   │   ├── test_cloud_authentication.py
│   │   ├── test_ignore_utils.py
│   │   ├── test_import_chatgpt.py
│   │   ├── test_import_claude_conversations.py
│   │   ├── test_import_claude_projects.py
│   │   ├── test_import_memory_json.py
│   │   ├── test_project_add_with_local_path.py
│   │   └── test_upload.py
│   ├── conftest.py
│   ├── db
│   │   └── test_issue_254_foreign_key_constraints.py
│   ├── importers
│   │   ├── test_conversation_indexing.py
│   │   ├── test_importer_base.py
│   │   └── test_importer_utils.py
│   ├── markdown
│   │   ├── __init__.py
│   │   ├── test_date_frontmatter_parsing.py
│   │   ├── test_entity_parser_error_handling.py
│   │   ├── test_entity_parser.py
│   │   ├── test_markdown_plugins.py
│   │   ├── test_markdown_processor.py
│   │   ├── test_observation_edge_cases.py
│   │   ├── test_parser_edge_cases.py
│   │   ├── test_relation_edge_cases.py
│   │   └── test_task_detection.py
│   ├── mcp
│   │   ├── clients
│   │   │   ├── __init__.py
│   │   │   └── test_clients.py
│   │   ├── conftest.py
│   │   ├── test_async_client_modes.py
│   │   ├── test_mcp_container.py
│   │   ├── test_obsidian_yaml_formatting.py
│   │   ├── test_permalink_collision_file_overwrite.py
│   │   ├── test_project_context.py
│   │   ├── test_prompts.py
│   │   ├── test_recent_activity_prompt_modes.py
│   │   ├── test_resources.py
│   │   ├── test_server_lifespan_branches.py
│   │   ├── test_tool_build_context.py
│   │   ├── test_tool_canvas.py
│   │   ├── test_tool_delete_note.py
│   │   ├── test_tool_edit_note.py
│   │   ├── test_tool_list_directory.py
│   │   ├── test_tool_move_note.py
│   │   ├── test_tool_project_management.py
│   │   ├── test_tool_read_content.py
│   │   ├── test_tool_read_note.py
│   │   ├── test_tool_recent_activity.py
│   │   ├── test_tool_resource.py
│   │   ├── test_tool_search.py
│   │   ├── test_tool_utils.py
│   │   ├── test_tool_view_note.py
│   │   ├── test_tool_write_note_kebab_filenames.py
│   │   ├── test_tool_write_note.py
│   │   └── tools
│   │       └── test_chatgpt_tools.py
│   ├── Non-MarkdownFileSupport.pdf
│   ├── README.md
│   ├── repository
│   │   ├── test_entity_repository_upsert.py
│   │   ├── test_entity_repository.py
│   │   ├── test_entity_upsert_issue_187.py
│   │   ├── test_observation_repository.py
│   │   ├── test_postgres_search_repository.py
│   │   ├── test_project_info_repository.py
│   │   ├── test_project_repository.py
│   │   ├── test_relation_repository.py
│   │   ├── test_repository.py
│   │   ├── test_search_repository_edit_bug_fix.py
│   │   └── test_search_repository.py
│   ├── schemas
│   │   ├── test_base_timeframe_minimum.py
│   │   ├── test_memory_serialization.py
│   │   ├── test_memory_url_validation.py
│   │   ├── test_memory_url.py
│   │   ├── test_relation_response_reference_resolution.py
│   │   ├── test_schemas.py
│   │   └── test_search.py
│   ├── Screenshot.png
│   ├── services
│   │   ├── test_context_service.py
│   │   ├── test_directory_service.py
│   │   ├── test_entity_service_disable_permalinks.py
│   │   ├── test_entity_service.py
│   │   ├── test_file_service.py
│   │   ├── test_initialization_cloud_mode_branches.py
│   │   ├── test_initialization.py
│   │   ├── test_link_resolver.py
│   │   ├── test_project_removal_bug.py
│   │   ├── test_project_service_operations.py
│   │   ├── test_project_service.py
│   │   └── test_search_service.py
│   ├── sync
│   │   ├── test_character_conflicts.py
│   │   ├── test_coordinator.py
│   │   ├── test_sync_service_incremental.py
│   │   ├── test_sync_service.py
│   │   ├── test_sync_wikilink_issue.py
│   │   ├── test_tmp_files.py
│   │   ├── test_watch_service_atomic_adds.py
│   │   ├── test_watch_service_edge_cases.py
│   │   ├── test_watch_service_reload.py
│   │   └── test_watch_service.py
│   ├── test_config.py
│   ├── test_deps.py
│   ├── test_production_cascade_delete.py
│   ├── test_project_resolver.py
│   ├── test_rclone_commands.py
│   ├── test_runtime.py
│   ├── test_telemetry.py
│   └── utils
│       ├── test_file_utils.py
│       ├── test_frontmatter_obsidian_compatible.py
│       ├── test_parse_tags.py
│       ├── test_permalink_formatting.py
│       ├── test_timezone_utils.py
│       ├── test_utf8_handling.py
│       └── test_validate_project_path.py
└── uv.lock
```

# Files

--------------------------------------------------------------------------------
/specs/SPEC-16 MCP Cloud Service Consolidation.md:
--------------------------------------------------------------------------------

```markdown
  1 | ---
  2 | title: 'SPEC-16: MCP Cloud Service Consolidation'
  3 | type: spec
  4 | permalink: specs/spec-16-mcp-cloud-service-consolidation
  5 | tags:
  6 | - architecture
  7 | - mcp
  8 | - cloud
  9 | - performance
 10 | - deployment
 11 | status: in-progress
 12 | ---
 13 | 
 14 | ## Status Update
 15 | 
 16 | **Phase 0 (Basic Memory Refactor): ✅ COMPLETE**
 17 | - basic-memory PR #344: async_client context manager pattern implemented
 18 | - All 17 MCP tools updated to use `async with get_client() as client:`
 19 | - CLI commands updated to use context manager
 20 | - Removed `inject_auth_header()` and `headers.py` (~100 lines deleted)
 21 | - Factory pattern enables clean dependency injection
 22 | - Tests passing, typecheck clean
 23 | 
 24 | **Phase 0 Integration: ✅ COMPLETE**
 25 | - basic-memory-cloud updated to use async-client-context-manager branch
 26 | - Implemented `tenant_direct_client_factory()` with proper context manager pattern
 27 | - Removed module-level client override hacks
 28 | - Removed unnecessary `/proxy` prefix stripping (tools pass relative URLs)
 29 | - Typecheck and lint passing with proper noqa hints
 30 | - MCP tools confirmed working via inspector (local testing)
 31 | 
 32 | **Phase 1 (Code Consolidation): ✅ COMPLETE**
 33 | - MCP server mounted on Cloud FastAPI app at /mcp endpoint
 34 | - AuthKitProvider configured with WorkOS settings
 35 | - Combined lifespans (Cloud + MCP) working correctly
 36 | - JWT context middleware integrated
 37 | - All routes and MCP tools functional
 38 | 
 39 | **Phase 2 (Direct Tenant Transport): ✅ COMPLETE**
 40 | - TenantDirectTransport implemented with custom httpx transport
 41 | - Per-request JWT extraction via FastMCP DI
 42 | - Tenant lookup and signed header generation working
 43 | - Direct routing to tenant APIs (eliminating HTTP hop)
 44 | - Transport tests passing (11/11)
 45 | 
 46 | **Phase 3 (Testing & Validation): ✅ COMPLETE**
 47 | - Typecheck and lint passing across all services
 48 | - MCP OAuth authentication working in preview environment
 49 | - Tenant isolation via signed headers verified
 50 | - Fixed BM_TENANT_HEADER_SECRET mismatch between environments
 51 | - MCP tools successfully calling tenant APIs in preview
 52 | 
 53 | **Phase 4 (Deployment Configuration): ✅ COMPLETE**
 54 | - Updated apps/cloud/fly.template.toml with MCP environment variables
 55 | - Added HTTP/2 backend support for better MCP performance
 56 | - Added OAuth protected resource health check
 57 | - Removed MCP from preview deployment workflow
 58 | - Successfully deployed to preview environment (PR #113)
 59 | - All services operational at pr-113-basic-memory-cloud.fly.dev
 60 | 
 61 | **Next Steps:**
 62 | - Phase 5: Cleanup (remove apps/mcp directory)
 63 | - Phase 6: Production rollout and performance measurement
 64 | 
 65 | # SPEC-16: MCP Cloud Service Consolidation
 66 | 
 67 | ## Why
 68 | 
 69 | ### Original Architecture Constraints (Now Removed)
 70 | 
 71 | The current architecture deploys MCP Gateway and Cloud Service as separate Fly.io apps:
 72 | 
 73 | **Current Flow:**
 74 | ```
 75 | LLM Client → MCP Gateway (OAuth) → Cloud Proxy (JWT + header signing) → Tenant API (JWT + header validation)
 76 |             apps/mcp                apps/cloud /proxy                    apps/api
 77 | ```
 78 | 
 79 | This separation was originally necessary because:
 80 | 1. **Stateful SSE requirement** - MCP needed server-sent events with session state for active project tracking
 81 | 2. **fastmcp.run limitation** - The FastMCP demo helper didn't support worker processes
 82 | 
 83 | ### Why These Constraints No Longer Apply
 84 | 
 85 | 1. **State externalized** - Project state moved from in-memory to LLM context (external state)
 86 | 2. **HTTP transport enabled** - Switched from SSE to stateless HTTP for MCP tools
 87 | 3. **Worker support added** - Converted from `fastmcp.run()` to `uvicorn.run()` with workers
 88 | 
 89 | ### Current Problems
 90 | 
 91 | - **Unnecessary HTTP hop** - MCP tools call Cloud /proxy endpoint which calls tenant API
 92 | - **Higher latency** - Extra network round trip for every MCP operation
 93 | - **Increased costs** - Two separate Fly.io apps instead of one
 94 | - **Complex deployment** - Two services to deploy, monitor, and maintain
 95 | - **Resource waste** - Separate database connections, HTTP clients, telemetry overhead
 96 | 
 97 | ## What
 98 | 
 99 | ### Services Affected
100 | 
101 | 1. **apps/mcp** - MCP Gateway service (to be merged)
102 | 2. **apps/cloud** - Cloud service (will receive MCP functionality)
103 | 3. **basic-memory** - Update `async_client.py` to use direct calls
104 | 4. **Deployment** - Consolidate Fly.io deployment to single app
105 | 
106 | ### Components Changed
107 | 
108 | **Merged:**
109 | - MCP middleware and telemetry into Cloud app
110 | - MCP tools mounted on Cloud FastAPI instance
111 | - ProxyService used directly by MCP tools (not via HTTP)
112 | 
113 | **Kept:**
114 | - `/proxy` endpoint (still needed by web UI)
115 | - All existing Cloud routes (provisioning, webhooks, etc.)
116 | - Dual validation in tenant API (JWT + signed headers)
117 | 
118 | **Removed:**
119 | - apps/mcp directory
120 | - Separate MCP Fly.io deployment
121 | - HTTP calls from MCP tools to /proxy endpoint
122 | 
123 | ## How (High Level)
124 | 
125 | ### 1. Mount FastMCP on Cloud FastAPI App
126 | 
127 | ```python
128 | # apps/cloud/src/basic_memory_cloud/main.py
129 | 
130 | from basic_memory.mcp.server import mcp
131 | from basic_memory_cloud_mcp.middleware import TelemetryMiddleware
132 | 
133 | # Configure MCP OAuth
134 | auth_provider = AuthKitProvider(
135 |     authkit_domain=settings.authkit_domain,
136 |     base_url=settings.authkit_base_url,
137 |     required_scopes=[],
138 | )
139 | mcp.auth = auth_provider
140 | mcp.add_middleware(TelemetryMiddleware())
141 | 
142 | # Mount MCP at /mcp endpoint
143 | mcp_app = mcp.http_app(path="/mcp", stateless_http=True)
144 | app.mount("/mcp", mcp_app)
145 | 
146 | # Existing Cloud routes stay at root
147 | app.include_router(proxy_router)
148 | app.include_router(provisioning_router)
149 | # ... etc
150 | ```
151 | 
152 | ### 2. Direct Tenant Transport (No HTTP Hop)
153 | 
154 | Instead of calling `/proxy`, MCP tools call tenant APIs directly via custom httpx transport.
155 | 
156 | **Important:** No URL prefix stripping needed. The transport receives relative URLs like `/main/resource/notes/my-note` which are correctly routed to tenant APIs. The `/proxy` prefix only exists for web UI requests to the proxy router, not for MCP tools using the custom transport.
157 | 
158 | ```python
159 | # apps/cloud/src/basic_memory_cloud/transports/tenant_direct.py
160 | 
161 | from httpx import AsyncBaseTransport, Request, Response
162 | from fastmcp.server.dependencies import get_http_headers
163 | import jwt
164 | 
165 | class TenantDirectTransport(AsyncBaseTransport):
166 |     """Direct transport to tenant APIs, bypassing /proxy endpoint."""
167 | 
168 |     async def handle_async_request(self, request: Request) -> Response:
169 |         # 1. Get JWT from current MCP request (via FastMCP DI)
170 |         http_headers = get_http_headers()
171 |         auth_header = http_headers.get("authorization") or http_headers.get("Authorization")
172 |         token = auth_header.replace("Bearer ", "")
173 |         claims = jwt.decode(token, options={"verify_signature": False})
174 |         workos_user_id = claims["sub"]
175 | 
176 |         # 2. Look up tenant for user
177 |         tenant = await tenant_service.get_tenant_by_user_id(workos_user_id)
178 | 
179 |         # 3. Build tenant app URL with signed headers
180 |         fly_app_name = f"{settings.tenant_prefix}-{tenant.id}"
181 |         target_url = f"https://{fly_app_name}.fly.dev{request.url.path}"
182 | 
183 |         headers = dict(request.headers)
184 |         signer = create_signer(settings.bm_tenant_header_secret)
185 |         headers.update(signer.sign_tenant_headers(tenant.id))
186 | 
187 |         # 4. Make direct call to tenant API
188 |         response = await self.client.request(
189 |             method=request.method, url=target_url,
190 |             headers=headers, content=request.content
191 |         )
192 |         return response
193 | ```
194 | 
195 | Then configure basic-memory's client factory before mounting MCP:
196 | 
197 | ```python
198 | # apps/cloud/src/basic_memory_cloud/main.py
199 | 
200 | from contextlib import asynccontextmanager
201 | from basic_memory.mcp import async_client
202 | from basic_memory_cloud.transports.tenant_direct import TenantDirectTransport
203 | 
204 | # Configure factory for basic-memory's async_client
205 | @asynccontextmanager
206 | async def tenant_direct_client_factory():
207 |     """Factory for creating clients with tenant direct transport."""
208 |     client = httpx.AsyncClient(
209 |         transport=TenantDirectTransport(),
210 |         base_url="http://direct",
211 |     )
212 |     try:
213 |         yield client
214 |     finally:
215 |         await client.aclose()
216 | 
217 | # Set factory BEFORE importing MCP tools
218 | async_client.set_client_factory(tenant_direct_client_factory)
219 | 
220 | # NOW import - tools will use our factory
221 | import basic_memory.mcp.tools
222 | import basic_memory.mcp.prompts
223 | from basic_memory.mcp.server import mcp
224 | 
225 | # Mount MCP - tools use direct transport via factory
226 | app.mount("/mcp", mcp_app)
227 | ```
228 | 
229 | **Key benefits:**
230 | - Clean dependency injection via factory pattern
231 | - Per-request tenant resolution via FastMCP DI
232 | - Proper resource cleanup (client.aclose() guaranteed)
233 | - Eliminates HTTP hop entirely
234 | - /proxy endpoint remains for web UI
235 | 
236 | ### 3. Keep /proxy Endpoint for Web UI
237 | 
238 | The existing `/proxy` HTTP endpoint remains functional for:
239 | - Web UI requests
240 | - Future external API consumers
241 | - Backward compatibility
242 | 
243 | ### 4. Security: Maintain Dual Validation
244 | 
245 | **Do NOT remove JWT validation from tenant API.** Keep defense in depth:
246 | 
247 | ```python
248 | # apps/api - Keep both validations
249 | 1. JWT validation (from WorkOS token)
250 | 2. Signed header validation (from Cloud/MCP)
251 | ```
252 | 
253 | This ensures if the Cloud service is compromised, attackers still cannot access tenant APIs without valid JWTs.
254 | 
255 | ### 5. Deployment Changes
256 | 
257 | **Before:**
258 | - `apps/mcp/fly.template.toml` → MCP Gateway deployment
259 | - `apps/cloud/fly.template.toml` → Cloud Service deployment
260 | 
261 | **After:**
262 | - Remove `apps/mcp/fly.template.toml`
263 | - Update `apps/cloud/fly.template.toml` to expose port 8000 for both /mcp and /proxy
264 | - Update deployment scripts to deploy single consolidated app
265 | 
266 | 
267 | ## Basic Memory Dependency: Async Client Refactor
268 | 
269 | ### Problem
270 | The current `basic_memory.mcp.async_client` creates a module-level `client` at import time:
271 | ```python
272 | client = create_client()  # Runs immediately when module is imported
273 | ```
274 | 
275 | This prevents dependency injection - by the time we can override it, tools have already imported it.
276 | 
277 | ### Solution: Context Manager Pattern with Auth at Client Creation
278 | 
279 | Refactor basic-memory to use httpx's context manager pattern instead of module-level client.
280 | 
281 | **Key principle:** Authentication happens at client creation time, not per-request.
282 | 
283 | ```python
284 | # basic_memory/src/basic_memory/mcp/async_client.py
285 | from contextlib import asynccontextmanager
286 | from httpx import AsyncClient, ASGITransport, Timeout
287 | 
288 | # Optional factory override for dependency injection
289 | _client_factory = None
290 | 
291 | def set_client_factory(factory):
292 |     """Override the default client factory (for cloud app, testing, etc)."""
293 |     global _client_factory
294 |     _client_factory = factory
295 | 
296 | @asynccontextmanager
297 | async def get_client():
298 |     """Get an AsyncClient as a context manager.
299 | 
300 |     Usage:
301 |         async with get_client() as client:
302 |             response = await client.get(...)
303 |     """
304 |     if _client_factory:
305 |         # Cloud app: custom transport handles everything
306 |         async with _client_factory() as client:
307 |             yield client
308 |     else:
309 |         # Default: create based on config
310 |         config = ConfigManager().config
311 |         timeout = Timeout(connect=10.0, read=30.0, write=30.0, pool=30.0)
312 | 
313 |         if config.cloud_mode_enabled:
314 |             # CLI cloud mode: inject auth when creating client
315 |             from basic_memory.cli.auth import CLIAuth
316 | 
317 |             auth = CLIAuth(
318 |                 client_id=config.cloud_client_id,
319 |                 authkit_domain=config.cloud_domain
320 |             )
321 |             token = await auth.get_valid_token()
322 | 
323 |             if not token:
324 |                 raise RuntimeError(
325 |                     "Cloud mode enabled but not authenticated. "
326 |                     "Run 'basic-memory cloud login' first."
327 |                 )
328 | 
329 |             # Auth header set ONCE at client creation
330 |             async with AsyncClient(
331 |                 base_url=f"{config.cloud_host}/proxy",
332 |                 headers={"Authorization": f"Bearer {token}"},
333 |                 timeout=timeout
334 |             ) as client:
335 |                 yield client
336 |         else:
337 |             # Local mode: ASGI transport
338 |             async with AsyncClient(
339 |                 transport=ASGITransport(app=fastapi_app),
340 |                 base_url="http://test",
341 |                 timeout=timeout
342 |             ) as client:
343 |                 yield client
344 | ```
345 | 
346 | **Tool Updates:**
347 | ```python
348 | # Before: from basic_memory.mcp.async_client import client
349 | from basic_memory.mcp.async_client import get_client
350 | 
351 | async def read_note(...):
352 |     # Before: response = await call_get(client, path, ...)
353 |     async with get_client() as client:
354 |         response = await call_get(client, path, ...)
355 |         # ... use response
356 | ```
357 | 
358 | **Cloud Usage:**
359 | ```python
360 | from contextlib import asynccontextmanager
361 | from basic_memory.mcp import async_client
362 | 
363 | @asynccontextmanager
364 | async def tenant_direct_client():
365 |     """Factory for creating clients with tenant direct transport."""
366 |     client = httpx.AsyncClient(
367 |         transport=TenantDirectTransport(),
368 |         base_url="http://direct",
369 |     )
370 |     try:
371 |         yield client
372 |     finally:
373 |         await client.aclose()
374 | 
375 | # Before importing MCP tools:
376 | async_client.set_client_factory(tenant_direct_client)
377 | 
378 | # Now import - tools will use our factory
379 | import basic_memory.mcp.tools
380 | ```
381 | 
382 | ### Benefits
383 | - **No module-level state** - client created only when needed
384 | - **Proper cleanup** - context manager ensures `aclose()` is called
385 | - **Easy dependency injection** - factory pattern allows custom clients
386 | - **httpx best practices** - follows official recommendations
387 | - **Works for all modes** - stdio, cloud, testing
388 | 
389 | ### Architecture Simplification: Auth at Client Creation
390 | 
391 | **Key design principle:** Authentication happens when creating the client, not on every request.
392 | 
393 | **Three modes, three approaches:**
394 | 
395 | 1. **Local mode (ASGI)**
396 |    - No auth needed
397 |    - Direct in-process calls via ASGITransport
398 | 
399 | 2. **CLI cloud mode (HTTP)**
400 |    - Auth token from CLIAuth (stored in ~/.basic-memory/basic-memory-cloud.json)
401 |    - Injected as default header when creating AsyncClient
402 |    - Single auth check at client creation time
403 | 
404 | 3. **Cloud app mode (Custom Transport)**
405 |    - TenantDirectTransport handles everything
406 |    - Extracts JWT from FastMCP context per-request
407 |    - No interaction with inject_auth_header() logic
408 | 
409 | **What this removes:**
410 | - `src/basic_memory/mcp/tools/headers.py` - entire file deleted
411 | - `inject_auth_header()` calls in all request helpers (call_get, call_post, etc.)
412 | - Per-request header manipulation complexity
413 | - Circular dependency concerns between async_client and auth logic
414 | 
415 | **Benefits:**
416 | - Cleaner separation of concerns
417 | - Simpler request helper functions
418 | - Auth happens at the right layer (client creation)
419 | - Cloud app transport is completely independent
420 | 
421 | ### Refactor Summary
422 | 
423 | This refactor achieves:
424 | 
425 | **Simplification:**
426 | - Removes ~100 lines of per-request header injection logic
427 | - Deletes entire `headers.py` module
428 | - Auth happens once at client creation, not per-request
429 | 
430 | **Decoupling:**
431 | - Cloud app's custom transport is completely independent
432 | - No interaction with basic-memory's auth logic
433 | - Each mode (local, CLI cloud, cloud app) has clean separation
434 | 
435 | **Better Design:**
436 | - Follows httpx best practices (context managers)
437 | - Proper resource cleanup (client.aclose() guaranteed)
438 | - Easier testing via factory injection
439 | - No circular import risks
440 | 
441 | **Three Distinct Modes:**
442 | 1. Local: ASGI transport, no auth
443 | 2. CLI cloud: HTTP transport with CLIAuth token injection
444 | 3. Cloud app: Custom transport with per-request tenant routing
445 | 
446 | ### Implementation Plan Summary
447 | 1. Create branch `async-client-context-manager` in basic-memory
448 | 2. Update `async_client.py` with context manager pattern and CLIAuth integration
449 | 3. Remove `inject_auth_header()` from all request helpers
450 | 4. Delete `src/basic_memory/mcp/tools/headers.py`
451 | 5. Update all MCP tools to use `async with get_client() as client:`
452 | 6. Update CLI commands to use context manager and remove manual auth
453 | 7. Remove `api_url` config field
454 | 8. Update tests
455 | 9. Update basic-memory-cloud to use branch: `basic-memory @ git+https://github.com/basicmachines-co/basic-memory.git@async-client-context-manager`
456 | 
457 | Detailed breakdown in Phase 0 tasks below.
458 | 
459 | ### Implementation Notes
460 | 
461 | **Potential Issues & Solutions:**
462 | 
463 | 1. **Circular Import** (async_client imports CLIAuth)
464 |    - **Risk:** CLIAuth might import something from async_client
465 |    - **Solution:** Use lazy import inside `get_client()` function
466 |    - **Already done:** Import is inside the function, not at module level
467 | 
468 | 2. **Test Fixtures**
469 |    - **Risk:** Tests using module-level client will break
470 |    - **Solution:** Update fixtures to use factory pattern
471 |    - **Example:**
472 |      ```python
473 |      @pytest.fixture
474 |      def mock_client_factory():
475 |          @asynccontextmanager
476 |          async def factory():
477 |              async with AsyncClient(...) as client:
478 |                  yield client
479 |          return factory
480 |      ```
481 | 
482 | 3. **Performance**
483 |    - **Risk:** Creating client per tool call might be expensive
484 |    - **Reality:** httpx is designed for this pattern, connection pooling at transport level
485 |    - **Mitigation:** Monitor performance, can optimize later if needed
486 | 
487 | 4. **CLI Cloud Commands Edge Cases**
488 |    - **Risk:** Token expires mid-operation
489 |    - **Solution:** CLIAuth.get_valid_token() already handles refresh
490 |    - **Validation:** Test cloud login → use tools → token refresh flow
491 | 
492 | 5. **Backward Compatibility**
493 |    - **Risk:** External code importing `client` directly
494 |    - **Solution:** Keep `create_client()` and `client` for one version, deprecate
495 |    - **Timeline:** Remove in next major version
496 | 
497 | ## Implementation Tasks
498 | 
499 | ### Phase 0: Basic Memory Refactor (Prerequisite)
500 | 
501 | #### 0.1 Core Refactor - async_client.py
502 | - [x] Create branch `async-client-context-manager` in basic-memory repo
503 | - [x] Implement `get_client()` context manager
504 | - [x] Implement `set_client_factory()` for dependency injection
505 | - [x] Add CLI cloud mode auth injection (CLIAuth integration)
506 | - [x] Remove `api_url` config field (legacy, unused)
507 | - [x] Keep `create_client()` temporarily for backward compatibility (deprecate later)
508 | 
509 | #### 0.2 Simplify Request Helpers - tools/utils.py
510 | - [x] Remove `inject_auth_header()` calls from `call_get()`
511 | - [x] Remove `inject_auth_header()` calls from `call_post()`
512 | - [x] Remove `inject_auth_header()` calls from `call_put()`
513 | - [x] Remove `inject_auth_header()` calls from `call_patch()`
514 | - [x] Remove `inject_auth_header()` calls from `call_delete()`
515 | - [x] Delete `src/basic_memory/mcp/tools/headers.py` entirely
516 | - [x] Update imports in utils.py
517 | 
518 | #### 0.3 Update MCP Tools (~16 files)
519 | Convert from `from async_client import client` to `async with get_client() as client:`
520 | 
521 | - [x] `tools/write_note.py` (34/34 tests passing)
522 | - [x] `tools/read_note.py` (21/21 tests passing)
523 | - [x] `tools/view_note.py` (12/12 tests passing - no changes needed, delegates to read_note)
524 | - [x] `tools/delete_note.py` (2/2 tests passing)
525 | - [x] `tools/read_content.py` (20/20 tests passing)
526 | - [x] `tools/list_directory.py` (11/11 tests passing)
527 | - [x] `tools/move_note.py` (34/34 tests passing, 90% coverage)
528 | - [x] `tools/search.py` (16/16 tests passing, 96% coverage)
529 | - [x] `tools/recent_activity.py` (4/4 tests passing, 82% coverage)
530 | - [x] `tools/project_management.py` (3 functions: list_memory_projects, create_memory_project, delete_project - typecheck passed)
531 | - [x] `tools/edit_note.py` (17/17 tests passing)
532 | - [x] `tools/canvas.py` (5/5 tests passing)
533 | - [x] `tools/build_context.py` (6/6 tests passing)
534 | - [x] `tools/sync_status.py` (typecheck passed)
535 | - [x] `prompts/continue_conversation.py` (typecheck passed)
536 | - [x] `prompts/search.py` (typecheck passed)
537 | - [x] `resources/project_info.py` (typecheck passed)
538 | 
539 | #### 0.4 Update CLI Commands (~3 files)
540 | Remove manual auth header passing, use context manager:
541 | 
542 | - [x] `cli/commands/project.py` - removed get_authenticated_headers() calls, use context manager
543 | - [x] `cli/commands/status.py` - use context manager
544 | - [x] `cli/commands/command_utils.py` - use context manager
545 | 
546 | #### 0.5 Update Config
547 | - [x] Remove `api_url` field from `BasicMemoryConfig` in config.py
548 | - [x] Update any lingering references/docs (added deprecation notice to v15-docs/cloud-mode-usage.md)
549 | 
550 | #### 0.6 Testing
551 | - [-] Update test fixtures to use factory pattern
552 | - [x] Run full test suite in basic-memory
553 | - [x] Verify cloud_mode_enabled works with CLIAuth injection
554 | - [x] Run typecheck and linting
555 | 
556 | #### 0.7 Cloud Integration Prep
557 | - [x] Update basic-memory-cloud pyproject.toml to use branch
558 | - [x] Implement factory pattern in cloud app main.py
559 | - [x] Remove `/proxy` prefix stripping logic (not needed - tools pass relative URLs)
560 | 
561 | #### 0.8 Phase 0 Validation
562 | 
563 | **Before merging async-client-context-manager branch:**
564 | 
565 | - [x] All tests pass locally
566 | - [x] Typecheck passes (pyright/mypy)
567 | - [x] Linting passes (ruff)
568 | - [x] Manual test: local mode works (ASGI transport)
569 | - [x] Manual test: cloud login → cloud mode works (HTTP transport with auth)
570 | - [x] No import of `inject_auth_header` anywhere
571 | - [x] `headers.py` file deleted
572 | - [x] `api_url` config removed
573 | - [x] Tool functions properly scoped (client inside async with)
574 | - [ ] CLI commands properly scoped (client inside async with)
575 | 
576 | **Integration validation:**
577 | - [x] basic-memory-cloud can import and use factory pattern
578 | - [x] TenantDirectTransport works without touching header injection
579 | - [x] No circular imports or lazy import issues
580 | - [x] MCP tools work via inspector (local testing confirmed)
581 | 
582 | ### Phase 1: Code Consolidation
583 | - [x] Create feature branch `consolidate-mcp-cloud`
584 | - [x] Update `apps/cloud/src/basic_memory_cloud/config.py`:
585 |   - [x] Add `authkit_base_url` field (already has authkit_domain)
586 |   - [x] Workers config already exists ✓
587 | - [x] Update `apps/cloud/src/basic_memory_cloud/telemetry.py`:
588 |   - [x] Add `logfire.instrument_mcp()` to existing setup
589 |   - [x] Skip complex two-phase setup - use Cloud's simpler approach
590 | - [x] Create `apps/cloud/src/basic_memory_cloud/middleware/jwt_context.py`:
591 |   - [x] FastAPI middleware to extract JWT claims from Authorization header
592 |   - [x] Add tenant context (workos_user_id) to logfire baggage
593 |   - [x] Simpler than FastMCP middleware version
594 | - [x] Update `apps/cloud/src/basic_memory_cloud/main.py`:
595 |   - [x] Import FastMCP server from basic-memory
596 |   - [x] Configure AuthKitProvider with WorkOS settings
597 |   - [x] No FastMCP telemetry middleware needed (using FastAPI middleware instead)
598 |   - [x] Create MCP ASGI app: `mcp_app = mcp.http_app(path='/mcp', stateless_http=True)`
599 |   - [x] Combine lifespans (Cloud + MCP) using nested async context managers
600 |   - [x] Mount MCP: `app.mount("/mcp", mcp_app)`
601 |   - [x] Add JWT context middleware to FastAPI app
602 | - [x] Run typecheck - passes ✓
603 | 
604 | ### Phase 2: Direct Tenant Transport
605 | - [x] Create `apps/cloud/src/basic_memory_cloud/transports/tenant_direct.py`:
606 |   - [x] Implement `TenantDirectTransport(AsyncBaseTransport)`
607 |   - [x] Use FastMCP DI (`get_http_headers()`) to extract JWT per-request
608 |   - [x] Decode JWT to get `workos_user_id`
609 |   - [x] Look up/create tenant via `TenantRepository.get_or_create_tenant_for_workos_user()`
610 |   - [x] Build tenant app URL and add signed headers
611 |   - [x] Make direct httpx call to tenant API
612 |   - [x] No `/proxy` prefix stripping needed (tools pass relative URLs like `/main/resource/...`)
613 | - [x] Update `apps/cloud/src/basic_memory_cloud/main.py`:
614 |   - [x] Refactored to use factory pattern instead of module-level override
615 |   - [x] Implement `tenant_direct_client_factory()` context manager
616 |   - [x] Call `async_client.set_client_factory()` before importing MCP tools
617 |   - [x] Clean imports, proper noqa hints for lint
618 | - [x] Basic-memory refactor integrated (PR #344)
619 | - [x] Run typecheck - passes ✓
620 | - [x] Run lint - passes ✓
621 | 
622 | ### Phase 3: Testing & Validation
623 | - [x] Run `just typecheck` in apps/cloud
624 | - [x] Run `just check` in project
625 | - [x] Run `just fix` - all lint errors fixed ✓
626 | - [x] Write comprehensive transport tests (11 tests passing) ✓
627 | - [x] Test MCP tools locally with consolidated service (inspector confirmed working)
628 | - [x] Verify OAuth authentication works (requires full deployment)
629 | - [x] Verify tenant isolation via signed headers (requires full deployment)
630 | - [x] Test /proxy endpoint still works for web UI
631 | - [ ] Measure latency before/after consolidation
632 | - [ ] Check telemetry traces span correctly
633 | 
634 | ### Phase 4: Deployment Configuration
635 | - [x] Update `apps/cloud/fly.template.toml`:
636 |   - [x] Merged MCP-specific environment variables (AUTHKIT_BASE_URL, FASTMCP_LOG_LEVEL, BASIC_MEMORY_*)
637 |   - [x] Added HTTP/2 backend support (`h2_backend = true`) for better MCP performance
638 |   - [x] Added health check for MCP OAuth endpoint (`/.well-known/oauth-protected-resource`)
639 |   - [x] Port 8000 already exposed - serves both Cloud routes and /mcp endpoint
640 |   - [x] Workers configured (UVICORN_WORKERS = 4)
641 | - [x] Update `.env.example`:
642 |   - [x] Consolidated MCP Gateway section into Cloud app section
643 |   - [x] Added AUTHKIT_BASE_URL, FASTMCP_LOG_LEVEL, BASIC_MEMORY_HOME
644 |   - [x] Added LOG_LEVEL to Development Settings
645 |   - [x] Documented that MCP now served at /mcp on Cloud service (port 8000)
646 | - [x] Test deployment to preview environment (PR #113)
647 |   - [x] OAuth authentication verified
648 |   - [x] MCP tools successfully calling tenant APIs
649 |   - [x] Fixed BM_TENANT_HEADER_SECRET synchronization issue
650 | 
651 | ### Phase 5: Cleanup
652 | - [x] Remove `apps/mcp/` directory entirely
653 | - [x] Remove MCP-specific fly.toml and deployment configs
654 | - [x] Update repository documentation
655 | - [x] Update CLAUDE.md with new architecture
656 | - [-] Archive old MCP deployment configs (if needed)
657 | 
658 | ### Phase 6: Production Rollout
659 | - [ ] Deploy to development and validate
660 | - [ ] Monitor metrics and logs
661 | - [ ] Deploy to production
662 | - [ ] Verify production functionality
663 | - [ ] Document performance improvements
664 | 
665 | ## Migration Plan
666 | 
667 | ### Phase 1: Preparation
668 | 1. Create feature branch `consolidate-mcp-cloud`
669 | 2. Update basic-memory async_client.py for direct ProxyService calls
670 | 3. Update apps/cloud/main.py to mount MCP
671 | 
672 | ### Phase 2: Testing
673 | 1. Local testing with consolidated app
674 | 2. Deploy to development environment
675 | 3. Run full test suite
676 | 4. Performance benchmarking
677 | 
678 | ### Phase 3: Deployment
679 | 1. Deploy to development
680 | 2. Validate all functionality
681 | 3. Deploy to production
682 | 4. Monitor for issues
683 | 
684 | ### Phase 4: Cleanup
685 | 1. Remove apps/mcp directory
686 | 2. Update documentation
687 | 3. Update deployment scripts
688 | 4. Archive old MCP deployment configs
689 | 
690 | ## Rollback Plan
691 | 
692 | If issues arise:
693 | 1. Revert feature branch
694 | 2. Redeploy separate apps/mcp and apps/cloud services
695 | 3. Restore previous fly.toml configurations
696 | 4. Document issues encountered
697 | 
698 | The well-organized code structure makes splitting back out feasible if future scaling needs diverge.
699 | 
700 | ## How to Evaluate
701 | 
702 | ### 1. Functional Testing
703 | 
704 | **MCP Tools:**
705 | - [ ] All 17 MCP tools work via consolidated /mcp endpoint
706 | - [x] OAuth authentication validates correctly
707 | - [x] Tenant isolation maintained via signed headers
708 | - [x] Project management tools function correctly
709 | 
710 | **Cloud Routes:**
711 | - [x] /proxy endpoint still works for web UI
712 | - [x] /provisioning routes functional
713 | - [x] /webhooks routes functional
714 | - [x] /tenants routes functional
715 | 
716 | **API Validation:**
717 | - [x] Tenant API validates both JWT and signed headers
718 | - [x] Unauthorized requests rejected appropriately
719 | - [x] Multi-tenant isolation verified
720 | 
721 | ### 2. Performance Testing
722 | 
723 | **Latency Reduction:**
724 | - [x] Measure MCP tool latency before consolidation
725 | - [x] Measure MCP tool latency after consolidation
726 | - [x] Verify reduction from eliminated HTTP hop (expected: 20-50ms improvement)
727 | 
728 | **Resource Usage:**
729 | - [x] Single app uses less total memory than two apps
730 | - [x] Database connection pooling more efficient
731 | - [x] HTTP client overhead reduced
732 | 
733 | ### 3. Deployment Testing
734 | 
735 | **Fly.io Deployment:**
736 | - [x] Single app deploys successfully
737 | - [x] Health checks pass for consolidated service
738 | - [x] No apps/mcp deployment required
739 | - [x] Environment variables configured correctly
740 | 
741 | **Local Development:**
742 | - [x] `just setup` works with consolidated architecture
743 | - [x] Local testing shows MCP tools working
744 | - [x] No regression in developer experience
745 | 
746 | ### 4. Security Validation
747 | 
748 | **Defense in Depth:**
749 | - [x] Tenant API still validates JWT tokens
750 | - [x] Tenant API still validates signed headers
751 | - [x] No access possible with only signed headers (JWT required)
752 | - [x] No access possible with only JWT (signed headers required)
753 | 
754 | **Authorization:**
755 | - [x] Users can only access their own tenant data
756 | - [x] Cross-tenant requests rejected
757 | - [x] Admin operations require proper authentication
758 | 
759 | ### 5. Observability
760 | 
761 | **Telemetry:**
762 | - [x] OpenTelemetry traces span across MCP → ProxyService → Tenant API
763 | - [x] Logfire shows consolidated traces correctly
764 | - [x] Error tracking and debugging still functional
765 | - [x] Performance metrics accurate
766 | 
767 | **Logging:**
768 | - [x] Structured logs show proper context (tenant_id, operation, etc.)
769 | - [x] Error logs contain actionable information
770 | - [x] Log volume reasonable for single app
771 | 
772 | ## Success Criteria
773 | 
774 | 1. **Functionality**: All MCP tools and Cloud routes work identically to before
775 | 2. **Performance**: Measurable latency reduction (>20ms average)
776 | 3. **Cost**: Single Fly.io app instead of two (50% infrastructure reduction)
777 | 4. **Security**: Dual validation maintained, no security regression
778 | 5. **Deployment**: Simplified deployment process, single app to manage
779 | 6. **Observability**: Telemetry and logging work correctly
780 | 
781 | 
782 | 
783 | ## Notes
784 | 
785 | ### Future Considerations
786 | 
787 | - **Independent scaling**: If MCP and Cloud need different scaling profiles in future, code organization supports splitting back out
788 | - **Regional deployment**: Consolidated app can still be deployed to multiple regions
789 | - **Edge caching**: Could add edge caching layer in front of consolidated service
790 | 
791 | ### Dependencies
792 | 
793 | - SPEC-9: Signed Header Tenant Information (already implemented)
794 | - SPEC-12: OpenTelemetry Observability (telemetry must work across merged services)
795 | 
796 | ### Related Work
797 | 
798 | - basic-memory v0.13.x: MCP server implementation
799 | - FastMCP documentation: Mounting on existing FastAPI apps
800 | - Fly.io multi-service patterns
801 | 
```

--------------------------------------------------------------------------------
/tests/api/test_project_router.py:
--------------------------------------------------------------------------------

```python
  1 | """Tests for the project router API endpoints."""
  2 | 
  3 | import tempfile
  4 | from pathlib import Path
  5 | 
  6 | import pytest
  7 | 
  8 | from basic_memory.schemas.project_info import ProjectItem
  9 | 
 10 | 
 11 | @pytest.mark.asyncio
 12 | async def test_get_project_item(test_graph, client, project_config, test_project, project_url):
 13 |     """Test the project item endpoint returns correctly structured data."""
 14 |     # Set up some test data in the database
 15 | 
 16 |     # Call the endpoint
 17 |     response = await client.get(f"{project_url}/project/item")
 18 | 
 19 |     # Verify response
 20 |     assert response.status_code == 200
 21 |     project_info = ProjectItem.model_validate(response.json())
 22 |     assert project_info.name == test_project.name
 23 |     assert project_info.path == test_project.path
 24 |     assert project_info.is_default == test_project.is_default
 25 | 
 26 | 
 27 | @pytest.mark.asyncio
 28 | async def test_get_project_item_not_found(
 29 |     test_graph, client, project_config, test_project, project_url
 30 | ):
 31 |     """Test the project item endpoint returns correctly structured data."""
 32 |     # Set up some test data in the database
 33 | 
 34 |     # Call the endpoint
 35 |     response = await client.get("/not-found/project/item")
 36 | 
 37 |     # Verify response
 38 |     assert response.status_code == 404
 39 | 
 40 | 
 41 | @pytest.mark.asyncio
 42 | async def test_get_default_project(test_graph, client, project_config, test_project, project_url):
 43 |     """Test the default project item endpoint returns the default project."""
 44 |     # Set up some test data in the database
 45 | 
 46 |     # Call the endpoint
 47 |     response = await client.get("/projects/default")
 48 | 
 49 |     # Verify response
 50 |     assert response.status_code == 200
 51 |     project_info = ProjectItem.model_validate(response.json())
 52 |     assert project_info.name == test_project.name
 53 |     assert project_info.path == test_project.path
 54 |     assert project_info.is_default == test_project.is_default
 55 | 
 56 | 
 57 | @pytest.mark.asyncio
 58 | async def test_get_project_info_endpoint(test_graph, client, project_config, project_url):
 59 |     """Test the project-info endpoint returns correctly structured data."""
 60 |     # Set up some test data in the database
 61 | 
 62 |     # Call the endpoint
 63 |     response = await client.get(f"{project_url}/project/info")
 64 | 
 65 |     # Verify response
 66 |     assert response.status_code == 200
 67 |     data = response.json()
 68 | 
 69 |     # Check top-level keys
 70 |     assert "project_name" in data
 71 |     assert "project_path" in data
 72 |     assert "available_projects" in data
 73 |     assert "default_project" in data
 74 |     assert "statistics" in data
 75 |     assert "activity" in data
 76 |     assert "system" in data
 77 | 
 78 |     # Check statistics
 79 |     stats = data["statistics"]
 80 |     assert "total_entities" in stats
 81 |     assert stats["total_entities"] >= 0
 82 |     assert "total_observations" in stats
 83 |     assert stats["total_observations"] >= 0
 84 |     assert "total_relations" in stats
 85 |     assert stats["total_relations"] >= 0
 86 | 
 87 |     # Check activity
 88 |     activity = data["activity"]
 89 |     assert "recently_created" in activity
 90 |     assert "recently_updated" in activity
 91 |     assert "monthly_growth" in activity
 92 | 
 93 |     # Check system
 94 |     system = data["system"]
 95 |     assert "version" in system
 96 |     assert "database_path" in system
 97 |     assert "database_size" in system
 98 |     assert "timestamp" in system
 99 | 
100 | 
101 | @pytest.mark.asyncio
102 | async def test_get_project_info_content(test_graph, client, project_config, project_url):
103 |     """Test that project-info contains actual data from the test database."""
104 |     # Call the endpoint
105 |     response = await client.get(f"{project_url}/project/info")
106 | 
107 |     # Verify response
108 |     assert response.status_code == 200
109 |     data = response.json()
110 | 
111 |     # Check that test_graph content is reflected in statistics
112 |     stats = data["statistics"]
113 | 
114 |     # Our test graph should have at least a few entities
115 |     assert stats["total_entities"] > 0
116 | 
117 |     # It should also have some observations
118 |     assert stats["total_observations"] > 0
119 | 
120 |     # And relations
121 |     assert stats["total_relations"] > 0
122 | 
123 |     # Check that entity types include 'test'
124 |     assert "test" in stats["entity_types"] or "entity" in stats["entity_types"]
125 | 
126 | 
127 | @pytest.mark.asyncio
128 | async def test_list_projects_endpoint(test_config, test_graph, client, project_config, project_url):
129 |     """Test the list projects endpoint returns correctly structured data."""
130 |     # Call the endpoint
131 |     response = await client.get("/projects/projects")
132 | 
133 |     # Verify response
134 |     assert response.status_code == 200
135 |     data = response.json()
136 | 
137 |     # Check that the response contains expected fields
138 |     assert "projects" in data
139 |     assert "default_project" in data
140 | 
141 |     # Check that projects is a list
142 |     assert isinstance(data["projects"], list)
143 | 
144 |     # There should be at least one project (the test project)
145 |     assert len(data["projects"]) > 0
146 | 
147 |     # Verify project item structure
148 |     if data["projects"]:
149 |         project = data["projects"][0]
150 |         assert "name" in project
151 |         assert "path" in project
152 |         assert "is_default" in project
153 | 
154 |         # Default project should be marked
155 |         default_project = next((p for p in data["projects"] if p["is_default"]), None)
156 |         assert default_project is not None
157 |         assert default_project["name"] == data["default_project"]
158 | 
159 | 
160 | @pytest.mark.asyncio
161 | async def test_remove_project_endpoint(test_config, client, project_service):
162 |     """Test the remove project endpoint."""
163 |     # First create a test project to remove
164 |     test_project_name = "test-remove-project"
165 |     await project_service.add_project(test_project_name, "/tmp/test-remove-project")
166 | 
167 |     # Verify it exists
168 |     project = await project_service.get_project(test_project_name)
169 |     assert project is not None
170 | 
171 |     # Remove the project
172 |     response = await client.delete(f"/projects/{test_project_name}")
173 | 
174 |     # Verify response
175 |     assert response.status_code == 200
176 |     data = response.json()
177 | 
178 |     # Check response structure
179 |     assert "message" in data
180 |     assert "status" in data
181 |     assert data["status"] == "success"
182 |     assert "old_project" in data
183 |     assert data["old_project"]["name"] == test_project_name
184 | 
185 |     # Verify project is actually removed
186 |     removed_project = await project_service.get_project(test_project_name)
187 |     assert removed_project is None
188 | 
189 | 
190 | @pytest.mark.asyncio
191 | async def test_set_default_project_endpoint(test_config, client, project_service):
192 |     """Test the set default project endpoint."""
193 |     # Create a test project to set as default
194 |     test_project_name = "test-default-project"
195 |     await project_service.add_project(test_project_name, "/tmp/test-default-project")
196 | 
197 |     # Set it as default
198 |     response = await client.put(f"/projects/{test_project_name}/default")
199 | 
200 |     # Verify response
201 |     assert response.status_code == 200
202 |     data = response.json()
203 | 
204 |     # Check response structure
205 |     assert "message" in data
206 |     assert "status" in data
207 |     assert data["status"] == "success"
208 |     assert "new_project" in data
209 |     assert data["new_project"]["name"] == test_project_name
210 | 
211 |     # Verify it's actually set as default
212 |     assert project_service.default_project == test_project_name
213 | 
214 | 
215 | @pytest.mark.asyncio
216 | async def test_update_project_path_endpoint(test_config, client, project_service, project_url):
217 |     """Test the update project endpoint for changing project path."""
218 |     # Create a test project to update
219 |     test_project_name = "test-update-project"
220 |     with tempfile.TemporaryDirectory() as temp_dir:
221 |         test_root = Path(temp_dir)
222 |         old_path = test_root / "old-location"
223 |         new_path = test_root / "new-location"
224 | 
225 |         await project_service.add_project(test_project_name, str(old_path))
226 | 
227 |         try:
228 |             # Verify initial state
229 |             project = await project_service.get_project(test_project_name)
230 |             assert project is not None
231 |             assert Path(project.path) == old_path
232 | 
233 |             # Update the project path
234 |             response = await client.patch(
235 |                 f"{project_url}/project/{test_project_name}", json={"path": str(new_path)}
236 |             )
237 | 
238 |             # Verify response
239 |             assert response.status_code == 200
240 |             data = response.json()
241 | 
242 |             # Check response structure
243 |             assert "message" in data
244 |             assert "status" in data
245 |             assert data["status"] == "success"
246 |             assert "old_project" in data
247 |             assert "new_project" in data
248 | 
249 |             # Check old project data
250 |             assert data["old_project"]["name"] == test_project_name
251 |             assert Path(data["old_project"]["path"]) == old_path
252 | 
253 |             # Check new project data
254 |             assert data["new_project"]["name"] == test_project_name
255 |             assert Path(data["new_project"]["path"]) == new_path
256 | 
257 |             # Verify project was actually updated in database
258 |             updated_project = await project_service.get_project(test_project_name)
259 |             assert updated_project is not None
260 |             assert Path(updated_project.path) == new_path
261 | 
262 |         finally:
263 |             # Clean up
264 |             try:
265 |                 await project_service.remove_project(test_project_name)
266 |             except Exception:
267 |                 pass
268 | 
269 | 
270 | @pytest.mark.asyncio
271 | async def test_update_project_is_active_endpoint(test_config, client, project_service, project_url):
272 |     """Test the update project endpoint for changing is_active status."""
273 |     # Create a test project to update
274 |     test_project_name = "test-update-active-project"
275 |     test_path = "/tmp/test-update-active"
276 | 
277 |     await project_service.add_project(test_project_name, test_path)
278 | 
279 |     try:
280 |         # Update the project is_active status
281 |         response = await client.patch(
282 |             f"{project_url}/project/{test_project_name}", json={"is_active": False}
283 |         )
284 | 
285 |         # Verify response
286 |         assert response.status_code == 200
287 |         data = response.json()
288 | 
289 |         # Check response structure
290 |         assert "message" in data
291 |         assert "status" in data
292 |         assert data["status"] == "success"
293 |         assert f"Project '{test_project_name}' updated successfully" == data["message"]
294 | 
295 |     finally:
296 |         # Clean up
297 |         try:
298 |             await project_service.remove_project(test_project_name)
299 |         except Exception:
300 |             pass
301 | 
302 | 
303 | @pytest.mark.asyncio
304 | async def test_update_project_both_params_endpoint(
305 |     test_config, client, project_service, project_url
306 | ):
307 |     """Test the update project endpoint with both path and is_active parameters."""
308 |     # Create a test project to update
309 |     test_project_name = "test-update-both-project"
310 |     with tempfile.TemporaryDirectory() as temp_dir:
311 |         test_root = Path(temp_dir)
312 |         old_path = (test_root / "old-location").as_posix()
313 |         new_path = (test_root / "new-location").as_posix()
314 | 
315 |         await project_service.add_project(test_project_name, old_path)
316 | 
317 |         try:
318 |             # Update both path and is_active (path should take precedence)
319 |             response = await client.patch(
320 |                 f"{project_url}/project/{test_project_name}",
321 |                 json={"path": new_path, "is_active": False},
322 |             )
323 | 
324 |             # Verify response
325 |             assert response.status_code == 200
326 |             data = response.json()
327 | 
328 |             # Check that path update was performed (takes precedence)
329 |             assert data["new_project"]["path"] == new_path
330 | 
331 |             # Verify project was actually updated in database
332 |             updated_project = await project_service.get_project(test_project_name)
333 |             assert updated_project is not None
334 |             assert updated_project.path == new_path
335 | 
336 |         finally:
337 |             # Clean up
338 |             try:
339 |                 await project_service.remove_project(test_project_name)
340 |             except Exception:
341 |                 pass
342 | 
343 | 
344 | @pytest.mark.asyncio
345 | async def test_update_project_nonexistent_endpoint(client, project_url, tmp_path):
346 |     """Test the update project endpoint with a nonexistent project."""
347 |     # Try to update a project that doesn't exist
348 |     # Use tmp_path for cross-platform absolute path compatibility
349 |     new_path = str(tmp_path / "new-path")
350 |     response = await client.patch(
351 |         f"{project_url}/project/nonexistent-project", json={"path": new_path}
352 |     )
353 | 
354 |     # Should return 400 error
355 |     assert response.status_code == 400
356 |     data = response.json()
357 |     assert "detail" in data
358 |     assert "not found in configuration" in data["detail"]
359 | 
360 | 
361 | @pytest.mark.asyncio
362 | async def test_update_project_relative_path_error_endpoint(
363 |     test_config, client, project_service, project_url
364 | ):
365 |     """Test the update project endpoint with relative path (should fail)."""
366 |     # Create a test project to update
367 |     test_project_name = "test-update-relative-project"
368 |     test_path = "/tmp/test-update-relative"
369 | 
370 |     await project_service.add_project(test_project_name, test_path)
371 | 
372 |     try:
373 |         # Try to update with relative path
374 |         response = await client.patch(
375 |             f"{project_url}/project/{test_project_name}", json={"path": "./relative-path"}
376 |         )
377 | 
378 |         # Should return 400 error
379 |         assert response.status_code == 400
380 |         data = response.json()
381 |         assert "detail" in data
382 |         assert "Path must be absolute" in data["detail"]
383 | 
384 |     finally:
385 |         # Clean up
386 |         try:
387 |             await project_service.remove_project(test_project_name)
388 |         except Exception:
389 |             pass
390 | 
391 | 
392 | @pytest.mark.asyncio
393 | async def test_update_project_no_params_endpoint(test_config, client, project_service, project_url):
394 |     """Test the update project endpoint with no parameters (should fail)."""
395 |     # Create a test project to update
396 |     test_project_name = "test-update-no-params-project"
397 |     test_path = "/tmp/test-update-no-params"
398 | 
399 |     await project_service.add_project(test_project_name, test_path)
400 |     proj_info = await project_service.get_project(test_project_name)
401 |     assert proj_info.name == test_project_name
402 |     # On Windows the path is prepended with a drive letter
403 |     assert test_path in proj_info.path
404 | 
405 |     try:
406 |         # Try to update with no parameters
407 |         response = await client.patch(f"{project_url}/project/{test_project_name}", json={})
408 | 
409 |         # Should return 200 (no-op)
410 |         assert response.status_code == 200
411 |         proj_info = await project_service.get_project(test_project_name)
412 |         assert proj_info.name == test_project_name
413 |         # On Windows the path is prepended with a drive letter
414 |         assert test_path in proj_info.path
415 | 
416 |     finally:
417 |         # Clean up
418 |         try:
419 |             await project_service.remove_project(test_project_name)
420 |         except Exception:
421 |             pass
422 | 
423 | 
424 | @pytest.mark.asyncio
425 | async def test_update_project_empty_path_endpoint(
426 |     test_config, client, project_service, project_url
427 | ):
428 |     """Test the update project endpoint with empty path parameter."""
429 |     # Create a test project to update
430 |     test_project_name = "test-update-empty-path-project"
431 |     test_path = "/tmp/test-update-empty-path"
432 | 
433 |     await project_service.add_project(test_project_name, test_path)
434 | 
435 |     try:
436 |         # Try to update with empty/null path - should be treated as no path update
437 |         response = await client.patch(
438 |             f"{project_url}/project/{test_project_name}", json={"path": None, "is_active": True}
439 |         )
440 | 
441 |         # Should succeed and perform is_active update
442 |         assert response.status_code == 200
443 |         data = response.json()
444 |         assert data["status"] == "success"
445 | 
446 |     finally:
447 |         # Clean up
448 |         try:
449 |             await project_service.remove_project(test_project_name)
450 |         except Exception:
451 |             pass
452 | 
453 | 
454 | @pytest.mark.asyncio
455 | async def test_sync_project_endpoint(test_graph, client, project_url):
456 |     """Test the project sync endpoint initiates background sync."""
457 |     # Call the sync endpoint
458 |     response = await client.post(f"{project_url}/project/sync")
459 | 
460 |     # Verify response
461 |     assert response.status_code == 200
462 |     data = response.json()
463 | 
464 |     # Check response structure
465 |     assert "status" in data
466 |     assert "message" in data
467 |     assert data["status"] == "sync_started"
468 |     assert "Filesystem sync initiated" in data["message"]
469 | 
470 | 
471 | @pytest.mark.asyncio
472 | async def test_sync_project_endpoint_with_force_full(test_graph, client, project_url):
473 |     """Test the project sync endpoint with force_full parameter."""
474 |     # Call the sync endpoint with force_full=true
475 |     response = await client.post(f"{project_url}/project/sync?force_full=true")
476 | 
477 |     # Verify response
478 |     assert response.status_code == 200
479 |     data = response.json()
480 | 
481 |     # Check response structure
482 |     assert "status" in data
483 |     assert "message" in data
484 |     assert data["status"] == "sync_started"
485 |     assert "Filesystem sync initiated" in data["message"]
486 | 
487 | 
488 | @pytest.mark.asyncio
489 | async def test_sync_project_endpoint_with_force_full_false(test_graph, client, project_url):
490 |     """Test the project sync endpoint with force_full=false."""
491 |     # Call the sync endpoint with force_full=false
492 |     response = await client.post(f"{project_url}/project/sync?force_full=false")
493 | 
494 |     # Verify response
495 |     assert response.status_code == 200
496 |     data = response.json()
497 | 
498 |     # Check response structure
499 |     assert "status" in data
500 |     assert "message" in data
501 |     assert data["status"] == "sync_started"
502 |     assert "Filesystem sync initiated" in data["message"]
503 | 
504 | 
505 | @pytest.mark.asyncio
506 | async def test_sync_project_endpoint_not_found(client):
507 |     """Test the project sync endpoint with nonexistent project."""
508 |     # Call the sync endpoint for a project that doesn't exist
509 |     response = await client.post("/nonexistent-project/project/sync")
510 | 
511 |     # Should return 404
512 |     assert response.status_code == 404
513 | 
514 | 
515 | @pytest.mark.asyncio
516 | async def test_sync_project_endpoint_foreground(test_graph, client, project_url):
517 |     """Test the project sync endpoint with run_in_background=false returns sync report."""
518 |     # Call the sync endpoint with run_in_background=false
519 |     response = await client.post(f"{project_url}/project/sync?run_in_background=false")
520 | 
521 |     # Verify response
522 |     assert response.status_code == 200
523 |     data = response.json()
524 | 
525 |     # Check that we get a sync report instead of status message
526 |     assert "new" in data
527 |     assert "modified" in data
528 |     assert "deleted" in data
529 |     assert "moves" in data
530 |     assert "checksums" in data
531 |     assert "skipped_files" in data
532 |     assert "total" in data
533 | 
534 |     # Verify these are the right types
535 |     assert isinstance(data["new"], list)
536 |     assert isinstance(data["modified"], list)
537 |     assert isinstance(data["deleted"], list)
538 |     assert isinstance(data["moves"], dict)
539 |     assert isinstance(data["checksums"], dict)
540 |     assert isinstance(data["skipped_files"], list)
541 |     assert isinstance(data["total"], int)
542 | 
543 | 
544 | @pytest.mark.asyncio
545 | async def test_sync_project_endpoint_foreground_with_force_full(test_graph, client, project_url):
546 |     """Test the project sync endpoint with run_in_background=false and force_full=true."""
547 |     # Call the sync endpoint with both parameters
548 |     response = await client.post(
549 |         f"{project_url}/project/sync?run_in_background=false&force_full=true"
550 |     )
551 | 
552 |     # Verify response
553 |     assert response.status_code == 200
554 |     data = response.json()
555 | 
556 |     # Check that we get a sync report with all expected fields
557 |     assert "new" in data
558 |     assert "modified" in data
559 |     assert "deleted" in data
560 |     assert "moves" in data
561 |     assert "checksums" in data
562 |     assert "skipped_files" in data
563 |     assert "total" in data
564 | 
565 | 
566 | @pytest.mark.asyncio
567 | async def test_sync_project_endpoint_foreground_with_changes(
568 |     test_graph, client, project_config, project_url, tmpdir
569 | ):
570 |     """Test foreground sync detects actual file changes."""
571 |     # Create a new file in the project directory
572 |     import os
573 |     from pathlib import Path
574 | 
575 |     test_file = Path(project_config.home) / "new_test_file.md"
576 |     test_file.write_text("# New Test File\n\nThis is a test file for sync detection.")
577 | 
578 |     try:
579 |         # Call the sync endpoint with run_in_background=false
580 |         response = await client.post(f"{project_url}/project/sync?run_in_background=false")
581 | 
582 |         # Verify response
583 |         assert response.status_code == 200
584 |         data = response.json()
585 | 
586 |         # The sync report should show changes (the new file we created)
587 |         assert data["total"] >= 0  # Should have at least detected changes
588 |         assert "new" in data
589 |         assert "modified" in data
590 |         assert "deleted" in data
591 | 
592 |         # At least one of these should have changes
593 |         has_changes = len(data["new"]) > 0 or len(data["modified"]) > 0 or len(data["deleted"]) > 0
594 |         assert has_changes or data["total"] >= 0  # Either changes detected or empty sync is valid
595 | 
596 |     finally:
597 |         # Clean up the test file
598 |         if test_file.exists():
599 |             os.remove(test_file)
600 | 
601 | 
602 | @pytest.mark.asyncio
603 | async def test_remove_default_project_fails(test_config, client, project_service):
604 |     """Test that removing the default project returns an error."""
605 |     # Get the current default project
606 |     default_project_name = project_service.default_project
607 | 
608 |     # Try to remove the default project
609 |     response = await client.delete(f"/projects/{default_project_name}")
610 | 
611 |     # Should return 400 with helpful error message
612 |     assert response.status_code == 400
613 |     data = response.json()
614 |     assert "detail" in data
615 |     assert "Cannot delete default project" in data["detail"]
616 |     assert default_project_name in data["detail"]
617 | 
618 | 
619 | @pytest.mark.asyncio
620 | async def test_remove_default_project_with_alternatives(test_config, client, project_service):
621 |     """Test that error message includes alternative projects when trying to delete default."""
622 |     # Get the current default project
623 |     default_project_name = project_service.default_project
624 | 
625 |     # Create another project so there are alternatives
626 |     test_project_name = "test-alternative-project"
627 |     await project_service.add_project(test_project_name, "/tmp/test-alternative")
628 | 
629 |     try:
630 |         # Try to remove the default project
631 |         response = await client.delete(f"/projects/{default_project_name}")
632 | 
633 |         # Should return 400 with helpful error message including alternatives
634 |         assert response.status_code == 400
635 |         data = response.json()
636 |         assert "detail" in data
637 |         assert "Cannot delete default project" in data["detail"]
638 |         assert "Set another project as default first" in data["detail"]
639 |         assert test_project_name in data["detail"]
640 | 
641 |     finally:
642 |         # Clean up
643 |         try:
644 |             await project_service.remove_project(test_project_name)
645 |         except Exception:
646 |             pass
647 | 
648 | 
649 | @pytest.mark.asyncio
650 | async def test_remove_non_default_project_succeeds(test_config, client, project_service):
651 |     """Test that removing a non-default project succeeds."""
652 |     # Create a test project to remove
653 |     test_project_name = "test-remove-non-default"
654 |     await project_service.add_project(test_project_name, "/tmp/test-remove-non-default")
655 | 
656 |     # Verify it's not the default
657 |     assert project_service.default_project != test_project_name
658 | 
659 |     # Remove the project
660 |     response = await client.delete(f"/projects/{test_project_name}")
661 | 
662 |     # Should succeed
663 |     assert response.status_code == 200
664 |     data = response.json()
665 |     assert data["status"] == "success"
666 | 
667 |     # Verify project is removed
668 |     removed_project = await project_service.get_project(test_project_name)
669 |     assert removed_project is None
670 | 
671 | 
672 | @pytest.mark.asyncio
673 | async def test_set_nonexistent_project_as_default_fails(test_config, client, project_service):
674 |     """Test that setting a non-existent project as default returns 404."""
675 |     # Try to set a project that doesn't exist as default
676 |     response = await client.put("/projects/nonexistent-project/default")
677 | 
678 |     # Should return 404
679 |     assert response.status_code == 404
680 |     data = response.json()
681 |     assert "detail" in data
682 |     assert "does not exist" in data["detail"]
683 | 
684 | 
685 | @pytest.mark.asyncio
686 | async def test_create_project_idempotent_same_path(test_config, client, project_service):
687 |     """Test that creating a project with same name and same path is idempotent."""
688 |     # Create a project with platform-independent path
689 |     test_project_name = "test-idempotent"
690 |     with tempfile.TemporaryDirectory() as temp_dir:
691 |         test_project_path = (Path(temp_dir) / "test-idempotent").as_posix()
692 | 
693 |         response1 = await client.post(
694 |             "/projects/projects",
695 |             json={"name": test_project_name, "path": test_project_path, "set_default": False},
696 |         )
697 | 
698 |         # Should succeed with 201 Created
699 |         assert response1.status_code == 201
700 |         data1 = response1.json()
701 |         assert data1["status"] == "success"
702 |         assert data1["new_project"]["name"] == test_project_name
703 | 
704 |         # Try to create the same project again with same name and path
705 |         response2 = await client.post(
706 |             "/projects/projects",
707 |             json={"name": test_project_name, "path": test_project_path, "set_default": False},
708 |         )
709 | 
710 |         # Should also succeed (idempotent)
711 |         assert response2.status_code == 200
712 |         data2 = response2.json()
713 |         assert data2["status"] == "success"
714 |         assert "already exists" in data2["message"]
715 |         assert data2["new_project"]["name"] == test_project_name
716 |         # Normalize paths for cross-platform comparison
717 |         assert Path(data2["new_project"]["path"]).resolve() == Path(test_project_path).resolve()
718 | 
719 |         # Clean up
720 |         try:
721 |             await project_service.remove_project(test_project_name)
722 |         except Exception:
723 |             pass
724 | 
725 | 
726 | @pytest.mark.asyncio
727 | async def test_create_project_fails_different_path(test_config, client, project_service):
728 |     """Test that creating a project with same name but different path fails."""
729 |     # Create a project
730 |     test_project_name = "test-path-conflict"
731 |     test_project_path1 = "/tmp/test-path-conflict-1"
732 | 
733 |     response1 = await client.post(
734 |         "/projects/projects",
735 |         json={"name": test_project_name, "path": test_project_path1, "set_default": False},
736 |     )
737 | 
738 |     # Should succeed with 201 Created
739 |     assert response1.status_code == 201
740 | 
741 |     # Try to create the same project with different path
742 |     test_project_path2 = "/tmp/test-path-conflict-2"
743 |     response2 = await client.post(
744 |         "/projects/projects",
745 |         json={"name": test_project_name, "path": test_project_path2, "set_default": False},
746 |     )
747 | 
748 |     # Should fail with 400
749 |     assert response2.status_code == 400
750 |     data2 = response2.json()
751 |     assert "detail" in data2
752 |     assert "already exists with different path" in data2["detail"]
753 |     assert test_project_path1 in data2["detail"]
754 |     assert test_project_path2 in data2["detail"]
755 | 
756 |     # Clean up
757 |     try:
758 |         await project_service.remove_project(test_project_name)
759 |     except Exception:
760 |         pass
761 | 
762 | 
763 | @pytest.mark.asyncio
764 | async def test_remove_project_with_delete_notes_false(test_config, client, project_service):
765 |     """Test that removing a project with delete_notes=False leaves directory intact."""
766 |     # Create a test project with actual directory
767 |     test_project_name = "test-remove-keep-files"
768 |     with tempfile.TemporaryDirectory() as temp_dir:
769 |         test_path = Path(temp_dir) / "test-project"
770 |         test_path.mkdir()
771 |         test_file = test_path / "test.md"
772 |         test_file.write_text("# Test Note")
773 | 
774 |         await project_service.add_project(test_project_name, str(test_path))
775 | 
776 |         # Remove the project without deleting files (default)
777 |         response = await client.delete(f"/projects/{test_project_name}")
778 | 
779 |         # Verify response
780 |         assert response.status_code == 200
781 |         data = response.json()
782 |         assert data["status"] == "success"
783 | 
784 |         # Verify project is removed from config/db
785 |         removed_project = await project_service.get_project(test_project_name)
786 |         assert removed_project is None
787 | 
788 |         # Verify directory still exists
789 |         assert test_path.exists()
790 |         assert test_file.exists()
791 | 
792 | 
793 | @pytest.mark.asyncio
794 | async def test_remove_project_with_delete_notes_true(test_config, client, project_service):
795 |     """Test that removing a project with delete_notes=True deletes the directory."""
796 |     # Create a test project with actual directory
797 |     test_project_name = "test-remove-delete-files"
798 |     with tempfile.TemporaryDirectory() as temp_dir:
799 |         test_path = Path(temp_dir) / "test-project"
800 |         test_path.mkdir()
801 |         test_file = test_path / "test.md"
802 |         test_file.write_text("# Test Note")
803 | 
804 |         await project_service.add_project(test_project_name, str(test_path))
805 | 
806 |         # Remove the project with delete_notes=True
807 |         response = await client.delete(f"/projects/{test_project_name}?delete_notes=true")
808 | 
809 |         # Verify response
810 |         assert response.status_code == 200
811 |         data = response.json()
812 |         assert data["status"] == "success"
813 | 
814 |         # Verify project is removed from config/db
815 |         removed_project = await project_service.get_project(test_project_name)
816 |         assert removed_project is None
817 | 
818 |         # Verify directory is deleted
819 |         assert not test_path.exists()
820 | 
821 | 
822 | @pytest.mark.asyncio
823 | async def test_remove_project_delete_notes_nonexistent_directory(
824 |     test_config, client, project_service
825 | ):
826 |     """Test that removing a project with delete_notes=True handles missing directory gracefully."""
827 |     # Create a project pointing to a non-existent path
828 |     test_project_name = "test-remove-missing-dir"
829 |     test_path = "/tmp/this-directory-does-not-exist-12345"
830 | 
831 |     await project_service.add_project(test_project_name, test_path)
832 | 
833 |     # Remove the project with delete_notes=True (should not fail even if dir doesn't exist)
834 |     response = await client.delete(f"/projects/{test_project_name}?delete_notes=true")
835 | 
836 |     # Should succeed
837 |     assert response.status_code == 200
838 |     data = response.json()
839 |     assert data["status"] == "success"
840 | 
841 |     # Verify project is removed
842 |     removed_project = await project_service.get_project(test_project_name)
843 |     assert removed_project is None
844 | 
```

--------------------------------------------------------------------------------
/tests/mcp/test_tool_move_note.py:
--------------------------------------------------------------------------------

```python
  1 | """Tests for the move_note MCP tool."""
  2 | 
  3 | import pytest
  4 | 
  5 | from basic_memory.mcp.tools.move_note import move_note, _format_move_error_response
  6 | from basic_memory.mcp.tools.write_note import write_note
  7 | from basic_memory.mcp.tools.read_note import read_note
  8 | 
  9 | 
 10 | @pytest.mark.asyncio
 11 | async def test_detect_cross_project_move_attempt_is_defensive_on_api_error(monkeypatch):
 12 |     """Cross-project detection should fail open (return None) if the projects API errors."""
 13 |     import importlib
 14 | 
 15 |     clients_mod = importlib.import_module("basic_memory.mcp.clients")
 16 | 
 17 |     # Mock ProjectClient to raise an exception on list_projects
 18 |     class MockProjectClient:
 19 |         def __init__(self, *args, **kwargs):
 20 |             pass
 21 | 
 22 |         async def list_projects(self, *args, **kwargs):
 23 |             raise RuntimeError("boom")
 24 | 
 25 |     monkeypatch.setattr(clients_mod, "ProjectClient", MockProjectClient)
 26 | 
 27 |     move_note_module = importlib.import_module("basic_memory.mcp.tools.move_note")
 28 | 
 29 |     result = await move_note_module._detect_cross_project_move_attempt(
 30 |         client=None,
 31 |         identifier="source/note",
 32 |         destination_path="somewhere/note",
 33 |         current_project="test-project",
 34 |     )
 35 |     assert result is None
 36 | 
 37 | 
 38 | @pytest.mark.asyncio
 39 | async def test_move_note_success(app, client, test_project):
 40 |     """Test successfully moving a note to a new location."""
 41 |     # Create initial note
 42 |     await write_note.fn(
 43 |         project=test_project.name,
 44 |         title="Test Note",
 45 |         folder="source",
 46 |         content="# Test Note\nOriginal content here.",
 47 |     )
 48 | 
 49 |     # Move note
 50 |     result = await move_note.fn(
 51 |         project=test_project.name,
 52 |         identifier="source/test-note",
 53 |         destination_path="target/MovedNote.md",
 54 |     )
 55 | 
 56 |     assert isinstance(result, str)
 57 |     assert "✅ Note moved successfully" in result
 58 | 
 59 |     # Verify original location no longer exists
 60 |     try:
 61 |         await read_note.fn(test_project.name, "source/test-note")
 62 |         assert False, "Original note should not exist after move"
 63 |     except Exception:
 64 |         pass  # Expected - note should not exist at original location
 65 | 
 66 |     # Verify note exists at new location with same content
 67 |     content = await read_note.fn("target/moved-note", project=test_project.name)
 68 |     assert "# Test Note" in content
 69 |     assert "Original content here" in content
 70 |     assert "permalink: target/moved-note" in content
 71 | 
 72 | 
 73 | @pytest.mark.asyncio
 74 | async def test_move_note_with_folder_creation(client, test_project):
 75 |     """Test moving note creates necessary folders."""
 76 |     # Create initial note
 77 |     await write_note.fn(
 78 |         project=test_project.name,
 79 |         title="Deep Note",
 80 |         folder="",
 81 |         content="# Deep Note\nContent in root folder.",
 82 |     )
 83 | 
 84 |     # Move to deeply nested path
 85 |     result = await move_note.fn(
 86 |         project=test_project.name,
 87 |         identifier="deep-note",
 88 |         destination_path="deeply/nested/folder/DeepNote.md",
 89 |     )
 90 | 
 91 |     assert isinstance(result, str)
 92 |     assert "✅ Note moved successfully" in result
 93 | 
 94 |     # Verify note exists at new location
 95 |     content = await read_note.fn("deeply/nested/folder/deep-note", project=test_project.name)
 96 |     assert "# Deep Note" in content
 97 |     assert "Content in root folder" in content
 98 | 
 99 | 
100 | @pytest.mark.asyncio
101 | async def test_move_note_with_observations_and_relations(app, client, test_project):
102 |     """Test moving note preserves observations and relations."""
103 |     # Create note with complex semantic content
104 |     await write_note.fn(
105 |         project=test_project.name,
106 |         title="Complex Entity",
107 |         folder="source",
108 |         content="""# Complex Entity
109 | 
110 | ## Observations
111 | - [note] Important observation #tag1
112 | - [feature] Key feature #feature
113 | 
114 | ## Relations
115 | - relation to [[SomeOtherEntity]]
116 | - depends on [[Dependency]]
117 | 
118 | Some additional content.
119 |         """,
120 |     )
121 | 
122 |     # Move note
123 |     result = await move_note.fn(
124 |         project=test_project.name,
125 |         identifier="source/complex-entity",
126 |         destination_path="target/MovedComplex.md",
127 |     )
128 | 
129 |     assert isinstance(result, str)
130 |     assert "✅ Note moved successfully" in result
131 | 
132 |     # Verify moved note preserves all content
133 |     content = await read_note.fn("target/moved-complex", project=test_project.name)
134 |     assert "Important observation #tag1" in content
135 |     assert "Key feature #feature" in content
136 |     assert "[[SomeOtherEntity]]" in content
137 |     assert "[[Dependency]]" in content
138 |     assert "Some additional content" in content
139 | 
140 | 
141 | @pytest.mark.asyncio
142 | async def test_move_note_by_title(client, test_project):
143 |     """Test moving note using title as identifier."""
144 |     # Create note with unique title
145 |     await write_note.fn(
146 |         project=test_project.name,
147 |         title="UniqueTestTitle",
148 |         folder="source",
149 |         content="# UniqueTestTitle\nTest content.",
150 |     )
151 | 
152 |     # Move using title as identifier
153 |     result = await move_note.fn(
154 |         project=test_project.name,
155 |         identifier="UniqueTestTitle",
156 |         destination_path="target/MovedByTitle.md",
157 |     )
158 | 
159 |     assert isinstance(result, str)
160 |     assert "✅ Note moved successfully" in result
161 | 
162 |     # Verify note exists at new location
163 |     content = await read_note.fn("target/moved-by-title", project=test_project.name)
164 |     assert "# UniqueTestTitle" in content
165 |     assert "Test content" in content
166 | 
167 | 
168 | @pytest.mark.asyncio
169 | async def test_move_note_by_file_path(client, test_project):
170 |     """Test moving note using file path as identifier."""
171 |     # Create initial note
172 |     await write_note.fn(
173 |         project=test_project.name,
174 |         title="PathTest",
175 |         folder="source",
176 |         content="# PathTest\nContent for path test.",
177 |     )
178 | 
179 |     # Move using file path as identifier
180 |     result = await move_note.fn(
181 |         project=test_project.name,
182 |         identifier="source/PathTest.md",
183 |         destination_path="target/MovedByPath.md",
184 |     )
185 | 
186 |     assert isinstance(result, str)
187 |     assert "✅ Note moved successfully" in result
188 | 
189 |     # Verify note exists at new location
190 |     content = await read_note.fn("target/moved-by-path", project=test_project.name)
191 |     assert "# PathTest" in content
192 |     assert "Content for path test" in content
193 | 
194 | 
195 | @pytest.mark.asyncio
196 | async def test_move_note_nonexistent_note(client, test_project):
197 |     """Test moving a note that doesn't exist."""
198 |     result = await move_note.fn(
199 |         project=test_project.name,
200 |         identifier="nonexistent/note",
201 |         destination_path="target/SomeFile.md",
202 |     )
203 | 
204 |     # Should return user-friendly error message string
205 |     assert isinstance(result, str)
206 |     assert "# Move Failed - Note Not Found" in result
207 |     assert "could not be found for moving" in result
208 |     assert "Search for the note first" in result
209 | 
210 | 
211 | @pytest.mark.asyncio
212 | async def test_move_note_invalid_destination_path(client, test_project):
213 |     """Test moving note with invalid destination path."""
214 |     # Create initial note
215 |     await write_note.fn(
216 |         project=test_project.name,
217 |         title="TestNote",
218 |         folder="source",
219 |         content="# TestNote\nTest content.",
220 |     )
221 | 
222 |     # Test absolute path (should be rejected by validation)
223 |     result = await move_note.fn(
224 |         project=test_project.name,
225 |         identifier="source/test-note",
226 |         destination_path="/absolute/path.md",
227 |     )
228 | 
229 |     # Should return user-friendly error message string
230 |     assert isinstance(result, str)
231 |     assert "# Move Failed" in result
232 |     assert "/absolute/path.md" in result or "Invalid" in result or "path" in result
233 | 
234 | 
235 | @pytest.mark.asyncio
236 | async def test_move_note_missing_file_extension(client, test_project):
237 |     """Test moving note without file extension in destination path."""
238 |     # Create initial note
239 |     await write_note.fn(
240 |         project=test_project.name,
241 |         title="ExtensionTest",
242 |         folder="source",
243 |         content="# Extension Test\nTesting extension validation.",
244 |     )
245 | 
246 |     # Test path without extension
247 |     result = await move_note.fn(
248 |         project=test_project.name,
249 |         identifier="source/extension-test",
250 |         destination_path="target/renamed-note",
251 |     )
252 | 
253 |     # Should return error about missing extension
254 |     assert isinstance(result, str)
255 |     assert "# Move Failed - File Extension Required" in result
256 |     assert "must include a file extension" in result
257 |     assert ".md" in result
258 |     assert "renamed-note.md" in result  # Should suggest adding .md
259 | 
260 |     # Test path with empty extension (edge case)
261 |     result = await move_note.fn(
262 |         project=test_project.name,
263 |         identifier="source/extension-test",
264 |         destination_path="target/renamed-note.",
265 |     )
266 | 
267 |     assert isinstance(result, str)
268 |     assert "# Move Failed - File Extension Required" in result
269 |     assert "must include a file extension" in result
270 | 
271 |     # Test that note still exists at original location
272 |     content = await read_note.fn("source/extension-test", project=test_project.name)
273 |     assert "# Extension Test" in content
274 |     assert "Testing extension validation" in content
275 | 
276 | 
277 | @pytest.mark.asyncio
278 | async def test_move_note_file_extension_mismatch(client, test_project):
279 |     """Test that moving note with different extension is blocked."""
280 |     # Create initial note with .md extension
281 |     await write_note.fn(
282 |         project=test_project.name,
283 |         title="MarkdownNote",
284 |         folder="source",
285 |         content="# Markdown Note\nThis is a markdown file.",
286 |     )
287 | 
288 |     # Try to move with .txt extension
289 |     result = await move_note.fn(
290 |         project=test_project.name,
291 |         identifier="source/markdown-note",
292 |         destination_path="target/renamed-note.txt",
293 |     )
294 | 
295 |     # Should return error about extension mismatch
296 |     assert isinstance(result, str)
297 |     assert "# Move Failed - File Extension Mismatch" in result
298 |     assert "does not match the source file extension" in result
299 |     assert ".md" in result
300 |     assert ".txt" in result
301 |     assert "renamed-note.md" in result  # Should suggest correct extension
302 | 
303 |     # Test that note still exists at original location with original extension
304 |     content = await read_note.fn("source/markdown-note", project=test_project.name)
305 |     assert "# Markdown Note" in content
306 |     assert "This is a markdown file" in content
307 | 
308 | 
309 | @pytest.mark.asyncio
310 | async def test_move_note_preserves_file_extension(client, test_project):
311 |     """Test that moving note with matching extension succeeds."""
312 |     # Create initial note with .md extension
313 |     await write_note.fn(
314 |         project=test_project.name,
315 |         title="PreserveExtension",
316 |         folder="source",
317 |         content="# Preserve Extension\nTesting that extension is preserved.",
318 |     )
319 | 
320 |     # Move with same .md extension
321 |     result = await move_note.fn(
322 |         project=test_project.name,
323 |         identifier="source/preserve-extension",
324 |         destination_path="target/preserved-note.md",
325 |     )
326 | 
327 |     # Should succeed
328 |     assert isinstance(result, str)
329 |     assert "✅ Note moved successfully" in result
330 | 
331 |     # Verify note exists at new location with same extension
332 |     content = await read_note.fn("target/preserved-note", project=test_project.name)
333 |     assert "# Preserve Extension" in content
334 |     assert "Testing that extension is preserved" in content
335 | 
336 |     # Verify old location no longer exists
337 |     try:
338 |         await read_note.fn("source/preserve-extension")
339 |         assert False, "Original note should not exist after move"
340 |     except Exception:
341 |         pass  # Expected
342 | 
343 | 
344 | @pytest.mark.asyncio
345 | async def test_move_note_destination_exists(client, test_project):
346 |     """Test moving note to existing destination."""
347 |     # Create source note
348 |     await write_note.fn(
349 |         project=test_project.name,
350 |         title="SourceNote",
351 |         folder="source",
352 |         content="# SourceNote\nSource content.",
353 |     )
354 | 
355 |     # Create destination note
356 |     await write_note.fn(
357 |         project=test_project.name,
358 |         title="DestinationNote",
359 |         folder="target",
360 |         content="# DestinationNote\nDestination content.",
361 |     )
362 | 
363 |     # Try to move source to existing destination
364 |     result = await move_note.fn(
365 |         project=test_project.name,
366 |         identifier="source/source-note",
367 |         destination_path="target/DestinationNote.md",
368 |     )
369 | 
370 |     # Should return user-friendly error message string
371 |     assert isinstance(result, str)
372 |     assert "# Move Failed" in result
373 |     assert "already exists" in result or "Destination" in result
374 | 
375 | 
376 | @pytest.mark.asyncio
377 | async def test_move_note_same_location(client, test_project):
378 |     """Test moving note to the same location."""
379 |     # Create initial note
380 |     await write_note.fn(
381 |         project=test_project.name,
382 |         title="SameLocationTest",
383 |         folder="test",
384 |         content="# SameLocationTest\nContent here.",
385 |     )
386 | 
387 |     # Try to move to same location
388 |     result = await move_note.fn(
389 |         project=test_project.name,
390 |         identifier="test/same-location-test",
391 |         destination_path="test/SameLocationTest.md",
392 |     )
393 | 
394 |     # Should return user-friendly error message string
395 |     assert isinstance(result, str)
396 |     assert "# Move Failed" in result
397 |     assert "already exists" in result or "same" in result or "Destination" in result
398 | 
399 | 
400 | @pytest.mark.asyncio
401 | async def test_move_note_rename_only(client, test_project):
402 |     """Test moving note within same folder (rename operation)."""
403 |     # Create initial note
404 |     await write_note.fn(
405 |         project=test_project.name,
406 |         title="OriginalName",
407 |         folder="test",
408 |         content="# OriginalName\nContent to rename.",
409 |     )
410 | 
411 |     # Rename within same folder
412 |     await move_note.fn(
413 |         project=test_project.name,
414 |         identifier="test/original-name",
415 |         destination_path="test/NewName.md",
416 |     )
417 | 
418 |     # Verify original is gone
419 |     try:
420 |         await read_note.fn("test/original-name", project=test_project.name)
421 |         assert False, "Original note should not exist after rename"
422 |     except Exception:
423 |         pass  # Expected
424 | 
425 |     # Verify new name exists with same content
426 |     content = await read_note.fn("test/new-name", project=test_project.name)
427 |     assert "# OriginalName" in content  # Title in content remains same
428 |     assert "Content to rename" in content
429 |     assert "permalink: test/new-name" in content
430 | 
431 | 
432 | @pytest.mark.asyncio
433 | async def test_move_note_complex_filename(client, test_project):
434 |     """Test moving note with spaces in filename."""
435 |     # Create note with spaces in name
436 |     await write_note.fn(
437 |         project=test_project.name,
438 |         title="Meeting Notes 2025",
439 |         folder="meetings",
440 |         content="# Meeting Notes 2025\nMeeting content with dates.",
441 |     )
442 | 
443 |     # Move to new location
444 |     result = await move_note.fn(
445 |         project=test_project.name,
446 |         identifier="meetings/meeting-notes-2025",
447 |         destination_path="archive/2025/meetings/Meeting Notes 2025.md",
448 |     )
449 | 
450 |     assert isinstance(result, str)
451 |     assert "✅ Note moved successfully" in result
452 | 
453 |     # Verify note exists at new location with correct content
454 |     content = await read_note.fn(
455 |         "archive/2025/meetings/meeting-notes-2025", project=test_project.name
456 |     )
457 |     assert "# Meeting Notes 2025" in content
458 |     assert "Meeting content with dates" in content
459 | 
460 | 
461 | @pytest.mark.asyncio
462 | async def test_move_note_with_tags(app, client, test_project):
463 |     """Test moving note with tags preserves tags."""
464 |     # Create note with tags
465 |     await write_note.fn(
466 |         project=test_project.name,
467 |         title="Tagged Note",
468 |         folder="source",
469 |         content="# Tagged Note\nContent with tags.",
470 |         tags=["important", "work", "project"],
471 |     )
472 | 
473 |     # Move note
474 |     result = await move_note.fn(
475 |         project=test_project.name,
476 |         identifier="source/tagged-note",
477 |         destination_path="target/MovedTaggedNote.md",
478 |     )
479 | 
480 |     assert isinstance(result, str)
481 |     assert "✅ Note moved successfully" in result
482 | 
483 |     # Verify tags are preserved in correct YAML format
484 |     content = await read_note.fn("target/moved-tagged-note", project=test_project.name)
485 |     assert "- important" in content
486 |     assert "- work" in content
487 |     assert "- project" in content
488 | 
489 | 
490 | @pytest.mark.asyncio
491 | async def test_move_note_empty_string_destination(client, test_project):
492 |     """Test moving note with empty destination path."""
493 |     # Create initial note
494 |     await write_note.fn(
495 |         project=test_project.name,
496 |         title="TestNote",
497 |         folder="source",
498 |         content="# TestNote\nTest content.",
499 |     )
500 | 
501 |     # Test empty destination path
502 |     result = await move_note.fn(
503 |         project=test_project.name,
504 |         identifier="source/test-note",
505 |         destination_path="",
506 |     )
507 | 
508 |     # Should return user-friendly error message string
509 |     assert isinstance(result, str)
510 |     assert "# Move Failed" in result
511 |     assert "empty" in result or "Invalid" in result or "path" in result
512 | 
513 | 
514 | @pytest.mark.asyncio
515 | async def test_move_note_parent_directory_path(client, test_project):
516 |     """Test moving note with parent directory in destination path."""
517 |     # Create initial note
518 |     await write_note.fn(
519 |         project=test_project.name,
520 |         title="TestNote",
521 |         folder="source",
522 |         content="# TestNote\nTest content.",
523 |     )
524 | 
525 |     # Test parent directory path
526 |     result = await move_note.fn(
527 |         project=test_project.name,
528 |         identifier="source/test-note",
529 |         destination_path="../parent/file.md",
530 |     )
531 | 
532 |     # Should return user-friendly error message string
533 |     assert isinstance(result, str)
534 |     assert "# Move Failed" in result
535 |     assert "parent" in result or "Invalid" in result or "path" in result or ".." in result
536 | 
537 | 
538 | @pytest.mark.asyncio
539 | async def test_move_note_identifier_variations(client, test_project):
540 |     """Test that various identifier formats work for moving."""
541 |     # Create a note to test different identifier formats
542 |     await write_note.fn(
543 |         project=test_project.name,
544 |         title="Test Document",
545 |         folder="docs",
546 |         content="# Test Document\nContent for testing identifiers.",
547 |     )
548 | 
549 |     # Test with permalink identifier
550 |     result = await move_note.fn(
551 |         project=test_project.name,
552 |         identifier="docs/test-document",
553 |         destination_path="moved/TestDocument.md",
554 |     )
555 | 
556 |     assert isinstance(result, str)
557 |     assert "✅ Note moved successfully" in result
558 | 
559 |     # Verify it moved correctly
560 |     content = await read_note.fn("moved/test-document", project=test_project.name)
561 |     assert "# Test Document" in content
562 |     assert "Content for testing identifiers" in content
563 | 
564 | 
565 | @pytest.mark.asyncio
566 | async def test_move_note_preserves_frontmatter(app, client, test_project):
567 |     """Test that moving preserves custom frontmatter."""
568 |     # Create note with custom frontmatter by first creating it normally
569 |     await write_note.fn(
570 |         project=test_project.name,
571 |         title="Custom Frontmatter Note",
572 |         folder="source",
573 |         content="# Custom Frontmatter Note\nContent with custom metadata.",
574 |     )
575 | 
576 |     # Move the note
577 |     result = await move_note.fn(
578 |         project=test_project.name,
579 |         identifier="source/custom-frontmatter-note",
580 |         destination_path="target/MovedCustomNote.md",
581 |     )
582 | 
583 |     assert isinstance(result, str)
584 |     assert "✅ Note moved successfully" in result
585 | 
586 |     # Verify the moved note has proper frontmatter structure
587 |     content = await read_note.fn("target/moved-custom-note", project=test_project.name)
588 |     assert "title: Custom Frontmatter Note" in content
589 |     assert "type: note" in content
590 |     assert "permalink: target/moved-custom-note" in content
591 |     assert "# Custom Frontmatter Note" in content
592 |     assert "Content with custom metadata" in content
593 | 
594 | 
595 | class TestMoveNoteErrorFormatting:
596 |     """Test move note error formatting for better user experience."""
597 | 
598 |     def test_format_move_error_invalid_path(self):
599 |         """Test formatting for invalid path errors."""
600 |         result = _format_move_error_response("invalid path format", "test-note", "/invalid/path.md")
601 | 
602 |         assert "# Move Failed - Invalid Destination Path" in result
603 |         assert "The destination path '/invalid/path.md' is not valid" in result
604 |         assert "Relative paths only" in result
605 |         assert "Include file extension" in result
606 | 
607 |     def test_format_move_error_permission_denied(self):
608 |         """Test formatting for permission errors."""
609 |         result = _format_move_error_response("permission denied", "test-note", "target/file.md")
610 | 
611 |         assert "# Move Failed - Permission Error" in result
612 |         assert "You don't have permission to move 'test-note'" in result
613 |         assert "Check file permissions" in result
614 |         assert "Check file locks" in result
615 | 
616 |     def test_format_move_error_source_missing(self):
617 |         """Test formatting for source file missing errors."""
618 |         result = _format_move_error_response("source file missing", "test-note", "target/file.md")
619 | 
620 |         assert "# Move Failed - Source File Missing" in result
621 |         assert "The source file for 'test-note' was not found on disk" in result
622 |         assert "database and filesystem are out of sync" in result
623 | 
624 |     def test_format_move_error_server_error(self):
625 |         """Test formatting for server errors."""
626 |         result = _format_move_error_response("server error occurred", "test-note", "target/file.md")
627 | 
628 |         assert "# Move Failed - System Error" in result
629 |         assert "A system error occurred while moving 'test-note'" in result
630 |         assert "Try again" in result
631 |         assert "Check disk space" in result
632 | 
633 | 
634 | class TestMoveNoteSecurityValidation:
635 |     """Test move note security validation features."""
636 | 
637 |     @pytest.mark.asyncio
638 |     async def test_move_note_blocks_path_traversal_unix(self, client, test_project):
639 |         """Test that Unix-style path traversal attacks are blocked."""
640 |         # Create initial note
641 |         await write_note.fn(
642 |             project=test_project.name,
643 |             title="Test Note",
644 |             folder="source",
645 |             content="# Test Note\nTest content for security testing.",
646 |         )
647 | 
648 |         # Test various Unix-style path traversal patterns
649 |         attack_paths = [
650 |             "../secrets.txt",
651 |             "../../etc/passwd",
652 |             "../../../root/.ssh/id_rsa",
653 |             "notes/../../../etc/shadow",
654 |             "folder/../../outside/file.md",
655 |             "../../../../etc/hosts",
656 |         ]
657 | 
658 |         for attack_path in attack_paths:
659 |             result = await move_note.fn(
660 |                 project=test_project.name,
661 |                 identifier="source/test-note",
662 |                 destination_path=attack_path,
663 |             )
664 | 
665 |             assert isinstance(result, str)
666 |             assert "# Move Failed - Security Validation Error" in result
667 |             assert "paths must stay within project boundaries" in result
668 |             assert attack_path in result
669 |             assert "Try again with a safe path" in result
670 | 
671 |     @pytest.mark.asyncio
672 |     async def test_move_note_blocks_path_traversal_windows(self, client, test_project):
673 |         """Test that Windows-style path traversal attacks are blocked."""
674 |         # Create initial note
675 |         await write_note.fn(
676 |             project=test_project.name,
677 |             title="Test Note",
678 |             folder="source",
679 |             content="# Test Note\nTest content for security testing.",
680 |         )
681 | 
682 |         # Test various Windows-style path traversal patterns
683 |         attack_paths = [
684 |             "..\\secrets.txt",
685 |             "..\\..\\Windows\\System32\\config\\SAM",
686 |             "notes\\..\\..\\..\\Windows\\System32",
687 |             "\\\\server\\share\\file.txt",
688 |             "..\\..\\Users\\user\\.env",
689 |             "\\\\..\\..\\Windows",
690 |         ]
691 | 
692 |         for attack_path in attack_paths:
693 |             result = await move_note.fn(
694 |                 project=test_project.name,
695 |                 identifier="source/test-note",
696 |                 destination_path=attack_path,
697 |             )
698 | 
699 |             assert isinstance(result, str)
700 |             assert "# Move Failed - Security Validation Error" in result
701 |             assert "paths must stay within project boundaries" in result
702 |             assert attack_path in result
703 | 
704 |     @pytest.mark.asyncio
705 |     async def test_move_note_blocks_absolute_paths(self, client, test_project):
706 |         """Test that absolute paths are blocked."""
707 |         # Create initial note
708 |         await write_note.fn(
709 |             project=test_project.name,
710 |             title="Test Note",
711 |             folder="source",
712 |             content="# Test Note\nTest content for security testing.",
713 |         )
714 | 
715 |         # Test various absolute path patterns
716 |         attack_paths = [
717 |             "/etc/passwd",
718 |             "/home/user/.env",
719 |             "/var/log/auth.log",
720 |             "/root/.ssh/id_rsa",
721 |             "C:\\Windows\\System32\\config\\SAM",
722 |             "C:\\Users\\user\\.env",
723 |             "D:\\secrets\\config.json",
724 |             "/tmp/malicious.txt",
725 |         ]
726 | 
727 |         for attack_path in attack_paths:
728 |             result = await move_note.fn(
729 |                 project=test_project.name,
730 |                 identifier="source/test-note",
731 |                 destination_path=attack_path,
732 |             )
733 | 
734 |             assert isinstance(result, str)
735 |             assert "# Move Failed - Security Validation Error" in result
736 |             assert "paths must stay within project boundaries" in result
737 |             assert attack_path in result
738 | 
739 |     @pytest.mark.asyncio
740 |     async def test_move_note_blocks_home_directory_access(self, client, test_project):
741 |         """Test that home directory access patterns are blocked."""
742 |         # Create initial note
743 |         await write_note.fn(
744 |             project=test_project.name,
745 |             title="Test Note",
746 |             folder="source",
747 |             content="# Test Note\nTest content for security testing.",
748 |         )
749 | 
750 |         # Test various home directory access patterns
751 |         attack_paths = [
752 |             "~/secrets.txt",
753 |             "~/.env",
754 |             "~/.ssh/id_rsa",
755 |             "~/Documents/passwords.txt",
756 |             "~\\AppData\\secrets",
757 |             "~\\Desktop\\config.ini",
758 |         ]
759 | 
760 |         for attack_path in attack_paths:
761 |             result = await move_note.fn(
762 |                 project=test_project.name,
763 |                 identifier="source/test-note",
764 |                 destination_path=attack_path,
765 |             )
766 | 
767 |             assert isinstance(result, str)
768 |             assert "# Move Failed - Security Validation Error" in result
769 |             assert "paths must stay within project boundaries" in result
770 |             assert attack_path in result
771 | 
772 |     @pytest.mark.asyncio
773 |     async def test_move_note_blocks_mixed_attack_patterns(self, client, test_project):
774 |         """Test that mixed legitimate/attack patterns are blocked."""
775 |         # Create initial note
776 |         await write_note.fn(
777 |             project=test_project.name,
778 |             title="Test Note",
779 |             folder="source",
780 |             content="# Test Note\nTest content for security testing.",
781 |         )
782 | 
783 |         # Test mixed patterns that start legitimate but contain attacks
784 |         attack_paths = [
785 |             "notes/../../../etc/passwd",
786 |             "docs/../../.env",
787 |             "legitimate/path/../../.ssh/id_rsa",
788 |             "project/folder/../../../Windows/System32",
789 |             "valid/folder/../../home/user/.bashrc",
790 |         ]
791 | 
792 |         for attack_path in attack_paths:
793 |             result = await move_note.fn(
794 |                 project=test_project.name,
795 |                 identifier="source/test-note",
796 |                 destination_path=attack_path,
797 |             )
798 | 
799 |             assert isinstance(result, str)
800 |             assert "# Move Failed - Security Validation Error" in result
801 |             assert "paths must stay within project boundaries" in result
802 | 
803 |     @pytest.mark.asyncio
804 |     async def test_move_note_allows_safe_paths(self, client, test_project):
805 |         """Test that legitimate paths are still allowed."""
806 |         # Create initial note
807 |         await write_note.fn(
808 |             project=test_project.name,
809 |             title="Test Note",
810 |             folder="source",
811 |             content="# Test Note\nTest content for security testing.",
812 |         )
813 | 
814 |         # Test various safe path patterns
815 |         safe_paths = [
816 |             "notes/meeting.md",
817 |             "docs/readme.txt",
818 |             "projects/2025/planning.md",
819 |             "archive/old-notes/backup.md",
820 |             "deep/nested/directory/structure/file.txt",
821 |             "folder/subfolder/document.md",
822 |         ]
823 | 
824 |         for safe_path in safe_paths:
825 |             result = await move_note.fn(
826 |                 project=test_project.name,
827 |                 identifier="source/test-note",
828 |                 destination_path=safe_path,
829 |             )
830 | 
831 |             # Should succeed or fail for legitimate reasons (not security)
832 |             assert isinstance(result, str)
833 |             # Should NOT contain security error message
834 |             assert "Security Validation Error" not in result
835 | 
836 |             # If it fails, it should be for other reasons like "already exists" or API errors
837 |             if "Move Failed" in result:
838 |                 assert "paths must stay within project boundaries" not in result
839 | 
840 |     @pytest.mark.asyncio
841 |     async def test_move_note_security_logging(self, client, test_project, caplog):
842 |         """Test that security violations are properly logged."""
843 |         # Create initial note
844 |         await write_note.fn(
845 |             project=test_project.name,
846 |             title="Test Note",
847 |             folder="source",
848 |             content="# Test Note\nTest content for security testing.",
849 |         )
850 | 
851 |         # Attempt path traversal attack
852 |         result = await move_note.fn(
853 |             project=test_project.name,
854 |             identifier="source/test-note",
855 |             destination_path="../../../etc/passwd",
856 |         )
857 | 
858 |         assert "# Move Failed - Security Validation Error" in result
859 | 
860 |         # Check that security violation was logged
861 |         # Note: This test may need adjustment based on the actual logging setup
862 |         # The security validation should generate a warning log entry
863 | 
864 |     @pytest.mark.asyncio
865 |     async def test_move_note_empty_path_security(self, client, test_project):
866 |         """Test that empty destination path is handled securely."""
867 |         # Create initial note
868 |         await write_note.fn(
869 |             project=test_project.name,
870 |             title="Test Note",
871 |             folder="source",
872 |             content="# Test Note\nTest content for security testing.",
873 |         )
874 | 
875 |         # Test empty destination path (should be allowed as it resolves to project root)
876 |         result = await move_note.fn(
877 |             project=test_project.name,
878 |             identifier="source/test-note",
879 |             destination_path="",
880 |         )
881 | 
882 |         assert isinstance(result, str)
883 |         # Empty path should not trigger security error (it's handled by pathlib validation)
884 |         # But may fail for other API-related reasons
885 | 
886 |     @pytest.mark.asyncio
887 |     async def test_move_note_current_directory_references_security(self, client, test_project):
888 |         """Test that current directory references are handled securely."""
889 |         # Create initial note
890 |         await write_note.fn(
891 |             project=test_project.name,
892 |             title="Test Note",
893 |             folder="source",
894 |             content="# Test Note\nTest content for security testing.",
895 |         )
896 | 
897 |         # Test current directory references (should be safe)
898 |         safe_paths = [
899 |             "./notes/file.md",
900 |             "folder/./file.md",
901 |             "./folder/subfolder/file.md",
902 |         ]
903 | 
904 |         for safe_path in safe_paths:
905 |             result = await move_note.fn(
906 |                 project=test_project.name,
907 |                 identifier="source/test-note",
908 |                 destination_path=safe_path,
909 |             )
910 | 
911 |             assert isinstance(result, str)
912 |             # Should NOT contain security error message
913 |             assert "Security Validation Error" not in result
914 | 
```

--------------------------------------------------------------------------------
/tests/services/test_search_service.py:
--------------------------------------------------------------------------------

```python
  1 | """Tests for search service."""
  2 | 
  3 | from datetime import datetime
  4 | 
  5 | import pytest
  6 | from sqlalchemy import text
  7 | 
  8 | from basic_memory import db
  9 | from basic_memory.schemas.search import SearchQuery, SearchItemType
 10 | 
 11 | 
 12 | @pytest.mark.asyncio
 13 | async def test_search_permalink(search_service, test_graph):
 14 |     """Exact permalink"""
 15 |     results = await search_service.search(SearchQuery(permalink="test/root"))
 16 |     assert len(results) == 1
 17 | 
 18 |     for r in results:
 19 |         assert "test/root" in r.permalink
 20 | 
 21 | 
 22 | @pytest.mark.asyncio
 23 | async def test_search_limit_offset(search_service, test_graph):
 24 |     """Exact permalink"""
 25 |     results = await search_service.search(SearchQuery(permalink_match="test/*"))
 26 |     assert len(results) > 1
 27 | 
 28 |     results = await search_service.search(SearchQuery(permalink_match="test/*"), limit=1)
 29 |     assert len(results) == 1
 30 | 
 31 |     results = await search_service.search(SearchQuery(permalink_match="test/*"), limit=100)
 32 |     num_results = len(results)
 33 | 
 34 |     # assert offset
 35 |     offset_results = await search_service.search(
 36 |         SearchQuery(permalink_match="test/*"), limit=100, offset=1
 37 |     )
 38 |     assert len(offset_results) == num_results - 1
 39 | 
 40 | 
 41 | @pytest.mark.asyncio
 42 | async def test_search_permalink_observations_wildcard(search_service, test_graph):
 43 |     """Pattern matching"""
 44 |     results = await search_service.search(SearchQuery(permalink_match="test/root/observations/*"))
 45 |     assert len(results) == 2
 46 |     permalinks = {r.permalink for r in results}
 47 |     assert "test/root/observations/note/root-note-1" in permalinks
 48 |     assert "test/root/observations/tech/root-tech-note" in permalinks
 49 | 
 50 | 
 51 | @pytest.mark.asyncio
 52 | async def test_search_permalink_relation_wildcard(search_service, test_graph):
 53 |     """Pattern matching"""
 54 |     results = await search_service.search(SearchQuery(permalink_match="test/root/connects-to/*"))
 55 |     assert len(results) == 1
 56 |     permalinks = {r.permalink for r in results}
 57 |     assert "test/root/connects-to/test/connected-entity-1" in permalinks
 58 | 
 59 | 
 60 | @pytest.mark.asyncio
 61 | async def test_search_permalink_wildcard2(search_service, test_graph):
 62 |     """Pattern matching"""
 63 |     results = await search_service.search(
 64 |         SearchQuery(
 65 |             permalink_match="test/connected*",
 66 |         )
 67 |     )
 68 |     assert len(results) >= 2
 69 |     permalinks = {r.permalink for r in results}
 70 |     assert "test/connected-entity-1" in permalinks
 71 |     assert "test/connected-entity-2" in permalinks
 72 | 
 73 | 
 74 | @pytest.mark.asyncio
 75 | async def test_search_text(search_service, test_graph):
 76 |     """Full-text search"""
 77 |     results = await search_service.search(
 78 |         SearchQuery(text="Root Entity", entity_types=[SearchItemType.ENTITY])
 79 |     )
 80 |     assert len(results) >= 1
 81 |     assert results[0].permalink == "test/root"
 82 | 
 83 | 
 84 | @pytest.mark.asyncio
 85 | async def test_search_title(search_service, test_graph):
 86 |     """Title only search"""
 87 |     results = await search_service.search(
 88 |         SearchQuery(title="Root", entity_types=[SearchItemType.ENTITY])
 89 |     )
 90 |     assert len(results) >= 1
 91 |     assert results[0].permalink == "test/root"
 92 | 
 93 | 
 94 | @pytest.mark.asyncio
 95 | async def test_text_search_case_insensitive(search_service, test_graph):
 96 |     """Test text search functionality."""
 97 |     # Case insensitive
 98 |     results = await search_service.search(SearchQuery(text="ENTITY"))
 99 |     assert any("test/root" in r.permalink for r in results)
100 | 
101 | 
102 | @pytest.mark.asyncio
103 | async def test_text_search_content_word_match(search_service, test_graph):
104 |     """Test text search functionality."""
105 | 
106 |     # content word match
107 |     results = await search_service.search(SearchQuery(text="Connected"))
108 |     assert len(results) > 0
109 |     assert any(r.file_path == "test/Connected Entity 2.md" for r in results)
110 | 
111 | 
112 | @pytest.mark.asyncio
113 | async def test_text_search_multiple_terms(search_service, test_graph):
114 |     """Test text search functionality."""
115 | 
116 |     # Multiple terms
117 |     results = await search_service.search(SearchQuery(text="root note"))
118 |     assert any("test/root" in r.permalink for r in results)
119 | 
120 | 
121 | @pytest.mark.asyncio
122 | async def test_pattern_matching(search_service, test_graph):
123 |     """Test pattern matching with various wildcards."""
124 |     # Test wildcards
125 |     results = await search_service.search(SearchQuery(permalink_match="test/*"))
126 |     for r in results:
127 |         assert "test/" in r.permalink
128 | 
129 |     # Test start wildcards
130 |     results = await search_service.search(SearchQuery(permalink_match="*/observations"))
131 |     for r in results:
132 |         assert "/observations" in r.permalink
133 | 
134 |     # Test permalink partial match
135 |     results = await search_service.search(SearchQuery(permalink_match="test"))
136 |     for r in results:
137 |         assert "test/" in r.permalink
138 | 
139 | 
140 | @pytest.mark.asyncio
141 | async def test_filters(search_service, test_graph):
142 |     """Test search filters."""
143 |     # Combined filters
144 |     results = await search_service.search(
145 |         SearchQuery(text="Deep", entity_types=[SearchItemType.ENTITY], types=["deep"])
146 |     )
147 |     assert len(results) == 1
148 |     for r in results:
149 |         assert r.type == SearchItemType.ENTITY
150 |         assert r.metadata.get("entity_type") == "deep"
151 | 
152 | 
153 | @pytest.mark.asyncio
154 | async def test_after_date(search_service, test_graph):
155 |     """Test search filters."""
156 | 
157 |     # Should find with past date
158 |     past_date = datetime(2020, 1, 1).astimezone()
159 |     results = await search_service.search(
160 |         SearchQuery(
161 |             text="entity",
162 |             after_date=past_date.isoformat(),
163 |         )
164 |     )
165 |     for r in results:
166 |         # Handle both string (SQLite) and datetime (Postgres) formats
167 |         created_at = (
168 |             r.created_at
169 |             if isinstance(r.created_at, datetime)
170 |             else datetime.fromisoformat(r.created_at)
171 |         )
172 |         assert created_at > past_date
173 | 
174 |     # Should not find with future date
175 |     future_date = datetime(2030, 1, 1).astimezone()
176 |     results = await search_service.search(
177 |         SearchQuery(
178 |             text="entity",
179 |             after_date=future_date.isoformat(),
180 |         )
181 |     )
182 |     assert len(results) == 0
183 | 
184 | 
185 | @pytest.mark.asyncio
186 | async def test_search_type(search_service, test_graph):
187 |     """Test search filters."""
188 | 
189 |     # Should find only type
190 |     results = await search_service.search(SearchQuery(types=["test"]))
191 |     assert len(results) > 0
192 |     for r in results:
193 |         assert r.type == SearchItemType.ENTITY
194 | 
195 | 
196 | @pytest.mark.asyncio
197 | async def test_search_entity_type(search_service, test_graph):
198 |     """Test search filters."""
199 | 
200 |     # Should find only type
201 |     results = await search_service.search(SearchQuery(entity_types=[SearchItemType.ENTITY]))
202 |     assert len(results) > 0
203 |     for r in results:
204 |         assert r.type == SearchItemType.ENTITY
205 | 
206 | 
207 | @pytest.mark.asyncio
208 | async def test_extract_entity_tags_exception_handling(search_service):
209 |     """Test the _extract_entity_tags method exception handling (lines 147-151)."""
210 |     from basic_memory.models.knowledge import Entity
211 | 
212 |     # Create entity with string tags that will cause parsing to fail and fall back to single tag
213 |     entity_with_invalid_tags = Entity(
214 |         title="Test Entity",
215 |         entity_type="test",
216 |         entity_metadata={"tags": "just a string"},  # This will fail ast.literal_eval
217 |         content_type="text/markdown",
218 |         file_path="test/test-entity.md",
219 |         project_id=1,
220 |     )
221 | 
222 |     # This should trigger the except block on lines 147-149
223 |     result = search_service._extract_entity_tags(entity_with_invalid_tags)
224 |     assert result == ["just a string"]
225 | 
226 |     # Test with empty string (should return empty list) - covers line 149
227 |     entity_with_empty_tags = Entity(
228 |         title="Test Entity Empty",
229 |         entity_type="test",
230 |         entity_metadata={"tags": ""},
231 |         content_type="text/markdown",
232 |         file_path="test/test-entity-empty.md",
233 |         project_id=1,
234 |     )
235 | 
236 |     result = search_service._extract_entity_tags(entity_with_empty_tags)
237 |     assert result == []
238 | 
239 | 
240 | @pytest.mark.asyncio
241 | async def test_delete_entity_without_permalink(search_service, sample_entity):
242 |     """Test deleting an entity that has no permalink (edge case)."""
243 | 
244 |     # Set the entity permalink to None to trigger the else branch on line 355
245 |     sample_entity.permalink = None
246 | 
247 |     # This should trigger the delete_by_entity_id path (line 355) in handle_delete
248 |     await search_service.handle_delete(sample_entity)
249 | 
250 | 
251 | @pytest.mark.asyncio
252 | async def test_no_criteria(search_service, test_graph):
253 |     """Test search with no criteria returns empty list."""
254 |     results = await search_service.search(SearchQuery())
255 |     assert len(results) == 0
256 | 
257 | 
258 | @pytest.mark.asyncio
259 | async def test_init_search_index(search_service, session_maker, app_config):
260 |     """Test search index initialization."""
261 |     from basic_memory.config import DatabaseBackend
262 | 
263 |     async with db.scoped_session(session_maker) as session:
264 |         # Use database-specific query to check table existence
265 |         if app_config.database_backend == DatabaseBackend.POSTGRES:
266 |             result = await session.execute(
267 |                 text("SELECT tablename FROM pg_catalog.pg_tables WHERE tablename='search_index';")
268 |             )
269 |         else:
270 |             result = await session.execute(
271 |                 text("SELECT name FROM sqlite_master WHERE type='table' AND name='search_index';")
272 |             )
273 |         assert result.scalar() == "search_index"
274 | 
275 | 
276 | @pytest.mark.asyncio
277 | async def test_update_index(search_service, full_entity):
278 |     """Test updating indexed content."""
279 |     await search_service.index_entity(full_entity)
280 | 
281 |     # Update entity
282 |     full_entity.title = "OMG I AM UPDATED"
283 |     await search_service.index_entity(full_entity)
284 | 
285 |     # Search for new title
286 |     results = await search_service.search(SearchQuery(text="OMG I AM UPDATED"))
287 |     assert len(results) > 1
288 | 
289 | 
290 | @pytest.mark.asyncio
291 | async def test_boolean_and_search(search_service, test_graph):
292 |     """Test boolean AND search."""
293 |     # Create an entity with specific terms for testing
294 |     # This assumes the test_graph fixture already has entities with relevant terms
295 | 
296 |     # Test AND operator - both terms must be present
297 |     results = await search_service.search(SearchQuery(text="Root AND Entity"))
298 |     assert len(results) >= 1
299 | 
300 |     # Verify the result contains both terms
301 |     found = False
302 |     for result in results:
303 |         if (result.title and "Root" in result.title and "Entity" in result.title) or (
304 |             result.content_snippet
305 |             and "Root" in result.content_snippet
306 |             and "Entity" in result.content_snippet
307 |         ):
308 |             found = True
309 |             break
310 |     assert found, "Boolean AND search failed to find items containing both terms"
311 | 
312 |     # Verify that items with only one term are not returned
313 |     results = await search_service.search(SearchQuery(text="NonexistentTerm AND Root"))
314 |     assert len(results) == 0, "Boolean AND search returned results when it shouldn't have"
315 | 
316 | 
317 | @pytest.mark.asyncio
318 | async def test_boolean_or_search(search_service, test_graph):
319 |     """Test boolean OR search."""
320 |     # Test OR operator - either term can be present
321 |     results = await search_service.search(SearchQuery(text="Root OR Connected"))
322 | 
323 |     # Should find both "Root Entity" and "Connected Entity"
324 |     assert len(results) >= 2
325 | 
326 |     # Verify we find items with either term
327 |     root_found = False
328 |     connected_found = False
329 | 
330 |     for result in results:
331 |         if result.permalink == "test/root":
332 |             root_found = True
333 |         elif "connected" in result.permalink.lower():
334 |             connected_found = True
335 | 
336 |     assert root_found, "Boolean OR search failed to find 'Root' term"
337 |     assert connected_found, "Boolean OR search failed to find 'Connected' term"
338 | 
339 | 
340 | @pytest.mark.asyncio
341 | async def test_boolean_not_search(search_service, test_graph):
342 |     """Test boolean NOT search."""
343 |     # Test NOT operator - exclude certain terms
344 |     results = await search_service.search(SearchQuery(text="Entity NOT Connected"))
345 | 
346 |     # Should find "Root Entity" but not "Connected Entity"
347 |     for result in results:
348 |         assert "connected" not in result.permalink.lower(), (
349 |             "Boolean NOT search returned excluded term"
350 |         )
351 | 
352 | 
353 | @pytest.mark.asyncio
354 | async def test_boolean_group_search(search_service, test_graph):
355 |     """Test boolean grouping with parentheses."""
356 |     # Test grouping - (A OR B) AND C
357 |     results = await search_service.search(SearchQuery(title="(Root OR Connected) AND Entity"))
358 | 
359 |     # Should find both entities that contain "Entity" and either "Root" or "Connected"
360 |     assert len(results) >= 2
361 | 
362 |     for result in results:
363 |         # Each result should contain "Entity" and either "Root" or "Connected"
364 |         contains_entity = "entity" in result.title.lower()
365 |         contains_root_or_connected = (
366 |             "root" in result.title.lower() or "connected" in result.title.lower()
367 |         )
368 | 
369 |         assert contains_entity and contains_root_or_connected, (
370 |             "Boolean grouped search returned incorrect results"
371 |         )
372 | 
373 | 
374 | @pytest.mark.asyncio
375 | async def test_boolean_operators_detection(search_service):
376 |     """Test detection of boolean operators in query."""
377 |     # Test various queries that should be detected as boolean
378 |     boolean_queries = [
379 |         "term1 AND term2",
380 |         "term1 OR term2",
381 |         "term1 NOT term2",
382 |         "(term1 OR term2) AND term3",
383 |         "complex (nested OR grouping) AND term",
384 |     ]
385 | 
386 |     for query_text in boolean_queries:
387 |         query = SearchQuery(text=query_text)
388 |         assert query.has_boolean_operators(), f"Failed to detect boolean operators in: {query_text}"
389 | 
390 |     # Test queries that should not be detected as boolean
391 |     non_boolean_queries = [
392 |         "normal search query",
393 |         "brand name",  # Should not detect "AND" within "brand"
394 |         "understand this concept",  # Should not detect "AND" within "understand"
395 |         "command line",
396 |         "sandbox testing",
397 |     ]
398 | 
399 |     for query_text in non_boolean_queries:
400 |         query = SearchQuery(text=query_text)
401 |         assert not query.has_boolean_operators(), (
402 |             f"Incorrectly detected boolean operators in: {query_text}"
403 |         )
404 | 
405 | 
406 | # Tests for frontmatter tag search functionality
407 | 
408 | 
409 | @pytest.mark.asyncio
410 | async def test_extract_entity_tags_list_format(search_service, session_maker):
411 |     """Test tag extraction from list format in entity metadata."""
412 |     from basic_memory.models import Entity
413 | 
414 |     entity = Entity(
415 |         title="Test Entity",
416 |         entity_type="note",
417 |         entity_metadata={"tags": ["business", "strategy", "planning"]},
418 |         content_type="text/markdown",
419 |         file_path="test/business-strategy.md",
420 |         project_id=1,
421 |     )
422 | 
423 |     tags = search_service._extract_entity_tags(entity)
424 |     assert tags == ["business", "strategy", "planning"]
425 | 
426 | 
427 | @pytest.mark.asyncio
428 | async def test_extract_entity_tags_string_format(search_service, session_maker):
429 |     """Test tag extraction from string format in entity metadata."""
430 |     from basic_memory.models import Entity
431 | 
432 |     entity = Entity(
433 |         title="Test Entity",
434 |         entity_type="note",
435 |         entity_metadata={"tags": "['documentation', 'tools', 'best-practices']"},
436 |         content_type="text/markdown",
437 |         file_path="test/docs.md",
438 |         project_id=1,
439 |     )
440 | 
441 |     tags = search_service._extract_entity_tags(entity)
442 |     assert tags == ["documentation", "tools", "best-practices"]
443 | 
444 | 
445 | @pytest.mark.asyncio
446 | async def test_extract_entity_tags_empty_list(search_service, session_maker):
447 |     """Test tag extraction from empty list in entity metadata."""
448 |     from basic_memory.models import Entity
449 | 
450 |     entity = Entity(
451 |         title="Test Entity",
452 |         entity_type="note",
453 |         entity_metadata={"tags": []},
454 |         content_type="text/markdown",
455 |         file_path="test/empty-tags.md",
456 |         project_id=1,
457 |     )
458 | 
459 |     tags = search_service._extract_entity_tags(entity)
460 |     assert tags == []
461 | 
462 | 
463 | @pytest.mark.asyncio
464 | async def test_extract_entity_tags_empty_string(search_service, session_maker):
465 |     """Test tag extraction from empty string in entity metadata."""
466 |     from basic_memory.models import Entity
467 | 
468 |     entity = Entity(
469 |         title="Test Entity",
470 |         entity_type="note",
471 |         entity_metadata={"tags": "[]"},
472 |         content_type="text/markdown",
473 |         file_path="test/empty-string-tags.md",
474 |         project_id=1,
475 |     )
476 | 
477 |     tags = search_service._extract_entity_tags(entity)
478 |     assert tags == []
479 | 
480 | 
481 | @pytest.mark.asyncio
482 | async def test_extract_entity_tags_no_metadata(search_service, session_maker):
483 |     """Test tag extraction when entity has no metadata."""
484 |     from basic_memory.models import Entity
485 | 
486 |     entity = Entity(
487 |         title="Test Entity",
488 |         entity_type="note",
489 |         entity_metadata=None,
490 |         content_type="text/markdown",
491 |         file_path="test/no-metadata.md",
492 |         project_id=1,
493 |     )
494 | 
495 |     tags = search_service._extract_entity_tags(entity)
496 |     assert tags == []
497 | 
498 | 
499 | @pytest.mark.asyncio
500 | async def test_extract_entity_tags_no_tags_key(search_service, session_maker):
501 |     """Test tag extraction when metadata exists but has no tags key."""
502 |     from basic_memory.models import Entity
503 | 
504 |     entity = Entity(
505 |         title="Test Entity",
506 |         entity_type="note",
507 |         entity_metadata={"title": "Some Title", "type": "note"},
508 |         content_type="text/markdown",
509 |         file_path="test/no-tags-key.md",
510 |         project_id=1,
511 |     )
512 | 
513 |     tags = search_service._extract_entity_tags(entity)
514 |     assert tags == []
515 | 
516 | 
517 | @pytest.mark.asyncio
518 | async def test_search_by_frontmatter_tags(search_service, session_maker, test_project):
519 |     """Test that entities can be found by searching for their frontmatter tags."""
520 |     from basic_memory.repository import EntityRepository
521 | 
522 |     entity_repo = EntityRepository(session_maker, project_id=test_project.id)
523 | 
524 |     # Create entity with tags
525 |     from datetime import datetime
526 | 
527 |     entity_data = {
528 |         "title": "Business Strategy Guide",
529 |         "entity_type": "note",
530 |         "entity_metadata": {"tags": ["business", "strategy", "planning", "organization"]},
531 |         "content_type": "text/markdown",
532 |         "file_path": "guides/business-strategy.md",
533 |         "permalink": "guides/business-strategy",
534 |         "project_id": test_project.id,
535 |         "created_at": datetime.now(),
536 |         "updated_at": datetime.now(),
537 |     }
538 | 
539 |     entity = await entity_repo.create(entity_data)
540 | 
541 |     await search_service.index_entity(entity, content="")
542 | 
543 |     # Search for entities by tag
544 |     results = await search_service.search(SearchQuery(text="business"))
545 |     assert len(results) >= 1
546 | 
547 |     # Check that our entity is in the results
548 |     entity_found = False
549 |     for result in results:
550 |         if result.title == "Business Strategy Guide":
551 |             entity_found = True
552 |             break
553 |     assert entity_found, "Entity with 'business' tag should be found in search results"
554 | 
555 |     # Test searching by another tag
556 |     results = await search_service.search(SearchQuery(text="planning"))
557 |     assert len(results) >= 1
558 | 
559 |     entity_found = False
560 |     for result in results:
561 |         if result.title == "Business Strategy Guide":
562 |             entity_found = True
563 |             break
564 |     assert entity_found, "Entity with 'planning' tag should be found in search results"
565 | 
566 | 
567 | @pytest.mark.asyncio
568 | async def test_search_by_frontmatter_tags_string_format(
569 |     search_service, session_maker, test_project
570 | ):
571 |     """Test that entities with string format tags can be found in search."""
572 |     from basic_memory.repository import EntityRepository
573 | 
574 |     entity_repo = EntityRepository(session_maker, project_id=test_project.id)
575 | 
576 |     # Create entity with tags in string format
577 |     from datetime import datetime
578 | 
579 |     entity_data = {
580 |         "title": "Documentation Guidelines",
581 |         "entity_type": "note",
582 |         "entity_metadata": {"tags": "['documentation', 'tools', 'best-practices']"},
583 |         "content_type": "text/markdown",
584 |         "file_path": "guides/documentation.md",
585 |         "permalink": "guides/documentation",
586 |         "project_id": test_project.id,
587 |         "created_at": datetime.now(),
588 |         "updated_at": datetime.now(),
589 |     }
590 | 
591 |     entity = await entity_repo.create(entity_data)
592 | 
593 |     await search_service.index_entity(entity, content="")
594 | 
595 |     # Search for entities by tag
596 |     results = await search_service.search(SearchQuery(text="documentation"))
597 |     assert len(results) >= 1
598 | 
599 |     # Check that our entity is in the results
600 |     entity_found = False
601 |     for result in results:
602 |         if result.title == "Documentation Guidelines":
603 |             entity_found = True
604 |             break
605 |     assert entity_found, "Entity with 'documentation' tag should be found in search results"
606 | 
607 | 
608 | @pytest.mark.asyncio
609 | async def test_search_special_characters_in_title(search_service, session_maker, test_project):
610 |     """Test that entities with special characters in titles can be searched without FTS5 syntax errors."""
611 |     from basic_memory.repository import EntityRepository
612 | 
613 |     entity_repo = EntityRepository(session_maker, project_id=test_project.id)
614 | 
615 |     # Create entities with special characters that could cause FTS5 syntax errors
616 |     special_titles = [
617 |         "Note with spaces",
618 |         "Note-with-dashes",
619 |         "Note_with_underscores",
620 |         "Note (with parentheses)",  # This is the problematic one
621 |         "Note & Symbols!",
622 |         "Note [with brackets]",
623 |         "Note {with braces}",
624 |         'Note "with quotes"',
625 |         "Note 'with apostrophes'",
626 |     ]
627 | 
628 |     entities = []
629 |     for i, title in enumerate(special_titles):
630 |         from datetime import datetime
631 | 
632 |         entity_data = {
633 |             "title": title,
634 |             "entity_type": "note",
635 |             "entity_metadata": {"tags": ["special", "characters"]},
636 |             "content_type": "text/markdown",
637 |             "file_path": f"special/{title}.md",
638 |             "permalink": f"special/note-{i}",
639 |             "project_id": test_project.id,
640 |             "created_at": datetime.now(),
641 |             "updated_at": datetime.now(),
642 |         }
643 | 
644 |         entity = await entity_repo.create(entity_data)
645 |         entities.append(entity)
646 | 
647 |     # Index all entities
648 |     for entity in entities:
649 |         await search_service.index_entity(entity, content="")
650 | 
651 |     # Test searching for each title - this should not cause FTS5 syntax errors
652 |     for title in special_titles:
653 |         results = await search_service.search(SearchQuery(title=title))
654 | 
655 |         # Should find the entity without throwing FTS5 syntax errors
656 |         entity_found = False
657 |         for result in results:
658 |             if result.title == title:
659 |                 entity_found = True
660 |                 break
661 | 
662 |         assert entity_found, f"Entity with title '{title}' should be found in search results"
663 | 
664 | 
665 | @pytest.mark.asyncio
666 | async def test_search_title_with_parentheses_specific(search_service, session_maker, test_project):
667 |     """Test searching specifically for title with parentheses to reproduce FTS5 error."""
668 |     from basic_memory.repository import EntityRepository
669 | 
670 |     entity_repo = EntityRepository(session_maker, project_id=test_project.id)
671 | 
672 |     # Create the problematic entity
673 |     from datetime import datetime
674 | 
675 |     entity_data = {
676 |         "title": "Note (with parentheses)",
677 |         "entity_type": "note",
678 |         "entity_metadata": {"tags": ["test"]},
679 |         "content_type": "text/markdown",
680 |         "file_path": "special/Note (with parentheses).md",
681 |         "permalink": "special/note-with-parentheses",
682 |         "project_id": test_project.id,
683 |         "created_at": datetime.now(),
684 |         "updated_at": datetime.now(),
685 |     }
686 | 
687 |     entity = await entity_repo.create(entity_data)
688 | 
689 |     # Index the entity
690 |     await search_service.index_entity(entity, content="")
691 | 
692 |     # Test searching for the title - this should not cause FTS5 syntax errors
693 |     search_query = SearchQuery(title="Note (with parentheses)")
694 |     results = await search_service.search(search_query)
695 | 
696 |     # Should find the entity without throwing FTS5 syntax errors
697 |     assert len(results) >= 1
698 |     assert any(result.title == "Note (with parentheses)" for result in results)
699 | 
700 | 
701 | @pytest.mark.asyncio
702 | async def test_search_title_via_repository_direct(search_service, session_maker, test_project):
703 |     """Test searching via search repository directly to isolate the FTS5 error."""
704 |     from basic_memory.repository import EntityRepository
705 | 
706 |     entity_repo = EntityRepository(session_maker, project_id=test_project.id)
707 | 
708 |     # Create the problematic entity
709 |     from datetime import datetime
710 | 
711 |     entity_data = {
712 |         "title": "Note (with parentheses)",
713 |         "entity_type": "note",
714 |         "entity_metadata": {"tags": ["test"]},
715 |         "content_type": "text/markdown",
716 |         "file_path": "special/Note (with parentheses).md",
717 |         "permalink": "special/note-with-parentheses",
718 |         "project_id": test_project.id,
719 |         "created_at": datetime.now(),
720 |         "updated_at": datetime.now(),
721 |     }
722 | 
723 |     entity = await entity_repo.create(entity_data)
724 | 
725 |     # Index the entity
726 |     await search_service.index_entity(entity, content="")
727 | 
728 |     # Test searching via repository directly - this reproduces the error path
729 |     results = await search_service.repository.search(
730 |         title="Note (with parentheses)",
731 |         limit=10,
732 |         offset=0,
733 |     )
734 | 
735 |     # Should find the entity without throwing FTS5 syntax errors
736 |     assert len(results) >= 1
737 |     assert any(result.title == "Note (with parentheses)" for result in results)
738 | 
739 | 
740 | # Tests for duplicate observation permalink deduplication
741 | 
742 | 
743 | @pytest.mark.asyncio
744 | async def test_index_entity_with_duplicate_observations(
745 |     search_service, session_maker, test_project
746 | ):
747 |     """Test that indexing an entity with duplicate observations doesn't cause unique constraint violations.
748 | 
749 |     Two observations with the same category and content generate identical permalinks,
750 |     which would violate the unique constraint on the search_index table.
751 |     """
752 |     from basic_memory.repository import EntityRepository, ObservationRepository
753 |     from datetime import datetime
754 | 
755 |     entity_repo = EntityRepository(session_maker, project_id=test_project.id)
756 |     obs_repo = ObservationRepository(session_maker, project_id=test_project.id)
757 | 
758 |     # Create entity
759 |     entity_data = {
760 |         "title": "Entity With Duplicate Observations",
761 |         "entity_type": "note",
762 |         "entity_metadata": {},
763 |         "content_type": "text/markdown",
764 |         "file_path": "test/duplicate-obs.md",
765 |         "permalink": "test/duplicate-obs",
766 |         "project_id": test_project.id,
767 |         "created_at": datetime.now(),
768 |         "updated_at": datetime.now(),
769 |     }
770 | 
771 |     entity = await entity_repo.create(entity_data)
772 | 
773 |     # Create duplicate observations - same category and content
774 |     duplicate_content = "This is a duplicated observation"
775 |     await obs_repo.create(
776 |         {"entity_id": entity.id, "category": "note", "content": duplicate_content}
777 |     )
778 |     await obs_repo.create(
779 |         {"entity_id": entity.id, "category": "note", "content": duplicate_content}
780 |     )
781 | 
782 |     # Reload entity with observations (get_by_permalink eagerly loads observations)
783 |     entity = await entity_repo.get_by_permalink("test/duplicate-obs")
784 | 
785 |     # Verify we have duplicate observations
786 |     assert len(entity.observations) == 2
787 |     assert entity.observations[0].permalink == entity.observations[1].permalink
788 | 
789 |     # This should not raise a unique constraint violation
790 |     await search_service.index_entity(entity, content="")
791 | 
792 |     # Verify entity is searchable
793 |     results = await search_service.search(SearchQuery(text="Duplicate Observations"))
794 |     assert len(results) >= 1
795 |     assert any(r.title == "Entity With Duplicate Observations" for r in results)
796 | 
797 | 
798 | @pytest.mark.asyncio
799 | async def test_index_entity_dedupes_observations_by_permalink(
800 |     search_service, session_maker, test_project
801 | ):
802 |     """Test that only unique observation permalinks are indexed.
803 | 
804 |     When an entity has observations with identical permalinks, only the first one
805 |     should be indexed to avoid unique constraint violations.
806 |     """
807 |     from basic_memory.repository import EntityRepository, ObservationRepository
808 |     from datetime import datetime
809 | 
810 |     entity_repo = EntityRepository(session_maker, project_id=test_project.id)
811 |     obs_repo = ObservationRepository(session_maker, project_id=test_project.id)
812 | 
813 |     # Create entity
814 |     entity_data = {
815 |         "title": "Dedupe Test Entity",
816 |         "entity_type": "note",
817 |         "entity_metadata": {},
818 |         "content_type": "text/markdown",
819 |         "file_path": "test/dedupe-test.md",
820 |         "permalink": "test/dedupe-test",
821 |         "project_id": test_project.id,
822 |         "created_at": datetime.now(),
823 |         "updated_at": datetime.now(),
824 |     }
825 | 
826 |     entity = await entity_repo.create(entity_data)
827 | 
828 |     # Create three observations: two duplicates and one unique
829 |     duplicate_content = "Duplicate observation content"
830 |     unique_content = "Unique observation content"
831 | 
832 |     await obs_repo.create(
833 |         {"entity_id": entity.id, "category": "note", "content": duplicate_content}
834 |     )
835 |     await obs_repo.create(
836 |         {"entity_id": entity.id, "category": "note", "content": duplicate_content}
837 |     )
838 |     await obs_repo.create({"entity_id": entity.id, "category": "note", "content": unique_content})
839 | 
840 |     # Reload entity with observations (get_by_permalink eagerly loads observations)
841 |     entity = await entity_repo.get_by_permalink("test/dedupe-test")
842 |     assert len(entity.observations) == 3
843 | 
844 |     # Index the entity
845 |     await search_service.index_entity(entity, content="")
846 | 
847 |     # Search for the unique observation - should find it
848 |     results = await search_service.search(SearchQuery(text="Unique observation"))
849 |     assert len(results) >= 1
850 | 
851 |     # Search for duplicate observation - should find it (only one indexed)
852 |     results = await search_service.search(SearchQuery(text="Duplicate observation"))
853 |     assert len(results) >= 1
854 | 
855 | 
856 | @pytest.mark.asyncio
857 | async def test_index_entity_multiple_categories_same_content(
858 |     search_service, session_maker, test_project
859 | ):
860 |     """Test that observations with same content but different categories are not deduped.
861 | 
862 |     The permalink includes the category, so observations with different categories
863 |     but same content should have different permalinks and both be indexed.
864 |     """
865 |     from basic_memory.repository import EntityRepository, ObservationRepository
866 |     from datetime import datetime
867 | 
868 |     entity_repo = EntityRepository(session_maker, project_id=test_project.id)
869 |     obs_repo = ObservationRepository(session_maker, project_id=test_project.id)
870 | 
871 |     # Create entity
872 |     entity_data = {
873 |         "title": "Multi Category Entity",
874 |         "entity_type": "note",
875 |         "entity_metadata": {},
876 |         "content_type": "text/markdown",
877 |         "file_path": "test/multi-category.md",
878 |         "permalink": "test/multi-category",
879 |         "project_id": test_project.id,
880 |         "created_at": datetime.now(),
881 |         "updated_at": datetime.now(),
882 |     }
883 | 
884 |     entity = await entity_repo.create(entity_data)
885 | 
886 |     # Create observations with same content but different categories
887 |     shared_content = "Shared content across categories"
888 |     await obs_repo.create({"entity_id": entity.id, "category": "tech", "content": shared_content})
889 |     await obs_repo.create({"entity_id": entity.id, "category": "design", "content": shared_content})
890 | 
891 |     # Reload entity with observations (get_by_permalink eagerly loads observations)
892 |     entity = await entity_repo.get_by_permalink("test/multi-category")
893 |     assert len(entity.observations) == 2
894 | 
895 |     # Verify permalinks are different due to different categories
896 |     permalinks = {obs.permalink for obs in entity.observations}
897 |     assert len(permalinks) == 2  # Should be 2 unique permalinks
898 | 
899 |     # Index the entity - both should be indexed since permalinks differ
900 |     await search_service.index_entity(entity, content="")
901 | 
902 |     # Search for the shared content - should find both observations
903 |     results = await search_service.search(SearchQuery(text="Shared content"))
904 |     assert len(results) >= 2
905 | 
```

--------------------------------------------------------------------------------
/specs/SPEC-8 TigrisFS Integration.md:
--------------------------------------------------------------------------------

```markdown
  1 | ---
  2 | title: 'SPEC-8: TigrisFS Integration for Tenant API'
  3 | Date: September 22, 2025
  4 | Status: Phase 3.6 Complete - Tenant Mount API Endpoints Ready for CLI Implementation
  5 | Priority: High
  6 | Goal: Replace Fly volumes with Tigris bucket provisioning in production tenant API
  7 | permalink: spec-8-tigris-fs-integration
  8 | ---
  9 | 
 10 | ## Executive Summary
 11 | 
 12 | Based on SPEC-7 Phase 4 POC testing, this spec outlines productizing the TigrisFS/rclone implementation in the Basic Memory Cloud tenant API. 
 13 | We're moving from proof-of-concept to production integration, replacing Fly volume storage with Tigris bucket-per-tenant architecture.
 14 | 
 15 | ## Current Architecture (Fly Volumes)
 16 | 
 17 | ### Tenant Provisioning Flow
 18 | ```python
 19 | # apps/cloud/src/basic_memory_cloud/workflows/tenant_provisioning.py
 20 | async def provision_tenant_infrastructure(tenant_id: str):
 21 |     # 1. Create Fly app
 22 |     # 2. Create Fly volume  ← REPLACE THIS
 23 |     # 3. Deploy API container with volume mount
 24 |     # 4. Configure health checks
 25 | ```
 26 | 
 27 | ### Storage Implementation
 28 | - Each tenant gets dedicated Fly volume (1GB-10GB)
 29 | - Volume mounted at `/app/data` in API container
 30 | - Local filesystem storage with Basic Memory indexing
 31 | - No global caching or edge distribution
 32 | 
 33 | ## Proposed Architecture (Tigris Buckets)
 34 | 
 35 | ### New Tenant Provisioning Flow
 36 | ```python
 37 | async def provision_tenant_infrastructure(tenant_id: str):
 38 |     # 1. Create Fly app
 39 |     # 2. Create Tigris bucket with admin credentials  ← NEW
 40 |     # 3. Store bucket name in tenant record  ← NEW
 41 |     # 4. Deploy API container with TigrisFS mount using admin credentials
 42 |     # 5. Configure health checks
 43 | ```
 44 | 
 45 | ### Storage Implementation
 46 | - Each tenant gets dedicated Tigris bucket
 47 | - TigrisFS mounts bucket at `/app/data` in API container
 48 | - Global edge caching and distribution
 49 | - Configurable cache TTL for sync performance
 50 | 
 51 | ## Implementation Plan
 52 | 
 53 | ### Phase 1: Bucket Provisioning Service
 54 | 
 55 | **✅ IMPLEMENTED: StorageClient with Admin Credentials**
 56 | ```python
 57 | # apps/cloud/src/basic_memory_cloud/clients/storage_client.py
 58 | class StorageClient:
 59 |     async def create_tenant_bucket(self, tenant_id: UUID) -> TigrisBucketCredentials
 60 |     async def delete_tenant_bucket(self, tenant_id: UUID, bucket_name: str) -> bool
 61 |     async def list_buckets(self) -> list[TigrisBucketResponse]
 62 |     async def test_tenant_credentials(self, credentials: TigrisBucketCredentials) -> bool
 63 | ```
 64 | 
 65 | **Simplified Architecture Using Admin Credentials:**
 66 | - Single admin access key with full Tigris permissions (configured in console)
 67 | - No tenant-specific IAM user creation needed
 68 | - Bucket-per-tenant isolation for logical separation
 69 | - Admin credentials shared across all tenant operations
 70 | 
 71 | **Integrate with Provisioning workflow:**
 72 | ```python
 73 | # Update tenant_provisioning.py
 74 | async def provision_tenant_infrastructure(tenant_id: str):
 75 |     storage_client = StorageClient(settings.aws_access_key_id, settings.aws_secret_access_key)
 76 |     bucket_creds = await storage_client.create_tenant_bucket(tenant_id)
 77 |     await store_bucket_name(tenant_id, bucket_creds.bucket_name)
 78 |     await deploy_api_with_tigris(tenant_id, bucket_creds)
 79 | ```
 80 | 
 81 | ### Phase 2: Simplified Bucket Management
 82 | 
 83 | **✅ SIMPLIFIED: Admin Credentials + Bucket Names Only**
 84 | 
 85 | Since we use admin credentials for all operations, we only need to track bucket names per tenant:
 86 | 
 87 | 1. **Primary Storage (Fly Secrets)**
 88 |    ```bash
 89 |    flyctl secrets set -a basic-memory-{tenant_id} \
 90 |      AWS_ACCESS_KEY_ID="{admin_access_key}" \
 91 |      AWS_SECRET_ACCESS_KEY="{admin_secret_key}" \
 92 |      AWS_ENDPOINT_URL_S3="https://fly.storage.tigris.dev" \
 93 |      AWS_REGION="auto" \
 94 |      BUCKET_NAME="basic-memory-{tenant_id}"
 95 |    ```
 96 | 
 97 | 2. **Database Storage (Bucket Name Only)**
 98 |    ```python
 99 |    # apps/cloud/src/basic_memory_cloud/models/tenant.py
100 |    class Tenant(BaseModel):
101 |        # ... existing fields
102 |        tigris_bucket_name: Optional[str] = None  # Just store bucket name
103 |        tigris_region: str = "auto"
104 |        created_at: datetime
105 |    ```
106 | 
107 | **Benefits of Simplified Approach:**
108 | - No credential encryption/decryption needed
109 | - Admin credentials managed centrally in environment
110 | - Only bucket names stored in database (not sensitive)
111 | - Simplified backup/restore scenarios
112 | - Reduced security attack surface
113 | 
114 | ### Phase 3: API Container Updates
115 | 
116 | **Update API container configuration:**
117 | ```dockerfile
118 | # apps/api/Dockerfile
119 | # Add TigrisFS installation
120 | RUN curl -L https://github.com/tigrisdata/tigrisfs/releases/latest/download/tigrisfs-linux-amd64 \
121 |     -o /usr/local/bin/tigrisfs && chmod +x /usr/local/bin/tigrisfs
122 | ```
123 | 
124 | **Startup script integration:**
125 | ```bash
126 | # apps/api/tigrisfs-startup.sh (already exists)
127 | # Mount TigrisFS → Start Basic Memory API
128 | exec python -m basic_memory_cloud_api.main
129 | ```
130 | 
131 | **Fly.toml environment (optimized for < 5s startup):**
132 | ```toml
133 | # apps/api/fly.tigris-production.toml
134 | [env]
135 |   TIGRISFS_MEMORY_LIMIT = '1024'     # Reduced for faster init
136 |   TIGRISFS_MAX_FLUSHERS = '16'       # Fewer threads for faster startup
137 |   TIGRISFS_STAT_CACHE_TTL = '30s'    # Balance sync speed vs startup
138 |   TIGRISFS_LAZY_INIT = 'true'        # Enable lazy loading
139 |   BASIC_MEMORY_HOME = '/app/data'
140 | 
141 | # Suspend optimization for wake-on-network
142 | [machine]
143 |   auto_stop_machines = "suspend"     # Faster than full stop
144 |   auto_start_machines = true
145 |   min_machines_running = 0
146 | ```
147 | 
148 | ### Phase 4: Local Access Features
149 | 
150 | **CLI automation for local mounting:**
151 | ```python
152 | # New CLI command: basic-memory cloud mount
153 | async def setup_local_mount(tenant_id: str):
154 |     # 1. Fetch bucket credentials from cloud API
155 |     # 2. Configure rclone with scoped IAM policy
156 |     # 3. Mount via rclone nfsmount (macOS) or FUSE (Linux)
157 |     # 4. Start Basic Memory sync watcher
158 | ```
159 | 
160 | **Local mount configuration:**
161 | ```bash
162 | # rclone config for tenant
163 | rclone mount basic-memory-{tenant_id}: ~/basic-memory-{tenant_id} \
164 |   --nfs-mount \
165 |   --vfs-cache-mode writes \
166 |   --cache-dir ~/.cache/rclone/basic-memory-{tenant_id}
167 | ```
168 | 
169 | ### Phase 5: TigrisFS Cache Sync Solutions
170 | 
171 | **Problem**: When files are uploaded via CLI/bisync, the tenant API container doesn't see them immediately due to TigrisFS cache (30s TTL) and lack of inotify events on mounted filesystems.
172 | 
173 | **Multi-Layer Solution:**
174 | 
175 | **Layer 1: API Sync Endpoint** (Immediate)
176 | ```python
177 | # POST /sync - Force TigrisFS cache refresh
178 | # Callable by CLI after uploads
179 | subprocess.run(["sync", "fsync /app/data"], check=True)
180 | ```
181 | 
182 | **Layer 2: Tigris Webhook Integration** (Real-time)
183 | https://www.tigrisdata.com/docs/buckets/object-notifications/#webhook
184 | ```python
185 | # Webhook endpoint for bucket changes
186 | @app.post("/webhooks/tigris/{tenant_id}")
187 | async def handle_bucket_notification(tenant_id: str, event: TigrisEvent):
188 |     if event.eventName in ["OBJECT_CREATED_PUT", "OBJECT_DELETED"]:
189 |         await notify_container_sync(tenant_id, event.object.key)
190 | ```
191 | 
192 | **Layer 3: CLI Sync Notification** (User-triggered)
193 | ```bash
194 | # CLI calls container sync endpoint after successful bisync
195 | basic-memory cloud bisync  # Automatically notifies container
196 | curl -X POST https://basic-memory-{tenant-id}.fly.dev/sync
197 | ```
198 | 
199 | **Layer 4: Periodic Sync Fallback** (Safety net)
200 | ```python
201 | # Background task: fsync /app/data every 30s as fallback
202 | # Ensures eventual consistency even if other layers fail
203 | ```
204 | 
205 | **Implementation Priority:**
206 | 1. Layer 1 (API endpoint) - Quick testing capability
207 | 2. Layer 3 (CLI integration) - Improved UX
208 | 3. Layer 4 (Periodic fallback) - Safety net
209 | 4. Layer 2 (Webhooks) - Production real-time sync
210 | 
211 | 
212 | ## Performance Targets
213 | 
214 | ### Sync Latency
215 | - **Target**: < 5 seconds local→cloud→container
216 | - **Configuration**: `TIGRISFS_STAT_CACHE_TTL = '5s'`
217 | - **Monitoring**: Track sync metrics in production
218 | 
219 | ### Container Startup
220 | - **Target**: < 5 seconds including TigrisFS mount
221 | - **Fast retry**: 0.5s intervals for mount verification
222 | - **Fallback**: Container fails fast if mount fails
223 | 
224 | ### Memory Usage
225 | - **TigrisFS cache**: 2GB memory limit per container
226 | - **Concurrent uploads**: 32 flushers max
227 | - **VM sizing**: shared-cpu-2x (2048mb) minimum
228 | 
229 | ## Security Considerations
230 | 
231 | ### Bucket Isolation
232 | - Each tenant has dedicated bucket
233 | - IAM policies prevent cross-tenant access
234 | - No shared bucket with subdirectories
235 | 
236 | ### Credential Security
237 | - Fly secrets for runtime access
238 | - Encrypted database backup for disaster recovery
239 | - Credential rotation capability
240 | 
241 | ### Data Residency
242 | - Tigris global edge caching
243 | - SOC2 Type II compliance
244 | - Encryption at rest and in transit
245 | 
246 | ## Operational Benefits
247 | 
248 | ### Scalability
249 | - Horizontal scaling with stateless API containers
250 | - Global edge distribution
251 | - Better resource utilization
252 | 
253 | ### Reliability
254 | - No cold starts between tenants
255 | - Built-in redundancy and caching
256 | - Simplified backup strategy
257 | 
258 | ### Cost Efficiency
259 | - Pay-per-use storage pricing
260 | - Shared infrastructure benefits
261 | - Reduced operational overhead
262 | 
263 | ## Risk Mitigation
264 | 
265 | ### Data Loss Prevention
266 | - Dual credential storage (Fly + database)
267 | - Automated backup workflows to R2/S3
268 | - Tigris built-in redundancy
269 | 
270 | ### Performance Degradation
271 | - Configurable cache settings per tenant
272 | - Monitoring and alerting on sync latency
273 | - Fallback to volume storage if needed
274 | 
275 | ### Security Vulnerabilities
276 | - Bucket-per-tenant isolation
277 | - Regular credential rotation
278 | - Security scanning and monitoring
279 | 
280 | ## Success Metrics
281 | 
282 | ### Technical Metrics
283 | - Sync latency P50 < 5 seconds
284 | - Container startup time < 5 seconds
285 | - Zero data loss incidents
286 | - 99.9% uptime per tenant
287 | 
288 | ### Business Metrics
289 | - Reduced infrastructure costs vs volumes
290 | - Improved user experience with faster sync
291 | - Enhanced enterprise security posture
292 | - Simplified operational overhead
293 | 
294 | ## Open Questions
295 | 
296 | 1. **Tigris rate limits**: What are the API limits for bucket creation?
297 | 2. **Cost analysis**: What's the break-even point vs Fly volumes?
298 | 3. **Regional preferences**: Should enterprise customers choose regions?
299 | 4. **Backup retention**: How long to keep automated backups?
300 | 
301 | ## Implementation Checklist
302 | 
303 | ### Phase 1: Bucket Provisioning Service ✅ COMPLETED
304 | - [x] **Research Tigris bucket API** - Document bucket creation and S3 API compatibility
305 | - [x] **Create StorageClient class** - Implemented with admin credentials and comprehensive integration tests
306 | - [x] **Test bucket creation** - Full test suite validates API integration with real Tigris environment
307 | - [x] **Add bucket provisioning to DBOS workflow** - Integrated StorageClient with tenant_provisioning.py
308 | 
309 | ### Phase 2: Simplified Bucket Management ✅ COMPLETED
310 | - [x] **Update Tenant model** with tigris_bucket_name field (replaced fly_volume_id)
311 | - [x] **Implement bucket name storage** - Database migration and model updates completed
312 | - [x] **Test bucket provisioning integration** - Full test suite validates workflow from tenant creation to bucket assignment
313 | - [x] **Remove volume logic from all tests** - Complete migration from volume-based to bucket-based architecture
314 | 
315 | ### Phase 3: API Container Integration ✅ COMPLETED
316 | - [x] **Update Dockerfile** to install TigrisFS binary in API container with configurable version
317 | - [x] **Optimize tigrisfs-startup.sh** with production-ready security and reliability improvements
318 | - [x] **Create production-ready container** with proper signal handling and mount validation
319 | - [x] **Implement security fixes** based on Claude code review (conditional debug, credential protection)
320 | - [x] **Add proper process supervision** with cleanup traps and error handling
321 | - [x] **Remove debug artifacts** - Cleaned up all debug Dockerfiles and test scripts
322 | 
323 | ### Phase 3.5: IAM Access Key Management ✅ COMPLETED
324 | - [x] **Research Tigris IAM API** - Documented create_policy, attach_user_policy, delete_access_key operations
325 | - [x] **Implement bucket-scoped credential generation** - StorageClient.create_tenant_access_keys() with IAM policies
326 | - [x] **Add comprehensive security test suite** - 5 security-focused integration tests covering all attack vectors
327 | - [x] **Verify cross-bucket access prevention** - Scoped credentials can ONLY access their designated bucket
328 | - [x] **Test credential lifecycle management** - Create, validate, delete, and revoke access keys
329 | - [x] **Validate admin vs scoped credential isolation** - Different access patterns and security boundaries
330 | - [x] **Test multi-tenant isolation** - Multiple tenants cannot access each other's buckets
331 | 
332 | ### Phase 3.6: Tenant Mount API Endpoints ✅ COMPLETED
333 | - [x] **Implement GET /tenant/mount/info** - Returns mount info without exposing credentials
334 | - [x] **Implement POST /tenant/mount/credentials** - Creates new bucket-scoped credentials for CLI mounting
335 | - [x] **Implement DELETE /tenant/mount/credentials/{cred_id}** - Revoke specific credentials with proper cleanup
336 | - [x] **Implement GET /tenant/mount/credentials** - List active credentials without exposing secrets
337 | - [x] **Add TenantMountCredentials database model** - Tracks credential metadata (no secret storage)
338 | - [x] **Create comprehensive test suite** - 28 tests covering all scenarios including multi-session support
339 | - [x] **Implement multi-session credential flow** - Multiple active credentials per tenant supported
340 | - [x] **Secure credential handling** - Secret keys never stored, returned once only for immediate use
341 | - [x] **Add dependency injection for StorageClient** - Clean integration with existing API architecture
342 | - [x] **Fix Tigris configuration for cloud service** - Added AWS environment variables to fly.template.toml
343 | - [x] **Update tenant machine configurations** - Include AWS credentials for TigrisFS mounting with clear credential strategy
344 | 
345 | **Security Test Results:**
346 | ```
347 | ✅ Cross-bucket access prevention - PASS
348 | ✅ Deleted credentials access revoked - PASS
349 | ✅ Invalid credentials rejected - PASS
350 | ✅ Admin vs scoped credential isolation - PASS
351 | ✅ Multiple scoped credentials isolation - PASS
352 | ```
353 | 
354 | **Implementation Details:**
355 | - Uses Tigris IAM managed policies (create_policy + attach_user_policy)
356 | - Bucket-scoped S3 policies with Actions: GetObject, PutObject, DeleteObject, ListBucket
357 | - Resource ARNs limited to specific bucket: `arn:aws:s3:::bucket-name` and `arn:aws:s3:::bucket-name/*`
358 | - Access keys follow Tigris format: `tid_` prefix with secure random suffix
359 | - Complete cleanup on deletion removes both access keys and associated policies
360 | 
361 | ### Phase 4: Local Access CLI
362 | - [x] **Design local mount CLI command** for automated rclone configuration
363 | - [x] **Implement credential fetching** from cloud API for local setup
364 | - [x] **Create rclone config automation** for tenant-specific bucket mounting
365 | - [x] **Test local→cloud→container sync** with optimized cache settings
366 | - [x] **Document local access setup** for beta users
367 | 
368 | ### Phase 5: Webhook Integration (Future)
369 | - [ ] **Research Tigris webhook API** for object notifications and payload format
370 | - [ ] **Design webhook endpoint** for real-time sync notifications
371 | - [ ] **Implement notification handling** to trigger Basic Memory sync events
372 | - [ ] **Test webhook delivery** and sync latency improvements
373 | 
374 | ## Success Metrics
375 | - [ ] **Container startup < 5 seconds** including TigrisFS mount and Basic Memory init
376 | - [ ] **Sync latency < 5 seconds** for local→cloud→container file changes
377 | - [ ] **Zero data loss** during bucket provisioning and credential management
378 | - [ ] **100% test coverage** for new TigrisBucketService and credential functions
379 | - [ ] **Beta deployment** with internal users validating local-cloud workflow
380 | 
381 | 
382 | 
383 | ## Implementation Notes
384 | 
385 | ## Phase 4.1: Bidirectional Sync with rclone bisync (NEW)
386 | 
387 | ### Problem Statement
388 | During testing, we discovered that some applications (particularly Obsidian) don't detect file changes over NFS mounts. Rather than building a custom sync daemon, we can leverage `rclone bisync` - rclone's built-in bidirectional synchronization feature.
389 | 
390 | ### Solution: rclone bisync
391 | Use rclone's proven bidirectional sync instead of custom implementation:
392 | 
393 | **Core Architecture:**
394 | ```bash
395 | # rclone bisync handles all the complexity
396 | rclone bisync ~/basic-memory-{tenant_id} basic-memory-{tenant_id}:{bucket_name} \
397 |   --create-empty-src-dirs \
398 |   --conflict-resolve newer \
399 |   --resilient \
400 |   --check-access
401 | ```
402 | 
403 | **Key Benefits:**
404 | - ✅ **Battle-tested**: Production-proven rclone functionality
405 | - ✅ **MIT licensed**: Open source with permissive licensing
406 | - ✅ **No custom code**: Zero maintenance burden for sync logic
407 | - ✅ **Built-in safety**: max-delete protection, conflict resolution
408 | - ✅ **Simple installation**: Works with Homebrew rclone (no FUSE needed)
409 | - ✅ **File watcher compatible**: Works with Obsidian and all applications
410 | - ✅ **Offline support**: Can work offline and sync when connected
411 | 
412 | ### bisync Conflict Resolution Options
413 | 
414 | **Built-in conflict strategies:**
415 | ```bash
416 | --conflict-resolve none     # Keep both files with .conflict suffixes (safest)
417 | --conflict-resolve newer    # Always pick the most recently modified file
418 | --conflict-resolve larger   # Choose based on file size
419 | --conflict-resolve path1    # Always prefer local changes
420 | --conflict-resolve path2    # Always prefer cloud changes
421 | ```
422 | 
423 | ### Sync Profiles Using bisync
424 | 
425 | **Profile configurations:**
426 | ```python
427 | BISYNC_PROFILES = {
428 |     "safe": {
429 |         "conflict_resolve": "none",      # Keep both versions
430 |         "max_delete": 10,                # Prevent mass deletion
431 |         "check_access": True,            # Verify sync integrity
432 |         "description": "Safe mode with conflict preservation"
433 |     },
434 |     "balanced": {
435 |         "conflict_resolve": "newer",     # Auto-resolve to newer file
436 |         "max_delete": 25,
437 |         "check_access": True,
438 |         "description": "Balanced mode (recommended default)"
439 |     },
440 |     "fast": {
441 |         "conflict_resolve": "newer",
442 |         "max_delete": 50,
443 |         "check_access": False,           # Skip verification for speed
444 |         "description": "Fast mode for rapid iteration"
445 |     }
446 | }
447 | ```
448 | 
449 | ### CLI Commands
450 | 
451 | **Manual sync commands:**
452 | ```bash
453 | basic-memory cloud bisync                    # Manual bidirectional sync
454 | basic-memory cloud bisync --dry-run          # Preview changes
455 | basic-memory cloud bisync --profile safe     # Use specific profile
456 | basic-memory cloud bisync --resync           # Force full baseline resync
457 | ```
458 | 
459 | **Watch mode (Step 1):**
460 | ```bash
461 | basic-memory cloud bisync --watch            # Long-running process, sync every 60s
462 | basic-memory cloud bisync --watch --interval 30s  # Custom interval
463 | ```
464 | 
465 | **System integration (Step 2 - Future):**
466 | ```bash
467 | basic-memory cloud bisync-service install    # Install as system service
468 | basic-memory cloud bisync-service start      # Start background service
469 | basic-memory cloud bisync-service status     # Check service status
470 | ```
471 | 
472 | ### Implementation Strategy
473 | 
474 | **Phase 4.1.1: Core bisync Implementation**
475 | - [ ] Implement `run_bisync()` function wrapping rclone bisync
476 | - [ ] Add profile-based configuration (safe/balanced/fast)
477 | - [ ] Create conflict resolution and safety options
478 | - [ ] Test with sample files and conflict scenarios
479 | 
480 | **Phase 4.1.2: Watch Mode**
481 | - [ ] Add `--watch` flag for continuous sync
482 | - [ ] Implement configurable sync intervals
483 | - [ ] Add graceful shutdown and signal handling
484 | - [ ] Create status monitoring and progress indicators
485 | 
486 | **Phase 4.1.3: User Experience**
487 | - [ ] Add conflict reporting and resolution guidance
488 | - [ ] Implement dry-run preview functionality
489 | - [ ] Create troubleshooting and diagnostic commands
490 | - [ ] Add filtering configuration (.gitignore-style)
491 | 
492 | **Phase 4.1.4: System Integration (Future)**
493 | - [ ] Generate platform-specific service files (launchd/systemd)
494 | - [ ] Add service management commands
495 | - [ ] Implement automatic startup and recovery
496 | - [ ] Create monitoring and logging integration
497 | 
498 | ### Technical Implementation
499 | 
500 | **Core bisync wrapper:**
501 | ```python
502 | def run_bisync(
503 |     tenant_id: str,
504 |     bucket_name: str,
505 |     profile: str = "balanced",
506 |     dry_run: bool = False
507 | ) -> bool:
508 |     """Run rclone bisync with specified profile."""
509 | 
510 |     local_path = Path.home() / f"basic-memory-{tenant_id}"
511 |     remote_path = f"basic-memory-{tenant_id}:{bucket_name}"
512 |     profile_config = BISYNC_PROFILES[profile]
513 | 
514 |     cmd = [
515 |         "rclone", "bisync",
516 |         str(local_path), remote_path,
517 |         "--create-empty-src-dirs",
518 |         "--resilient",
519 |         f"--conflict-resolve={profile_config['conflict_resolve']}",
520 |         f"--max-delete={profile_config['max_delete']}",
521 |         "--filters-file", "~/.basic-memory/bisync-filters.txt"
522 |     ]
523 | 
524 |     if profile_config.get("check_access"):
525 |         cmd.append("--check-access")
526 | 
527 |     if dry_run:
528 |         cmd.append("--dry-run")
529 | 
530 |     return subprocess.run(cmd, check=True).returncode == 0
531 | ```
532 | 
533 | **Default filter file (~/.basic-memory/bisync-filters.txt):**
534 | ```
535 | - .DS_Store
536 | - .git/**
537 | - __pycache__/**
538 | - *.pyc
539 | - .pytest_cache/**
540 | - node_modules/**
541 | - .conflict-*
542 | - Thumbs.db
543 | - desktop.ini
544 | ```
545 | 
546 | **Advantages Over Custom Daemon:**
547 | - ✅ **Zero maintenance**: No custom sync logic to debug/maintain
548 | - ✅ **Production proven**: Used by thousands in production
549 | - ✅ **Safety features**: Built-in max-delete, conflict handling, recovery
550 | - ✅ **Filtering**: Advanced exclude patterns and rules
551 | - ✅ **Performance**: Optimized for various storage backends
552 | - ✅ **Community support**: Extensive documentation and community
553 | 
554 | ## Phase 4.2: NFS Mount Support (Direct Access)
555 | 
556 | ### Solution: rclone nfsmount
557 | Keep the existing NFS mount functionality for users who prefer direct file access:
558 | 
559 | **Core Architecture:**
560 | ```bash
561 | # rclone nfsmount provides transparent file access
562 | rclone nfsmount basic-memory-{tenant_id}:{bucket_name} ~/basic-memory-{tenant_id} \
563 |   --vfs-cache-mode writes \
564 |   --dir-cache-time 10s \
565 |   --daemon
566 | ```
567 | 
568 | **Key Benefits:**
569 | - ✅ **Real-time access**: Files appear immediately as they're created/modified
570 | - ✅ **Transparent**: Works with any application that reads/writes files
571 | - ✅ **Low latency**: Direct access without sync delays
572 | - ✅ **Simple**: No periodic sync commands needed
573 | - ✅ **Homebrew compatible**: Works with Homebrew rclone (no FUSE required)
574 | 
575 | **Limitations:**
576 | - ❌ **File watcher compatibility**: Some apps (Obsidian) don't detect changes over NFS
577 | - ❌ **Network dependency**: Requires active connection to cloud storage
578 | - ❌ **Potential conflicts**: Simultaneous edits from multiple locations can cause issues
579 | 
580 | ### Mount Profiles (Existing)
581 | 
582 | **Already implemented profiles from SPEC-7 testing:**
583 | ```python
584 | MOUNT_PROFILES = {
585 |     "fast": {
586 |         "cache_time": "5s",
587 |         "poll_interval": "3s",
588 |         "description": "Ultra-fast development (5s sync)"
589 |     },
590 |     "balanced": {
591 |         "cache_time": "10s",
592 |         "poll_interval": "5s",
593 |         "description": "Fast development (10-15s sync, recommended)"
594 |     },
595 |     "safe": {
596 |         "cache_time": "15s",
597 |         "poll_interval": "10s",
598 |         "description": "Conflict-aware mount with backup",
599 |         "extra_args": ["--conflict-suffix", ".conflict-{DateTimeExt}"]
600 |     }
601 | }
602 | ```
603 | 
604 | ### CLI Commands (Existing)
605 | 
606 | **Mount commands already implemented:**
607 | ```bash
608 | basic-memory cloud mount                     # Mount with balanced profile
609 | basic-memory cloud mount --profile fast     # Ultra-fast caching
610 | basic-memory cloud mount --profile safe     # Conflict detection
611 | basic-memory cloud unmount                  # Clean unmount
612 | basic-memory cloud mount-status             # Show mount status
613 | ```
614 | 
615 | ## User Choice: Mount vs Bisync
616 | 
617 | ### When to Use Each Approach
618 | 
619 | | Use Case | Recommended Solution | Why |
620 | |----------|---------------------|-----|
621 | | **Obsidian users** | `bisync` | File watcher support for live preview |
622 | | **CLI/vim/emacs users** | `mount` | Direct file access, lower latency |
623 | | **Offline work** | `bisync` | Can work offline, sync when connected |
624 | | **Real-time collaboration** | `mount` | Immediate visibility of changes |
625 | | **Multiple machines** | `bisync` | Better conflict handling |
626 | | **Single machine** | `mount` | Simpler, more transparent |
627 | | **Development work** | Either | Both work well, user preference |
628 | | **Large files** | `mount` | Streaming access vs full download |
629 | 
630 | ### Installation Simplicity
631 | 
632 | **Both approaches now use simple Homebrew installation:**
633 | ```bash
634 | # Single installation command for both approaches
635 | brew install rclone
636 | 
637 | # No macFUSE, no system modifications needed
638 | # Works immediately with both mount and bisync
639 | ```
640 | 
641 | ### Implementation Status
642 | 
643 | **Phase 4.1: bisync** (NEW)
644 | - [ ] Implement bisync command wrapper
645 | - [ ] Add watch mode with configurable intervals
646 | - [ ] Create conflict resolution workflows
647 | - [ ] Add filtering and safety options
648 | 
649 | **Phase 4.2: mount** (EXISTING - ✅ IMPLEMENTED)
650 | - [x] NFS mount commands with profile support
651 | - [x] Mount management and cleanup
652 | - [x] Process monitoring and health checks
653 | - [x] Credential integration with cloud API
654 | 
655 | **Both approaches share:**
656 | - [x] Credential management via cloud API
657 | - [x] Secure rclone configuration
658 | - [x] Tenant isolation and bucket scoping
659 | - [x] Simple Homebrew rclone installation
660 | 
661 | 
662 | Key Features:
663 | 
664 | 1. Cross-Platform rclone Installation (rclone_installer.py):
665 | - macOS: Homebrew → official script fallback
666 | - Linux: snap → apt → official script fallback
667 | - Windows: winget → chocolatey → scoop fallback
668 | - Automatic version detection and verification
669 | 
670 | 2. Smart rclone Configuration (rclone_config.py):
671 | - Automatic tenant-specific config generation
672 | - Three optimized mount profiles from your SPEC-7 testing:
673 | - fast: 5s sync (ultra-performance)
674 | - balanced: 10-15s sync (recommended default)
675 | - safe: 15s sync + conflict detection
676 | - Backup existing configs before modification
677 | 
678 | 3. Robust Mount Management (mount_commands.py):
679 | - Automatic tenant credential generation
680 | - Mount path management (~/basic-memory-{tenant-id})
681 | - Process lifecycle management (prevent duplicate mounts)
682 | - Orphaned process cleanup
683 | - Mount verification and health checking
684 | 
685 | 4. Clean Architecture (api_client.py):
686 | - Separated API client to avoid circular imports
687 | - Reuses existing authentication infrastructure
688 | - Consistent error handling and logging
689 | 
690 | User Experience:
691 | 
692 | One-Command Setup:
693 | basic-memory cloud setup
694 | ```bash 
695 | # 1. Installs rclone automatically
696 | # 2. Authenticates with existing login
697 | # 3. Generates secure credentials  
698 | # 4. Configures rclone
699 | # 5. Performs initial mount
700 | ```
701 | 
702 | Profile-Based Mounting:
703 | basic-memory cloud mount --profile fast      # 5s sync
704 | basic-memory cloud mount --profile balanced  # 15s sync (default)
705 | basic-memory cloud mount --profile safe      # conflict detection
706 | 
707 | Status Monitoring:
708 | basic-memory cloud mount-status
709 | ```bash 
710 | # Shows: tenant info, mount path, sync profile, rclone processes
711 | ```
712 | ### local mount api 
713 | 
714 | Endpoint 1: Get Tenant Info for user
715 | Purpose: Get tenant details for mounting
716 | - pass in jwt
717 | - service returns mount info
718 | 
719 | **✅ IMPLEMENTED API Specification:**
720 | 
721 | **Endpoint 1: GET /tenant/mount/info**
722 | - Purpose: Get tenant mount information without exposing credentials
723 | - Authentication: JWT token (tenant_id extracted from claims)
724 | 
725 | Request:
726 | ```
727 | GET /tenant/mount/info
728 | Authorization: Bearer {jwt_token}
729 | ```
730 | 
731 | Response:
732 | ```json
733 | {
734 |     "tenant_id": "434252dd-d83b-4b20-bf70-8a950ff875c4",
735 |     "bucket_name": "basic-memory-434252dd",
736 |     "has_credentials": true,
737 |     "credentials_created_at": "2025-09-22T16:48:50.414694"
738 | }
739 | ```
740 | 
741 | **Endpoint 2: POST /tenant/mount/credentials**
742 | - Purpose: Generate NEW bucket-scoped S3 credentials for rclone mounting
743 | - Authentication: JWT token (tenant_id extracted from claims)
744 | - Multi-session: Creates new credentials without revoking existing ones
745 | 
746 | Request:
747 | ```
748 | POST /tenant/mount/credentials
749 | Authorization: Bearer {jwt_token}
750 | Content-Type: application/json
751 | ```
752 | *Note: No request body needed - tenant_id extracted from JWT*
753 | 
754 | Response:
755 | ```json
756 | {
757 |     "tenant_id": "434252dd-d83b-4b20-bf70-8a950ff875c4",
758 |     "bucket_name": "basic-memory-434252dd",
759 |     "access_key": "test_access_key_12345",
760 |     "secret_key": "test_secret_key_abcdef",
761 |     "endpoint_url": "https://fly.storage.tigris.dev",
762 |     "region": "auto"
763 | }
764 | ```
765 | 
766 | **🔒 Security Notes:**
767 | - Secret key returned ONCE only - never stored in database
768 | - Credentials are bucket-scoped (cannot access other tenants' buckets)
769 | - Multiple active credentials supported per tenant (work laptop + personal machine)
770 | 
771 | Implementation Notes
772 | 
773 | Security:
774 | - Both endpoints require JWT authentication
775 | - Extract tenant_id from JWT claims (not request body)
776 | - Generate scoped credentials (not admin credentials)
777 | - Credentials should have bucket-specific access only
778 | 
779 | Integration Points:
780 | - Use your existing StorageClient from SPEC-8 implementation
781 | - Leverage existing JWT middleware for tenant extraction
782 | - Return same credential format as your Tigris bucket provisioning
783 | 
784 | Error Handling:
785 | - 401 if not authenticated
786 | - 403 if tenant doesn't exist
787 | - 500 if credential generation fails
788 | 
789 | **🔄 Design Decisions:**
790 | 
791 | 1. **Secure Credential Flow (No Secret Storage)**
792 | 
793 | Based on CLI flow analysis, we follow security best practices:
794 | - ✅ API generates both access_key + secret_key via Tigris IAM
795 | - ✅ Returns both in API response for immediate use
796 | - ✅ CLI uses credentials immediately to configure rclone
797 | - ✅ Database stores only metadata (access_key + policy_arn for cleanup)
798 | - ✅ rclone handles secure local credential storage
799 | - ❌ **Never store secret_key in database (even encrypted)**
800 | 
801 | 2. **CLI Credential Flow**
802 | ```bash
803 | # CLI calls API
804 | POST /tenant/mount/credentials → {access_key, secret_key, ...}
805 | 
806 | # CLI immediately configures rclone
807 | rclone config create basic-memory-{tenant_id} s3 \
808 |   access_key_id={access_key} \
809 |   secret_access_key={secret_key} \
810 |   endpoint=https://fly.storage.tigris.dev
811 | 
812 | # Database tracks metadata only
813 | INSERT INTO tenant_mount_credentials (tenant_id, access_key, policy_arn, ...)
814 | ```
815 | 
816 | 3. **Multiple Sessions Supported**
817 | 
818 | - Users can have multiple active credential sets (work laptop, personal machine, etc.)
819 | - Each credential generation creates a new Tigris access key
820 | - List active credentials via API (shows access_key but never secret)
821 | 
822 | 4. **Failure Handling & Cleanup**
823 | 
824 | - **Happy Path**: Credentials created → Used immediately → rclone configured
825 | - **Orphaned Credentials**: Background job revokes unused credentials
826 | - **API Failure Recovery**: Retry Tigris deletion with stored policy_arn
827 | - **Status Tracking**: Track tigris_deletion_status (pending/completed/failed)
828 | 
829 | 5. **Event Sourcing & Audit**
830 | 
831 | - MountCredentialCreatedEvent
832 | - MountCredentialRevokedEvent
833 | - MountCredentialOrphanedEvent (for cleanup)
834 | - Full audit trail for security compliance
835 | 
836 | 6. **Tenant/Bucket Validation**
837 | 
838 | - Verify tenant exists and has valid bucket before credential generation
839 | - Use existing StorageClient to validate bucket access
840 | - Prevent credential generation for inactive/invalid tenants
841 | 
842 | 📋 **Implemented API Endpoints:**
843 | 
844 | ```
845 | ✅ IMPLEMENTED:
846 | GET    /tenant/mount/info                    # Get tenant/bucket info (no credentials exposed)
847 | POST   /tenant/mount/credentials             # Generate new credentials (returns secret once)
848 | GET    /tenant/mount/credentials             # List active credentials (no secrets)
849 | DELETE /tenant/mount/credentials/{cred_id}   # Revoke specific credentials
850 | ```
851 | 
852 | **API Implementation Status:**
853 | - ✅ **GET /tenant/mount/info**: Returns tenant_id, bucket_name, has_credentials, credentials_created_at
854 | - ✅ **POST /tenant/mount/credentials**: Creates new bucket-scoped access keys, returns access_key + secret_key once
855 | - ✅ **GET /tenant/mount/credentials**: Lists active credentials without exposing secret keys
856 | - ✅ **DELETE /tenant/mount/credentials/{cred_id}**: Revokes specific credentials with proper Tigris IAM cleanup
857 | - ✅ **Multi-session support**: Multiple active credentials per tenant (work laptop + personal machine)
858 | - ✅ **Security**: Secret keys never stored in database, returned once only for immediate use
859 | - ✅ **Comprehensive test suite**: 28 tests covering all scenarios including error handling and multi-session flows
860 | - ✅ **Dependency injection**: Clean integration with existing FastAPI architecture
861 | - ✅ **Production-ready configuration**: Tigris credentials properly configured for tenant machines
862 | 
863 | 🗄️ **Secure Database Schema:**
864 | 
865 | ```sql
866 | CREATE TABLE tenant_mount_credentials (
867 |   id UUID PRIMARY KEY,
868 |   tenant_id UUID REFERENCES tenant(id),
869 |   access_key VARCHAR(255) NOT NULL,
870 |   -- secret_key REMOVED - never store secrets (security best practice)
871 |   policy_arn VARCHAR(255) NOT NULL,           -- For Tigris IAM cleanup
872 |   tigris_deletion_status VARCHAR(20) DEFAULT 'pending',  -- Track cleanup
873 |   created_at TIMESTAMP DEFAULT NOW(),
874 |   updated_at TIMESTAMP DEFAULT NOW(),
875 |   revoked_at TIMESTAMP NULL,
876 |   last_used_at TIMESTAMP NULL,               -- Track usage for orphan cleanup
877 |   description VARCHAR(255) DEFAULT 'CLI mount credentials'
878 | );
879 | ```
880 | 
881 | **Security Benefits:**
882 | - ✅ Database breach cannot expose secrets
883 | - ✅ Follows "secrets don't persist" security principle
884 | - ✅ Meets compliance requirements (SOC2, etc.)
885 | - ✅ Reduced attack surface
886 | - ✅ CLI gets credentials once and stores securely via rclone
887 | 
```
Page 19/27FirstPrevNextLast