#
tokens: 49779/50000 96/625 files (page 2/47)
lines: on (toggle) GitHub
raw markdown copy reset
This is page 2 of 47. Use http://codebase.md/doobidoo/mcp-memory-service?lines=true&page={x} to view the full context.

# Directory Structure

```
├── .claude
│   ├── agents
│   │   ├── amp-bridge.md
│   │   ├── amp-pr-automator.md
│   │   ├── code-quality-guard.md
│   │   ├── gemini-pr-automator.md
│   │   └── github-release-manager.md
│   ├── settings.local.json.backup
│   └── settings.local.json.local
├── .commit-message
├── .dockerignore
├── .env.example
├── .env.sqlite.backup
├── .envnn#
├── .gitattributes
├── .github
│   ├── FUNDING.yml
│   ├── ISSUE_TEMPLATE
│   │   ├── bug_report.yml
│   │   ├── config.yml
│   │   ├── feature_request.yml
│   │   └── performance_issue.yml
│   ├── pull_request_template.md
│   └── workflows
│       ├── bridge-tests.yml
│       ├── CACHE_FIX.md
│       ├── claude-code-review.yml
│       ├── claude.yml
│       ├── cleanup-images.yml.disabled
│       ├── dev-setup-validation.yml
│       ├── docker-publish.yml
│       ├── LATEST_FIXES.md
│       ├── main-optimized.yml.disabled
│       ├── main.yml
│       ├── publish-and-test.yml
│       ├── README_OPTIMIZATION.md
│       ├── release-tag.yml.disabled
│       ├── release.yml
│       ├── roadmap-review-reminder.yml
│       ├── SECRET_CONDITIONAL_FIX.md
│       └── WORKFLOW_FIXES.md
├── .gitignore
├── .mcp.json.backup
├── .mcp.json.template
├── .pyscn
│   ├── .gitignore
│   └── reports
│       └── analyze_20251123_214224.html
├── AGENTS.md
├── archive
│   ├── deployment
│   │   ├── deploy_fastmcp_fixed.sh
│   │   ├── deploy_http_with_mcp.sh
│   │   └── deploy_mcp_v4.sh
│   ├── deployment-configs
│   │   ├── empty_config.yml
│   │   └── smithery.yaml
│   ├── development
│   │   └── test_fastmcp.py
│   ├── docs-removed-2025-08-23
│   │   ├── authentication.md
│   │   ├── claude_integration.md
│   │   ├── claude-code-compatibility.md
│   │   ├── claude-code-integration.md
│   │   ├── claude-code-quickstart.md
│   │   ├── claude-desktop-setup.md
│   │   ├── complete-setup-guide.md
│   │   ├── database-synchronization.md
│   │   ├── development
│   │   │   ├── autonomous-memory-consolidation.md
│   │   │   ├── CLEANUP_PLAN.md
│   │   │   ├── CLEANUP_README.md
│   │   │   ├── CLEANUP_SUMMARY.md
│   │   │   ├── dream-inspired-memory-consolidation.md
│   │   │   ├── hybrid-slm-memory-consolidation.md
│   │   │   ├── mcp-milestone.md
│   │   │   ├── multi-client-architecture.md
│   │   │   ├── test-results.md
│   │   │   └── TIMESTAMP_FIX_SUMMARY.md
│   │   ├── distributed-sync.md
│   │   ├── invocation_guide.md
│   │   ├── macos-intel.md
│   │   ├── master-guide.md
│   │   ├── mcp-client-configuration.md
│   │   ├── multi-client-server.md
│   │   ├── service-installation.md
│   │   ├── sessions
│   │   │   └── MCP_ENHANCEMENT_SESSION_MEMORY_v4.1.0.md
│   │   ├── UBUNTU_SETUP.md
│   │   ├── ubuntu.md
│   │   ├── windows-setup.md
│   │   └── windows.md
│   ├── docs-root-cleanup-2025-08-23
│   │   ├── AWESOME_LIST_SUBMISSION.md
│   │   ├── CLOUDFLARE_IMPLEMENTATION.md
│   │   ├── DOCUMENTATION_ANALYSIS.md
│   │   ├── DOCUMENTATION_CLEANUP_PLAN.md
│   │   ├── DOCUMENTATION_CONSOLIDATION_COMPLETE.md
│   │   ├── LITESTREAM_SETUP_GUIDE.md
│   │   ├── lm_studio_system_prompt.md
│   │   ├── PYTORCH_DOWNLOAD_FIX.md
│   │   └── README-ORIGINAL-BACKUP.md
│   ├── investigations
│   │   └── MACOS_HOOKS_INVESTIGATION.md
│   ├── litestream-configs-v6.3.0
│   │   ├── install_service.sh
│   │   ├── litestream_master_config_fixed.yml
│   │   ├── litestream_master_config.yml
│   │   ├── litestream_replica_config_fixed.yml
│   │   ├── litestream_replica_config.yml
│   │   ├── litestream_replica_simple.yml
│   │   ├── litestream-http.service
│   │   ├── litestream.service
│   │   └── requirements-cloudflare.txt
│   ├── release-notes
│   │   └── release-notes-v7.1.4.md
│   └── setup-development
│       ├── README.md
│       ├── setup_consolidation_mdns.sh
│       ├── STARTUP_SETUP_GUIDE.md
│       └── test_service.sh
├── CHANGELOG-HISTORIC.md
├── CHANGELOG.md
├── claude_commands
│   ├── memory-context.md
│   ├── memory-health.md
│   ├── memory-ingest-dir.md
│   ├── memory-ingest.md
│   ├── memory-recall.md
│   ├── memory-search.md
│   ├── memory-store.md
│   ├── README.md
│   └── session-start.md
├── claude-hooks
│   ├── config.json
│   ├── config.template.json
│   ├── CONFIGURATION.md
│   ├── core
│   │   ├── memory-retrieval.js
│   │   ├── mid-conversation.js
│   │   ├── session-end.js
│   │   ├── session-start.js
│   │   └── topic-change.js
│   ├── debug-pattern-test.js
│   ├── install_claude_hooks_windows.ps1
│   ├── install_hooks.py
│   ├── memory-mode-controller.js
│   ├── MIGRATION.md
│   ├── README-NATURAL-TRIGGERS.md
│   ├── README-phase2.md
│   ├── README.md
│   ├── simple-test.js
│   ├── statusline.sh
│   ├── test-adaptive-weights.js
│   ├── test-dual-protocol-hook.js
│   ├── test-mcp-hook.js
│   ├── test-natural-triggers.js
│   ├── test-recency-scoring.js
│   ├── tests
│   │   ├── integration-test.js
│   │   ├── phase2-integration-test.js
│   │   ├── test-code-execution.js
│   │   ├── test-cross-session.json
│   │   ├── test-session-tracking.json
│   │   └── test-threading.json
│   ├── utilities
│   │   ├── adaptive-pattern-detector.js
│   │   ├── context-formatter.js
│   │   ├── context-shift-detector.js
│   │   ├── conversation-analyzer.js
│   │   ├── dynamic-context-updater.js
│   │   ├── git-analyzer.js
│   │   ├── mcp-client.js
│   │   ├── memory-client.js
│   │   ├── memory-scorer.js
│   │   ├── performance-manager.js
│   │   ├── project-detector.js
│   │   ├── session-tracker.js
│   │   ├── tiered-conversation-monitor.js
│   │   └── version-checker.js
│   └── WINDOWS-SESSIONSTART-BUG.md
├── CLAUDE.md
├── CODE_OF_CONDUCT.md
├── CONTRIBUTING.md
├── Development-Sprint-November-2025.md
├── docs
│   ├── amp-cli-bridge.md
│   ├── api
│   │   ├── code-execution-interface.md
│   │   ├── memory-metadata-api.md
│   │   ├── PHASE1_IMPLEMENTATION_SUMMARY.md
│   │   ├── PHASE2_IMPLEMENTATION_SUMMARY.md
│   │   ├── PHASE2_REPORT.md
│   │   └── tag-standardization.md
│   ├── architecture
│   │   ├── search-enhancement-spec.md
│   │   └── search-examples.md
│   ├── architecture.md
│   ├── archive
│   │   └── obsolete-workflows
│   │       ├── load_memory_context.md
│   │       └── README.md
│   ├── assets
│   │   └── images
│   │       ├── dashboard-v3.3.0-preview.png
│   │       ├── memory-awareness-hooks-example.png
│   │       ├── project-infographic.svg
│   │       └── README.md
│   ├── CLAUDE_CODE_QUICK_REFERENCE.md
│   ├── cloudflare-setup.md
│   ├── deployment
│   │   ├── docker.md
│   │   ├── dual-service.md
│   │   ├── production-guide.md
│   │   └── systemd-service.md
│   ├── development
│   │   ├── ai-agent-instructions.md
│   │   ├── code-quality
│   │   │   ├── phase-2a-completion.md
│   │   │   ├── phase-2a-handle-get-prompt.md
│   │   │   ├── phase-2a-index.md
│   │   │   ├── phase-2a-install-package.md
│   │   │   └── phase-2b-session-summary.md
│   │   ├── code-quality-workflow.md
│   │   ├── dashboard-workflow.md
│   │   ├── issue-management.md
│   │   ├── pr-review-guide.md
│   │   ├── refactoring-notes.md
│   │   ├── release-checklist.md
│   │   └── todo-tracker.md
│   ├── docker-optimized-build.md
│   ├── document-ingestion.md
│   ├── DOCUMENTATION_AUDIT.md
│   ├── enhancement-roadmap-issue-14.md
│   ├── examples
│   │   ├── analysis-scripts.js
│   │   ├── maintenance-session-example.md
│   │   ├── memory-distribution-chart.jsx
│   │   └── tag-schema.json
│   ├── first-time-setup.md
│   ├── glama-deployment.md
│   ├── guides
│   │   ├── advanced-command-examples.md
│   │   ├── chromadb-migration.md
│   │   ├── commands-vs-mcp-server.md
│   │   ├── mcp-enhancements.md
│   │   ├── mdns-service-discovery.md
│   │   ├── memory-consolidation-guide.md
│   │   ├── migration.md
│   │   ├── scripts.md
│   │   └── STORAGE_BACKENDS.md
│   ├── HOOK_IMPROVEMENTS.md
│   ├── hooks
│   │   └── phase2-code-execution-migration.md
│   ├── http-server-management.md
│   ├── ide-compatability.md
│   ├── IMAGE_RETENTION_POLICY.md
│   ├── images
│   │   └── dashboard-placeholder.md
│   ├── implementation
│   │   ├── health_checks.md
│   │   └── performance.md
│   ├── IMPLEMENTATION_PLAN_HTTP_SSE.md
│   ├── integration
│   │   ├── homebrew.md
│   │   └── multi-client.md
│   ├── integrations
│   │   ├── gemini.md
│   │   ├── groq-bridge.md
│   │   ├── groq-integration-summary.md
│   │   └── groq-model-comparison.md
│   ├── integrations.md
│   ├── legacy
│   │   └── dual-protocol-hooks.md
│   ├── LM_STUDIO_COMPATIBILITY.md
│   ├── maintenance
│   │   └── memory-maintenance.md
│   ├── mastery
│   │   ├── api-reference.md
│   │   ├── architecture-overview.md
│   │   ├── configuration-guide.md
│   │   ├── local-setup-and-run.md
│   │   ├── testing-guide.md
│   │   └── troubleshooting.md
│   ├── migration
│   │   └── code-execution-api-quick-start.md
│   ├── natural-memory-triggers
│   │   ├── cli-reference.md
│   │   ├── installation-guide.md
│   │   └── performance-optimization.md
│   ├── oauth-setup.md
│   ├── pr-graphql-integration.md
│   ├── quick-setup-cloudflare-dual-environment.md
│   ├── README.md
│   ├── remote-configuration-wiki-section.md
│   ├── research
│   │   ├── code-execution-interface-implementation.md
│   │   └── code-execution-interface-summary.md
│   ├── ROADMAP.md
│   ├── sqlite-vec-backend.md
│   ├── statistics
│   │   ├── charts
│   │   │   ├── activity_patterns.png
│   │   │   ├── contributors.png
│   │   │   ├── growth_trajectory.png
│   │   │   ├── monthly_activity.png
│   │   │   └── october_sprint.png
│   │   ├── data
│   │   │   ├── activity_by_day.csv
│   │   │   ├── activity_by_hour.csv
│   │   │   ├── contributors.csv
│   │   │   └── monthly_activity.csv
│   │   ├── generate_charts.py
│   │   └── REPOSITORY_STATISTICS.md
│   ├── technical
│   │   ├── development.md
│   │   ├── memory-migration.md
│   │   ├── migration-log.md
│   │   ├── sqlite-vec-embedding-fixes.md
│   │   └── tag-storage.md
│   ├── testing
│   │   └── regression-tests.md
│   ├── testing-cloudflare-backend.md
│   ├── troubleshooting
│   │   ├── cloudflare-api-token-setup.md
│   │   ├── cloudflare-authentication.md
│   │   ├── general.md
│   │   ├── hooks-quick-reference.md
│   │   ├── pr162-schema-caching-issue.md
│   │   ├── session-end-hooks.md
│   │   └── sync-issues.md
│   └── tutorials
│       ├── advanced-techniques.md
│       ├── data-analysis.md
│       └── demo-session-walkthrough.md
├── examples
│   ├── claude_desktop_config_template.json
│   ├── claude_desktop_config_windows.json
│   ├── claude-desktop-http-config.json
│   ├── config
│   │   └── claude_desktop_config.json
│   ├── http-mcp-bridge.js
│   ├── memory_export_template.json
│   ├── README.md
│   ├── setup
│   │   └── setup_multi_client_complete.py
│   └── start_https_example.sh
├── install_service.py
├── install.py
├── LICENSE
├── NOTICE
├── pyproject.toml
├── pytest.ini
├── README.md
├── run_server.py
├── scripts
│   ├── .claude
│   │   └── settings.local.json
│   ├── archive
│   │   └── check_missing_timestamps.py
│   ├── backup
│   │   ├── backup_memories.py
│   │   ├── backup_sqlite_vec.sh
│   │   ├── export_distributable_memories.sh
│   │   └── restore_memories.py
│   ├── benchmarks
│   │   ├── benchmark_code_execution_api.py
│   │   ├── benchmark_hybrid_sync.py
│   │   └── benchmark_server_caching.py
│   ├── database
│   │   ├── analyze_sqlite_vec_db.py
│   │   ├── check_sqlite_vec_status.py
│   │   ├── db_health_check.py
│   │   └── simple_timestamp_check.py
│   ├── development
│   │   ├── debug_server_initialization.py
│   │   ├── find_orphaned_files.py
│   │   ├── fix_mdns.sh
│   │   ├── fix_sitecustomize.py
│   │   ├── remote_ingest.sh
│   │   ├── setup-git-merge-drivers.sh
│   │   ├── uv-lock-merge.sh
│   │   └── verify_hybrid_sync.py
│   ├── hooks
│   │   └── pre-commit
│   ├── installation
│   │   ├── install_linux_service.py
│   │   ├── install_macos_service.py
│   │   ├── install_uv.py
│   │   ├── install_windows_service.py
│   │   ├── install.py
│   │   ├── setup_backup_cron.sh
│   │   ├── setup_claude_mcp.sh
│   │   └── setup_cloudflare_resources.py
│   ├── linux
│   │   ├── service_status.sh
│   │   ├── start_service.sh
│   │   ├── stop_service.sh
│   │   ├── uninstall_service.sh
│   │   └── view_logs.sh
│   ├── maintenance
│   │   ├── assign_memory_types.py
│   │   ├── check_memory_types.py
│   │   ├── cleanup_corrupted_encoding.py
│   │   ├── cleanup_memories.py
│   │   ├── cleanup_organize.py
│   │   ├── consolidate_memory_types.py
│   │   ├── consolidation_mappings.json
│   │   ├── delete_orphaned_vectors_fixed.py
│   │   ├── fast_cleanup_duplicates_with_tracking.sh
│   │   ├── find_all_duplicates.py
│   │   ├── find_cloudflare_duplicates.py
│   │   ├── find_duplicates.py
│   │   ├── memory-types.md
│   │   ├── README.md
│   │   ├── recover_timestamps_from_cloudflare.py
│   │   ├── regenerate_embeddings.py
│   │   ├── repair_malformed_tags.py
│   │   ├── repair_memories.py
│   │   ├── repair_sqlite_vec_embeddings.py
│   │   ├── repair_zero_embeddings.py
│   │   ├── restore_from_json_export.py
│   │   └── scan_todos.sh
│   ├── migration
│   │   ├── cleanup_mcp_timestamps.py
│   │   ├── legacy
│   │   │   └── migrate_chroma_to_sqlite.py
│   │   ├── mcp-migration.py
│   │   ├── migrate_sqlite_vec_embeddings.py
│   │   ├── migrate_storage.py
│   │   ├── migrate_tags.py
│   │   ├── migrate_timestamps.py
│   │   ├── migrate_to_cloudflare.py
│   │   ├── migrate_to_sqlite_vec.py
│   │   ├── migrate_v5_enhanced.py
│   │   ├── TIMESTAMP_CLEANUP_README.md
│   │   └── verify_mcp_timestamps.py
│   ├── pr
│   │   ├── amp_collect_results.sh
│   │   ├── amp_detect_breaking_changes.sh
│   │   ├── amp_generate_tests.sh
│   │   ├── amp_pr_review.sh
│   │   ├── amp_quality_gate.sh
│   │   ├── amp_suggest_fixes.sh
│   │   ├── auto_review.sh
│   │   ├── detect_breaking_changes.sh
│   │   ├── generate_tests.sh
│   │   ├── lib
│   │   │   └── graphql_helpers.sh
│   │   ├── quality_gate.sh
│   │   ├── resolve_threads.sh
│   │   ├── run_pyscn_analysis.sh
│   │   ├── run_quality_checks.sh
│   │   ├── thread_status.sh
│   │   └── watch_reviews.sh
│   ├── quality
│   │   ├── fix_dead_code_install.sh
│   │   ├── phase1_dead_code_analysis.md
│   │   ├── phase2_complexity_analysis.md
│   │   ├── README_PHASE1.md
│   │   ├── README_PHASE2.md
│   │   ├── track_pyscn_metrics.sh
│   │   └── weekly_quality_review.sh
│   ├── README.md
│   ├── run
│   │   ├── run_mcp_memory.sh
│   │   ├── run-with-uv.sh
│   │   └── start_sqlite_vec.sh
│   ├── run_memory_server.py
│   ├── server
│   │   ├── check_http_server.py
│   │   ├── check_server_health.py
│   │   ├── memory_offline.py
│   │   ├── preload_models.py
│   │   ├── run_http_server.py
│   │   ├── run_memory_server.py
│   │   ├── start_http_server.bat
│   │   └── start_http_server.sh
│   ├── service
│   │   ├── deploy_dual_services.sh
│   │   ├── install_http_service.sh
│   │   ├── mcp-memory-http.service
│   │   ├── mcp-memory.service
│   │   ├── memory_service_manager.sh
│   │   ├── service_control.sh
│   │   ├── service_utils.py
│   │   └── update_service.sh
│   ├── sync
│   │   ├── check_drift.py
│   │   ├── claude_sync_commands.py
│   │   ├── export_memories.py
│   │   ├── import_memories.py
│   │   ├── litestream
│   │   │   ├── apply_local_changes.sh
│   │   │   ├── enhanced_memory_store.sh
│   │   │   ├── init_staging_db.sh
│   │   │   ├── io.litestream.replication.plist
│   │   │   ├── manual_sync.sh
│   │   │   ├── memory_sync.sh
│   │   │   ├── pull_remote_changes.sh
│   │   │   ├── push_to_remote.sh
│   │   │   ├── README.md
│   │   │   ├── resolve_conflicts.sh
│   │   │   ├── setup_local_litestream.sh
│   │   │   ├── setup_remote_litestream.sh
│   │   │   ├── staging_db_init.sql
│   │   │   ├── stash_local_changes.sh
│   │   │   ├── sync_from_remote_noconfig.sh
│   │   │   └── sync_from_remote.sh
│   │   ├── README.md
│   │   ├── safe_cloudflare_update.sh
│   │   ├── sync_memory_backends.py
│   │   └── sync_now.py
│   ├── testing
│   │   ├── run_complete_test.py
│   │   ├── run_memory_test.sh
│   │   ├── simple_test.py
│   │   ├── test_cleanup_logic.py
│   │   ├── test_cloudflare_backend.py
│   │   ├── test_docker_functionality.py
│   │   ├── test_installation.py
│   │   ├── test_mdns.py
│   │   ├── test_memory_api.py
│   │   ├── test_memory_simple.py
│   │   ├── test_migration.py
│   │   ├── test_search_api.py
│   │   ├── test_sqlite_vec_embeddings.py
│   │   ├── test_sse_events.py
│   │   ├── test-connection.py
│   │   └── test-hook.js
│   ├── utils
│   │   ├── claude_commands_utils.py
│   │   ├── generate_personalized_claude_md.sh
│   │   ├── groq
│   │   ├── groq_agent_bridge.py
│   │   ├── list-collections.py
│   │   ├── memory_wrapper_uv.py
│   │   ├── query_memories.py
│   │   ├── smithery_wrapper.py
│   │   ├── test_groq_bridge.sh
│   │   └── uv_wrapper.py
│   └── validation
│       ├── check_dev_setup.py
│       ├── check_documentation_links.py
│       ├── diagnose_backend_config.py
│       ├── validate_configuration_complete.py
│       ├── validate_memories.py
│       ├── validate_migration.py
│       ├── validate_timestamp_integrity.py
│       ├── verify_environment.py
│       ├── verify_pytorch_windows.py
│       └── verify_torch.py
├── SECURITY.md
├── selective_timestamp_recovery.py
├── SPONSORS.md
├── src
│   └── mcp_memory_service
│       ├── __init__.py
│       ├── api
│       │   ├── __init__.py
│       │   ├── client.py
│       │   ├── operations.py
│       │   ├── sync_wrapper.py
│       │   └── types.py
│       ├── backup
│       │   ├── __init__.py
│       │   └── scheduler.py
│       ├── cli
│       │   ├── __init__.py
│       │   ├── ingestion.py
│       │   ├── main.py
│       │   └── utils.py
│       ├── config.py
│       ├── consolidation
│       │   ├── __init__.py
│       │   ├── associations.py
│       │   ├── base.py
│       │   ├── clustering.py
│       │   ├── compression.py
│       │   ├── consolidator.py
│       │   ├── decay.py
│       │   ├── forgetting.py
│       │   ├── health.py
│       │   └── scheduler.py
│       ├── dependency_check.py
│       ├── discovery
│       │   ├── __init__.py
│       │   ├── client.py
│       │   └── mdns_service.py
│       ├── embeddings
│       │   ├── __init__.py
│       │   └── onnx_embeddings.py
│       ├── ingestion
│       │   ├── __init__.py
│       │   ├── base.py
│       │   ├── chunker.py
│       │   ├── csv_loader.py
│       │   ├── json_loader.py
│       │   ├── pdf_loader.py
│       │   ├── registry.py
│       │   ├── semtools_loader.py
│       │   └── text_loader.py
│       ├── lm_studio_compat.py
│       ├── mcp_server.py
│       ├── models
│       │   ├── __init__.py
│       │   └── memory.py
│       ├── server.py
│       ├── services
│       │   ├── __init__.py
│       │   └── memory_service.py
│       ├── storage
│       │   ├── __init__.py
│       │   ├── base.py
│       │   ├── cloudflare.py
│       │   ├── factory.py
│       │   ├── http_client.py
│       │   ├── hybrid.py
│       │   └── sqlite_vec.py
│       ├── sync
│       │   ├── __init__.py
│       │   ├── exporter.py
│       │   ├── importer.py
│       │   └── litestream_config.py
│       ├── utils
│       │   ├── __init__.py
│       │   ├── cache_manager.py
│       │   ├── content_splitter.py
│       │   ├── db_utils.py
│       │   ├── debug.py
│       │   ├── document_processing.py
│       │   ├── gpu_detection.py
│       │   ├── hashing.py
│       │   ├── http_server_manager.py
│       │   ├── port_detection.py
│       │   ├── system_detection.py
│       │   └── time_parser.py
│       └── web
│           ├── __init__.py
│           ├── api
│           │   ├── __init__.py
│           │   ├── analytics.py
│           │   ├── backup.py
│           │   ├── consolidation.py
│           │   ├── documents.py
│           │   ├── events.py
│           │   ├── health.py
│           │   ├── manage.py
│           │   ├── mcp.py
│           │   ├── memories.py
│           │   ├── search.py
│           │   └── sync.py
│           ├── app.py
│           ├── dependencies.py
│           ├── oauth
│           │   ├── __init__.py
│           │   ├── authorization.py
│           │   ├── discovery.py
│           │   ├── middleware.py
│           │   ├── models.py
│           │   ├── registration.py
│           │   └── storage.py
│           ├── sse.py
│           └── static
│               ├── app.js
│               ├── index.html
│               ├── README.md
│               ├── sse_test.html
│               └── style.css
├── start_http_debug.bat
├── start_http_server.sh
├── test_document.txt
├── test_version_checker.js
├── tests
│   ├── __init__.py
│   ├── api
│   │   ├── __init__.py
│   │   ├── test_compact_types.py
│   │   └── test_operations.py
│   ├── bridge
│   │   ├── mock_responses.js
│   │   ├── package-lock.json
│   │   ├── package.json
│   │   └── test_http_mcp_bridge.js
│   ├── conftest.py
│   ├── consolidation
│   │   ├── __init__.py
│   │   ├── conftest.py
│   │   ├── test_associations.py
│   │   ├── test_clustering.py
│   │   ├── test_compression.py
│   │   ├── test_consolidator.py
│   │   ├── test_decay.py
│   │   └── test_forgetting.py
│   ├── contracts
│   │   └── api-specification.yml
│   ├── integration
│   │   ├── package-lock.json
│   │   ├── package.json
│   │   ├── test_api_key_fallback.py
│   │   ├── test_api_memories_chronological.py
│   │   ├── test_api_tag_time_search.py
│   │   ├── test_api_with_memory_service.py
│   │   ├── test_bridge_integration.js
│   │   ├── test_cli_interfaces.py
│   │   ├── test_cloudflare_connection.py
│   │   ├── test_concurrent_clients.py
│   │   ├── test_data_serialization_consistency.py
│   │   ├── test_http_server_startup.py
│   │   ├── test_mcp_memory.py
│   │   ├── test_mdns_integration.py
│   │   ├── test_oauth_basic_auth.py
│   │   ├── test_oauth_flow.py
│   │   ├── test_server_handlers.py
│   │   └── test_store_memory.py
│   ├── performance
│   │   ├── test_background_sync.py
│   │   └── test_hybrid_live.py
│   ├── README.md
│   ├── smithery
│   │   └── test_smithery.py
│   ├── sqlite
│   │   └── simple_sqlite_vec_test.py
│   ├── test_client.py
│   ├── test_content_splitting.py
│   ├── test_database.py
│   ├── test_hybrid_cloudflare_limits.py
│   ├── test_hybrid_storage.py
│   ├── test_memory_ops.py
│   ├── test_semantic_search.py
│   ├── test_sqlite_vec_storage.py
│   ├── test_time_parser.py
│   ├── test_timestamp_preservation.py
│   ├── timestamp
│   │   ├── test_hook_vs_manual_storage.py
│   │   ├── test_issue99_final_validation.py
│   │   ├── test_search_retrieval_inconsistency.py
│   │   ├── test_timestamp_issue.py
│   │   └── test_timestamp_simple.py
│   └── unit
│       ├── conftest.py
│       ├── test_cloudflare_storage.py
│       ├── test_csv_loader.py
│       ├── test_fastapi_dependencies.py
│       ├── test_import.py
│       ├── test_json_loader.py
│       ├── test_mdns_simple.py
│       ├── test_mdns.py
│       ├── test_memory_service.py
│       ├── test_memory.py
│       ├── test_semtools_loader.py
│       ├── test_storage_interface_compatibility.py
│       └── test_tag_time_filtering.py
├── tools
│   ├── docker
│   │   ├── DEPRECATED.md
│   │   ├── docker-compose.http.yml
│   │   ├── docker-compose.pythonpath.yml
│   │   ├── docker-compose.standalone.yml
│   │   ├── docker-compose.uv.yml
│   │   ├── docker-compose.yml
│   │   ├── docker-entrypoint-persistent.sh
│   │   ├── docker-entrypoint-unified.sh
│   │   ├── docker-entrypoint.sh
│   │   ├── Dockerfile
│   │   ├── Dockerfile.glama
│   │   ├── Dockerfile.slim
│   │   ├── README.md
│   │   └── test-docker-modes.sh
│   └── README.md
└── uv.lock
```

# Files

--------------------------------------------------------------------------------
/CONTRIBUTING.md:
--------------------------------------------------------------------------------

```markdown
  1 | # Contributing to MCP Memory Service
  2 | 
  3 | Thank you for your interest in contributing to MCP Memory Service! 🎉
  4 | 
  5 | This project provides semantic memory and persistent storage for AI assistants through the Model Context Protocol. We welcome contributions of all kinds - from bug fixes and features to documentation and testing.
  6 | 
  7 | ## Table of Contents
  8 | 
  9 | - [Code of Conduct](#code-of-conduct)
 10 | - [Ways to Contribute](#ways-to-contribute)
 11 | - [Getting Started](#getting-started)
 12 | - [Development Process](#development-process)
 13 | - [Coding Standards](#coding-standards)
 14 | - [Testing Requirements](#testing-requirements)
 15 | - [Documentation](#documentation)
 16 | - [Submitting Changes](#submitting-changes)
 17 | - [Reporting Issues](#reporting-issues)
 18 | - [Community & Support](#community--support)
 19 | - [Recognition](#recognition)
 20 | 
 21 | ## Code of Conduct
 22 | 
 23 | We are committed to providing a welcoming and inclusive environment for all contributors. Please:
 24 | 
 25 | - Be respectful and considerate in all interactions
 26 | - Welcome newcomers and help them get started
 27 | - Focus on constructive criticism and collaborative problem-solving
 28 | - Respect differing viewpoints and experiences
 29 | - Avoid harassment, discrimination, or inappropriate behavior
 30 | 
 31 | ## Ways to Contribute
 32 | 
 33 | ### 🐛 Bug Reports
 34 | Help us identify and fix issues by reporting bugs with detailed information.
 35 | 
 36 | ### ✨ Feature Requests
 37 | Suggest new features or improvements to existing functionality.
 38 | 
 39 | ### 📝 Documentation
 40 | Improve README, Wiki pages, code comments, or API documentation.
 41 | 
 42 | ### 🧪 Testing
 43 | Write tests, improve test coverage, or help with manual testing.
 44 | 
 45 | ### 💻 Code Contributions
 46 | Fix bugs, implement features, or improve performance.
 47 | 
 48 | ### 🌍 Translations
 49 | Help make the project accessible to more users (future goal).
 50 | 
 51 | ### 💬 Community Support
 52 | Answer questions in Issues, Discussions, or help other users.
 53 | 
 54 | ## Getting Started
 55 | 
 56 | ### Prerequisites
 57 | 
 58 | - Python 3.10 or higher
 59 | - Git
 60 | - Platform-specific requirements:
 61 |   - **macOS**: Homebrew Python recommended for SQLite extension support
 62 |   - **Windows**: Visual Studio Build Tools for some dependencies
 63 |   - **Linux**: Build essentials package
 64 | 
 65 | ### Setting Up Your Development Environment
 66 | 
 67 | 1. **Fork the repository** on GitHub
 68 | 
 69 | 2. **Clone your fork**:
 70 |    ```bash
 71 |    git clone https://github.com/YOUR_USERNAME/mcp-memory-service.git
 72 |    cd mcp-memory-service
 73 |    ```
 74 | 
 75 | 3. **Install dependencies**:
 76 |    ```bash
 77 |    python install.py
 78 |    ```
 79 |    This will automatically detect your platform and install appropriate dependencies.
 80 | 
 81 | 4. **Verify installation**:
 82 |    ```bash
 83 |    python scripts/verify_environment.py
 84 |    ```
 85 | 
 86 | 5. **Run the service**:
 87 |    ```bash
 88 |    uv run memory server
 89 |    ```
 90 | 
 91 | 6. **Test with MCP Inspector** (optional):
 92 |    ```bash
 93 |    npx @modelcontextprotocol/inspector uv run memory server
 94 |    ```
 95 | 
 96 | ### Alternative: Docker Setup
 97 | 
 98 | For a containerized environment:
 99 | ```bash
100 | docker-compose up -d  # For MCP mode
101 | docker-compose -f docker-compose.http.yml up -d  # For HTTP API mode
102 | ```
103 | 
104 | ## Development Process
105 | 
106 | ### 1. Create a Feature Branch
107 | 
108 | ```bash
109 | git checkout -b feature/your-feature-name
110 | # or
111 | git checkout -b fix/issue-description
112 | ```
113 | 
114 | Use descriptive branch names:
115 | - `feature/` for new features
116 | - `fix/` for bug fixes
117 | - `docs/` for documentation
118 | - `test/` for test improvements
119 | - `refactor/` for code refactoring
120 | 
121 | ### 2. Make Your Changes
122 | 
123 | - Write clean, readable code
124 | - Follow the coding standards (see below)
125 | - Add/update tests as needed
126 | - Update documentation if applicable
127 | - Keep commits focused and atomic
128 | 
129 | ### 3. Test Your Changes
130 | 
131 | ```bash
132 | # Run all tests
133 | pytest tests/
134 | 
135 | # Run specific test file
136 | pytest tests/test_server.py
137 | 
138 | # Run with coverage
139 | pytest --cov=mcp_memory_service tests/
140 | ```
141 | 
142 | ### 4. Commit Your Changes
143 | 
144 | Use semantic commit messages:
145 | ```bash
146 | git commit -m "feat: add memory export functionality"
147 | git commit -m "fix: resolve timezone handling in memory search"
148 | git commit -m "docs: update installation guide for Windows"
149 | git commit -m "test: add coverage for storage backends"
150 | ```
151 | 
152 | Format: `<type>: <description>`
153 | 
154 | Types:
155 | - `feat`: New feature
156 | - `fix`: Bug fix
157 | - `docs`: Documentation changes
158 | - `test`: Test additions or changes
159 | - `refactor`: Code refactoring
160 | - `perf`: Performance improvements
161 | - `chore`: Maintenance tasks
162 | 
163 | ### 5. Push to Your Fork
164 | 
165 | ```bash
166 | git push origin your-branch-name
167 | ```
168 | 
169 | ### 6. Create a Pull Request
170 | 
171 | Open a PR from your fork to the main repository with:
172 | - Clear title describing the change
173 | - Description of what and why
174 | - Reference to any related issues
175 | - Screenshots/examples if applicable
176 | 
177 | ## Coding Standards
178 | 
179 | ### Python Style Guide
180 | 
181 | - Follow PEP 8 with these modifications:
182 |   - Line length: 88 characters (Black formatter default)
183 |   - Use double quotes for strings
184 | - Use type hints for all function signatures
185 | - Write descriptive variable and function names
186 | - Add docstrings to all public functions/classes (Google style)
187 | 
188 | ### Code Organization
189 | 
190 | ```python
191 | # Import order
192 | import standard_library
193 | import third_party_libraries
194 | from mcp_memory_service import local_modules
195 | 
196 | # Type hints
197 | from typing import Optional, List, Dict, Any
198 | 
199 | # Async functions
200 | async def process_memory(content: str) -> Dict[str, Any]:
201 |     """Process and store memory content.
202 | 
203 |     Args:
204 |         content: The memory content to process
205 | 
206 |     Returns:
207 |         Dictionary containing memory metadata
208 |     """
209 |     # Implementation
210 | ```
211 | 
212 | ### Error Handling
213 | 
214 | - Use specific exception types
215 | - Provide helpful error messages
216 | - Log errors appropriately
217 | - Never silently fail
218 | 
219 | ```python
220 | try:
221 |     result = await storage.store(memory)
222 | except StorageError as e:
223 |     logger.error(f"Failed to store memory: {e}")
224 |     raise MemoryServiceError(f"Storage operation failed: {e}") from e
225 | ```
226 | 
227 | ## Testing Requirements
228 | 
229 | ### Writing Tests
230 | 
231 | - Place tests in `tests/` directory
232 | - Name test files with `test_` prefix
233 | - Use descriptive test names
234 | - Include both positive and negative test cases
235 | - Mock external dependencies
236 | 
237 | Example test:
238 | ```python
239 | import pytest
240 | from mcp_memory_service.storage import SqliteVecStorage
241 | 
242 | @pytest.mark.asyncio
243 | async def test_store_memory_success():
244 |     """Test successful memory storage."""
245 |     storage = SqliteVecStorage(":memory:")
246 |     result = await storage.store("test content", tags=["test"])
247 |     assert result is not None
248 |     assert "hash" in result
249 | ```
250 | 
251 | ### Test Coverage
252 | 
253 | - Aim for >80% code coverage
254 | - Focus on critical paths and edge cases
255 | - Test error handling scenarios
256 | - Include integration tests where appropriate
257 | 
258 | ## Documentation
259 | 
260 | ### Code Documentation
261 | 
262 | - Add docstrings to all public APIs
263 | - Include type hints
264 | - Provide usage examples in docstrings
265 | - Keep comments concise and relevant
266 | 
267 | ### Project Documentation
268 | 
269 | When adding features or making significant changes:
270 | 
271 | 1. Update README.md if needed
272 | 2. Add/update Wiki pages for detailed guides
273 | 3. Update CHANGELOG.md following Keep a Changelog format
274 | 4. Update AGENTS.md or CLAUDE.md if development workflow changes
275 | 
276 | **Advanced Workflow Automation**:
277 | - See [Context Provider Workflow Automation](https://github.com/doobidoo/mcp-memory-service/wiki/Context-Provider-Workflow-Automation) for automating development workflows with intelligent patterns
278 | 
279 | ### API Documentation
280 | 
281 | - Document new MCP tools in `docs/api/tools.md`
282 | - Include parameter descriptions and examples
283 | - Note any breaking changes
284 | 
285 | ## Submitting Changes
286 | 
287 | ### Pull Request Guidelines
288 | 
289 | 1. **PR Title**: Use semantic format (e.g., "feat: add batch memory operations")
290 | 
291 | 2. **PR Description Template**:
292 |    ```markdown
293 |    ## Description
294 |    Brief description of changes
295 | 
296 |    ## Motivation
297 |    Why these changes are needed
298 | 
299 |    ## Changes
300 |    - List of specific changes
301 |    - Breaking changes (if any)
302 | 
303 |    ## Testing
304 |    - How you tested the changes
305 |    - Test coverage added
306 | 
307 |    ## Screenshots
308 |    (if applicable)
309 | 
310 |    ## Related Issues
311 |    Fixes #123
312 |    ```
313 | 
314 | 3. **PR Checklist**:
315 |    - [ ] Tests pass locally
316 |    - [ ] Code follows style guidelines
317 |    - [ ] Documentation updated
318 |    - [ ] CHANGELOG.md updated
319 |    - [ ] No sensitive data exposed
320 | 
321 | ### Review Process
322 | 
323 | - PRs require at least one review
324 | - Address review feedback promptly
325 | - Keep discussions focused and constructive
326 | - Be patient - reviews may take a few days
327 | 
328 | ## Reporting Issues
329 | 
330 | ### Bug Reports
331 | 
332 | When reporting bugs, include:
333 | 
334 | 1. **Environment**:
335 |    - OS and version
336 |    - Python version
337 |    - MCP Memory Service version
338 |    - Installation method (pip, Docker, source)
339 | 
340 | 2. **Steps to Reproduce**:
341 |    - Minimal code example
342 |    - Exact commands run
343 |    - Configuration used
344 | 
345 | 3. **Expected vs Actual Behavior**:
346 |    - What you expected to happen
347 |    - What actually happened
348 |    - Error messages/stack traces
349 | 
350 | 4. **Additional Context**:
351 |    - Screenshots if applicable
352 |    - Relevant log output
353 |    - Related issues
354 | 
355 | ### Feature Requests
356 | 
357 | For feature requests, describe:
358 | 
359 | - The problem you're trying to solve
360 | - Your proposed solution
361 | - Alternative approaches considered
362 | - Potential impact on existing functionality
363 | 
364 | ## Community & Support
365 | 
366 | ### Getting Help
367 | 
368 | - **Documentation**: Check the [Wiki](https://github.com/doobidoo/mcp-memory-service/wiki) first
369 | - **Issues**: Search existing [issues](https://github.com/doobidoo/mcp-memory-service/issues) before creating new ones
370 | - **Discussions**: Use [GitHub Discussions](https://github.com/doobidoo/mcp-memory-service/discussions) for questions
371 | - **Response Time**: Maintainers typically respond within 2-3 days
372 | 
373 | ### Communication Channels
374 | 
375 | - **GitHub Issues**: Bug reports and feature requests
376 | - **GitHub Discussions**: General questions and community discussion
377 | - **Pull Requests**: Code contributions and reviews
378 | 
379 | ### For AI Agents
380 | 
381 | If you're an AI coding assistant, also check:
382 | - [AGENTS.md](AGENTS.md) - Generic AI agent instructions
383 | - [CLAUDE.md](CLAUDE.md) - Claude-specific guidelines
384 | - [Context Provider Workflow Automation](https://github.com/doobidoo/mcp-memory-service/wiki/Context-Provider-Workflow-Automation) - Automate development workflows with intelligent patterns
385 | 
386 | ## Recognition
387 | 
388 | We value all contributions! Contributors are:
389 | 
390 | - Listed in release notes for their contributions
391 | - Mentioned in CHANGELOG.md entries
392 | - Credited in commit messages when providing fixes/solutions
393 | - Welcome to add themselves to a CONTRIBUTORS file (future)
394 | 
395 | ### Types of Recognition
396 | 
397 | - 🐛 Bug reporters who provide detailed, reproducible issues
398 | - 💻 Code contributors who submit PRs
399 | - 📝 Documentation improvers
400 | - 🧪 Test writers and reviewers
401 | - 💬 Community helpers who support other users
402 | - 🎨 UI/UX improvers (for dashboard contributions)
403 | 
404 | ---
405 | 
406 | Thank you for contributing to MCP Memory Service! Your efforts help make AI assistants more capable and useful for everyone. 🚀
407 | 
408 | If you have questions not covered here, please open a [Discussion](https://github.com/doobidoo/mcp-memory-service/discussions) or check our [Wiki](https://github.com/doobidoo/mcp-memory-service/wiki).
```

--------------------------------------------------------------------------------
/CLAUDE.md:
--------------------------------------------------------------------------------

```markdown
  1 | # CLAUDE.md
  2 | 
  3 | This file provides guidance to Claude Code (claude.ai/code) when working with this MCP Memory Service repository.
  4 | 
  5 | > **📝 Personal Customizations**: You can create `CLAUDE.local.md` (gitignored) for personal notes, custom workflows, or environment-specific instructions. This file contains shared project conventions.
  6 | 
  7 | > **Note**: Comprehensive project context has been stored in memory with tags `claude-code-reference`. Use memory retrieval to access detailed information during development.
  8 | 
  9 | ## Overview
 10 | 
 11 | MCP Memory Service is a Model Context Protocol server providing semantic memory and persistent storage for Claude Desktop with SQLite-vec, Cloudflare, and Hybrid storage backends.
 12 | 
 13 | > **🆕 v8.42.0**: **Memory Awareness Enhancements** - Added visible memory injection display at session start (top 3 memories with relevance scores), raised session-end quality thresholds to prevent generic boilerplate (200 char min, 0.5 confidence), added optional LLM-powered session summarizer, cleaned 167 generic summaries from database (3352 → 3185 memories). Users now see what memories are being injected into their sessions. See [CHANGELOG.md](CHANGELOG.md) for full version history.
 14 | >
 15 | > **Note**: When releasing new versions, update this line with current version + brief description. Use `.claude/agents/github-release-manager.md` agent for complete release workflow.
 16 | 
 17 | ## Essential Commands
 18 | 
 19 | | Category | Command | Description |
 20 | |----------|---------|-------------|
 21 | | **Setup** | `python scripts/installation/install.py --storage-backend hybrid` | Install with hybrid backend (recommended) |
 22 | | | `uv run memory server` | Start server |
 23 | | | `pytest tests/` | Run tests |
 24 | | **Memory Ops** | `claude /memory-store "content"` | Store information |
 25 | | | `claude /memory-recall "query"` | Retrieve information |
 26 | | | `claude /memory-health` | Check service status |
 27 | | **Validation** | `python scripts/validation/validate_configuration_complete.py` | Comprehensive config validation |
 28 | | | `python scripts/validation/diagnose_backend_config.py` | Cloudflare diagnostics |
 29 | | **Maintenance** | `python scripts/maintenance/consolidate_memory_types.py --dry-run` | Preview type consolidation |
 30 | | | `python scripts/maintenance/find_all_duplicates.py` | Find duplicates |
 31 | | | `python scripts/sync/check_drift.py` | Check hybrid backend drift (v8.25.0+) |
 32 | | **Quality** | `bash scripts/pr/quality_gate.sh 123` | Run PR quality checks |
 33 | | | `bash scripts/pr/quality_gate.sh 123 --with-pyscn` | Comprehensive quality analysis (includes pyscn) |
 34 | | | `bash scripts/quality/track_pyscn_metrics.sh` | Track quality metrics over time |
 35 | | | `bash scripts/quality/weekly_quality_review.sh` | Generate weekly quality review |
 36 | | | `pyscn analyze .` | Run pyscn static analysis |
 37 | | **Consolidation** | `curl -X POST http://127.0.0.1:8000/api/consolidation/trigger -H "Content-Type: application/json" -d '{"time_horizon":"weekly"}'` | Trigger memory consolidation |
 38 | | | `curl http://127.0.0.1:8000/api/consolidation/status` | Check scheduler status |
 39 | | | `curl http://127.0.0.1:8000/api/consolidation/recommendations` | Get consolidation recommendations |
 40 | | **Backup** | `curl -X POST http://127.0.0.1:8000/api/backup/now` | Trigger manual backup (v8.29.0+) |
 41 | | | `curl http://127.0.0.1:8000/api/backup/status` | Check backup status and schedule |
 42 | | | `curl http://127.0.0.1:8000/api/backup/list` | List available backups |
 43 | | **Sync Controls** | `curl -X POST http://127.0.0.1:8000/api/sync/pause` | Pause hybrid backend sync (v8.29.0+) |
 44 | | | `curl -X POST http://127.0.0.1:8000/api/sync/resume` | Resume hybrid backend sync |
 45 | | **Service** | `systemctl --user status mcp-memory-http.service` | Check HTTP service status (Linux) |
 46 | | | `scripts/service/memory_service_manager.sh status` | Check service status |
 47 | | **Debug** | `curl http://127.0.0.1:8000/api/health` | Health check |
 48 | | | `npx @modelcontextprotocol/inspector uv run memory server` | MCP Inspector |
 49 | 
 50 | See [scripts/README.md](scripts/README.md) for complete command reference.
 51 | 
 52 | ## Architecture
 53 | 
 54 | **Core Components:**
 55 | - **Server Layer**: MCP protocol with async handlers, global caches (`src/mcp_memory_service/server.py:1`)
 56 | - **Storage Backends**: SQLite-Vec (5ms reads), Cloudflare (edge), Hybrid (local + cloud sync)
 57 | - **Web Interface**: FastAPI dashboard at `http://127.0.0.1:8000/` with REST API
 58 | - **Document Ingestion**: PDF, DOCX, PPTX loaders (see [docs/document-ingestion.md](docs/document-ingestion.md))
 59 | - **Memory Hooks**: Natural Memory Triggers v7.1.3+ with 85%+ accuracy (see below)
 60 | 
 61 | **Key Patterns:**
 62 | - Async/await for I/O, type safety (Python 3.10+), platform hardware optimization (CUDA/MPS/DirectML/ROCm)
 63 | 
 64 | ## Document Ingestion
 65 | 
 66 | Supports PDF, DOCX, PPTX, TXT/MD with optional [semtools](https://github.com/run-llama/semtools) for enhanced quality.
 67 | 
 68 | ```bash
 69 | claude /memory-ingest document.pdf --tags documentation
 70 | claude /memory-ingest-dir ./docs --tags knowledge-base
 71 | ```
 72 | 
 73 | See [docs/document-ingestion.md](docs/document-ingestion.md) for full configuration and usage.
 74 | 
 75 | ## Interactive Dashboard
 76 | 
 77 | Web interface at `http://127.0.0.1:8000/` with CRUD operations, semantic/tag/time search, real-time updates (SSE), mobile responsive. Performance: 25ms page load, <100ms search.
 78 | 
 79 | **API Endpoints:** `/api/search`, `/api/search/by-tag`, `/api/search/by-time`, `/api/events`
 80 | 
 81 | ## Memory Consolidation System 🆕
 82 | 
 83 | **Dream-inspired memory consolidation** with automatic scheduling and Code Execution API (v8.23.0+).
 84 | 
 85 | ### Architecture
 86 | 
 87 | **Consolidation Scheduler Location**: HTTP Server (v8.23.0+)
 88 | - Runs 24/7 with HTTP server (independent of MCP server/Claude Desktop)
 89 | - Uses APScheduler for automatic scheduling
 90 | - Accessible via both HTTP API and MCP tools
 91 | - **Benefits**: Persistent, reliable, no dependency on Claude Desktop being open
 92 | 
 93 | **Code Execution API** (token-efficient operations):
 94 | ```python
 95 | from mcp_memory_service.api import consolidate, scheduler_status
 96 | 
 97 | # Trigger consolidation (15 tokens vs 150 MCP tool - 90% reduction)
 98 | result = consolidate('weekly')
 99 | 
100 | # Check scheduler (10 tokens vs 125 - 92% reduction)
101 | status = scheduler_status()
102 | ```
103 | 
104 | ### HTTP API Endpoints
105 | 
106 | | Endpoint | Method | Description | Response Time |
107 | |----------|--------|-------------|---------------|
108 | | `/api/consolidation/trigger` | POST | Trigger consolidation | ~10-30s |
109 | | `/api/consolidation/status` | GET | Scheduler status | <5ms |
110 | | `/api/consolidation/recommendations/{horizon}` | GET | Get recommendations | ~50ms |
111 | 
112 | **Example Usage:**
113 | ```bash
114 | # Trigger weekly consolidation
115 | curl -X POST http://127.0.0.1:8000/api/consolidation/trigger \
116 |   -H "Content-Type: application/json" \
117 |   -d '{"time_horizon": "weekly"}'
118 | 
119 | # Check scheduler status
120 | curl http://127.0.0.1:8000/api/consolidation/status
121 | 
122 | # Get recommendations
123 | curl http://127.0.0.1:8000/api/consolidation/recommendations/weekly
124 | ```
125 | 
126 | ### Configuration
127 | 
128 | ```bash
129 | # Enable consolidation (default: true)
130 | export MCP_CONSOLIDATION_ENABLED=true
131 | 
132 | # Scheduler configuration (in config.py)
133 | CONSOLIDATION_SCHEDULE = {
134 |     'daily': '02:00',              # Daily at 2 AM
135 |     'weekly': 'SUN 03:00',         # Weekly on Sunday at 3 AM
136 |     'monthly': '01 04:00',         # Monthly on 1st at 4 AM
137 |     'quarterly': 'disabled',       # Disabled
138 |     'yearly': 'disabled'           # Disabled
139 | }
140 | ```
141 | 
142 | ### Features
143 | 
144 | - **Exponential decay scoring** - Prioritize recent, frequently accessed memories
145 | - **Creative association discovery** - Find semantic connections (0.3-0.7 similarity)
146 | - **Semantic clustering** - Group related memories (DBSCAN algorithm)
147 | - **Compression** - Summarize redundant information (preserves originals)
148 | - **Controlled forgetting** - Archive low-relevance memories (90+ days inactive)
149 | 
150 | ### Performance Expectations
151 | 
152 | **Real-world metrics** (based on v8.23.1 test with 2,495 memories):
153 | 
154 | | Backend | First Run | Subsequent Runs | Notes |
155 | |---------|-----------|----------------|-------|
156 | | **SQLite-Vec** | 5-25s | 5-25s | Fast, local-only |
157 | | **Cloudflare** | 2-4min | 1-3min | Network-dependent, cloud-only |
158 | | **Hybrid** | 4-6min | 2-4min | Slower but provides multi-device sync |
159 | 
160 | **Why Hybrid takes longer**: Local SQLite operations (~5ms) + Cloudflare cloud sync (~150ms per update). Trade-off: Processing time for data persistence across devices.
161 | 
162 | **Recommendation**: Hybrid backend is recommended for production despite longer consolidation time - multi-device sync capability is worth it.
163 | 
164 | **📖 See [Memory Consolidation Guide](docs/guides/memory-consolidation-guide.md)** for detailed operational guide, monitoring procedures, and troubleshooting. Wiki version will be available at: [Memory Consolidation System Guide](https://github.com/doobidoo/mcp-memory-service/wiki/Memory-Consolidation-System-Guide)
165 | 
166 | ### Migration from MCP-only Mode (v8.22.x → v8.23.0+)
167 | 
168 | **No action required** - Consolidation automatically runs in HTTP server if enabled.
169 | 
170 | **For users without HTTP server:**
171 | ```bash
172 | # Enable HTTP server in .env
173 | export MCP_HTTP_ENABLED=true
174 | 
175 | # Restart service
176 | systemctl --user restart mcp-memory-http.service
177 | ```
178 | 
179 | **MCP tools continue working** (backward compatible via internal API calls).
180 | 
181 | ## Environment Variables
182 | 
183 | **Essential Configuration:**
184 | ```bash
185 | # Storage Backend (Hybrid is RECOMMENDED for production)
186 | export MCP_MEMORY_STORAGE_BACKEND=hybrid  # hybrid|cloudflare|sqlite_vec
187 | 
188 | # Cloudflare Configuration (REQUIRED for hybrid/cloudflare backends)
189 | export CLOUDFLARE_API_TOKEN="your-token"      # Required for Cloudflare backend
190 | export CLOUDFLARE_ACCOUNT_ID="your-account"   # Required for Cloudflare backend
191 | export CLOUDFLARE_D1_DATABASE_ID="your-d1-id" # Required for Cloudflare backend
192 | export CLOUDFLARE_VECTORIZE_INDEX="mcp-memory-index" # Required for Cloudflare backend
193 | 
194 | # Web Interface (Optional)
195 | export MCP_HTTP_ENABLED=true                  # Enable HTTP server
196 | export MCP_HTTPS_ENABLED=true                 # Enable HTTPS (production)
197 | export MCP_API_KEY="$(openssl rand -base64 32)" # Generate secure API key
198 | ```
199 | 
200 | **Configuration Precedence:** Environment variables > .env file > Global Claude Config > defaults
201 | 
202 | **✅ Automatic Configuration Loading (v6.16.0+):** The service now automatically loads `.env` files and respects environment variable precedence. CLI defaults no longer override environment configuration.
203 | 
204 | **⚠️  Important:** When using hybrid or cloudflare backends, ensure Cloudflare credentials are properly configured. If health checks show "sqlite-vec" when you expect "cloudflare" or "hybrid", this indicates a configuration issue that needs to be resolved.
205 | 
206 | **Platform Support:** macOS (MPS/CPU), Windows (CUDA/DirectML/CPU), Linux (CUDA/ROCm/CPU)
207 | 
208 | ## Claude Code Hooks Configuration 🆕
209 | 
210 | > **🚨 CRITICAL - Windows Users**: SessionStart hooks with `matchers: ["*"]` cause Claude Code to hang indefinitely on Windows. This is a confirmed bug (#160). **Workaround**: Disable SessionStart hooks or use UserPromptSubmit hooks instead. See [Windows SessionStart Hook Issue](#windows-sessionstart-hook-issue) below.
211 | 
212 | ### Natural Memory Triggers v7.1.3 (Latest)
213 | 
214 | **Intelligent automatic memory retrieval** with advanced semantic analysis and multi-tier performance optimization:
215 | 
216 | ```bash
217 | # Installation (Zero-restart required)
218 | cd claude-hooks && python install_hooks.py --natural-triggers
219 | 
220 | # CLI Management
221 | node ~/.claude/hooks/memory-mode-controller.js status
222 | node ~/.claude/hooks/memory-mode-controller.js profile balanced
223 | node ~/.claude/hooks/memory-mode-controller.js sensitivity 0.6
224 | ```
225 | 
226 | **Key Features:**
227 | - ✅ **85%+ trigger accuracy** for memory-seeking pattern detection
228 | - ✅ **Multi-tier processing**: 50ms instant → 150ms fast → 500ms intensive
229 | - ✅ **CLI management system** for real-time configuration without restart
230 | - ✅ **Git-aware context** integration for enhanced memory relevance
231 | - ✅ **Adaptive learning** based on user preferences and usage patterns
232 | 
233 | **Configuration (`~/.claude/hooks/config.json`):**
234 | ```json
235 | {
236 |   "naturalTriggers": {
237 |     "enabled": true,
238 |     "triggerThreshold": 0.6,
239 |     "cooldownPeriod": 30000,
240 |     "maxMemoriesPerTrigger": 5
241 |   },
242 |   "performance": {
243 |     "defaultProfile": "balanced",
244 |     "enableMonitoring": true,
245 |     "autoAdjust": true
246 |   }
247 | }
248 | ```
249 | 
250 | **Performance Profiles:**
251 | - `speed_focused`: <100ms, instant tier only - minimal memory awareness for speed
252 | - `balanced`: <200ms, instant + fast tiers - optimal for general development (recommended)
253 | - `memory_aware`: <500ms, all tiers - maximum context awareness for complex work
254 | - `adaptive`: Dynamic adjustment based on usage patterns and user feedback
255 | 
256 | ### Context-Provider Integration 🆕
257 | 
258 | **Rule-based context management** that complements Natural Memory Triggers with structured, project-specific patterns:
259 | 
260 | ```bash
261 | # Context-Provider Commands
262 | mcp context list                                # List available contexts
263 | mcp context status                             # Check session initialization status
264 | mcp context optimize                           # Get optimization suggestions
265 | ```
266 | 
267 | #### **Available Contexts:**
268 | 
269 | **1. Python MCP Memory Service Context** (`python_mcp_memory`)
270 | - Project-specific patterns for FastAPI, MCP protocol, and storage backends
271 | - Auto-store: MCP protocol changes, backend configs, performance optimizations
272 | - Auto-retrieve: Troubleshooting, setup queries, implementation examples
273 | - Smart tagging: Auto-detects tools (fastapi, cloudflare, sqlite-vec, hybrid, etc.)
274 | 
275 | **2. Release Workflow Context** 🆕 (`mcp_memory_release_workflow`)
276 | - **PR Review Cycle**: Iterative Gemini Code Assist workflow (Fix → Comment → /gemini review → Wait 1min → Repeat)
277 | - **Version Management**: Four-file procedure (__init__.py → pyproject.toml → README.md → uv lock)
278 | - **CHANGELOG Management**: Format guidelines, conflict resolution (combine PR entries)
279 | - **Documentation Matrix**: When to use CHANGELOG vs Wiki vs CLAUDE.md vs code comments
280 | - **Release Procedure**: Merge → Tag → Push → Verify workflows (Docker Publish, Publish and Test, HTTP-MCP Bridge)
281 | - **Issue Management** 🆕: Auto-tracking, post-release workflow, smart closing comments
282 |   - **Auto-Detection**: Tracks "fixes #", "closes #", "resolves #" patterns in PRs
283 |   - **Post-Release Workflow**: Retrieves issues from release, suggests closures with context
284 |   - **Smart Comments**: Auto-generates closing comments with PR links, CHANGELOG entries, wiki references
285 |   - **Triage Intelligence**: Auto-categorizes issues (bug, feature, docs, performance) based on patterns
286 | 
287 | **Auto-Store Patterns:**
288 | - **Technical**: `MCP protocol`, `tool handler`, `storage backend switch`, `25ms page load`, `embedding cache`
289 | - **Configuration**: `cloudflare configuration`, `hybrid backend setup`, `oauth integration`
290 | - **Release Workflow** 🆕: `merged PR`, `gemini review`, `created tag`, `CHANGELOG conflict`, `version bump`
291 | - **Documentation** 🆕: `updated CHANGELOG`, `wiki page created`, `CLAUDE.md updated`
292 | - **Issue Tracking** 🆕: `fixes #`, `closes #`, `resolves #`, `created issue`, `closed issue #`
293 | 
294 | **Auto-Retrieve Patterns:**
295 | - **Troubleshooting**: `cloudflare backend error`, `MCP client connection`, `storage backend failed`
296 | - **Setup**: `backend configuration`, `environment setup`, `claude desktop config`
297 | - **Development**: `MCP handler example`, `API endpoint pattern`, `async error handling`
298 | - **Release Workflow** 🆕: `how to release`, `PR workflow`, `gemini iteration`, `version bump procedure`, `where to document`
299 | - **Issue Management** 🆕: `review open issues`, `what issues fixed`, `can we close`, `issue status`, `which issues resolved`
300 | 
301 | **Documentation Decision Matrix:**
302 | | Change Type | CHANGELOG | CLAUDE.md | Wiki | Code Comments |
303 | |-------------|-----------|-----------|------|---------------|
304 | | Bug fix | ✅ Always | If affects workflow | If complex | ✅ Non-obvious |
305 | | New feature | ✅ Always | If adds commands | ✅ Major features | ✅ API changes |
306 | | Performance | ✅ Always | If measurable | If >20% improvement | Rationale |
307 | | Config change | ✅ Always | ✅ User-facing | If requires migration | Validation logic |
308 | | Troubleshooting | In notes | If common | ✅ Detailed guide | For maintainers |
309 | 
310 | **Integration Benefits:**
311 | - **Structured Memory Management**: Rule-based triggers complement AI-based Natural Memory Triggers
312 | - **Project-Specific Intelligence**: Captures MCP Memory Service-specific terminology and workflows
313 | - **Enhanced Git Workflow**: Automatic semantic commit formatting and branch naming conventions
314 | - **Release Automation** 🆕: Never miss version bumps, CHANGELOG updates, or workflow verification
315 | - **Knowledge Retention** 🆕: Capture what works/doesn't work in PR review cycles
316 | - **Intelligent Issue Management** 🆕: Auto-track issue-PR relationships, suggest closures after releases, generate smart closing comments
317 | - **Post-Release Efficiency** 🆕: Automated checklist retrieves related issues, suggests verification steps, includes all context
318 | - **Zero Performance Impact**: Lightweight rule processing with minimal overhead
319 | 
320 | **Legacy Hook Configuration**: See [docs/legacy/dual-protocol-hooks.md](docs/legacy/dual-protocol-hooks.md) for v7.0.0 dual protocol configuration (superseded by Natural Memory Triggers).
321 | 
322 | ## Storage Backends
323 | 
324 | | Backend | Performance | Use Case | Installation |
325 | |---------|-------------|----------|--------------|
326 | | **Hybrid** ⚡ | **Fast (5ms read)** | **🌟 Production (Recommended)** | `install.py --storage-backend hybrid` |
327 | | **Cloudflare** ☁️ | Network dependent | Cloud-only deployment | `install.py --storage-backend cloudflare` |
328 | | **SQLite-Vec** 🪶 | Fast (5ms read) | Development, single-user local | `install.py --storage-backend sqlite_vec` |
329 | 
330 | ### ⚠️ **Database Lock Prevention (v8.9.0+)**
331 | 
332 | **CRITICAL**: After adding `MCP_MEMORY_SQLITE_PRAGMAS` to `.env`, you **MUST restart all servers**:
333 | - HTTP server: `kill <PID>` then restart with `uv run python scripts/server/run_http_server.py`
334 | - MCP servers: Use `/mcp` in Claude Code to reconnect, or restart Claude Desktop
335 | - Verify: Check logs for `Custom pragma from env: busy_timeout=15000`
336 | 
337 | SQLite pragmas are **per-connection**, not global. Long-running servers (days/weeks old) won't pick up new `.env` settings automatically.
338 | 
339 | **Symptoms of missing pragmas**:
340 | - "database is locked" errors despite v8.9.0+ installation
341 | - `PRAGMA busy_timeout` returns `0` instead of `15000`
342 | - Concurrent HTTP + MCP access fails
343 | 
344 | ### 🚀 **Hybrid Backend (v6.21.0+) - RECOMMENDED**
345 | 
346 | The **Hybrid backend** provides the best of both worlds - **SQLite-vec speed with Cloudflare persistence**:
347 | 
348 | ```bash
349 | # Enable hybrid backend
350 | export MCP_MEMORY_STORAGE_BACKEND=hybrid
351 | 
352 | # Hybrid-specific configuration
353 | export MCP_HYBRID_SYNC_INTERVAL=300    # Background sync every 5 minutes
354 | export MCP_HYBRID_BATCH_SIZE=50        # Sync 50 operations at a time
355 | export MCP_HYBRID_SYNC_ON_STARTUP=true # Initial sync on startup
356 | 
357 | # Drift detection configuration (v8.25.0+)
358 | export MCP_HYBRID_SYNC_UPDATES=true              # Enable metadata sync (default: true)
359 | export MCP_HYBRID_DRIFT_CHECK_INTERVAL=3600      # Seconds between drift checks (default: 1 hour)
360 | export MCP_HYBRID_DRIFT_BATCH_SIZE=100           # Memories to check per scan (default: 100)
361 | 
362 | # Requires Cloudflare credentials (same as cloudflare backend)
363 | export CLOUDFLARE_API_TOKEN="your-token"
364 | export CLOUDFLARE_ACCOUNT_ID="your-account"
365 | export CLOUDFLARE_D1_DATABASE_ID="your-d1-id"
366 | export CLOUDFLARE_VECTORIZE_INDEX="mcp-memory-index"
367 | ```
368 | 
369 | **Key Benefits:**
370 | - ✅ **5ms read/write performance** (SQLite-vec speed)
371 | - ✅ **Zero user-facing latency** - Cloud sync happens in background
372 | - ✅ **Multi-device synchronization** - Access memories everywhere
373 | - ✅ **Graceful offline operation** - Works without internet, syncs when available
374 | - ✅ **Automatic failover** - Falls back to SQLite-only if Cloudflare unavailable
375 | - ✅ **Drift detection (v8.25.0+)** - Automatic metadata sync prevents data loss across backends
376 | 
377 | **Architecture:**
378 | - **Primary Storage**: SQLite-vec (all user operations)
379 | - **Secondary Storage**: Cloudflare (background sync)
380 | - **Background Service**: Async queue with retry logic and health monitoring
381 | 
382 | **v6.16.0+ Installer Enhancements:**
383 | - **Interactive backend selection** with usage-based recommendations
384 | - **Automatic Cloudflare credential setup** and `.env` file generation
385 | - **Connection testing** during installation to validate configuration
386 | - **Graceful fallbacks** from cloud to local backends if setup fails
387 | 
388 | ## Development Guidelines
389 | 
390 | ### 🔧 **Development Setup (CRITICAL)**
391 | 
392 | **⚠️ ALWAYS use editable install for development** to avoid stale package issues:
393 | 
394 | ```bash
395 | # REQUIRED for development - loads code from source, not site-packages
396 | pip install -e .
397 | 
398 | # Or with uv (preferred)
399 | uv pip install -e .
400 | 
401 | # Verify installation mode (CRITICAL CHECK)
402 | pip show mcp-memory-service | grep Location
403 | # Should show: Location: /path/to/mcp-memory-service/src
404 | # NOT: Location: /path/to/venv/lib/python3.x/site-packages
405 | ```
406 | 
407 | **Why This Matters:**
408 | - MCP servers load from `site-packages`, not source files
409 | - Without `-e`, source changes won't be reflected until reinstall
410 | - System restart won't help - it relaunches with stale package
411 | - **Common symptom**: Code shows v8.23.0 but server reports v8.5.3
412 | 
413 | **Development Workflow:**
414 | 1. Clone repo: `git clone https://github.com/doobidoo/mcp-memory-service.git`
415 | 2. Create venv: `python -m venv venv && source venv/bin/activate`
416 | 3. **Editable install**: `pip install -e .` ← CRITICAL STEP
417 | 4. Verify: `python -c "import mcp_memory_service; print(mcp_memory_service.__version__)"`
418 | 5. Start coding - changes take effect after server restart (no reinstall needed)
419 | 
420 | **Version Mismatch Detection:**
421 | ```bash
422 | # Quick check script - detects stale venv vs source code
423 | python scripts/validation/check_dev_setup.py
424 | 
425 | # Manual verification (both should match):
426 | grep '__version__' src/mcp_memory_service/__init__.py
427 | python -c "import mcp_memory_service; print(mcp_memory_service.__version__)"
428 | ```
429 | 
430 | **Fix Stale Installation:**
431 | ```bash
432 | # If you see version mismatch or non-editable install:
433 | pip uninstall mcp-memory-service
434 | pip install -e .
435 | 
436 | # Restart MCP servers (in Claude Code):
437 | # Run: /mcp
438 | ```
439 | 
440 | ### 🧠 **Memory & Documentation**
441 | - Use `claude /memory-store` to capture decisions during development
442 | - Memory operations handle duplicates via content hashing
443 | - Time parsing supports natural language ("yesterday", "last week")
444 | - Use semantic commit messages for version management
445 | 
446 | #### **Memory Type Taxonomy**
447 | Use 24 core types: `note`, `reference`, `document`, `guide`, `session`, `implementation`, `analysis`, `troubleshooting`, `test`, `fix`, `feature`, `release`, `deployment`, `milestone`, `status`, `configuration`, `infrastructure`, `process`, `security`, `architecture`, `documentation`, `solution`, `achievement`. Avoid creating variations. See [scripts/maintenance/memory-types.md](scripts/maintenance/memory-types.md) for full taxonomy and consolidation guidelines.
448 | 
449 | ### 🏗️ **Architecture & Testing**
450 | - Storage backends must implement abstract base class
451 | - All features require corresponding tests
452 | - **Comprehensive UI Testing**: Validate performance benchmarks (page load <2s, operations <1s)
453 | - **Security Validation**: Verify XSS protection, input validation, and OAuth integration
454 | - **Mobile Testing**: Confirm responsive design at 768px and 1024px breakpoints
455 | 
456 | ### 🚀 **Version Management**
457 | 
458 | **⚠️ CRITICAL**: **ALWAYS use the github-release-manager agent for ALL releases** (major, minor, patch, and hotfixes). Manual release workflows miss steps and are error-prone.
459 | 
460 | **Four-File Version Bump Procedure:**
461 | 1. Update `src/mcp_memory_service/__init__.py` (line 50: `__version__ = "X.Y.Z"`)
462 | 2. Update `pyproject.toml` (line 7: `version = "X.Y.Z"`)
463 | 3. Update `README.md` (line 19: Latest Release section)
464 | 4. Run `uv lock` to update dependency lock file
465 | 5. Commit all four files together
466 | 
467 | **Release Workflow:**
468 | - **ALWAYS** use `.claude/agents/github-release-manager.md` agent for complete release procedure
469 | - Agent ensures: README.md updates, GitHub Release creation, proper issue tracking
470 | - Manual workflows miss documentation steps (see v8.20.1 lesson learned)
471 | - Document milestones in CHANGELOG.md with performance metrics
472 | - Create descriptive git tags: `git tag -a vX.Y.Z -m "description"`
473 | - See [docs/development/release-checklist.md](docs/development/release-checklist.md) for full checklist
474 | 
475 | **Hotfix Workflow (Critical Bugs):**
476 | - **Speed target**: 8-10 minutes from bug report to release (achievable with AI assistance)
477 | - **Process**: Fix → Test → Four-file bump → Commit → github-release-manager agent
478 | - **Issue management**: Post detailed root cause analysis, don't close until user confirms fix works
479 | - **Example**: v8.20.1 (8 minutes: bug report → fix → release → user notification)
480 | 
481 | ### 🤖 **Agent-First Development**
482 | 
483 | **Principle**: Use agents for workflows, not manual steps. Manual workflows are error-prone and miss documentation updates.
484 | 
485 | **Agent Usage Matrix:**
486 | | Task | Agent | Why |
487 | |------|-------|-----|
488 | | **Any release** (major/minor/patch/hotfix) | github-release-manager | Ensures README.md, CHANGELOG.md, GitHub Release, issue tracking |
489 | | **Batch code fixes** | amp-bridge | Fast parallel execution, syntax validation |
490 | | **PR review automation** | gemini-pr-automator | Saves 10-30 min/PR, auto-resolves threads |
491 | | **Code quality checks** | code-quality-guard | Pre-commit complexity/security scanning |
492 | 
493 | **Manual vs Agent Comparison:**
494 | - ❌ Manual v8.20.1: Forgot README.md, incomplete GitHub Release
495 | - ✅ With agent v8.20.1: All files updated, proper release created
496 | - **Lesson**: Always use agents, even for "simple" hotfixes
497 | 
498 | ### 🔧 **Configuration & Deployment**
499 | - Run `python scripts/validation/validate_configuration_complete.py` when troubleshooting setup issues
500 | - Use sync utilities for hybrid Cloudflare/SQLite deployments
501 | - Test both OAuth enabled/disabled modes for web interface
502 | - Validate search endpoints: semantic (`/api/search`), tag (`/api/search/by-tag`), time (`/api/search/by-time`)
503 | 
504 | ## Code Quality Monitoring
505 | 
506 | ### Multi-Layer Quality Strategy
507 | 
508 | The QA workflow uses three complementary layers for comprehensive code quality assurance:
509 | 
510 | **Layer 1: Pre-commit (Fast - <5s)**
511 | - Groq/Gemini LLM complexity checks
512 | - Security scanning (SQL injection, XSS, command injection)
513 | - Dev environment validation
514 | - **Blocking**: Complexity >8, any security issues
515 | 
516 | **Layer 2: PR Quality Gate (Moderate - 10-60s)**
517 | - Standard checks: complexity, security, test coverage, breaking changes
518 | - Comprehensive checks (`--with-pyscn`): + duplication, dead code, architecture
519 | - **Blocking**: Security issues, health score <50
520 | 
521 | **Layer 3: Periodic Review (Weekly)**
522 | - pyscn codebase-wide analysis
523 | - Trend tracking and regression detection
524 | - Refactoring sprint planning
525 | 
526 | ### pyscn Integration
527 | 
528 | [pyscn](https://github.com/ludo-technologies/pyscn) provides comprehensive static analysis:
529 | 
530 | **Capabilities:**
531 | - Cyclomatic complexity scoring
532 | - Dead code detection
533 | - Clone detection (duplication)
534 | - Coupling metrics (CBO)
535 | - Dependency graph analysis
536 | - Architecture violation detection
537 | 
538 | **Usage:**
539 | 
540 | ```bash
541 | # PR creation (automated)
542 | bash scripts/pr/quality_gate.sh 123 --with-pyscn
543 | 
544 | # Local pre-PR check
545 | pyscn analyze .
546 | open .pyscn/reports/analyze_*.html
547 | 
548 | # Track metrics over time
549 | bash scripts/quality/track_pyscn_metrics.sh
550 | 
551 | # Weekly review
552 | bash scripts/quality/weekly_quality_review.sh
553 | ```
554 | 
555 | ### Health Score Thresholds
556 | 
557 | | Score | Status | Action Required |
558 | |-------|--------|----------------|
559 | | **<50** | 🔴 **Release Blocker** | Cannot merge - immediate refactoring required |
560 | | **50-69** | 🟡 **Action Required** | Plan refactoring sprint within 2 weeks |
561 | | **70-84** | ✅ **Good** | Monitor trends, continue development |
562 | | **85+** | 🎯 **Excellent** | Maintain current standards |
563 | 
564 | ### Quality Standards
565 | 
566 | **Release Blockers** (Health Score <50):
567 | - ❌ Cannot merge to main
568 | - ❌ Cannot create release
569 | - 🔧 Required: Immediate refactoring
570 | 
571 | **Action Required** (Health Score 50-69):
572 | - ⚠️ Plan refactoring sprint within 2 weeks
573 | - 📊 Track on project board
574 | - 🎯 Focus on top 5 complexity offenders
575 | 
576 | **Acceptable** (Health Score ≥70):
577 | - ✅ Continue normal development
578 | - 📈 Monitor trends monthly
579 | - 🎯 Address new issues proactively
580 | 
581 | ### Tool Complementarity
582 | 
583 | | Tool | Speed | Scope | Use Case | Blocking |
584 | |------|-------|-------|----------|----------|
585 | | **Groq/Gemini (pre-commit)** | <5s | Changed files | Every commit | Yes (complexity >8) |
586 | | **quality_gate.sh** | 10-30s | PR files | PR creation | Yes (security) |
587 | | **pyscn (PR)** | 30-60s | Full codebase | PR + periodic | Yes (health <50) |
588 | | **code-quality-guard** | Manual | Targeted | Refactoring | No (advisory) |
589 | 
590 | **Integration Points:**
591 | - Pre-commit: Fast LLM checks (Groq primary, Gemini fallback)
592 | - PR Quality Gate: `--with-pyscn` flag for comprehensive analysis
593 | - Periodic: Weekly pyscn analysis with trend tracking
594 | 
595 | See [`.claude/agents/code-quality-guard.md`](.claude/agents/code-quality-guard.md) for detailed workflows and [docs/development/code-quality-workflow.md](docs/development/code-quality-workflow.md) for complete documentation.
596 | 
597 | ## Configuration Management
598 | 
599 | **Quick Validation:**
600 | ```bash
601 | python scripts/validation/validate_configuration_complete.py  # Comprehensive validation
602 | python scripts/validation/diagnose_backend_config.py          # Cloudflare diagnostics
603 | ```
604 | 
605 | **Configuration Hierarchy:**
606 | - Global: `~/.claude.json` (authoritative)
607 | - Project: `.env` file (Cloudflare credentials)
608 | - **Avoid**: Local `.mcp.json` overrides
609 | 
610 | **Common Issues & Quick Fixes:**
611 | 
612 | | Issue | Quick Fix |
613 | |-------|-----------|
614 | | Wrong backend showing | `python scripts/validation/diagnose_backend_config.py` |
615 | | Port mismatch (hooks timeout) | Verify same port in `~/.claude/hooks/config.json` and server (default: 8000) |
616 | | Schema validation errors after PR merge | Run `/mcp` in Claude Code to reconnect with new schema |
617 | | Accidental `data/memory.db` | Delete safely: `rm -rf data/` (gitignored) |
618 | 
619 | See [docs/troubleshooting/hooks-quick-reference.md](docs/troubleshooting/hooks-quick-reference.md) for comprehensive troubleshooting.
620 | 
621 | ## Hook Troubleshooting
622 | 
623 | **SessionEnd Hooks:**
624 | - Trigger on `/exit`, terminal close (NOT Ctrl+C)
625 | - Require 100+ characters, confidence > 0.1
626 | - Memory creation: topics, decisions, insights, code changes
627 | 
628 | **Windows SessionStart Issue (#160):**
629 | - CRITICAL: SessionStart hooks hang Claude Code on Windows
630 | - Workaround: Use `/session-start` slash command or UserPromptSubmit hooks
631 | 
632 | See [docs/troubleshooting/hooks-quick-reference.md](docs/troubleshooting/hooks-quick-reference.md) for full troubleshooting guide.
633 | 
634 | ## Agent Integrations
635 | 
636 | Workflow automation agents using Gemini CLI, Groq API, and Amp CLI. All agents in `.claude/agents/` directory.
637 | 
638 | | Agent | Tool | Purpose | Priority | Usage |
639 | |-------|------|---------|----------|-------|
640 | | **github-release-manager** | GitHub CLI | Complete release workflow | Production | Proactive on feature completion |
641 | | **amp-bridge** | Amp CLI | Research without Claude credits | Production | File-based prompts |
642 | | **code-quality-guard** | Gemini CLI / Groq API | Fast code quality analysis | Active | Pre-commit, pre-PR |
643 | | **gemini-pr-automator** | Gemini CLI | Automated PR review loops | Active | Post-PR creation |
644 | 
645 | **Groq Bridge** (RECOMMENDED): Ultra-fast inference for code-quality-guard agent (~10x faster than Gemini, 200-300ms vs 2-3s). Supports multiple models including Kimi K2 (256K context, excellent for agentic coding). **Pre-commit hooks now use Groq as primary LLM** with Gemini fallback, avoiding OAuth browser authentication interruptions. See `docs/integrations/groq-bridge.md` for setup.
646 | 
647 | ### GitHub Release Manager
648 | 
649 | Proactive release workflow automation with issue tracking, version management, and documentation updates.
650 | 
651 | ```bash
652 | # Proactive usage - invokes automatically on feature completion
653 | # Manual usage - invoke @agent when ready to release
654 | @agent github-release-manager "Check if we need a release"
655 | @agent github-release-manager "Create release for v8.20.0"
656 | ```
657 | 
658 | **Capabilities:**
659 | - **Version Management**: Four-file procedure (__init__.py → pyproject.toml → README.md → uv lock)
660 | - **CHANGELOG Management**: Format guidelines, conflict resolution (combine PR entries)
661 | - **Documentation Matrix**: Automatic CHANGELOG, CLAUDE.md, README.md updates
662 | - **Issue Tracking**: Auto-detects "fixes #", suggests closures with smart comments
663 | - **Release Procedure**: Merge → Tag → Push → Verify workflows (Docker Publish, HTTP-MCP Bridge)
664 | 
665 | **Post-Release Workflow:** Retrieves issues from release, suggests closures with PR links and CHANGELOG entries.
666 | 
667 | See [.claude/agents/github-release-manager.md](.claude/agents/github-release-manager.md) for complete workflows.
668 | 
669 | ### Code Quality Guard (Gemini CLI / Groq API)
670 | 
671 | Fast automated analysis for complexity scoring, security scanning, and refactoring suggestions.
672 | 
673 | ```bash
674 | # Complexity check (Gemini CLI - default)
675 | gemini "Complexity 1-10 per function, list high (>7) first: $(cat file.py)"
676 | 
677 | # Complexity check (Groq API - 10x faster, default model)
678 | ./scripts/utils/groq "Complexity 1-10 per function, list high (>7) first: $(cat file.py)"
679 | 
680 | # Complexity check (Kimi K2 - best for complex code analysis)
681 | ./scripts/utils/groq "Complexity 1-10 per function, list high (>7) first: $(cat file.py)" --model moonshotai/kimi-k2-instruct
682 | 
683 | # Security scan
684 | gemini "Security check (SQL injection, XSS, command injection): $(cat file.py)"
685 | 
686 | # TODO prioritization
687 | bash scripts/maintenance/scan_todos.sh
688 | 
689 | # Pre-commit hook (auto-install)
690 | ln -s ../../scripts/hooks/pre-commit .git/hooks/pre-commit
691 | 
692 | # Pre-commit hook setup (RECOMMENDED: Groq for fast, non-interactive checks)
693 | export GROQ_API_KEY="your-groq-api-key"  # Primary (200-300ms, no OAuth)
694 | # Falls back to Gemini CLI if Groq unavailable
695 | # Skips checks gracefully if neither available
696 | ```
697 | 
698 | **Pre-commit Hook LLM Priority:**
699 | 1. **Groq API** (Primary) - Fast (200-300ms), simple API key auth, no browser interruption
700 | 2. **Gemini CLI** (Fallback) - Slower (2-3s), OAuth browser flow may interrupt commits
701 | 3. **Skip checks** - If neither available, commit proceeds without quality gates
702 | 
703 | See [.claude/agents/code-quality-guard.md](.claude/agents/code-quality-guard.md) for complete workflows and quality standards.
704 | 
705 | ### Gemini PR Automator
706 | 
707 | Eliminates manual "Wait 1min → /gemini review" cycles with fully automated review iteration.
708 | 
709 | ```bash
710 | # Full automated review (5 iterations, safe fixes enabled)
711 | bash scripts/pr/auto_review.sh <PR_NUMBER>
712 | 
713 | # Quality gate checks before review
714 | bash scripts/pr/quality_gate.sh <PR_NUMBER>
715 | 
716 | # Generate tests for new code
717 | bash scripts/pr/generate_tests.sh <PR_NUMBER>
718 | 
719 | # Breaking change detection
720 | bash scripts/pr/detect_breaking_changes.sh main <BRANCH>
721 | ```
722 | 
723 | **Time Savings:** ~10-30 minutes per PR vs manual iteration. See [.claude/agents/gemini-pr-automator.md](.claude/agents/gemini-pr-automator.md) for workflows.
724 | 
725 | ### Amp CLI Bridge
726 | 
727 | File-based workflow for external research without consuming Claude Code credits.
728 | 
729 | ```bash
730 | # Claude creates prompt → You run command → Amp writes response
731 | amp @.claude/amp/prompts/pending/{uuid}.json
732 | ```
733 | 
734 | **Use cases:** Web research, codebase analysis, documentation generation. See [docs/amp-cli-bridge.md](docs/amp-cli-bridge.md) for architecture.
735 | 
736 | > **For detailed troubleshooting, architecture, and deployment guides:**
737 | > - **Backend Configuration Issues**: See [Wiki Troubleshooting Guide](https://github.com/doobidoo/mcp-memory-service/wiki/07-TROUBLESHOOTING#backend-configuration-issues) for comprehensive solutions to missing memories, environment variable issues, Cloudflare auth, hooks timeouts, and more
738 | > - **Historical Context**: Retrieve memories tagged with `claude-code-reference`
739 | > - **Quick Diagnostic**: Run `python scripts/validation/diagnose_backend_config.py`
740 | 
```

--------------------------------------------------------------------------------
/tests/consolidation/__init__.py:
--------------------------------------------------------------------------------

```python
1 | # Consolidation tests module
```

--------------------------------------------------------------------------------
/archive/deployment-configs/empty_config.yml:
--------------------------------------------------------------------------------

```yaml
1 | # Empty Litestream config
2 | dbs: []
```

--------------------------------------------------------------------------------
/scripts/run/run-with-uv.sh:
--------------------------------------------------------------------------------

```bash
1 | #!/bin/bash
2 | echo "Running MCP Memory Service with UV..."
3 | python uv_wrapper.py "$@"
4 | 
```

--------------------------------------------------------------------------------
/scripts/linux/service_status.sh:
--------------------------------------------------------------------------------

```bash
1 | #!/bin/bash
2 | echo "MCP Memory Service Status:"
3 | echo "-" | tr '-' '='
4 | systemctl --user status mcp-memory
5 | 
```

--------------------------------------------------------------------------------
/scripts/linux/view_logs.sh:
--------------------------------------------------------------------------------

```bash
1 | #!/bin/bash
2 | echo "Viewing MCP Memory Service logs (press Ctrl+C to exit)..."
3 | journalctl -u mcp-memory -f
4 | 
```

--------------------------------------------------------------------------------
/scripts/.claude/settings.local.json:
--------------------------------------------------------------------------------

```json
1 | {
2 |   "permissions": {
3 |     "allow": [
4 |       "mcp__code-context__index_codebase"
5 |     ],
6 |     "deny": [],
7 |     "ask": []
8 |   }
9 | }
```

--------------------------------------------------------------------------------
/docs/statistics/data/activity_by_hour.csv:
--------------------------------------------------------------------------------

```
 1 | hour,commits
 2 | 00,22
 3 | 01,6
 4 | 06,19
 5 | 07,76
 6 | 08,90
 7 | 09,73
 8 | 10,43
 9 | 11,71
10 | 12,73
11 | 13,92
12 | 14,97
13 | 15,41
14 | 16,73
15 | 17,85
16 | 18,65
17 | 19,98
18 | 20,138
19 | 21,160
20 | 22,150
21 | 23,64
22 | 
```

--------------------------------------------------------------------------------
/docs/statistics/data/contributors.csv:
--------------------------------------------------------------------------------

```
1 | contributor,commits,percentage
2 | Heinrich Krupp,1418,94.8%
3 | zod,20,1.3%
4 | Salih Ergüt,16,1.1%
5 | 3dyuval,10,0.7%
6 | muxammadreza,8,0.5%
7 | Henry Mao,6,0.4%
8 | 
```

--------------------------------------------------------------------------------
/tests/__init__.py:
--------------------------------------------------------------------------------

```python
1 | """
2 | Test suite for MCP Memory Service.
3 | This package contains all test modules for verifying the functionality
4 | of the memory service components.
5 | """
```

--------------------------------------------------------------------------------
/docs/statistics/data/activity_by_day.csv:
--------------------------------------------------------------------------------

```
1 | day_of_week,commits,percentage
2 | Sunday,314,20.4%
3 | Saturday,285,18.6%
4 | Monday,271,17.6%
5 | Friday,231,15.0%
6 | Tuesday,177,11.5%
7 | Thursday,131,8.5%
8 | Wednesday,127,8.3%
9 | 
```

--------------------------------------------------------------------------------
/docs/statistics/data/monthly_activity.csv:
--------------------------------------------------------------------------------

```
 1 | month,commits,releases
 2 | 2024-12,55,1
 3 | 2025-01,34,0
 4 | 2025-02,2,0
 5 | 2025-03,66,0
 6 | 2025-04,102,0
 7 | 2025-05,4,0
 8 | 2025-06,36,0
 9 | 2025-07,351,9
10 | 2025-08,330,64
11 | 2025-09,246,34
12 | 2025-10,310,65
13 | 
```

--------------------------------------------------------------------------------
/scripts/linux/stop_service.sh:
--------------------------------------------------------------------------------

```bash
1 | #!/bin/bash
2 | echo "Stopping MCP Memory Service..."
3 | systemctl --user stop mcp-memory
4 | if [ $? -eq 0 ]; then
5 |     echo "✅ Service stopped successfully!"
6 | else
7 |     echo "❌ Failed to stop service"
8 | fi
9 | 
```

--------------------------------------------------------------------------------
/scripts/linux/start_service.sh:
--------------------------------------------------------------------------------

```bash
1 | #!/bin/bash
2 | echo "Starting MCP Memory Service..."
3 | systemctl --user start mcp-memory
4 | if [ $? -eq 0 ]; then
5 |     echo "✅ Service started successfully!"
6 | else
7 |     echo "❌ Failed to start service"
8 | fi
9 | 
```

--------------------------------------------------------------------------------
/pytest.ini:
--------------------------------------------------------------------------------

```
 1 | [pytest]
 2 | testpaths = tests
 3 | python_files = test_*.py
 4 | python_classes = Test*
 5 | python_functions = test_*
 6 | markers =
 7 |     unit: unit tests
 8 |     integration: integration tests
 9 |     performance: performance tests
10 |     asyncio: mark test as async
11 | 
```

--------------------------------------------------------------------------------
/archive/litestream-configs-v6.3.0/requirements-cloudflare.txt:
--------------------------------------------------------------------------------

```
1 | # Additional dependencies for Cloudflare backend support
2 | # These are installed automatically when using the cloudflare backend
3 | 
4 | # HTTP client for Cloudflare API calls
5 | httpx>=0.24.0
6 | 
7 | # Optional: Cloudflare Python SDK (if available)
8 | # cloudflare>=2.15.0
```

--------------------------------------------------------------------------------
/test_document.txt:
--------------------------------------------------------------------------------

```
 1 | This is a test document for MCP Memory Service document ingestion.
 2 | 
 3 | It contains some sample content to test the chunking and embedding functionality.
 4 | 
 5 | Features:
 6 | - Multiple paragraphs
 7 | - Some technical content
 8 | - Test data for verification
 9 | 
10 | End of document.
11 | 
```

--------------------------------------------------------------------------------
/archive/litestream-configs-v6.3.0/litestream_replica_simple.yml:
--------------------------------------------------------------------------------

```yaml
1 | # Simple Litestream replica configuration 
2 | # Note: Litestream replicas typically push TO destinations, not pull FROM them
3 | # For pulling from HTTP, we'll use restore commands instead
4 | dbs:
5 |   - path: /Users/hkr/Library/Application Support/mcp-memory/sqlite_vec.db
```

--------------------------------------------------------------------------------
/src/mcp_memory_service/services/__init__.py:
--------------------------------------------------------------------------------

```python
 1 | """
 2 | Services package for MCP Memory Service.
 3 | 
 4 | This package contains shared business logic services that provide
 5 | consistent behavior across different interfaces (API, MCP tools).
 6 | """
 7 | 
 8 | from .memory_service import MemoryService, MemoryResult
 9 | 
10 | __all__ = ["MemoryService", "MemoryResult"]
11 | 
```

--------------------------------------------------------------------------------
/examples/claude-desktop-http-config.json:
--------------------------------------------------------------------------------

```json
 1 | {
 2 |   "mcpServers": {
 3 |     "memory": {
 4 |       "command": "node",
 5 |       "args": ["/path/to/mcp-memory-service/examples/http-mcp-bridge.js"],
 6 |       "env": {
 7 |         "MCP_MEMORY_HTTP_ENDPOINT": "http://your-server:8000/api",
 8 |         "MCP_MEMORY_API_KEY": "your-secure-api-key"
 9 |       }
10 |     }
11 |   }
12 | }
```

--------------------------------------------------------------------------------
/archive/litestream-configs-v6.3.0/litestream_replica_config.yml:
--------------------------------------------------------------------------------

```yaml
1 | # Litestream Replica Configuration for local macOS machine
2 | # This configuration syncs from the remote master at narrowbox.local
3 | dbs:
4 |   - path: /Users/hkr/Library/Application Support/mcp-memory/sqlite_vec.db
5 |     replicas:
6 |       - type: file
7 |         url: http://10.0.1.30:8080/mcp-memory
8 |         sync-interval: 10s
```

--------------------------------------------------------------------------------
/src/mcp_memory_service/embeddings/__init__.py:
--------------------------------------------------------------------------------

```python
 1 | """Embedding generation modules for MCP Memory Service."""
 2 | 
 3 | from .onnx_embeddings import (
 4 |     ONNXEmbeddingModel,
 5 |     get_onnx_embedding_model,
 6 |     ONNX_AVAILABLE,
 7 |     TOKENIZERS_AVAILABLE
 8 | )
 9 | 
10 | __all__ = [
11 |     'ONNXEmbeddingModel',
12 |     'get_onnx_embedding_model',
13 |     'ONNX_AVAILABLE',
14 |     'TOKENIZERS_AVAILABLE'
15 | ]
```

--------------------------------------------------------------------------------
/examples/config/claude_desktop_config.json:
--------------------------------------------------------------------------------

```json
 1 | {
 2 |   "mcpServers": {
 3 |     "memory": {
 4 |       "command": "python",
 5 |       "args": [
 6 |         "-m",
 7 |         "mcp_memory_service.server"
 8 |       ],
 9 |       "env": {
10 |         "MCP_MEMORY_STORAGE_BACKEND": "sqlite_vec",
11 |         "MCP_MEMORY_BACKUPS_PATH": "C:\\Users\\heinrich.krupp\\AppData\\Local\\mcp-memory"
12 |       }
13 |     }
14 |   }
15 | }
```

--------------------------------------------------------------------------------
/archive/litestream-configs-v6.3.0/litestream_replica_config_fixed.yml:
--------------------------------------------------------------------------------

```yaml
1 | # Litestream Replica Configuration for local macOS machine (FIXED)
2 | # This configuration syncs from the remote master at 10.0.1.30:8080
3 | dbs:
4 |   - path: /Users/hkr/Library/Application Support/mcp-memory/sqlite_vec.db
5 |     replicas:
6 |       - name: "remote-master"
7 |         type: "http"
8 |         url: http://10.0.1.30:8080/mcp-memory
9 |         sync-interval: 10s
```

--------------------------------------------------------------------------------
/tests/conftest.py:
--------------------------------------------------------------------------------

```python
 1 | import pytest
 2 | import os
 3 | import sys
 4 | import tempfile
 5 | import shutil
 6 | 
 7 | # Add src directory to Python path
 8 | sys.path.insert(0, os.path.join(os.path.dirname(os.path.dirname(__file__)), 'src'))
 9 | 
10 | @pytest.fixture
11 | def temp_db_path():
12 |     '''Create a temporary directory for database testing.'''
13 |     temp_dir = tempfile.mkdtemp()
14 |     yield temp_dir
15 |     # Clean up after test
16 |     shutil.rmtree(temp_dir)
17 | 
```

--------------------------------------------------------------------------------
/scripts/testing/run_memory_test.sh:
--------------------------------------------------------------------------------

```bash
 1 | #!/bin/bash
 2 | set -e
 3 | 
 4 | # Activate virtual environment
 5 | source ./venv/bin/activate
 6 | 
 7 | # Set environment variables
 8 | export MCP_MEMORY_STORAGE_BACKEND="sqlite_vec"
 9 | export MCP_MEMORY_SQLITE_PATH="/Users/hkr/Library/Application Support/mcp-memory/sqlite_vec.db"
10 | export MCP_MEMORY_BACKUPS_PATH="/Users/hkr/Library/Application Support/mcp-memory/backups"
11 | export MCP_MEMORY_USE_ONNX="1"
12 | 
13 | # Run the memory server
14 | python -m mcp_memory_service.server
```

--------------------------------------------------------------------------------
/scripts/service/update_service.sh:
--------------------------------------------------------------------------------

```bash
 1 | #!/bin/bash
 2 | 
 3 | echo "Updating MCP Memory Service configuration..."
 4 | 
 5 | # Copy the updated service file
 6 | sudo cp mcp-memory.service /etc/systemd/system/
 7 | 
 8 | # Set proper permissions
 9 | sudo chmod 644 /etc/systemd/system/mcp-memory.service
10 | 
11 | # Reload systemd daemon
12 | sudo systemctl daemon-reload
13 | 
14 | echo "✅ Service updated successfully!"
15 | echo ""
16 | echo "Now try starting the service:"
17 | echo "  sudo systemctl start mcp-memory"
18 | echo "  sudo systemctl status mcp-memory"
```

--------------------------------------------------------------------------------
/scripts/pr/run_quality_checks.sh:
--------------------------------------------------------------------------------

```bash
 1 | #!/bin/bash
 2 | # scripts/pr/run_quality_checks.sh - Run quality checks on a PR
 3 | # Wrapper for quality_gate.sh to maintain consistent naming in workflows
 4 | #
 5 | # Usage: bash scripts/pr/run_quality_checks.sh <PR_NUMBER>
 6 | 
 7 | set -e
 8 | 
 9 | PR_NUMBER=$1
10 | 
11 | if [ -z "$PR_NUMBER" ]; then
12 |     echo "Usage: $0 <PR_NUMBER>"
13 |     exit 1
14 | fi
15 | 
16 | # Get script directory
17 | SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
18 | 
19 | # Run quality gate checks
20 | exec "$SCRIPT_DIR/quality_gate.sh" "$PR_NUMBER"
21 | 
```

--------------------------------------------------------------------------------
/tests/bridge/package.json:
--------------------------------------------------------------------------------

```json
 1 | {
 2 |   "name": "mcp-bridge-tests",
 3 |   "version": "1.0.0",
 4 |   "description": "Unit tests for HTTP-MCP bridge",
 5 |   "main": "test_http_mcp_bridge.js",
 6 |   "scripts": {
 7 |     "test": "mocha test_http_mcp_bridge.js --reporter spec",
 8 |     "test:watch": "mocha test_http_mcp_bridge.js --reporter spec --watch"
 9 |   },
10 |   "dependencies": {
11 |     "mocha": "^10.0.0",
12 |     "sinon": "^17.0.0"
13 |   },
14 |   "devDependencies": {},
15 |   "keywords": ["mcp", "bridge", "testing"],
16 |   "author": "",
17 |   "license": "Apache-2.0"
18 | }
```

--------------------------------------------------------------------------------
/scripts/development/setup-git-merge-drivers.sh:
--------------------------------------------------------------------------------

```bash
 1 | #!/bin/bash
 2 | # Setup script for git merge drivers
 3 | # Run this once after cloning the repository
 4 | 
 5 | echo "Setting up git merge drivers for uv.lock..."
 6 | 
 7 | # Configure the uv.lock merge driver
 8 | git config merge.uv-lock-merge.driver './scripts/uv-lock-merge.sh %O %A %B %L %P'
 9 | git config merge.uv-lock-merge.name 'UV lock file merge driver'
10 | 
11 | # Make the merge script executable
12 | chmod +x scripts/uv-lock-merge.sh
13 | 
14 | echo "✓ Git merge drivers configured successfully!"
15 | echo "  uv.lock conflicts will now be resolved automatically"
```

--------------------------------------------------------------------------------
/archive/litestream-configs-v6.3.0/litestream_master_config.yml:
--------------------------------------------------------------------------------

```yaml
 1 | # Litestream Master Configuration for narrowbox.local
 2 | # This configuration sets up the remote server as the master database
 3 | dbs:
 4 |   - path: /home/user/.local/share/mcp-memory/sqlite_vec.db
 5 |     replicas:
 6 |       # Local file replica for serving via HTTP
 7 |       - type: file
 8 |         path: /var/www/litestream/mcp-memory
 9 |         sync-interval: 10s
10 |       
11 |       # Local backup
12 |       - type: file
13 |         path: /backup/litestream/mcp-memory
14 |         sync-interval: 1m
15 |     
16 |     # Performance settings
17 |     checkpoint-interval: 30s
18 |     wal-retention: 10m
```

--------------------------------------------------------------------------------
/tests/integration/package.json:
--------------------------------------------------------------------------------

```json
 1 | {
 2 |   "name": "mcp-integration-tests",
 3 |   "version": "1.0.0", 
 4 |   "description": "Integration tests for HTTP-MCP bridge",
 5 |   "main": "test_bridge_integration.js",
 6 |   "scripts": {
 7 |     "test": "mocha test_bridge_integration.js --reporter spec --timeout 10000",
 8 |     "test:watch": "mocha test_bridge_integration.js --reporter spec --timeout 10000 --watch"
 9 |   },
10 |   "dependencies": {
11 |     "mocha": "^10.0.0",
12 |     "sinon": "^17.0.0"
13 |   },
14 |   "devDependencies": {},
15 |   "keywords": ["mcp", "bridge", "integration", "testing"],
16 |   "author": "",
17 |   "license": "Apache-2.0"
18 | }
```

--------------------------------------------------------------------------------
/.github/ISSUE_TEMPLATE/config.yml:
--------------------------------------------------------------------------------

```yaml
 1 | blank_issues_enabled: false
 2 | contact_links:
 3 |   - name: 📚 Documentation & Wiki
 4 |     url: https://github.com/doobidoo/mcp-memory-service/wiki
 5 |     about: Check the Wiki for setup guides, troubleshooting, and advanced usage
 6 |   - name: 💬 GitHub Discussions
 7 |     url: https://github.com/doobidoo/mcp-memory-service/discussions
 8 |     about: Ask questions, share ideas, or discuss general topics with the community
 9 |   - name: 🔍 Search Existing Issues
10 |     url: https://github.com/doobidoo/mcp-memory-service/issues
11 |     about: Check if your issue has already been reported or solved
12 | 
```

--------------------------------------------------------------------------------
/scripts/linux/uninstall_service.sh:
--------------------------------------------------------------------------------

```bash
 1 | #!/bin/bash
 2 | echo "This will uninstall MCP Memory Service."
 3 | read -p "Are you sure? (y/N): " confirm
 4 | if [[ ! "$confirm" =~ ^[Yy]$ ]]; then
 5 |     exit 0
 6 | fi
 7 | 
 8 | echo "Stopping service..."
 9 | systemctl --user stop mcp-memory 2>/dev/null
10 | systemctl --user disable mcp-memory 2>/dev/null
11 | 
12 | echo "Removing service files..."
13 | if [ -f "$HOME/.config/systemd/user/mcp-memory.service" ]; then
14 |     rm -f "$HOME/.config/systemd/user/mcp-memory.service"
15 |     systemctl --user daemon-reload
16 | else
17 |     sudo rm -f /etc/systemd/system/mcp-memory.service
18 |     sudo systemctl daemon-reload
19 | fi
20 | 
21 | echo "✅ Service uninstalled"
22 | 
```

--------------------------------------------------------------------------------
/archive/litestream-configs-v6.3.0/litestream_master_config_fixed.yml:
--------------------------------------------------------------------------------

```yaml
 1 | # Litestream Master Configuration for narrowbox.local (FIXED)
 2 | # This configuration sets up the remote server as the master database
 3 | dbs:
 4 |   - path: /home/hkr/.local/share/mcp-memory/sqlite_vec.db
 5 |     replicas:
 6 |       # HTTP replica for serving to clients
 7 |       - name: "http-replica"
 8 |         type: file
 9 |         path: /var/www/litestream/mcp-memory
10 |         sync-interval: 10s
11 |       
12 |       # Local backup
13 |       - name: "backup-replica"
14 |         type: file
15 |         path: /backup/litestream/mcp-memory
16 |         sync-interval: 1m
17 |     
18 |     # Performance settings
19 |     checkpoint-interval: 30s
20 |     wal-retention: 10m
```

--------------------------------------------------------------------------------
/start_http_server.sh:
--------------------------------------------------------------------------------

```bash
 1 | #!/bin/bash
 2 | 
 3 | export MCP_MEMORY_STORAGE_BACKEND=hybrid
 4 | export MCP_MEMORY_SQLITE_PATH="/Users/hkr/Library/Application Support/mcp-memory/sqlite_vec.db"
 5 | export MCP_HTTP_ENABLED=true
 6 | export MCP_OAUTH_ENABLED=false
 7 | export CLOUDFLARE_API_TOKEN="Y9qwW1rYkwiE63iWYASxnzfTQlIn-mtwCihRTwZa"
 8 | export CLOUDFLARE_ACCOUNT_ID="be0e35a26715043ef8df90253268c33f"
 9 | export CLOUDFLARE_D1_DATABASE_ID="f745e9b4-ba8e-4d47-b38f-12af91060d5a"
10 | export CLOUDFLARE_VECTORIZE_INDEX="mcp-memory-index"
11 | 
12 | cd /Users/hkr/Documents/GitHub/mcp-memory-service
13 | python -m uvicorn mcp_memory_service.web.app:app --host 127.0.0.1 --port 8889 --reload
14 | 
```

--------------------------------------------------------------------------------
/tests/api/__init__.py:
--------------------------------------------------------------------------------

```python
 1 | # Copyright 2024 Heinrich Krupp
 2 | #
 3 | # Licensed under the Apache License, Version 2.0 (the "License");
 4 | # you may not use this file except in compliance with the License.
 5 | # You may obtain a copy of the License at
 6 | #
 7 | #     http://www.apache.org/licenses/LICENSE-2.0
 8 | #
 9 | # Unless required by applicable law or agreed to in writing, software
10 | # distributed under the License is distributed on an "AS IS" BASIS,
11 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 | # See the License for the specific language governing permissions and
13 | # limitations under the License.
14 | 
15 | """Tests for code execution API."""
16 | 
```

--------------------------------------------------------------------------------
/src/mcp_memory_service/web/api/__init__.py:
--------------------------------------------------------------------------------

```python
 1 | # Copyright 2024 Heinrich Krupp
 2 | #
 3 | # Licensed under the Apache License, Version 2.0 (the "License");
 4 | # you may not use this file except in compliance with the License.
 5 | # You may obtain a copy of the License at
 6 | #
 7 | #     http://www.apache.org/licenses/LICENSE-2.0
 8 | #
 9 | # Unless required by applicable law or agreed to in writing, software
10 | # distributed under the License is distributed on an "AS IS" BASIS,
11 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 | # See the License for the specific language governing permissions and
13 | # limitations under the License.
14 | 
15 | """
16 | API routes for the HTTP interface.
17 | """
```

--------------------------------------------------------------------------------
/tools/docker/docker-compose.pythonpath.yml:
--------------------------------------------------------------------------------

```yaml
 1 | services:
 2 |   memory-service:
 3 |     image: python:3.10-slim
 4 |     working_dir: /app
 5 |     stdin_open: true
 6 |     tty: true
 7 |     ports:
 8 |       - "8000:8000"
 9 |     volumes:
10 |       - .:/app
11 |       - ${CHROMA_DB_PATH:-$HOME/mcp-memory/chroma_db}:/app/chroma_db
12 |       - ${BACKUPS_PATH:-$HOME/mcp-memory/backups}:/app/backups
13 |     environment:
14 |       - MCP_MEMORY_CHROMA_PATH=/app/chroma_db
15 |       - MCP_MEMORY_BACKUPS_PATH=/app/backups
16 |       - LOG_LEVEL=INFO
17 |       - MAX_RESULTS_PER_QUERY=10
18 |       - SIMILARITY_THRESHOLD=0.7
19 |       - PYTHONPATH=/app/src:/app
20 |       - PYTHONUNBUFFERED=1
21 |     restart: unless-stopped
22 |     build:
23 |       context: .
24 |       dockerfile: Dockerfile
25 | 
```

--------------------------------------------------------------------------------
/src/mcp_memory_service/models/__init__.py:
--------------------------------------------------------------------------------

```python
 1 | # Copyright 2024 Heinrich Krupp
 2 | #
 3 | # Licensed under the Apache License, Version 2.0 (the "License");
 4 | # you may not use this file except in compliance with the License.
 5 | # You may obtain a copy of the License at
 6 | #
 7 | #     http://www.apache.org/licenses/LICENSE-2.0
 8 | #
 9 | # Unless required by applicable law or agreed to in writing, software
10 | # distributed under the License is distributed on an "AS IS" BASIS,
11 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 | # See the License for the specific language governing permissions and
13 | # limitations under the License.
14 | 
15 | from .memory import Memory, MemoryQueryResult
16 | 
17 | __all__ = ['Memory', 'MemoryQueryResult']
```

--------------------------------------------------------------------------------
/tools/docker/docker-compose.uv.yml:
--------------------------------------------------------------------------------

```yaml
 1 | services:
 2 |   memory-service:
 3 |     image: python:3.10-slim
 4 |     working_dir: /app
 5 |     stdin_open: true
 6 |     tty: true
 7 |     ports:
 8 |       - "8000:8000"
 9 |     volumes:
10 |       - .:/app
11 |       - ${CHROMA_DB_PATH:-$HOME/mcp-memory/chroma_db}:/app/chroma_db
12 |       - ${BACKUPS_PATH:-$HOME/mcp-memory/backups}:/app/backups
13 |     environment:
14 |       - MCP_MEMORY_CHROMA_PATH=/app/chroma_db
15 |       - MCP_MEMORY_BACKUPS_PATH=/app/backups
16 |       - LOG_LEVEL=INFO
17 |       - MAX_RESULTS_PER_QUERY=10
18 |       - SIMILARITY_THRESHOLD=0.7
19 |       - PYTHONPATH=/app
20 |       - PYTHONUNBUFFERED=1
21 |       - UV_ACTIVE=1
22 |       - CHROMA_TELEMETRY_IMPL=none
23 |       - ANONYMIZED_TELEMETRY=false
24 |     restart: unless-stopped
25 |     build:
26 |       context: .
27 |       dockerfile: Dockerfile
28 | 
```

--------------------------------------------------------------------------------
/scripts/sync/litestream/init_staging_db.sh:
--------------------------------------------------------------------------------

```bash
 1 | #!/bin/bash
 2 | # Initialize staging database for offline memory changes
 3 | 
 4 | STAGING_DB="/Users/hkr/Library/Application Support/mcp-memory/sqlite_vec_staging.db"
 5 | INIT_SQL="$(dirname "$0")/deployment/staging_db_init.sql"
 6 | 
 7 | echo "$(date): Initializing staging database..."
 8 | 
 9 | # Create directory if it doesn't exist
10 | mkdir -p "$(dirname "$STAGING_DB")"
11 | 
12 | # Initialize database with schema
13 | sqlite3 "$STAGING_DB" < "$INIT_SQL"
14 | 
15 | if [ $? -eq 0 ]; then
16 |     echo "$(date): Staging database initialized at: $STAGING_DB"
17 |     echo "$(date): Database size: $(du -h "$STAGING_DB" | cut -f1)"
18 | else
19 |     echo "$(date): ERROR: Failed to initialize staging database"
20 |     exit 1
21 | fi
22 | 
23 | # Set permissions
24 | chmod 644 "$STAGING_DB"
25 | 
26 | echo "$(date): Staging database ready for use"
```

--------------------------------------------------------------------------------
/src/mcp_memory_service/utils/__init__.py:
--------------------------------------------------------------------------------

```python
 1 | # Copyright 2024 Heinrich Krupp
 2 | #
 3 | # Licensed under the Apache License, Version 2.0 (the "License");
 4 | # you may not use this file except in compliance with the License.
 5 | # You may obtain a copy of the License at
 6 | #
 7 | #     http://www.apache.org/licenses/LICENSE-2.0
 8 | #
 9 | # Unless required by applicable law or agreed to in writing, software
10 | # distributed under the License is distributed on an "AS IS" BASIS,
11 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 | # See the License for the specific language governing permissions and
13 | # limitations under the License.
14 | 
15 | from .hashing import generate_content_hash
16 | from .document_processing import create_memory_from_chunk, _process_and_store_chunk
17 | 
18 | __all__ = ['generate_content_hash', 'create_memory_from_chunk', '_process_and_store_chunk']
```

--------------------------------------------------------------------------------
/src/mcp_memory_service/backup/__init__.py:
--------------------------------------------------------------------------------

```python
 1 | # Copyright 2024 Heinrich Krupp
 2 | #
 3 | # Licensed under the Apache License, Version 2.0 (the "License");
 4 | # you may not use this file except in compliance with the License.
 5 | # You may obtain a copy of the License at
 6 | #
 7 | #     http://www.apache.org/licenses/LICENSE-2.0
 8 | #
 9 | # Unless required by applicable law or agreed to in writing, software
10 | # distributed under the License is distributed on an "AS IS" BASIS,
11 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 | # See the License for the specific language governing permissions and
13 | # limitations under the License.
14 | 
15 | """
16 | Automatic backup module for MCP Memory Service.
17 | 
18 | Provides scheduled backups and backup management functionality.
19 | """
20 | 
21 | from .scheduler import BackupScheduler, BackupService
22 | 
23 | __all__ = ['BackupScheduler', 'BackupService']
24 | 
```

--------------------------------------------------------------------------------
/src/mcp_memory_service/cli/__init__.py:
--------------------------------------------------------------------------------

```python
 1 | # Copyright 2024 Heinrich Krupp
 2 | #
 3 | # Licensed under the Apache License, Version 2.0 (the "License");
 4 | # you may not use this file except in compliance with the License.
 5 | # You may obtain a copy of the License at
 6 | #
 7 | #     http://www.apache.org/licenses/LICENSE-2.0
 8 | #
 9 | # Unless required by applicable law or agreed to in writing, software
10 | # distributed under the License is distributed on an "AS IS" BASIS,
11 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 | # See the License for the specific language governing permissions and
13 | # limitations under the License.
14 | 
15 | """
16 | Command Line Interface for MCP Memory Service
17 | 
18 | Provides CLI commands for document ingestion, memory management, and database operations.
19 | """
20 | 
21 | from .ingestion import add_ingestion_commands
22 | 
23 | __all__ = ['add_ingestion_commands']
```

--------------------------------------------------------------------------------
/src/mcp_memory_service/web/__init__.py:
--------------------------------------------------------------------------------

```python
 1 | # Copyright 2024 Heinrich Krupp
 2 | #
 3 | # Licensed under the Apache License, Version 2.0 (the "License");
 4 | # you may not use this file except in compliance with the License.
 5 | # You may obtain a copy of the License at
 6 | #
 7 | #     http://www.apache.org/licenses/LICENSE-2.0
 8 | #
 9 | # Unless required by applicable law or agreed to in writing, software
10 | # distributed under the License is distributed on an "AS IS" BASIS,
11 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 | # See the License for the specific language governing permissions and
13 | # limitations under the License.
14 | 
15 | """
16 | Web interface for MCP Memory Service.
17 | 
18 | Provides HTTP REST API and Server-Sent Events (SSE) interface
19 | using FastAPI and SQLite-vec backend.
20 | """
21 | 
22 | # Import version from main package to maintain consistency
23 | from .. import __version__
```

--------------------------------------------------------------------------------
/tools/docker/docker-compose.standalone.yml:
--------------------------------------------------------------------------------

```yaml
 1 | services:
 2 |   memory-service:
 3 |     image: python:3.10-slim
 4 |     working_dir: /app
 5 |     stdin_open: true
 6 |     tty: true
 7 |     ports:
 8 |       - "8000:8000"
 9 |     volumes:
10 |       - .:/app
11 |       - ${CHROMA_DB_PATH:-$HOME/mcp-memory/chroma_db}:/app/chroma_db
12 |       - ${BACKUPS_PATH:-$HOME/mcp-memory/backups}:/app/backups
13 |     environment:
14 |       - MCP_MEMORY_CHROMA_PATH=/app/chroma_db
15 |       - MCP_MEMORY_BACKUPS_PATH=/app/backups
16 |       - LOG_LEVEL=INFO
17 |       - MAX_RESULTS_PER_QUERY=10
18 |       - SIMILARITY_THRESHOLD=0.7
19 |       - PYTHONPATH=/app
20 |       - PYTHONUNBUFFERED=1
21 |       - UV_ACTIVE=1
22 |       - MCP_STANDALONE_MODE=1
23 |       - CHROMA_TELEMETRY_IMPL=none
24 |       - ANONYMIZED_TELEMETRY=false
25 |     restart: unless-stopped
26 |     build:
27 |       context: .
28 |       dockerfile: Dockerfile
29 |     entrypoint: ["/usr/local/bin/docker-entrypoint-persistent.sh"]
```

--------------------------------------------------------------------------------
/.github/FUNDING.yml:
--------------------------------------------------------------------------------

```yaml
 1 | # These are supported funding model platforms
 2 | 
 3 | # github: doobidoo # Uncomment when enrolled in GitHub Sponsors
 4 | # patreon: # Replace with a single Patreon username
 5 | # open_collective: # Replace with a single Open Collective username
 6 | ko_fi: doobidoo # Replace with a single Ko-fi username
 7 | # tidelift: # Replace with a single Tidelift platform-name/package-name e.g., npm/babel
 8 | # community_bridge: # Replace with a single Community Bridge project-name e.g., cloud-foundry
 9 | # liberapay: # Replace with a single Liberapay username
10 | # issuehunt: # Replace with a single IssueHunt username
11 | # lfx_crowdfunding: # Replace with a single LFX Crowdfunding project-name e.g., cloud-foundry
12 | custom: ['https://www.buymeacoffee.com/doobidoo', 'https://paypal.me/heinrichkrupp1'] # Replace with up to 4 custom
13 | # sponsorship URLs e.g., ['', 'link2']
14 | 
```

--------------------------------------------------------------------------------
/archive/deployment-configs/smithery.yaml:
--------------------------------------------------------------------------------

```yaml
 1 | # Smithery configuration file: https://smithery.ai/docs/config#smitheryyaml
 2 | 
 3 | startCommand:
 4 |   type: stdio
 5 |   configSchema:
 6 |     # JSON Schema defining the configuration options for the MCP.
 7 |     type: object
 8 |     required:
 9 |       - chromaDbPath
10 |       - backupsPath
11 |     properties:
12 |       chromaDbPath:
13 |         type: string
14 |         description: Path to ChromaDB storage.
15 |       backupsPath:
16 |         type: string
17 |         description: Path for backups.
18 |   commandFunction:
19 |     # A function that produces the CLI command to start the MCP on stdio.
20 |     |-
21 |     (config) => ({ 
22 |       command: 'python', 
23 |       args: ['-m', 'mcp_memory_service.server'], 
24 |       env: { 
25 |         MCP_MEMORY_CHROMA_PATH: config.chromaDbPath, 
26 |         MCP_MEMORY_BACKUPS_PATH: config.backupsPath,
27 |         PYTHONUNBUFFERED: '1',
28 |         PYTORCH_ENABLE_MPS_FALLBACK: '1'
29 |       } 
30 |     })
```

--------------------------------------------------------------------------------
/examples/claude_desktop_config_template.json:
--------------------------------------------------------------------------------

```json
 1 | {
 2 |   "mcpServers": {
 3 |     "memory": {
 4 |       "_comment": "Recommended: Use Python module approach (most stable, no path dependencies)",
 5 |       "command": "python",
 6 |       "args": [
 7 |         "-m",
 8 |         "mcp_memory_service.server"
 9 |       ],
10 |       "_alternative_approaches": [
11 |         "Option 1 (UV): command='uv', args=['--directory', '${PROJECT_PATH}', 'run', 'memory', 'server']",
12 |         "Option 2 (New script path): command='python', args=['${PROJECT_PATH}/scripts/server/run_memory_server.py']",
13 |         "Option 3 (Legacy, shows migration notice): command='python', args=['${PROJECT_PATH}/scripts/run_memory_server.py']"
14 |       ],
15 |       "env": {
16 |         "MCP_MEMORY_STORAGE_BACKEND": "sqlite_vec",
17 |         "MCP_MEMORY_BACKUPS_PATH": "${USER_DATA_PATH}/mcp-memory/backups",
18 |         "PYTORCH_ENABLE_MPS_FALLBACK": "1",
19 |         "PYTORCH_CUDA_ALLOC_CONF": "max_split_size_mb:128"
20 |       }
21 |     }
22 |   }
23 | }
```

--------------------------------------------------------------------------------
/scripts/server/start_http_server.bat:
--------------------------------------------------------------------------------

```
 1 | @echo off
 2 | REM Start the MCP Memory Service HTTP server in the background on Windows
 3 | 
 4 | echo Starting MCP Memory Service HTTP server...
 5 | 
 6 | REM Check if server is already running
 7 | uv run python scripts\server\check_http_server.py -q
 8 | if %errorlevel% == 0 (
 9 |     echo HTTP server is already running!
10 |     uv run python scripts\server\check_http_server.py
11 |     exit /b 0
12 | )
13 | 
14 | REM Start the server in a new window
15 | start "MCP Memory HTTP Server" uv run python scripts\server\run_http_server.py
16 | 
17 | REM Wait up to 5 seconds for the server to start
18 | FOR /L %%i IN (1,1,5) DO (
19 |     timeout /t 1 /nobreak >nul
20 |     uv run python scripts\server\check_http_server.py -q
21 |     if %errorlevel% == 0 (
22 |         echo.
23 |         echo [OK] HTTP server started successfully!
24 |         uv run python scripts\server\check_http_server.py
25 |         goto :eof
26 |     )
27 | )
28 | 
29 | echo.
30 | echo [WARN] Server did not start within 5 seconds. Check the server window for errors.
31 | 
```

--------------------------------------------------------------------------------
/scripts/sync/litestream/sync_from_remote.sh:
--------------------------------------------------------------------------------

```bash
 1 | #!/bin/bash
 2 | # Sync script to pull latest database from remote master
 3 | 
 4 | DB_PATH="/Users/hkr/Library/Application Support/mcp-memory/sqlite_vec.db"
 5 | REMOTE_URL="http://10.0.1.30:8080/mcp-memory"
 6 | BACKUP_PATH="/Users/hkr/Library/Application Support/mcp-memory/sqlite_vec.db.backup"
 7 | 
 8 | echo "$(date): Starting sync from remote master..."
 9 | 
10 | # Create backup of current database
11 | if [ -f "$DB_PATH" ]; then
12 |     cp "$DB_PATH" "$BACKUP_PATH"
13 |     echo "$(date): Created backup at $BACKUP_PATH"
14 | fi
15 | 
16 | # Restore from remote
17 | litestream restore -o "$DB_PATH" "$REMOTE_URL"
18 | 
19 | if [ $? -eq 0 ]; then
20 |     echo "$(date): Successfully synced database from remote master"
21 |     # Remove backup on success
22 |     rm -f "$BACKUP_PATH"
23 | else
24 |     echo "$(date): ERROR: Failed to sync from remote master"
25 |     # Restore backup on failure
26 |     if [ -f "$BACKUP_PATH" ]; then
27 |         mv "$BACKUP_PATH" "$DB_PATH"
28 |         echo "$(date): Restored backup"
29 |     fi
30 |     exit 1
31 | fi
```

--------------------------------------------------------------------------------
/src/mcp_memory_service/discovery/__init__.py:
--------------------------------------------------------------------------------

```python
 1 | # Copyright 2024 Heinrich Krupp
 2 | #
 3 | # Licensed under the Apache License, Version 2.0 (the "License");
 4 | # you may not use this file except in compliance with the License.
 5 | # You may obtain a copy of the License at
 6 | #
 7 | #     http://www.apache.org/licenses/LICENSE-2.0
 8 | #
 9 | # Unless required by applicable law or agreed to in writing, software
10 | # distributed under the License is distributed on an "AS IS" BASIS,
11 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 | # See the License for the specific language governing permissions and
13 | # limitations under the License.
14 | 
15 | """
16 | mDNS service discovery module for MCP Memory Service.
17 | 
18 | This module provides mDNS service advertisement and discovery capabilities
19 | for the MCP Memory Service HTTP/HTTPS interface.
20 | """
21 | 
22 | from .mdns_service import ServiceAdvertiser, ServiceDiscovery
23 | from .client import DiscoveryClient
24 | 
25 | __all__ = ['ServiceAdvertiser', 'ServiceDiscovery', 'DiscoveryClient']
```

--------------------------------------------------------------------------------
/scripts/development/uv-lock-merge.sh:
--------------------------------------------------------------------------------

```bash
 1 | #!/bin/bash
 2 | # Git merge driver for uv.lock files
 3 | # Automatically resolves conflicts and regenerates the lock file
 4 | 
 5 | # Arguments from git:
 6 | # %O = ancestor's version
 7 | # %A = current version  
 8 | # %B = other version
 9 | # %L = conflict marker length
10 | # %P = path to file
11 | 
12 | ANCESTOR="$1"
13 | CURRENT="$2" 
14 | OTHER="$3"
15 | MARKER_LENGTH="$4"
16 | PATH="$5"
17 | 
18 | echo "Auto-resolving uv.lock conflict by regenerating lock file..."
19 | 
20 | # Accept the incoming version first (this resolves the conflict)
21 | cp "$OTHER" "$PATH"
22 | 
23 | # Check if uv is available
24 | if command -v uv >/dev/null 2>&1; then
25 |     echo "Running uv sync to regenerate lock file..."
26 |     # Regenerate the lock file based on pyproject.toml
27 |     uv sync --quiet
28 |     if [ $? -eq 0 ]; then
29 |         echo "✓ uv.lock regenerated successfully"
30 |         exit 0
31 |     else
32 |         echo "⚠ Warning: uv sync failed, using incoming version"
33 |         exit 0
34 |     fi
35 | else
36 |     echo "⚠ Warning: uv not found, using incoming version"
37 |     exit 0
38 | fi
```

--------------------------------------------------------------------------------
/archive/deployment/deploy_fastmcp_fixed.sh:
--------------------------------------------------------------------------------

```bash
 1 | #!/bin/bash
 2 | 
 3 | echo "🚀 Deploying Fixed FastMCP Server v4.0.0-alpha.1..."
 4 | 
 5 | # Stop current service
 6 | echo "⏹️ Stopping current service..."
 7 | sudo systemctl stop mcp-memory
 8 | 
 9 | # Install the fixed FastMCP service configuration
10 | echo "📝 Installing fixed FastMCP service configuration..."
11 | sudo cp /tmp/fastmcp-server-fixed.service /etc/systemd/system/mcp-memory.service
12 | 
13 | # Reload systemd daemon
14 | echo "🔄 Reloading systemd daemon..."
15 | sudo systemctl daemon-reload
16 | 
17 | # Start the FastMCP server
18 | echo "▶️ Starting FastMCP server..."
19 | sudo systemctl start mcp-memory
20 | 
21 | # Wait a moment for startup
22 | sleep 3
23 | 
24 | # Check status
25 | echo "🔍 Checking service status..."
26 | sudo systemctl status mcp-memory --no-pager
27 | 
28 | echo ""
29 | echo "📊 Service logs (last 10 lines):"
30 | sudo journalctl -u mcp-memory -n 10 --no-pager
31 | 
32 | echo ""
33 | echo "✅ FastMCP Server deployment complete!"
34 | echo "🔗 Native MCP Protocol should be available on port 8000"
35 | echo "📋 Monitor logs: sudo journalctl -u mcp-memory -f"
```

--------------------------------------------------------------------------------
/archive/development/test_fastmcp.py:
--------------------------------------------------------------------------------

```python
 1 | #!/usr/bin/env python3
 2 | """Simple test of FastMCP server structure for memory service."""
 3 | 
 4 | import sys
 5 | import os
 6 | from pathlib import Path
 7 | 
 8 | # Add src to path
 9 | sys.path.insert(0, 'src')
10 | 
11 | from mcp.server.fastmcp import FastMCP
12 | 
13 | # Create a simple FastMCP server for testing
14 | mcp = FastMCP("Test Memory Service")
15 | 
16 | @mcp.tool()
17 | def test_store_memory(content: str, tags: list[str] = None) -> dict:
18 |     """Test memory storage function."""
19 |     return {
20 |         "success": True,
21 |         "message": f"Stored: {content}",
22 |         "tags": tags or []
23 |     }
24 | 
25 | @mcp.tool() 
26 | def test_health() -> dict:
27 |     """Test health check."""
28 |     return {
29 |         "status": "healthy",
30 |         "version": "4.0.0-alpha.1"
31 |     }
32 | 
33 | if __name__ == "__main__":
34 |     print("FastMCP Memory Service Test")
35 |     print("Server configured with basic tools")
36 |     print("Available tools:")
37 |     print("- test_store_memory")
38 |     print("- test_health")
39 |     print("\nTo run server: mcp.run(transport='streamable-http')")
```

--------------------------------------------------------------------------------
/src/mcp_memory_service/sync/__init__.py:
--------------------------------------------------------------------------------

```python
 1 | # Copyright 2024 Heinrich Krupp
 2 | #
 3 | # Licensed under the Apache License, Version 2.0 (the "License");
 4 | # you may not use this file except in compliance with the License.
 5 | # You may obtain a copy of the License at
 6 | #
 7 | #     http://www.apache.org/licenses/LICENSE-2.0
 8 | #
 9 | # Unless required by applicable law or agreed to in writing, software
10 | # distributed under the License is distributed on an "AS IS" BASIS,
11 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 | # See the License for the specific language governing permissions and
13 | # limitations under the License.
14 | 
15 | """
16 | Database synchronization module for MCP Memory Service.
17 | 
18 | This module provides tools for synchronizing SQLite-vec databases across
19 | multiple machines using JSON export/import and Litestream replication.
20 | """
21 | 
22 | from .exporter import MemoryExporter
23 | from .importer import MemoryImporter
24 | from .litestream_config import LitestreamManager
25 | 
26 | __all__ = ['MemoryExporter', 'MemoryImporter', 'LitestreamManager']
```

--------------------------------------------------------------------------------
/docs/images/dashboard-placeholder.md:
--------------------------------------------------------------------------------

```markdown
 1 | # Dashboard Screenshot Placeholder
 2 | 
 3 | This directory will contain screenshots of the MCP Memory Service dashboard.
 4 | 
 5 | ## v3.3.0 Dashboard Features
 6 | 
 7 | The new dashboard includes:
 8 | 
 9 | - **Modern Design**: Gradient backgrounds with professional card layout
10 | - **Live Statistics**: Real-time server metrics and memory counts
11 | - **Interactive Endpoints**: Organized API documentation with hover effects
12 | - **Tech Stack Badges**: Visual representation of FastAPI, SQLite-vec, PyTorch, etc.
13 | - **Responsive Layout**: Works on desktop and mobile devices
14 | - **Auto-Refresh**: Stats update every 30 seconds
15 | 
16 | ## Access URLs
17 | 
18 | - Dashboard: http://localhost:8000
19 | - mDNS: http://mcp-memory-service.local:8000  
20 | - API Docs: http://localhost:8000/api/docs
21 | - ReDoc: http://localhost:8000/api/redoc
22 | 
23 | ## Screenshot Instructions
24 | 
25 | To capture the dashboard:
26 | 
27 | 1. Ensure the HTTP server is running
28 | 2. Open browser to http://localhost:8000
29 | 3. Wait for stats to load (shows actual memory count)
30 | 4. Take full-page screenshot
31 | 5. Save as `dashboard-v3.3.0.png` in this directory
```

--------------------------------------------------------------------------------
/tests/unit/test_import.py:
--------------------------------------------------------------------------------

```python
 1 | #!/usr/bin/env python3
 2 | """
 3 | Test script to verify the memory service can be imported and run.
 4 | """
 5 | import sys
 6 | import os
 7 | 
 8 | # Add the src directory to the Python path
 9 | script_dir = os.path.dirname(os.path.abspath(__file__))
10 | src_dir = os.path.join(script_dir, "src")
11 | sys.path.insert(0, src_dir)
12 | 
13 | try:
14 |     from mcp_memory_service.server import main
15 |     print("SUCCESS: Successfully imported mcp_memory_service.server.main")
16 |     
17 |     # Test basic configuration
18 |     from mcp_memory_service.config import (
19 |         SERVER_NAME,
20 |         SERVER_VERSION,
21 |         STORAGE_BACKEND,
22 |         DATABASE_PATH
23 |     )
24 |     print(f"SUCCESS: Server name: {SERVER_NAME}")
25 |     print(f"SUCCESS: Server version: {SERVER_VERSION}")
26 |     print(f"SUCCESS: Storage backend: {STORAGE_BACKEND}")
27 |     print(f"SUCCESS: Database path: {DATABASE_PATH}")
28 |     
29 |     print("SUCCESS: All imports successful - the memory service is ready to use!")
30 |     
31 | except ImportError as e:
32 |     print(f"ERROR: Import failed: {e}")
33 |     sys.exit(1)
34 | except Exception as e:
35 |     print(f"ERROR: Error: {e}")
36 |     sys.exit(1)
37 | 
```

--------------------------------------------------------------------------------
/archive/docs-removed-2025-08-23/development/CLEANUP_README.md:
--------------------------------------------------------------------------------

```markdown
 1 | # MCP-MEMORY-SERVICE Cleanup and Organization
 2 | 
 3 | This branch contains cleanup and reorganization changes for the MCP-MEMORY-SERVICE project.
 4 | 
 5 | ## Changes Implemented
 6 | 
 7 | 1. **Code Organization**
 8 |    - Restructured test files into proper directories
 9 |    - Organized documentation into a docs/ directory
10 |    - Archived old backup files
11 | 
12 | 2. **Documentation Updates**
13 |    - Updated CHANGELOG.md with v1.2.0 entries
14 |    - Created comprehensive documentation structure
15 |    - Added READMEs for each directory
16 | 
17 | 3. **Test Infrastructure**
18 |    - Created proper pytest configuration
19 |    - Added fixtures for common test scenarios
20 |    - Organized tests by type (unit, integration, performance)
21 | 
22 | ## Running the Cleanup Script
23 | 
24 | To apply these changes, run:
25 | 
26 | ```bash
27 | cd C:\REPOSITORIES\mcp-memory-service
28 | python scripts/cleanup_organize.py
29 | ```
30 | 
31 | ## Testing on Different Hardware
32 | 
33 | After organization is complete, create a hardware testing branch:
34 | 
35 | ```bash
36 | git checkout -b test/hardware-validation
37 | ```
38 | 
39 | The changes have been tracked in the memory system with the tag `memory-driven-development`.
40 | 
```

--------------------------------------------------------------------------------
/scripts/server/start_http_server.sh:
--------------------------------------------------------------------------------

```bash
 1 | #!/usr/bin/env bash
 2 | # Start the MCP Memory Service HTTP server in the background on Unix/macOS
 3 | 
 4 | set -e
 5 | 
 6 | echo "Starting MCP Memory Service HTTP server..."
 7 | 
 8 | # Check if server is already running
 9 | if uv run python scripts/server/check_http_server.py -q; then
10 |     echo "✅ HTTP server is already running!"
11 |     uv run python scripts/server/check_http_server.py -v
12 |     exit 0
13 | fi
14 | 
15 | # Start the server in the background
16 | nohup uv run python scripts/server/run_http_server.py > /tmp/mcp-http-server.log 2>&1 &
17 | SERVER_PID=$!
18 | 
19 | echo "Server started with PID: $SERVER_PID"
20 | echo "Logs available at: /tmp/mcp-http-server.log"
21 | 
22 | # Wait up to 5 seconds for the server to start
23 | for i in {1..5}; do
24 |     if uv run python scripts/server/check_http_server.py -q; then
25 |         break
26 |     fi
27 |     sleep 1
28 | done
29 | 
30 | # Check if it started successfully
31 | if uv run python scripts/server/check_http_server.py -v; then
32 |     echo ""
33 |     echo "✅ HTTP server started successfully!"
34 |     echo "PID: $SERVER_PID"
35 | else
36 |     echo ""
37 |     echo "⚠️ Server may still be starting... Check logs at /tmp/mcp-http-server.log"
38 | fi
39 | 
```

--------------------------------------------------------------------------------
/claude-hooks/simple-test.js:
--------------------------------------------------------------------------------

```javascript
 1 | #!/usr/bin/env node
 2 | 
 3 | const { AdaptivePatternDetector } = require('./utilities/adaptive-pattern-detector');
 4 | 
 5 | async function simpleTest() {
 6 |     const detector = new AdaptivePatternDetector({ sensitivity: 0.7 });
 7 | 
 8 |     const testCases = [
 9 |         { message: "What did we decide about the authentication approach?", shouldTrigger: true },
10 |         { message: "Remind me how we handled user sessions", shouldTrigger: true },
11 |         { message: "Remember when we discussed the database schema?", shouldTrigger: true },
12 |         { message: "Just implementing a new feature", shouldTrigger: false }
13 |     ];
14 | 
15 |     for (const testCase of testCases) {
16 |         const result = await detector.detectPatterns(testCase.message);
17 |         const actualTrigger = result.triggerRecommendation;
18 | 
19 |         console.log(`Message: "${testCase.message}"`);
20 |         console.log(`Expected: ${testCase.shouldTrigger}, Actual: ${actualTrigger}`);
21 |         console.log(`Confidence: ${result.confidence}`);
22 |         console.log(`Matches: ${result.matches.length}`);
23 |         console.log('---');
24 |     }
25 | }
26 | 
27 | simpleTest().catch(console.error);
```

--------------------------------------------------------------------------------
/scripts/sync/litestream/sync_from_remote_noconfig.sh:
--------------------------------------------------------------------------------

```bash
 1 | #!/bin/bash
 2 | # Sync script to pull latest database from remote master (no config file)
 3 | 
 4 | DB_PATH="/Users/hkr/Library/Application Support/mcp-memory/sqlite_vec.db"
 5 | REMOTE_URL="http://10.0.1.30:8080/mcp-memory"
 6 | BACKUP_PATH="/Users/hkr/Library/Application Support/mcp-memory/sqlite_vec.db.backup"
 7 | 
 8 | echo "$(date): Starting sync from remote master..."
 9 | 
10 | # Create backup of current database
11 | if [ -f "$DB_PATH" ]; then
12 |     cp "$DB_PATH" "$BACKUP_PATH"
13 |     echo "$(date): Created backup at $BACKUP_PATH"
14 | fi
15 | 
16 | # Restore from remote (no config file)
17 | litestream restore -o "$DB_PATH" "$REMOTE_URL"
18 | 
19 | if [ $? -eq 0 ]; then
20 |     echo "$(date): Successfully synced database from remote master"
21 |     # Remove backup on success
22 |     rm -f "$BACKUP_PATH"
23 |     
24 |     # Show database info
25 |     echo "$(date): Database size: $(du -h "$DB_PATH" | cut -f1)"
26 |     echo "$(date): Database modified: $(stat -f "%Sm" "$DB_PATH")"
27 | else
28 |     echo "$(date): ERROR: Failed to sync from remote master"
29 |     # Restore backup on failure
30 |     if [ -f "$BACKUP_PATH" ]; then
31 |         mv "$BACKUP_PATH" "$DB_PATH"
32 |         echo "$(date): Restored backup"
33 |     fi
34 |     exit 1
35 | fi
```

--------------------------------------------------------------------------------
/scripts/development/fix_mdns.sh:
--------------------------------------------------------------------------------

```bash
 1 | #!/bin/bash
 2 | 
 3 | echo "=== Fixing mDNS Configuration ==="
 4 | 
 5 | echo "1. Stopping any conflicting processes..."
 6 | # Kill the old process that might be interfering
 7 | pkill -f "/home/hkr/repositories/mcp-memory-service/.venv/bin/memory"
 8 | 
 9 | echo "2. Stopping systemd service..."
10 | sudo systemctl stop mcp-memory
11 | 
12 | echo "3. Updating systemd service configuration..."
13 | sudo cp mcp-memory.service /etc/systemd/system/
14 | sudo chmod 644 /etc/systemd/system/mcp-memory.service
15 | 
16 | echo "4. Reloading systemd daemon..."
17 | sudo systemctl daemon-reload
18 | 
19 | echo "5. Starting service with new configuration..."
20 | sudo systemctl start mcp-memory
21 | 
22 | echo "6. Checking service status..."
23 | sudo systemctl status mcp-memory --no-pager -l
24 | 
25 | echo ""
26 | echo "7. Testing mDNS resolution..."
27 | sleep 3
28 | echo "Checking avahi browse:"
29 | avahi-browse -t _http._tcp | grep memory
30 | echo ""
31 | echo "Testing memory.local resolution:"
32 | avahi-resolve-host-name memory.local
33 | echo ""
34 | echo "Testing HTTPS access:"
35 | curl -k -s https://memory.local:8000/api/health --connect-timeout 5 || echo "HTTPS test failed"
36 | 
37 | echo ""
38 | echo "=== Fix Complete ==="
39 | echo "If memory.local resolves and HTTPS works, you're all set!"
```

--------------------------------------------------------------------------------
/archive/deployment/deploy_mcp_v4.sh:
--------------------------------------------------------------------------------

```bash
 1 | #!/bin/bash
 2 | 
 3 | # Deploy FastAPI MCP Server v4.0.0-alpha.1
 4 | echo "🚀 Deploying FastAPI MCP Server v4.0.0-alpha.1..."
 5 | 
 6 | # Stop current service
 7 | echo "⏹️  Stopping current HTTP API service..."
 8 | sudo systemctl stop mcp-memory
 9 | 
10 | # Update systemd service file
11 | echo "📝 Updating systemd service configuration..."
12 | sudo cp /tmp/mcp-memory-v4.service /etc/systemd/system/mcp-memory.service
13 | 
14 | # Reload systemd daemon
15 | echo "🔄 Reloading systemd daemon..."
16 | sudo systemctl daemon-reload
17 | 
18 | # Start the new MCP server
19 | echo "▶️  Starting FastAPI MCP server..."
20 | sudo systemctl start mcp-memory
21 | 
22 | # Check status
23 | echo "🔍 Checking service status..."
24 | sudo systemctl status mcp-memory --no-pager
25 | 
26 | echo ""
27 | echo "✅ FastAPI MCP Server v4.0.0-alpha.1 deployment complete!"
28 | echo ""
29 | echo "🌐 Service Access:"
30 | echo "   - MCP Protocol: Available on port 8000"
31 | echo "   - Health Check: curl http://localhost:8000/health"
32 | echo "   - Service Logs: sudo journalctl -u mcp-memory -f"
33 | echo ""
34 | echo "🔧 Service Management:"
35 | echo "   - Status: sudo systemctl status mcp-memory"
36 | echo "   - Stop:   sudo systemctl stop mcp-memory"
37 | echo "   - Start:  sudo systemctl start mcp-memory"
```

--------------------------------------------------------------------------------
/src/mcp_memory_service/storage/__init__.py:
--------------------------------------------------------------------------------

```python
 1 | # Copyright 2024 Heinrich Krupp
 2 | #
 3 | # Licensed under the Apache License, Version 2.0 (the "License");
 4 | # you may not use this file except in compliance with the License.
 5 | # You may obtain a copy of the License at
 6 | #
 7 | #     http://www.apache.org/licenses/LICENSE-2.0
 8 | #
 9 | # Unless required by applicable law or agreed to in writing, software
10 | # distributed under the License is distributed on an "AS IS" BASIS,
11 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 | # See the License for the specific language governing permissions and
13 | # limitations under the License.
14 | 
15 | from .base import MemoryStorage
16 | 
17 | # Conditional imports based on available dependencies
18 | __all__ = ['MemoryStorage']
19 | 
20 | try:
21 |     from .sqlite_vec import SqliteVecMemoryStorage
22 |     __all__.append('SqliteVecMemoryStorage')
23 | except ImportError:
24 |     SqliteVecMemoryStorage = None
25 | 
26 | try:
27 |     from .cloudflare import CloudflareStorage
28 |     __all__.append('CloudflareStorage')
29 | except ImportError:
30 |     CloudflareStorage = None
31 | 
32 | try:
33 |     from .hybrid import HybridMemoryStorage
34 |     __all__.append('HybridMemoryStorage')
35 | except ImportError:
36 |     HybridMemoryStorage = None
```

--------------------------------------------------------------------------------
/scripts/backup/export_distributable_memories.sh:
--------------------------------------------------------------------------------

```bash
 1 | #!/bin/bash
 2 | 
 3 | # Export distributable reference memories for sharing across local network
 4 | # Usage: ./export_distributable_memories.sh [output_file]
 5 | 
 6 | OUTPUT_FILE="${1:-mcp_reference_memories_$(date +%Y%m%d).json}"
 7 | MCP_ENDPOINT="https://10.0.1.30:8443/mcp"
 8 | API_KEY="test-key-123"
 9 | 
10 | echo "Exporting distributable reference memories..."
11 | echo "Output file: $OUTPUT_FILE"
12 | 
13 | curl -k -s -X POST "$MCP_ENDPOINT" \
14 |   -H "Content-Type: application/json" \
15 |   -H "Authorization: Bearer $API_KEY" \
16 |   -d '{
17 |     "jsonrpc": "2.0", 
18 |     "id": 1, 
19 |     "method": "tools/call", 
20 |     "params": {
21 |       "name": "search_by_tag", 
22 |       "arguments": {
23 |         "tags": ["distributable-reference"]
24 |       }
25 |     }
26 |   }' | jq -r '.result.content[0].text' > "$OUTPUT_FILE"
27 | 
28 | if [ $? -eq 0 ]; then
29 |     echo "✅ Export completed: $OUTPUT_FILE"
30 |     echo "📊 Memory count: $(cat "$OUTPUT_FILE" | jq '. | length' 2>/dev/null || echo "Unknown")"
31 |     echo ""
32 |     echo "To import to another MCP Memory Service:"
33 |     echo "1. Copy $OUTPUT_FILE to target machine"
34 |     echo "2. Use store_memory calls for each entry"
35 |     echo "3. Update CLAUDE.md with new memory hashes"
36 | else
37 |     echo "❌ Export failed"
38 |     exit 1
39 | fi
```

--------------------------------------------------------------------------------
/.github/workflows/release.yml:
--------------------------------------------------------------------------------

```yaml
 1 | name: Release (Manual)
 2 | 
 3 | on:
 4 |   workflow_dispatch:
 5 | 
 6 | jobs:
 7 |   release:
 8 |     runs-on: ubuntu-latest
 9 |     concurrency: release
10 |     permissions:
11 |       id-token: write
12 |       contents: write
13 |       actions: write
14 |       pull-requests: write
15 |       repository-projects: write
16 | 
17 |     steps:
18 |     - uses: actions/checkout@v3 # would probably be better to use v4
19 |       with:
20 |         fetch-depth: 0
21 |         token: ${{ secrets.GITHUB_TOKEN }}
22 | 
23 |     - name: Set up Python
24 |       uses: actions/setup-python@v4
25 |       with:
26 |         python-version: '3.9' # this setup python action uses a separate version than the python-semantic-release, thats why we had the error
27 | 
28 |     - name: Install dependencies
29 |       run: |
30 |         python -m pip install --upgrade pip
31 |         python -m pip install build hatchling python-semantic-release
32 | 
33 |     - name: Verify build module installation
34 |       run: python -m pip show build
35 | 
36 |     - name: Build package
37 |       run: python -m build
38 | 
39 |     - name: Python Semantic Release
40 |       uses: python-semantic-release/[email protected]
41 |       with:
42 |         github_token: ${{ secrets.GITHUB_TOKEN }}
43 |         verbosity: 2
44 |       env:
45 |         GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
46 | 
```

--------------------------------------------------------------------------------
/scripts/installation/setup_backup_cron.sh:
--------------------------------------------------------------------------------

```bash
 1 | #!/bin/bash
 2 | # Setup automated backups for MCP Memory Service
 3 | # Creates cron jobs for regular SQLite-vec database backups
 4 | 
 5 | set -e
 6 | 
 7 | SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
 8 | BACKUP_SCRIPT="$SCRIPT_DIR/backup_sqlite_vec.sh"
 9 | 
10 | # Check if backup script exists
11 | if [[ ! -f "$BACKUP_SCRIPT" ]]; then
12 |     echo "Error: Backup script not found at $BACKUP_SCRIPT"
13 |     exit 1
14 | fi
15 | 
16 | # Make sure backup script is executable
17 | chmod +x "$BACKUP_SCRIPT"
18 | 
19 | # Create cron job entry
20 | CRON_ENTRY="0 2 * * * $BACKUP_SCRIPT > /tmp/mcp-backup.log 2>&1"
21 | 
22 | # Check if cron job already exists
23 | if crontab -l 2>/dev/null | grep -q "$BACKUP_SCRIPT"; then
24 |     echo "Backup cron job already exists. Current crontab:"
25 |     crontab -l | grep "$BACKUP_SCRIPT"
26 | else
27 |     # Add cron job
28 |     (crontab -l 2>/dev/null || true; echo "$CRON_ENTRY") | crontab -
29 |     echo "Added daily backup cron job:"
30 |     echo "$CRON_ENTRY"
31 | fi
32 | 
33 | echo ""
34 | echo "Backup automation setup complete!"
35 | echo "- Daily backups at 2:00 AM"
36 | echo "- Backup script: $BACKUP_SCRIPT"
37 | echo "- Log file: /tmp/mcp-backup.log"
38 | echo ""
39 | echo "To check cron jobs: crontab -l"
40 | echo "To remove cron job: crontab -l | grep -v backup_sqlite_vec.sh | crontab -"
```

--------------------------------------------------------------------------------
/scripts/sync/litestream/setup_local_litestream.sh:
--------------------------------------------------------------------------------

```bash
 1 | #!/bin/bash
 2 | # Setup script for Litestream replica on local macOS machine
 3 | 
 4 | set -e
 5 | 
 6 | echo "🔧 Setting up Litestream replica on local macOS..."
 7 | 
 8 | # Copy configuration to system location
 9 | echo "⚙️ Installing Litestream configuration..."
10 | sudo mkdir -p /usr/local/etc
11 | sudo cp litestream_replica_config.yml /usr/local/etc/litestream.yml
12 | 
13 | # Create log directory
14 | sudo mkdir -p /var/log
15 | sudo touch /var/log/litestream.log
16 | sudo chmod 644 /var/log/litestream.log
17 | 
18 | # Install LaunchDaemon
19 | echo "🚀 Installing LaunchDaemon..."
20 | sudo cp deployment/io.litestream.replication.plist /Library/LaunchDaemons/
21 | 
22 | # Set permissions
23 | sudo chown root:wheel /Library/LaunchDaemons/io.litestream.replication.plist
24 | sudo chmod 644 /Library/LaunchDaemons/io.litestream.replication.plist
25 | 
26 | echo "✅ Local Litestream setup completed!"
27 | echo ""
28 | echo "Next steps:"
29 | echo "1. Load service: sudo launchctl load /Library/LaunchDaemons/io.litestream.replication.plist"
30 | echo "2. Start service: sudo launchctl start io.litestream.replication"
31 | echo "3. Check status: litestream replicas -config /usr/local/etc/litestream.yml"
32 | echo ""
33 | echo "⚠️  Before starting the replica service, make sure the master is running on narrowbox.local"
```

--------------------------------------------------------------------------------
/docs/technical/tag-storage.md:
--------------------------------------------------------------------------------

```markdown
 1 | # Tag Storage Procedure
 2 | 
 3 | ## File Structure Overview
 4 | ```
 5 | mcp_memory_service/
 6 | ├── tests/
 7 | │   └── test_tag_storage.py    # Integration tests
 8 | ├── scripts/
 9 | │   ├── validate_memories.py   # Validation script
10 | │   └── migrate_tags.py        # Migration script
11 | ```
12 | 
13 | ## Execution Steps
14 | 
15 | 1. **Run Initial Validation**
16 |    ```bash
17 |    python scripts/validate_memories.py
18 |    ```
19 |    - Generates validation report of current state
20 | 
21 | 2. **Run Integration Tests**
22 |    ```bash
23 |    python tests/test_tag_storage.py
24 |    ```
25 |    - Verifies functionality
26 | 
27 | 3. **Execute Migration**
28 |    ```bash
29 |    python scripts/migrate_tags.py
30 |    ```
31 |    The script will:
32 |    - Create a backup automatically
33 |    - Run validation check
34 |    - Ask for confirmation before proceeding
35 |    - Perform migration
36 |    - Verify the migration
37 | 
38 | 4. **Post-Migration Validation**
39 |    ```bash
40 |    python scripts/validate_memories.py
41 |    ```
42 |    - Confirms successful migration
43 | 
44 | ## Monitoring Requirements
45 | - Keep backup files for at least 7 days
46 | - Monitor logs for any tag-related errors
47 | - Run validation script daily for the first week
48 | - Check search functionality with various tag formats
49 | 
50 | ## Rollback Process
51 | If issues are detected, use:
52 | ```bash
53 | python scripts/migrate_tags.py --rollback
54 | ```
```

--------------------------------------------------------------------------------
/scripts/maintenance/check_memory_types.py:
--------------------------------------------------------------------------------

```python
 1 | #!/usr/bin/env python3
 2 | """Quick script to check memory types in local database."""
 3 | import sqlite3
 4 | from pathlib import Path
 5 | 
 6 | # Windows database path
 7 | db_path = Path.home() / "AppData/Local/mcp-memory/sqlite_vec.db"
 8 | 
 9 | if not db_path.exists():
10 |     print(f"❌ Database not found at: {db_path}")
11 |     exit(1)
12 | 
13 | conn = sqlite3.connect(db_path)
14 | cursor = conn.cursor()
15 | 
16 | # Get memory type distribution
17 | cursor.execute("""
18 |     SELECT memory_type, COUNT(*) as count
19 |     FROM memories
20 |     GROUP BY memory_type
21 |     ORDER BY count DESC
22 | """)
23 | 
24 | results = cursor.fetchall()
25 | total = sum(count for _, count in results)
26 | 
27 | print(f"\nMemory Type Distribution")
28 | print("=" * 60)
29 | print(f"Total memories: {total:,}")
30 | print(f"Unique types: {len(results)}\n")
31 | 
32 | print(f"{'Memory Type':<40} {'Count':>8} {'%':>6}")
33 | print("-" * 60)
34 | 
35 | for memory_type, count in results[:30]:  # Show top 30
36 |     pct = (count / total) * 100 if total > 0 else 0
37 |     type_display = memory_type if memory_type else "(empty/NULL)"
38 |     print(f"{type_display:<40} {count:>8,} {pct:>5.1f}%")
39 | 
40 | if len(results) > 30:
41 |     remaining = len(results) - 30
42 |     remaining_count = sum(count for _, count in results[30:])
43 |     print(f"\n... and {remaining} more types ({remaining_count:,} memories)")
44 | 
45 | conn.close()
46 | 
```

--------------------------------------------------------------------------------
/scripts/utils/list-collections.py:
--------------------------------------------------------------------------------

```python
 1 | # Copyright 2024 Heinrich Krupp
 2 | #
 3 | # Licensed under the Apache License, Version 2.0 (the "License");
 4 | # you may not use this file except in compliance with the License.
 5 | # You may obtain a copy of the License at
 6 | #
 7 | #     http://www.apache.org/licenses/LICENSE-2.0
 8 | #
 9 | # Unless required by applicable law or agreed to in writing, software
10 | # distributed under the License is distributed on an "AS IS" BASIS,
11 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 | # See the License for the specific language governing permissions and
13 | # limitations under the License.
14 | 
15 | from chromadb import HttpClient
16 | 
17 | def list_collections():
18 |     try:
19 |         # Connect to local ChromaDB
20 |         client = HttpClient(host='localhost', port=8000)
21 |         
22 |         # List all collections
23 |         collections = client.list_collections()
24 |         
25 |         print("\nFound Collections:")
26 |         print("------------------")
27 |         for collection in collections:
28 |             print(f"Name: {collection.name}")
29 |             print(f"Metadata: {collection.metadata}")
30 |             print(f"Count: {collection.count()}")
31 |             print("------------------")
32 |             
33 |     except Exception as e:
34 |         print(f"Error connecting to local ChromaDB: {str(e)}")
35 | 
36 | if __name__ == "__main__":
37 |     list_collections()
38 | 
```

--------------------------------------------------------------------------------
/tests/unit/conftest.py:
--------------------------------------------------------------------------------

```python
 1 | """
 2 | Shared test fixtures and helpers for unit tests.
 3 | """
 4 | 
 5 | import tempfile
 6 | from pathlib import Path
 7 | from typing import List, Any, Optional
 8 | 
 9 | 
10 | async def extract_chunks_from_temp_file(
11 |     loader: Any,
12 |     filename: str,
13 |     content: str,
14 |     encoding: str = 'utf-8',
15 |     **extract_kwargs
16 | ) -> List[Any]:
17 |     """
18 |     Helper to extract chunks from a temporary file.
19 | 
20 |     Args:
21 |         loader: Loader instance (CSVLoader, JSONLoader, etc.)
22 |         filename: Name of the temporary file to create
23 |         content: Content to write to the file
24 |         encoding: File encoding (default: utf-8)
25 |         **extract_kwargs: Additional keyword arguments to pass to extract_chunks()
26 | 
27 |     Returns:
28 |         List of extracted chunks
29 | 
30 |     Example:
31 |         >>> loader = CSVLoader(chunk_size=1000, chunk_overlap=200)
32 |         >>> chunks = await extract_chunks_from_temp_file(
33 |         ...     loader,
34 |         ...     "test.csv",
35 |         ...     "name,age\\nJohn,25",
36 |         ...     delimiter=','
37 |         ... )
38 |     """
39 |     with tempfile.TemporaryDirectory() as tmpdir:
40 |         file_path = Path(tmpdir) / filename
41 |         file_path.write_text(content, encoding=encoding)
42 | 
43 |         chunks = []
44 |         async for chunk in loader.extract_chunks(file_path, **extract_kwargs):
45 |             chunks.append(chunk)
46 | 
47 |         return chunks
48 | 
```

--------------------------------------------------------------------------------
/test_version_checker.js:
--------------------------------------------------------------------------------

```javascript
 1 | #!/usr/bin/env node
 2 | 
 3 | /**
 4 |  * Test script for version-checker.js utility
 5 |  */
 6 | 
 7 | const { getVersionInfo, formatVersionDisplay } = require('./claude-hooks/utilities/version-checker');
 8 | 
 9 | const CONSOLE_COLORS = {
10 |     RESET: '\x1b[0m',
11 |     BRIGHT: '\x1b[1m',
12 |     DIM: '\x1b[2m',
13 |     CYAN: '\x1b[36m',
14 |     GREEN: '\x1b[32m',
15 |     YELLOW: '\x1b[33m',
16 |     GRAY: '\x1b[90m',
17 |     RED: '\x1b[31m'
18 | };
19 | 
20 | async function test() {
21 |     console.log('Testing version-checker utility...\n');
22 | 
23 |     const projectRoot = __dirname;
24 | 
25 |     // Test with PyPI check
26 |     console.log('1. Testing with PyPI check enabled:');
27 |     const versionInfo = await getVersionInfo(projectRoot, { checkPyPI: true, timeout: 3000 });
28 |     console.log('   Raw version info:', JSON.stringify(versionInfo, null, 2));
29 |     const display = formatVersionDisplay(versionInfo, CONSOLE_COLORS);
30 |     console.log('   Formatted:', display);
31 | 
32 |     console.log('\n2. Testing without PyPI check:');
33 |     const localOnly = await getVersionInfo(projectRoot, { checkPyPI: false });
34 |     console.log('   Raw version info:', JSON.stringify(localOnly, null, 2));
35 |     const localDisplay = formatVersionDisplay(localOnly, CONSOLE_COLORS);
36 |     console.log('   Formatted:', localDisplay);
37 | 
38 |     console.log('\n✅ Test completed!');
39 | }
40 | 
41 | test().catch(error => {
42 |     console.error('❌ Test failed:', error);
43 |     process.exit(1);
44 | });
45 | 
```

--------------------------------------------------------------------------------
/docs/deployment/production-guide.md:
--------------------------------------------------------------------------------

```markdown
 1 | # MCP Memory Service - Production Setup
 2 | 
 3 | ## 🚀 Quick Start
 4 | 
 5 | This MCP Memory Service is configured with **consolidation system**, **mDNS auto-discovery**, **HTTPS**, and **automatic startup**.
 6 | 
 7 | ### **Installation**
 8 | ```bash
 9 | # 1. Install the service
10 | bash install_service.sh
11 | 
12 | # 2. Update configuration (if needed)
13 | ./update_service.sh
14 | 
15 | # 3. Start the service
16 | sudo systemctl start mcp-memory
17 | ```
18 | 
19 | ### **Verification**
20 | ```bash
21 | # Check service status
22 | sudo systemctl status mcp-memory
23 | 
24 | # Test API health
25 | curl -k https://localhost:8000/api/health
26 | 
27 | # Verify mDNS discovery
28 | avahi-browse -t _mcp-memory._tcp
29 | ```
30 | 
31 | ## 📋 **Service Details**
32 | 
33 | - **Service Name**: `memory._mcp-memory._tcp.local.`
34 | - **HTTPS Address**: https://localhost:8000 
35 | - **API Key**: `mcp-0b1ccbde2197a08dcb12d41af4044be6`
36 | - **Auto-Startup**: ✅ Enabled
37 | - **Consolidation**: ✅ Active
38 | - **mDNS Discovery**: ✅ Working
39 | 
40 | ## 🛠️ **Management**
41 | 
42 | ```bash
43 | ./service_control.sh start     # Start service
44 | ./service_control.sh stop      # Stop service  
45 | ./service_control.sh status    # Show status
46 | ./service_control.sh logs      # View logs
47 | ./service_control.sh health    # Test API
48 | ```
49 | 
50 | ## 📖 **Documentation**
51 | 
52 | - **Complete Guide**: `COMPLETE_SETUP_GUIDE.md`
53 | - **Service Files**: `mcp-memory.service`, management scripts
54 | - **Archive**: `archive/setup-development/` (development files)
55 | 
56 | **✅ Ready for production use!**
```

--------------------------------------------------------------------------------
/claude-hooks/statusline.sh:
--------------------------------------------------------------------------------

```bash
 1 | #!/bin/bash
 2 | 
 3 | # Claude Code Status Line Script
 4 | # Displays session memory context in status line
 5 | # Format: 🧠 8 (5 recent) memories | 📊 12 commits
 6 | 
 7 | # Path to session cache file
 8 | CACHE_FILE="$HOME/.claude/hooks/utilities/session-cache.json"
 9 | 
10 | # ANSI color codes for styling
11 | CYAN='\033[36m'
12 | GREEN='\033[32m'
13 | GRAY='\033[90m'
14 | RESET='\033[0m'
15 | 
16 | # Check if cache file exists
17 | if [ ! -f "$CACHE_FILE" ]; then
18 |     # No cache file - session not started yet or hook failed
19 |     echo ""
20 |     exit 0
21 | fi
22 | 
23 | # Read cache file and extract data
24 | MEMORIES=$(jq -r '.memoriesLoaded // 0' "$CACHE_FILE" 2>/dev/null)
25 | RECENT=$(jq -r '.recentCount // 0' "$CACHE_FILE" 2>/dev/null)
26 | GIT_COMMITS=$(jq -r '.gitCommits // 0' "$CACHE_FILE" 2>/dev/null)
27 | 
28 | # Handle jq errors
29 | if [ $? -ne 0 ]; then
30 |     echo ""
31 |     exit 0
32 | fi
33 | 
34 | # Build status line output
35 | STATUS=""
36 | 
37 | # Memory section
38 | if [ "$MEMORIES" -gt 0 ]; then
39 |     if [ "$RECENT" -gt 0 ]; then
40 |         STATUS="${CYAN}🧠 ${MEMORIES}${RESET} ${GREEN}(${RECENT} recent)${RESET} memories"
41 |     else
42 |         STATUS="${CYAN}🧠 ${MEMORIES}${RESET} memories"
43 |     fi
44 | fi
45 | 
46 | # Git section
47 | if [ "$GIT_COMMITS" -gt 0 ]; then
48 |     if [ -n "$STATUS" ]; then
49 |         STATUS="${STATUS} ${GRAY}|${RESET} ${CYAN}📊 ${GIT_COMMITS} commits${RESET}"
50 |     else
51 |         STATUS="${CYAN}📊 ${GIT_COMMITS} commits${RESET}"
52 |     fi
53 | fi
54 | 
55 | # Output first line becomes status line
56 | echo -e "$STATUS"
57 | 
```

--------------------------------------------------------------------------------
/src/mcp_memory_service/web/oauth/__init__.py:
--------------------------------------------------------------------------------

```python
 1 | # Copyright 2024 Heinrich Krupp
 2 | #
 3 | # Licensed under the Apache License, Version 2.0 (the "License");
 4 | # you may not use this file except in compliance with the License.
 5 | # You may obtain a copy of the License at
 6 | #
 7 | #     http://www.apache.org/licenses/LICENSE-2.0
 8 | #
 9 | # Unless required by applicable law or agreed to in writing, software
10 | # distributed under the License is distributed on an "AS IS" BASIS,
11 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 | # See the License for the specific language governing permissions and
13 | # limitations under the License.
14 | 
15 | """
16 | OAuth 2.1 Dynamic Client Registration implementation for MCP Memory Service.
17 | 
18 | Provides OAuth 2.1 DCR endpoints to enable Claude Code HTTP transport integration.
19 | 
20 | This module implements:
21 | - RFC 8414: OAuth 2.0 Authorization Server Metadata
22 | - RFC 7591: OAuth 2.0 Dynamic Client Registration Protocol
23 | - OAuth 2.1 security requirements and best practices
24 | 
25 | Key features:
26 | - Dynamic client registration for automated OAuth client setup
27 | - JWT-based access tokens with proper validation
28 | - Authorization code flow with PKCE support
29 | - Client credentials flow for server-to-server authentication
30 | - Comprehensive scope-based authorization
31 | - Backward compatibility with existing API key authentication
32 | """
33 | 
34 | __all__ = [
35 |     "discovery",
36 |     "models",
37 |     "registration",
38 |     "authorization",
39 |     "middleware",
40 |     "storage"
41 | ]
```

--------------------------------------------------------------------------------
/scripts/sync/litestream/setup_remote_litestream.sh:
--------------------------------------------------------------------------------

```bash
 1 | #!/bin/bash
 2 | # Setup script for Litestream master on remote server (narrowbox.local)
 3 | 
 4 | set -e
 5 | 
 6 | echo "🔧 Setting up Litestream master on remote server..."
 7 | 
 8 | # Install Litestream
 9 | echo "📦 Installing Litestream..."
10 | curl -LsS https://github.com/benbjohnson/litestream/releases/latest/download/litestream-linux-amd64.tar.gz | tar -xzf -
11 | sudo mv litestream /usr/local/bin/
12 | sudo chmod +x /usr/local/bin/litestream
13 | 
14 | # Create directories
15 | echo "📁 Creating directories..."
16 | sudo mkdir -p /var/www/litestream/mcp-memory
17 | sudo mkdir -p /backup/litestream/mcp-memory
18 | 
19 | # Set permissions
20 | sudo chown -R www-data:www-data /var/www/litestream
21 | sudo chmod -R 755 /var/www/litestream
22 | 
23 | # Copy configuration
24 | echo "⚙️ Installing Litestream configuration..."
25 | sudo cp litestream_master_config.yml /etc/litestream.yml
26 | 
27 | # Install systemd services
28 | echo "🚀 Installing systemd services..."
29 | sudo cp litestream.service /etc/systemd/system/
30 | sudo cp litestream-http.service /etc/systemd/system/
31 | 
32 | # Reload systemd and enable services
33 | sudo systemctl daemon-reload
34 | sudo systemctl enable litestream.service
35 | sudo systemctl enable litestream-http.service
36 | 
37 | echo "✅ Remote Litestream setup completed!"
38 | echo ""
39 | echo "Next steps:"
40 | echo "1. Start services: sudo systemctl start litestream litestream-http"
41 | echo "2. Check status: sudo systemctl status litestream litestream-http"
42 | echo "3. Verify HTTP endpoint: curl http://localhost:8080/mcp-memory/"
```

--------------------------------------------------------------------------------
/tools/docker/docker-compose.yml:
--------------------------------------------------------------------------------

```yaml
 1 | version: '3.8'
 2 | 
 3 | # Docker Compose configuration for MCP protocol mode
 4 | # For use with MCP clients (Claude Desktop, VS Code extension, etc.)
 5 | # For HTTP/API mode, use docker-compose.http.yml instead
 6 | 
 7 | services:
 8 |   mcp-memory-service:
 9 |     build:
10 |       context: ../..
11 |       dockerfile: tools/docker/Dockerfile
12 |     
13 |     # Required for MCP protocol communication
14 |     stdin_open: true
15 |     tty: true
16 |     
17 |     volumes:
18 |       # Single data directory for all storage
19 |       - ./data:/app/data
20 | 
21 |       # Model cache (prevents re-downloading models on each restart)
22 |       # Uncomment the following line to persist Hugging Face models
23 |       # - ${HOME}/.cache/huggingface:/root/.cache/huggingface
24 |     
25 |     environment:
26 |       # Mode selection
27 |       - MCP_MODE=mcp
28 |       
29 |       # Storage configuration
30 |       - MCP_MEMORY_STORAGE_BACKEND=sqlite_vec
31 |       - MCP_MEMORY_SQLITE_PATH=/app/data/sqlite_vec.db
32 |       - MCP_MEMORY_BACKUPS_PATH=/app/data/backups
33 |       
34 |       # Performance tuning
35 |       - LOG_LEVEL=${LOG_LEVEL:-INFO}
36 |       - MAX_RESULTS_PER_QUERY=10
37 |       - SIMILARITY_THRESHOLD=0.7
38 |       
39 |       # Python configuration
40 |       - PYTHONUNBUFFERED=1
41 |       - PYTHONPATH=/app/src
42 | 
43 |       # Offline mode (uncomment if models are pre-cached and network is restricted)
44 |       # - HF_HUB_OFFLINE=1
45 |       # - TRANSFORMERS_OFFLINE=1
46 |     
47 |     # Use the unified entrypoint
48 |     entrypoint: ["/usr/local/bin/docker-entrypoint-unified.sh"]
49 |     
50 |     restart: unless-stopped
```

--------------------------------------------------------------------------------
/scripts/testing/test-connection.py:
--------------------------------------------------------------------------------

```python
 1 | # Copyright 2024 Heinrich Krupp
 2 | #
 3 | # Licensed under the Apache License, Version 2.0 (the "License");
 4 | # you may not use this file except in compliance with the License.
 5 | # You may obtain a copy of the License at
 6 | #
 7 | #     http://www.apache.org/licenses/LICENSE-2.0
 8 | #
 9 | # Unless required by applicable law or agreed to in writing, software
10 | # distributed under the License is distributed on an "AS IS" BASIS,
11 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 | # See the License for the specific language governing permissions and
13 | # limitations under the License.
14 | 
15 | from chromadb import HttpClient
16 | 
17 | def test_connection(port=8000):
18 |     try:
19 |         # Try to connect to local ChromaDB
20 |         client = HttpClient(host='localhost', port=port)
21 |         # Try a simple operation
22 |         heartbeat = client.heartbeat()
23 |         print(f"Successfully connected to ChromaDB on port {port}")
24 |         print(f"Heartbeat: {heartbeat}")
25 |         
26 |         # List collections
27 |         collections = client.list_collections()
28 |         print("\nFound collections:")
29 |         for collection in collections:
30 |             print(f"- {collection.name} (count: {collection.count()})")
31 |         
32 |     except Exception as e:
33 |         print(f"Error connecting to ChromaDB on port {port}: {str(e)}")
34 | 
35 | if __name__ == "__main__":
36 |     # Try default port
37 |     test_connection()
38 |     
39 |     # If the above fails, you might want to try other common ports:
40 |     # test_connection(8080)
41 |     # test_connection(9000)
42 | 
```

--------------------------------------------------------------------------------
/docs/ROADMAP.md:
--------------------------------------------------------------------------------

```markdown
 1 | # Development Roadmap
 2 | 
 3 | **The official roadmap has moved to the Wiki for easier maintenance and community collaboration.**
 4 | 
 5 | 📖 **[View Development Roadmap on Wiki](https://github.com/doobidoo/mcp-memory-service/wiki/13-Development-Roadmap)**
 6 | 
 7 | The Wiki version includes:
 8 | - ✅ Completed milestones (v8.0-v8.38)
 9 | - 🎯 Current focus (v8.39-v9.0 - Q1 2026)
10 | - 🚀 Future enhancements (Q2 2026+)
11 | - 🎯 Medium term vision (Q3-Q4 2026)
12 | - 🌟 Long-term aspirations (2027+)
13 | - 📊 Success metrics and KPIs
14 | - 🤝 Community contribution opportunities
15 | 
16 | ## Why the Wiki?
17 | 
18 | The Wiki provides several advantages for roadmap documentation:
19 | - ✅ **Easier Updates**: No PR required for roadmap changes
20 | - ✅ **Better Navigation**: Integrated with other wiki guides
21 | - ✅ **Community Collaboration**: Lower barrier for community input
22 | - ✅ **Rich Formatting**: Enhanced markdown features
23 | - ✅ **Cleaner Repository**: Reduces noise in commit history
24 | 
25 | ## For Active Development Tracking
26 | 
27 | The roadmap on the Wiki tracks strategic direction. For day-to-day development:
28 | 
29 | - **[GitHub Projects](https://github.com/doobidoo/mcp-memory-service/projects)** - Sprint planning and task boards
30 | - **[Open Issues](https://github.com/doobidoo/mcp-memory-service/issues)** - Bug reports and feature requests
31 | - **[Pull Requests](https://github.com/doobidoo/mcp-memory-service/pulls)** - Active code changes
32 | - **[CHANGELOG.md](../CHANGELOG.md)** - Release history and completed features
33 | 
34 | ---
35 | 
36 | **Maintainer**: @doobidoo
37 | **Last Updated**: November 26, 2025
38 | 
```

--------------------------------------------------------------------------------
/scripts/installation/setup_claude_mcp.sh:
--------------------------------------------------------------------------------

```bash
 1 | #!/bin/bash
 2 | # Setup script for Claude Code MCP configuration
 3 | 
 4 | echo "🔧 Setting up MCP Memory Service for Claude Code..."
 5 | echo "=================================================="
 6 | 
 7 | # Get the absolute path to the repository
 8 | REPO_PATH="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
 9 | VENV_PYTHON="$REPO_PATH/venv/bin/python"
10 | 
11 | echo "Repository path: $REPO_PATH"
12 | echo "Python path: $VENV_PYTHON"
13 | 
14 | # Check if virtual environment exists
15 | if [ ! -f "$VENV_PYTHON" ]; then
16 |     echo "❌ Virtual environment not found at: $VENV_PYTHON"
17 |     echo "Please run: python -m venv venv && source venv/bin/activate && pip install -r requirements.txt"
18 |     exit 1
19 | fi
20 | 
21 | # Create MCP configuration
22 | cat > "$REPO_PATH/mcp_server_config.json" << EOF
23 | {
24 |   "mcpServers": {
25 |     "memory": {
26 |       "command": "$VENV_PYTHON",
27 |       "args": ["-m", "src.mcp_memory_service.server"],
28 |       "cwd": "$REPO_PATH",
29 |       "env": {
30 |         "MCP_MEMORY_STORAGE_BACKEND": "sqlite_vec",
31 |         "PYTHONPATH": "$REPO_PATH/src"
32 |       }
33 |     }
34 |   }
35 | }
36 | EOF
37 | 
38 | echo "✅ Created MCP configuration: $REPO_PATH/mcp_server_config.json"
39 | echo ""
40 | echo "📋 Manual Configuration Steps:"
41 | echo "1. Copy the configuration below"
42 | echo "2. Add it to your Claude Code MCP settings"
43 | echo ""
44 | echo "Configuration to add:"
45 | echo "====================="
46 | cat "$REPO_PATH/mcp_server_config.json"
47 | echo ""
48 | echo "🚀 Alternative: Start server manually and use Claude Code normally"
49 | echo "   cd $REPO_PATH"
50 | echo "   source venv/bin/activate"
51 | echo "   export MCP_MEMORY_STORAGE_BACKEND=sqlite_vec"
52 | echo "   python -m src.mcp_memory_service.server"
```

--------------------------------------------------------------------------------
/scripts/run_memory_server.py:
--------------------------------------------------------------------------------

```python
 1 | #!/usr/bin/env python3
 2 | """
 3 | Backward compatibility redirect to new location (v6.17.0+).
 4 | 
 5 | This stub ensures existing Claude Desktop configurations continue working
 6 | after the v6.17.0 script reorganization. The actual script has moved to
 7 | scripts/server/run_memory_server.py.
 8 | 
 9 | For best stability, consider using one of these approaches instead:
10 | 1. python -m mcp_memory_service.server (recommended)
11 | 2. uv run memory server
12 | 3. scripts/server/run_memory_server.py (direct path)
13 | """
14 | import sys
15 | import os
16 | 
17 | # Add informational notice (not a warning to avoid alarming users)
18 | print("[INFO] Note: scripts/run_memory_server.py has moved to scripts/server/run_memory_server.py", file=sys.stderr)
19 | print("[INFO] Consider using 'python -m mcp_memory_service.server' for better stability", file=sys.stderr)
20 | print("[INFO] See https://github.com/doobidoo/mcp-memory-service for migration guide", file=sys.stderr)
21 | 
22 | # Execute the relocated script
23 | script_dir = os.path.dirname(os.path.abspath(__file__))
24 | new_script = os.path.join(script_dir, "server", "run_memory_server.py")
25 | 
26 | if os.path.exists(new_script):
27 |     # Preserve the original __file__ context for the new script
28 |     global_vars = {
29 |         '__file__': new_script,
30 |         '__name__': '__main__',
31 |         'sys': sys,
32 |         'os': os
33 |     }
34 | 
35 |     with open(new_script, 'r', encoding='utf-8') as f:
36 |         exec(compile(f.read(), new_script, 'exec'), global_vars)
37 | else:
38 |     print(f"[ERROR] Could not find {new_script}", file=sys.stderr)
39 |     print("[ERROR] Please ensure you have the complete mcp-memory-service repository", file=sys.stderr)
40 |     sys.exit(1)
```

--------------------------------------------------------------------------------
/src/mcp_memory_service/ingestion/__init__.py:
--------------------------------------------------------------------------------

```python
 1 | # Copyright 2024 Heinrich Krupp
 2 | #
 3 | # Licensed under the Apache License, Version 2.0 (the "License");
 4 | # you may not use this file except in compliance with the License.
 5 | # You may obtain a copy of the License at
 6 | #
 7 | #     http://www.apache.org/licenses/LICENSE-2.0
 8 | #
 9 | # Unless required by applicable law or agreed to in writing, software
10 | # distributed under the License is distributed on an "AS IS" BASIS,
11 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 | # See the License for the specific language governing permissions and
13 | # limitations under the License.
14 | 
15 | """
16 | Document Ingestion Module
17 | 
18 | Provides functionality to side-load documents into the memory database,
19 | supporting multiple formats including PDF, text, and structured data.
20 | 
21 | This module enables users to pre-populate the vector database with
22 | documentation, knowledge bases, and other content for semantic retrieval.
23 | """
24 | 
25 | from .base import DocumentLoader, DocumentChunk, IngestionResult
26 | from .chunker import TextChunker
27 | from .registry import get_loader_for_file, register_loader, SUPPORTED_FORMATS, is_supported_file
28 | 
29 | # Import loaders to trigger registration
30 | # Order matters! Import SemtoolsLoader first, then specialized loaders
31 | # This allows specialized loaders to override if semtools is unavailable
32 | from . import text_loader
33 | from . import semtools_loader
34 | from . import pdf_loader
35 | from . import json_loader
36 | from . import csv_loader
37 | 
38 | __all__ = [
39 |     'DocumentLoader',
40 |     'DocumentChunk', 
41 |     'IngestionResult',
42 |     'TextChunker',
43 |     'get_loader_for_file',
44 |     'register_loader',
45 |     'SUPPORTED_FORMATS',
46 |     'is_supported_file'
47 | ]
```

--------------------------------------------------------------------------------
/scripts/run/start_sqlite_vec.sh:
--------------------------------------------------------------------------------

```bash
 1 | #!/bin/bash
 2 | # Quick start script for MCP Memory Service with SQLite-vec backend
 3 | 
 4 | echo "🚀 Starting MCP Memory Service with SQLite-vec backend..."
 5 | echo "=================================================="
 6 | 
 7 | # Check if virtual environment exists
 8 | if [ ! -d "venv" ]; then
 9 |     echo "❌ Virtual environment not found. Please run setup first."
10 |     exit 1
11 | fi
12 | 
13 | # Activate virtual environment
14 | echo "📦 Activating virtual environment..."
15 | source venv/bin/activate
16 | 
17 | # Set SQLite-vec backend
18 | echo "🔧 Configuring SQLite-vec backend..."
19 | export MCP_MEMORY_STORAGE_BACKEND=sqlite_vec
20 | 
21 | # Display configuration
22 | echo "✅ Configuration:"
23 | echo "   Backend: $MCP_MEMORY_STORAGE_BACKEND"
24 | echo "   Database: ~/.local/share/mcp-memory/sqlite_vec.db"
25 | echo "   Python: $(which python)"
26 | 
27 | # Check key dependencies
28 | echo ""
29 | echo "🧪 Checking dependencies..."
30 | python -c "
31 | import sqlite_vec
32 | import sentence_transformers
33 | import mcp
34 | print('   ✅ sqlite-vec available')
35 | print('   ✅ sentence-transformers available') 
36 | print('   ✅ mcp available')
37 | "
38 | 
39 | echo ""
40 | echo "🎯 Ready! The MCP Memory Service is configured for sqlite-vec."
41 | echo ""
42 | echo "To start the server:"
43 | echo "   python -m src.mcp_memory_service.server"
44 | echo ""
45 | echo "🧪 Testing server startup..."
46 | timeout 3 python -m src.mcp_memory_service.server 2>/dev/null || echo "✅ Server can start successfully!"
47 | echo ""
48 | echo "For Claude Code integration:"
49 | echo "   - The service will automatically use sqlite-vec"
50 | echo "   - Memory database: ~/.local/share/mcp-memory/sqlite_vec.db" 
51 | echo "   - 75% less memory usage vs ChromaDB"
52 | echo ""
53 | echo "To test the setup:"
54 | echo "   python simple_sqlite_vec_test.py"
```

--------------------------------------------------------------------------------
/claude-hooks/debug-pattern-test.js:
--------------------------------------------------------------------------------

```javascript
 1 | #!/usr/bin/env node
 2 | 
 3 | /**
 4 |  * Debug Pattern Detection
 5 |  */
 6 | 
 7 | const { AdaptivePatternDetector } = require('./utilities/adaptive-pattern-detector');
 8 | 
 9 | async function debugPatternDetection() {
10 |     console.log('🔍 Debugging Pattern Detection');
11 |     console.log('═'.repeat(50));
12 | 
13 |     const detector = new AdaptivePatternDetector({ sensitivity: 0.7 });
14 | 
15 |     const testMessage = "What did we decide about the authentication approach?";
16 |     console.log(`\nTesting message: "${testMessage}"`);
17 | 
18 |     const result = await detector.detectPatterns(testMessage);
19 | 
20 |     console.log('\nResults:');
21 |     console.log('- Matches found:', result.matches.length);
22 |     console.log('- Confidence:', result.confidence);
23 |     console.log('- Processing tier:', result.processingTier);
24 |     console.log('- Trigger recommendation:', result.triggerRecommendation);
25 | 
26 |     if (result.matches.length > 0) {
27 |         console.log('\nMatches:');
28 |         result.matches.forEach((match, i) => {
29 |             console.log(`  ${i + 1}. Category: ${match.category}`);
30 |             console.log(`     Pattern: ${match.pattern}`);
31 |             console.log(`     Confidence: ${match.confidence}`);
32 |             console.log(`     Type: ${match.type}`);
33 |         });
34 |     }
35 | 
36 |     // Test the instant patterns directly
37 |     console.log('\n🔍 Testing Instant Patterns Directly');
38 |     const instantMatches = detector.detectInstantPatterns(testMessage);
39 |     console.log('Instant matches:', instantMatches.length);
40 |     instantMatches.forEach((match, i) => {
41 |         console.log(`  ${i + 1}. ${match.category}: ${match.confidence}`);
42 |     });
43 | }
44 | 
45 | debugPatternDetection().catch(console.error);
```

--------------------------------------------------------------------------------
/docs/development/todo-tracker.md:
--------------------------------------------------------------------------------

```markdown
 1 | # TODO Tracker
 2 | 
 3 | **Last Updated:** 2025-11-08 10:25:25
 4 | **Scan Directory:** src
 5 | **Total TODOs:** 5
 6 | 
 7 | ## Summary
 8 | 
 9 | | Priority | Count | Description |
10 | |----------|-------|-------------|
11 | | CRITICAL (P0) | 1 | Security, data corruption, blocking bugs |
12 | | HIGH (P1) | 2 | Performance, user-facing, incomplete features |
13 | | MEDIUM (P2) | 2 | Code quality, optimizations, technical debt |
14 | | LOW (P3) | 0
15 | 0 | Documentation, cosmetic, nice-to-haves |
16 | 
17 | ---
18 | 
19 | ## CRITICAL (P0)
20 | - `src/mcp_memory_service/web/api/analytics.py:625` - Period filtering is not implemented, leading to incorrect analytics data.
21 | 
22 | ## HIGH (P1)
23 | - `src/mcp_memory_service/storage/cloudflare.py:185` - Lack of a fallback for embedding generation makes the service vulnerable to external API failures.
24 | - `src/mcp_memory_service/web/api/manage.py:231` - Inefficient queries can cause significant performance bottlenecks, especially with large datasets.
25 | 
26 | ## MEDIUM (P2)
27 | - `src/mcp_memory_service/web/api/documents.py:592` - Using a deprecated FastAPI event handler; should be migrated to the modern `lifespan` context manager to reduce technical debt.
28 | - `src/mcp_memory_service/web/api/analytics.py:213` - The `storage.get_stats()` method is missing a data point, leading to API inconsistency.
29 | 
30 | ## LOW (P3)
31 | *(None in this list)*
32 | 
33 | ---
34 | 
35 | ## How to Address
36 | 
37 | 1. **CRITICAL**: Address immediately, block releases if necessary
38 | 2. **HIGH**: Schedule for current/next sprint
39 | 3. **MEDIUM**: Add to backlog, address in refactoring sprints
40 | 4. **LOW**: Address opportunistically or when touching related code
41 | 
42 | ## Updating This Tracker
43 | 
44 | Run: `bash scripts/maintenance/scan_todos.sh`
45 | 
```

--------------------------------------------------------------------------------
/scripts/backup/backup_sqlite_vec.sh:
--------------------------------------------------------------------------------

```bash
 1 | #!/bin/bash
 2 | # SQLite-vec Database Backup Script
 3 | # Creates timestamped backups of the SQLite-vec database
 4 | 
 5 | set -e
 6 | 
 7 | # Configuration
 8 | MEMORY_DIR="${MCP_MEMORY_BASE_DIR:-$HOME/.local/share/mcp-memory}"
 9 | BACKUP_DIR="$MEMORY_DIR/backups"
10 | DATABASE_FILE="$MEMORY_DIR/sqlite_vec.db"
11 | TIMESTAMP=$(date +%Y%m%d_%H%M%S)
12 | BACKUP_NAME="sqlite_backup_$TIMESTAMP"
13 | BACKUP_PATH="$BACKUP_DIR/$BACKUP_NAME"
14 | 
15 | # Check if database exists
16 | if [[ ! -f "$DATABASE_FILE" ]]; then
17 |     echo "Error: SQLite database not found at $DATABASE_FILE"
18 |     exit 1
19 | fi
20 | 
21 | # Create backup directory
22 | mkdir -p "$BACKUP_PATH"
23 | 
24 | # Copy database files (main, WAL, and SHM files)
25 | echo "Creating backup: $BACKUP_NAME"
26 | cp "$DATABASE_FILE" "$BACKUP_PATH/" 2>/dev/null || true
27 | cp "${DATABASE_FILE}-wal" "$BACKUP_PATH/" 2>/dev/null || true
28 | cp "${DATABASE_FILE}-shm" "$BACKUP_PATH/" 2>/dev/null || true
29 | 
30 | # Get backup size
31 | BACKUP_SIZE=$(du -sh "$BACKUP_PATH" | cut -f1)
32 | 
33 | # Count files backed up
34 | FILE_COUNT=$(find "$BACKUP_PATH" -type f | wc -l)
35 | 
36 | # Create backup metadata
37 | cat > "$BACKUP_PATH/backup_info.json" << EOF
38 | {
39 |   "backup_name": "$BACKUP_NAME",
40 |   "timestamp": "$TIMESTAMP",
41 |   "source_database": "$DATABASE_FILE",
42 |   "backup_path": "$BACKUP_PATH",
43 |   "backup_size": "$BACKUP_SIZE",
44 |   "files_count": $FILE_COUNT,
45 |   "backend": "sqlite_vec",
46 |   "created_at": "$(date -u +%Y-%m-%dT%H:%M:%SZ)"
47 | }
48 | EOF
49 | 
50 | echo "Backup completed successfully:"
51 | echo "  Name: $BACKUP_NAME"
52 | echo "  Path: $BACKUP_PATH"
53 | echo "  Size: $BACKUP_SIZE"
54 | echo "  Files: $FILE_COUNT"
55 | 
56 | # Cleanup old backups (keep last 7 days)
57 | find "$BACKUP_DIR" -name "sqlite_backup_*" -type d -mtime +7 -exec rm -rf {} \; 2>/dev/null || true
58 | 
59 | exit 0
```

--------------------------------------------------------------------------------
/docs/legacy/dual-protocol-hooks.md:
--------------------------------------------------------------------------------

```markdown
 1 | # Dual Protocol Memory Hooks (Legacy)
 2 | 
 3 | > **Note**: This feature has been superseded by Natural Memory Triggers v7.1.3+. This documentation is kept for reference only.
 4 | 
 5 | **Dual Protocol Memory Hooks** (v7.0.0+) provide intelligent memory awareness with automatic protocol detection:
 6 | 
 7 | ## Configuration
 8 | 
 9 | ```json
10 | {
11 |   "memoryService": {
12 |     "protocol": "auto",
13 |     "preferredProtocol": "mcp",
14 |     "fallbackEnabled": true,
15 |     "http": {
16 |       "endpoint": "https://localhost:8443",
17 |       "apiKey": "your-api-key",
18 |       "healthCheckTimeout": 3000,
19 |       "useDetailedHealthCheck": true
20 |     },
21 |     "mcp": {
22 |       "serverCommand": ["uv", "run", "memory", "server", "-s", "cloudflare"],
23 |       "serverWorkingDir": "/Users/yourname/path/to/mcp-memory-service",
24 |       "connectionTimeout": 5000,
25 |       "toolCallTimeout": 10000
26 |     }
27 |   }
28 | }
29 | ```
30 | 
31 | ## Protocol Options
32 | 
33 | - `"auto"`: Smart detection (MCP → HTTP → Environment fallback)
34 | - `"http"`: HTTP-only mode (web server at localhost:8443)
35 | - `"mcp"`: MCP-only mode (direct server process)
36 | 
37 | ## Benefits
38 | 
39 | - **Reliability**: Multiple connection methods ensure hooks always work
40 | - **Performance**: MCP direct for speed, HTTP for stability
41 | - **Flexibility**: Works with local development or remote deployments
42 | - **Compatibility**: Full backward compatibility with existing configurations
43 | 
44 | ## Migration to Natural Memory Triggers
45 | 
46 | If you're using Dual Protocol Hooks, consider migrating to Natural Memory Triggers v7.1.3+ which offers:
47 | - 85%+ trigger accuracy
48 | - Multi-tier performance optimization
49 | - CLI management system
50 | - Git-aware context integration
51 | - Adaptive learning
52 | 
53 | See main CLAUDE.md for migration instructions.
54 | 
```

--------------------------------------------------------------------------------
/tools/docker/docker-entrypoint-persistent.sh:
--------------------------------------------------------------------------------

```bash
 1 | #!/bin/bash
 2 | # Docker entrypoint script for MCP Memory Service - Persistent mode
 3 | # This script keeps the container running even when there's no active MCP client
 4 | 
 5 | set -e
 6 | 
 7 | echo "[INFO] Starting MCP Memory Service in Docker container (persistent mode)"
 8 | 
 9 | # Function to handle signals
10 | handle_signal() {
11 |     echo "[INFO] Received signal, shutting down..."
12 |     if [ -n "$SERVER_PID" ]; then
13 |         kill -TERM $SERVER_PID 2>/dev/null || true
14 |     fi
15 |     exit 0
16 | }
17 | 
18 | # Set up signal handlers
19 | trap handle_signal SIGTERM SIGINT
20 | 
21 | # Create named pipes for stdio communication
22 | FIFO_DIR="/tmp/mcp-memory-fifo"
23 | mkdir -p "$FIFO_DIR"
24 | STDIN_FIFO="$FIFO_DIR/stdin"
25 | STDOUT_FIFO="$FIFO_DIR/stdout"
26 | 
27 | # Remove old pipes if they exist
28 | rm -f "$STDIN_FIFO" "$STDOUT_FIFO"
29 | 
30 | # Create new named pipes
31 | mkfifo "$STDIN_FIFO"
32 | mkfifo "$STDOUT_FIFO"
33 | 
34 | echo "[INFO] Created named pipes for stdio communication"
35 | 
36 | # Start the server in the background with the named pipes
37 | if [ "${UV_ACTIVE}" = "1" ]; then
38 |     echo "[INFO] Running with UV wrapper (persistent mode)"
39 |     python -u uv_wrapper.py < "$STDIN_FIFO" > "$STDOUT_FIFO" 2>&1 &
40 | else
41 |     echo "[INFO] Running directly with Python (persistent mode)"
42 |     python -u -m mcp_memory_service.server < "$STDIN_FIFO" > "$STDOUT_FIFO" 2>&1 &
43 | fi
44 | 
45 | SERVER_PID=$!
46 | echo "[INFO] Server started with PID: $SERVER_PID"
47 | 
48 | # Keep the stdin pipe open to prevent the server from exiting
49 | exec 3> "$STDIN_FIFO"
50 | 
51 | # Monitor the server process
52 | while true; do
53 |     if ! kill -0 $SERVER_PID 2>/dev/null; then
54 |         echo "[ERROR] Server process exited unexpectedly"
55 |         exit 1
56 |     fi
57 |     
58 |     # Send a keep-alive message every 30 seconds
59 |     echo "" >&3
60 |     
61 |     sleep 30
62 | done
```

--------------------------------------------------------------------------------
/examples/claude_desktop_config_windows.json:
--------------------------------------------------------------------------------

```json
 1 | {
 2 |   "_comment": "Windows-specific MCP Memory Service configuration for Claude Desktop",
 3 |   "_instructions": [
 4 |     "Replace 'YOUR_USERNAME' with your actual Windows username",
 5 |     "Replace 'C:\\REPOSITORIES\\mcp-memory-service' with your actual repository path",
 6 |     "Supported backends: sqlite_vec, cloudflare, hybrid (ChromaDB removed in v8.0.0)"
 7 |   ],
 8 |   "mcpServers": {
 9 |     "memory": {
10 |       "command": "python", 
11 |       "args": [
12 |         "C:/REPOSITORIES/mcp-memory-service/scripts/memory_offline.py"
13 |       ],
14 |       "env": {
15 |         "PYTHONPATH": "C://REPOSITORIES//mcp-memory-service",
16 |         "_comment_backend_choice": "Choose one of the backend configurations below",
17 |         
18 |         "_comment_sqlite_vec": "=== SQLite-vec Backend (Recommended for local storage) ===",
19 |         "MCP_MEMORY_STORAGE_BACKEND": "sqlite_vec",
20 |         "MCP_MEMORY_SQLITE_PATH": "C:\\Users\\YOUR_USERNAME\\AppData\\Local\\mcp-memory\\memory_migrated.db",
21 |         "MCP_MEMORY_BACKUPS_PATH": "C:\\Users\\YOUR_USERNAME\\AppData\\Local\\mcp-memory\\backups",
22 |         
23 |         "_comment_offline": "=== Offline Mode Configuration (prevents PyTorch downloads) ===",
24 |         "HF_HOME": "C:\\Users\\YOUR_USERNAME\\.cache\\huggingface",
25 |         "TRANSFORMERS_CACHE": "C:\\Users\\YOUR_USERNAME\\.cache\\huggingface\\transformers",
26 |         "SENTENCE_TRANSFORMERS_HOME": "C:\\Users\\YOUR_USERNAME\\.cache\\torch\\sentence_transformers",
27 |         "HF_HUB_OFFLINE": "1",
28 |         "TRANSFORMERS_OFFLINE": "1",
29 |         
30 |         "_comment_performance": "=== Performance Settings ===",
31 |         "PYTORCH_ENABLE_MPS_FALLBACK": "1",
32 |         "PYTORCH_CUDA_ALLOC_CONF": "max_split_size_mb:128"
33 |       }
34 |     }
35 |   }
36 | }
```

--------------------------------------------------------------------------------
/scripts/testing/simple_test.py:
--------------------------------------------------------------------------------

```python
 1 | #\!/usr/bin/env python3
 2 | """
 3 | Simple test to use Homebrew Python's sentence-transformers
 4 | """
 5 | import os
 6 | import sys
 7 | import subprocess
 8 | 
 9 | # Set environment variables for testing
10 | os.environ["MCP_MEMORY_STORAGE_BACKEND"] = "sqlite_vec"
11 | os.environ["MCP_MEMORY_SQLITE_PATH"] = os.path.expanduser("~/Library/Application Support/mcp-memory/sqlite_vec.db")
12 | os.environ["MCP_MEMORY_BACKUPS_PATH"] = os.path.expanduser("~/Library/Application Support/mcp-memory/backups")
13 | os.environ["MCP_MEMORY_USE_ONNX"] = "1"
14 | 
15 | # Get the Homebrew Python path
16 | result = subprocess.run(
17 |     ['brew', '--prefix', 'pytorch'],
18 |     capture_output=True,
19 |     text=True,
20 |     check=True
21 | )
22 | pytorch_prefix = result.stdout.strip()
23 | homebrew_python_path = f"{pytorch_prefix}/libexec/bin/python3"
24 | 
25 | print(f"Using Homebrew Python: {homebrew_python_path}")
26 | 
27 | # Run a simple test with the Homebrew Python
28 | test_script = """
29 | import torch
30 | import sentence_transformers
31 | import sys
32 | 
33 | print(f"Python: {sys.version}")
34 | print(f"PyTorch: {torch.__version__}")
35 | print(f"sentence-transformers: {sentence_transformers.__version__}")
36 | 
37 | # Load a model
38 | model = sentence_transformers.SentenceTransformer('paraphrase-MiniLM-L3-v2')
39 | print(f"Model loaded: {model}")
40 | 
41 | # Encode a test sentence
42 | test_text = "This is a test sentence for encoding with Homebrew PyTorch"
43 | embedding = model.encode([test_text])
44 | print(f"Embedding shape: {embedding.shape}")
45 | print("Test successful\!")
46 | """
47 | 
48 | # Run the test with Homebrew Python
49 | result = subprocess.run(
50 |     [homebrew_python_path, "-c", test_script],
51 |     capture_output=True,
52 |     text=True
53 | )
54 | 
55 | print("=== STDOUT ===")
56 | print(result.stdout)
57 | 
58 | if result.stderr:
59 |     print("=== STDERR ===")
60 |     print(result.stderr)
61 | 
```

--------------------------------------------------------------------------------
/scripts/utils/test_groq_bridge.sh:
--------------------------------------------------------------------------------

```bash
 1 | #!/bin/bash
 2 | # Test script for Groq bridge integration
 3 | # Demonstrates usage without requiring API key
 4 | 
 5 | set -e
 6 | 
 7 | echo "=== Groq Bridge Integration Test ==="
 8 | echo ""
 9 | 
10 | # Check if groq package is installed
11 | echo "1. Checking Python groq package..."
12 | if python3 -c "import groq" 2>/dev/null; then
13 |     echo "   ✓ groq package installed"
14 | else
15 |     echo "   ✗ groq package NOT installed"
16 |     echo ""
17 |     echo "To install: pip install groq"
18 |     echo "Or: uv pip install groq"
19 |     exit 1
20 | fi
21 | 
22 | # Check if API key is set
23 | echo ""
24 | echo "2. Checking GROQ_API_KEY environment variable..."
25 | if [ -z "$GROQ_API_KEY" ]; then
26 |     echo "   ✗ GROQ_API_KEY not set"
27 |     echo ""
28 |     echo "To set: export GROQ_API_KEY='your-api-key-here'"
29 |     echo "Get your API key from: https://console.groq.com/keys"
30 |     echo ""
31 |     echo "Skipping API test (would require valid key)"
32 | else
33 |     echo "   ✓ GROQ_API_KEY configured"
34 | 
35 |     # Test the bridge with a simple query
36 |     echo ""
37 |     echo "3. Testing Groq bridge with sample query..."
38 |     echo ""
39 | 
40 |     python3 scripts/utils/groq_agent_bridge.py \
41 |         "Rate the complexity of this Python function on a scale of 1-10: def add(a, b): return a + b" \
42 |         --json
43 | fi
44 | 
45 | echo ""
46 | echo "=== Integration Test Complete ==="
47 | echo ""
48 | echo "Usage examples:"
49 | echo ""
50 | echo "# Complexity analysis"
51 | echo "python scripts/utils/groq_agent_bridge.py \"Analyze complexity 1-10: \$(cat file.py)\""
52 | echo ""
53 | echo "# Security scan"
54 | echo "python scripts/utils/groq_agent_bridge.py \"Check for security issues: \$(cat file.py)\" --json"
55 | echo ""
56 | echo "# With custom model and temperature"
57 | echo "python scripts/utils/groq_agent_bridge.py \"Your prompt\" --model llama2-70b-4096 --temperature 0.3"
58 | 
```

--------------------------------------------------------------------------------
/tools/docker/DEPRECATED.md:
--------------------------------------------------------------------------------

```markdown
 1 | # Deprecated Docker Files
 2 | 
 3 | The following Docker files are deprecated as of v5.0.4 and will be removed in v6.0.0:
 4 | 
 5 | ## Deprecated Files
 6 | 
 7 | ### 1. `docker-compose.standalone.yml`
 8 | - **Replaced by**: `docker-compose.http.yml`
 9 | - **Reason**: Confusing name, mixed ChromaDB/SQLite configs, incorrect entrypoint for HTTP mode
10 | - **Migration**: Use `docker-compose.http.yml` for HTTP/API access
11 | 
12 | ### 2. `docker-compose.uv.yml`
13 | - **Replaced by**: UV is now built into the main Dockerfile
14 | - **Reason**: UV support should be in the image, not a separate compose file
15 | - **Migration**: UV is automatically available in all configurations
16 | 
17 | ### 3. `docker-compose.pythonpath.yml`
18 | - **Replaced by**: Fixed PYTHONPATH in main Dockerfile
19 | - **Reason**: PYTHONPATH fix belongs in Dockerfile, not compose variant
20 | - **Migration**: All compose files now have correct PYTHONPATH=/app/src
21 | 
22 | ### 4. `docker-entrypoint-persistent.sh`
23 | - **Replaced by**: `docker-entrypoint-unified.sh`
24 | - **Reason**: Overcomplicated, doesn't support HTTP mode, named pipes unnecessary
25 | - **Migration**: Use unified entrypoint with MCP_MODE environment variable
26 | 
27 | ## New Simplified Structure
28 | 
29 | Use one of these two configurations:
30 | 
31 | 1. **MCP Protocol Mode** (for Claude Desktop, VS Code):
32 |    ```bash
33 |    docker-compose up -d
34 |    ```
35 | 
36 | 2. **HTTP/API Mode** (for web access, REST API):
37 |    ```bash
38 |    docker-compose -f docker-compose.http.yml up -d
39 |    ```
40 | 
41 | ## Timeline
42 | 
43 | - **v5.0.4**: Files marked as deprecated, new structure introduced
44 | - **v5.1.0**: Warning messages added when using deprecated files
45 | - **v6.0.0**: Deprecated files removed
46 | 
47 | ## Credits
48 | 
49 | Thanks to Joe Esposito for identifying the Docker setup issues that led to this simplification.
```
Page 2/47FirstPrevNextLast