Skip to content

tryweb/lancedb-opencode-pro

Folders and files

NameName
Last commit message
Last commit date

Latest commit

ย 

History

141 Commits
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 

Repository files navigation

lancedb-opencode-pro

LanceDB-backed long-term memory provider for OpenCode

npm version OpenCode

Welcome to lancedb-opencode-pro! This plugin empowers OpenCode with a durable, long-term memory system powered by LanceDB.

To help you find what you need quickly, please select the guide that best fits your needs:

๐Ÿ—บ๏ธ Choose Your Path

โš ๏ธ Experiencing Issues?

You see "Memory store unavailable" error or plugin not loading ๐Ÿ‘‰ Read the Compatibility Guide (10 min)

  • Fix: rm -rf ~/.cache/opencode/node_modules/ then restart
  • OpenCode v1.4.3 fully supported (NAPI issue was a cache problem, not a loader bug)
  • Diagnosis checklist and solutions

๐Ÿš€ First-Time Users

You are new to this project and want to get it running quickly. ๐Ÿ‘‰ Read the Quick Start Guide (15 min)

  • Complete installation steps & examples
  • Basic memory operations
  • Troubleshooting common issues

โš™๏ธ Advanced Users

You have it running and want to tune retrieval, use OpenAI, or share memory across projects. ๐Ÿ‘‰ Read the Advanced Configuration (30 min)

  • Hybrid retrieval (RRF, recency boost, importance weighting)
  • Memory injection controls (budget/adaptive modes)
  • OpenAI Embedding setup
  • Cross-project memory sharing (global scope)

๐Ÿ› ๏ธ Developers & Contributors

You want to understand the architecture, run tests, or contribute to the source code. ๐Ÿ‘‰ Read the Development Workflow (20 min)

  • Local environment setup
  • OpenCode skills usage guide
  • Testing and validation processes
  • Release workflow

๐ŸŽฏ Quick Start (5 Minutes)

1. Registration

Register the plugin in ~/.config/opencode/opencode.json:

{
  "$schema": "https://opencode.ai/config.json",
  "plugin": [
    "lancedb-opencode-pro"
  ]
}

2. Configuration

Create ~/.config/opencode/lancedb-opencode-pro.json:

{
  "provider": "lancedb-opencode-pro",
  "dbPath": "~/.opencode/memory/lancedb",
  "embedding": {
    "provider": "ollama",
    "model": "nomic-embed-text",
    "baseUrl": "http://127.0.0.1:11434"
  }
}

3. Start Ollama and Restart OpenCode

# Verify Ollama is accessible
curl http://127.0.0.1:11434/api/tags

# Restart OpenCode

That's it! OpenCode will now automatically capture key decisions and inject them into future conversations.


โœจ Core Features

Auto Memory Capture

  • Automatically extracts decisions, lessons, and patterns from assistant responses.
  • Configurable minimum capture length (default: 80 chars).
  • Auto-categorization into project or global scope.

Hybrid Retrieval (v0.1.4+)

  • Vector Search + BM25 Lexical Search
  • Reciprocal Rank Fusion (RRF)
  • Recency boost & Importance weighting

Memory Management Tools

Tool Description Documentation
memory_search Hybrid search for long-term memory Doc
memory_delete Delete a single memory entry Doc
memory_clear Clear all memories in a specific scope Doc
memory_stats View memory statistics Doc
memory_remember Manually store a memory Doc
memory_forget Remove or disable a memory Doc
memory_what_did_you_learn Show recent learning summaries Doc
memory_dashboard Weekly learning dashboard with trends Doc
memory_kpi Learning effectiveness KPIs (retry-to-success, memory lift) Doc

(For full details on tools like Effectiveness Feedback, Cross-Project Sharing, Deduplication, Citations, and Episodic Learning, please refer to the Advanced Configuration.)


โš™๏ธ Configuration Overview

Basic Setup

{
  "provider": "lancedb-opencode-pro",
  "dbPath": "~/.opencode/memory/lancedb",
  "embedding": {
    "provider": "ollama",
    "model": "nomic-embed-text",
    "baseUrl": "http://127.0.0.1:11434"
  },
  "retrieval": {
    "mode": "hybrid",
    "vectorWeight": 0.7,
    "bm25Weight": 0.3,
    "minScore": 0.2
  }
}

Full configuration options including injection control, deduplication, and OpenAI setup are available in docs/ADVANCED_CONFIG.md.

Priority Order

  1. Environment Variables (LANCEDB_OPENCODE_PRO_*)
  2. LANCEDB_OPENCODE_PRO_CONFIG_PATH
  3. Project sidecar: .opencode/lancedb-opencode-pro.json
  4. Global sidecar: ~/.config/opencode/lancedb-opencode-pro.json
  5. Legacy sidecar or built-in defaults

๐Ÿงช Validation & Testing

Quick validation using Docker:

docker compose build --no-cache && docker compose up -d
docker compose exec opencode-dev npm run verify

Full pre-release validation:

docker compose exec opencode-dev npm run verify:full

Performance benchmark (optional):

# Mock mode (fast)
docker compose exec opencode-dev ./scripts/run-perf-benchmark.sh

# Real Ollama mode
docker compose exec opencode-dev ./scripts/run-perf-benchmark.sh --real

Check docs/memory-validation-checklist.md for more details.


๐Ÿ“ฆ Installation Options

Primary method: npm package (recommended) Alternatively, install via .tgz release asset or build from source. See Install Options.


๐Ÿ—บ๏ธ Version History

  • v0.8.3: @opencode-ai/sdk and plugin upgrade to 1.4.7, Effect type mock fix, SDK compatibility docs
  • v0.8.2: Plugin version mismatch fix (internal)
  • v0.8.1: @opencode-ai/sdk and plugin upgrade to 1.4.3, bunfig.toml test isolation fix
  • v0.8.0: Structured Logging via client.app.log() per OpenCode best practices, Test Environment Isolation Fix
  • v0.7.0: OpenCode SDK v1.3.14 Compatibility, Node 22 memory_search Race Condition Fix
  • v0.6.3: Index Creation Guard (defer on empty/insufficient tables, fix #70), LanceDB 0.27.2
  • v0.6.2: Index Race Condition Fix (concurrent-process conflict handling, jitter backoff)
  • v0.6.1: Event TTL/Archival, Index Creation Resilience, Duplicate Consolidation Performance
  • v0.6.0: Learning Dashboard, KPI Pipeline, Feedback-Driven Ranking, Task-Type Aware Injection

[older versions removed - see CHANGELOG.md for full history]

See CHANGELOG.md for all changes.


๐Ÿค Contributing

  1. Read docs/DEVELOPMENT_WORKFLOW.md
  2. Understand OpenSpec specs in openspec/specs/
  3. Use backlog-to-openspec to create proposals

๐Ÿ“ž Support & License

Last Updated: 2026-04-17 Latest Version: v0.8.3

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Packages

 
 
 

Contributors