Conversation
- workflows/llm-compaction.ts: replaces mechanical compaction with LLM intelligence - .gitignore: exclude .agent-relay/ metadata
Replaces mechanical keyword-based compaction with intelligent LLM summarization. New compact/ module: - provider.ts: OpenAI + Anthropic providers (raw fetch, no deps) - serializer.ts: trajectory → LLM-readable text with token budgeting - prompts.ts: system + user prompts for compaction - parser.ts: parse LLM JSON output with fallbacks - markdown.ts: generate readable .md summaries - config.ts: env vars or .trajectories/config.json CLI updates: - trail compact now uses LLM by default (if API key present) - --mechanical flag for old behavior - --focus <areas> for targeted summaries - --markdown flag (default: true) for .md output - Dry-run shows prompt + cost estimate Output includes: - Narrative summary (what happened, how) - Key decisions with reasoning and impact - Extracted conventions/patterns for future work - Synthesized lessons from challenges - Open questions / unresolved issues Backwards compatible: falls back to mechanical if no LLM provider.
| if (typeof value !== "string") { | ||
| return undefined; | ||
| } | ||
|
|
||
| const parsed = Number(value); | ||
| return Number.isFinite(parsed) ? parsed : undefined; |
There was a problem hiding this comment.
🔴 readNumber treats empty/whitespace-only env vars as 0 instead of unset
readNumber("") returns 0 because Number("") === 0 and Number.isFinite(0) === true. This is inconsistent with the sibling readString function at src/compact/config.ts:96-103 which correctly returns undefined for empty strings. When a user clears an env var (e.g., export TRAJECTORIES_LLM_MAX_INPUT_TOKENS=), readNumberEnv returns 0 instead of undefined. Because 0 ?? DEFAULT evaluates to 0 (nullish coalescing does not trigger for 0), this silently overrides the default. The most severe downstream effect: maxInputTokens = 0 causes serializeForLLM to produce an empty string (truncateText(document, 0) returns "" at src/compact/serializer.ts:61), so the LLM receives no trajectory data and generates a useless compaction. Similarly, maxOutputTokens = 0 would be sent as max_tokens: 0 to the LLM API, likely causing an API error.
Was this helpful? React with 👍 or 👎 to provide feedback.
When no API keys (OPENAI_API_KEY / ANTHROPIC_API_KEY) are available, resolveProvider now falls back to locally installed CLI tools (claude, codex) detected via @agent-relay/sdk's resolveCli(). This removes the hard requirement for API keys when users already have a CLI installed. Resolution order: explicit API keys → CLI detection → mechanical fallback. Users can also force CLI with TRAJECTORIES_LLM_PROVIDER=cli. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
… workflow compact.ts: execSync → execFileSync to prevent shell injection (HIGH) compact.ts: restore process.env immediately after FileStorage constructor (HIGH) compact.ts: extend shared CompactedTrajectoryMetadata type from parser.ts (MEDIUM) compact.ts: only pass jsonMode for OpenAIProvider (MEDIUM) provider.ts: warn on non-default base URLs to mitigate SSRF (MEDIUM) provider.ts: trim API keys and use || instead of ?? (MEDIUM) provider.ts: throw on empty Anthropic conversation instead of fabricating (MEDIUM) provider.ts: add AbortController with 300s timeout to fetch calls (MEDIUM) provider.ts: update Anthropic API version to 2024-10-22 (MEDIUM) provider.ts: clarify Message type relationship with prompts.ts (MEDIUM) provider.ts: use stdin pipe via spawnWithStdin to avoid arg length limits (MEDIUM) provider.ts: remove explicit env spread from spawn (MEDIUM) provider.ts: redact response text from parseJson error messages (LOW) workflows/llm-compaction.ts: use process.cwd() instead of hardcoded path (MEDIUM) config.ts: document merge precedence in loadFileConfig (MEDIUM) package.json: move @agent-relay/sdk to optionalDependencies (MEDIUM) tests: add full LLM pipeline test with mocked provider (LOW) Co-Authored-By: My Senior Dev <dev@myseniordev.com>
Cast the dynamic import result to an explicit type inline so TypeScript doesn't require module declarations for @agent-relay/sdk. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Use a top-level static import of resolveCli from @agent-relay/sdk instead of a dynamic import(). The SDK ships type declarations so TypeScript resolves it correctly. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
The bundler (esbuild via tsup) tried to resolve and inline the SDK. Mark it as external so the import is left as-is for runtime resolution. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
The entire SDK was too heavy just for resolveCli(). Replaced with an inlined findBinary() that does a `which` lookup + fallback to well-known install directories (~/.local/bin, ~/.claude/local, /usr/local/bin, /opt/homebrew/bin). Zero new dependencies. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Replaces mechanical keyword-based compaction with intelligent LLM summarization.
New
src/compact/module.trajectories/config.jsonCLI updates
trail compactuses LLM by default (if API key present)--mechanicalflag for old behavior--focus <areas>for targeted summaries--markdownflag for .md output alongside JSON--dry-runshows prompt + cost estimateOutput includes
Backwards compatible: falls back to mechanical compaction if no LLM provider configured.
No new npm dependencies.