mirror of
https://github.com/moltbot/moltbot.git
synced 2026-05-17 02:37:33 +00:00
refactor: make sessions sqlite-only
This commit is contained in:
@@ -467,7 +467,7 @@ openclaw doctor
|
||||
<Accordion title="Cron or heartbeat appears to prevent /new-style rollover">
|
||||
- Daily and idle reset freshness is not based on `updatedAt`; see [Session management](/concepts/session#session-lifecycle).
|
||||
- Cron wakeups, heartbeat runs, exec notifications, and gateway bookkeeping may update the session row for routing/status, but they do not extend `sessionStartedAt` or `lastInteractionAt`.
|
||||
- For legacy rows created before those fields existed, OpenClaw can recover `sessionStartedAt` from the transcript JSONL session header when the file is still available. Legacy idle rows without `lastInteractionAt` use that recovered start time as their idle baseline.
|
||||
- For legacy rows created before those fields existed, OpenClaw can recover `sessionStartedAt` from the SQLite transcript session header after doctor migration. Legacy idle rows without `lastInteractionAt` use that recovered start time as their idle baseline.
|
||||
|
||||
</Accordion>
|
||||
<Accordion title="Timezone gotchas">
|
||||
|
||||
@@ -114,15 +114,14 @@ openclaw sessions cleanup --json
|
||||
|
||||
`openclaw sessions cleanup` uses `session.maintenance` settings from config:
|
||||
|
||||
- Scope note: `openclaw sessions cleanup` maintains session stores, transcripts, and trajectory sidecars. It does not prune cron run logs (`cron/runs/<jobId>.jsonl`), which are managed by `cron.runLog.maxBytes` and `cron.runLog.keepLines` in [Cron configuration](/automation/cron-jobs#configuration) and explained in [Cron maintenance](/automation/cron-jobs#maintenance).
|
||||
- Cleanup also prunes unreferenced primary transcripts, compaction checkpoints, and trajectory sidecars older than `session.maintenance.pruneAfter`; files still referenced by the session store are preserved.
|
||||
- Scope note: `openclaw sessions cleanup` maintains SQLite session rows only. It does not prune transcript files, trajectory sidecars, or cron run logs (`cron/runs/<jobId>.jsonl`), which are managed by their owning runtimes.
|
||||
- Legacy JSON import belongs to `openclaw doctor --fix`; cleanup no longer imports or rewrites `sessions.json`.
|
||||
|
||||
- `--dry-run`: preview how many entries would be pruned/capped without writing.
|
||||
- In text mode, dry-run prints a per-session action table (`Action`, `Key`, `Age`, `Model`, `Flags`) so you can see what would be kept vs removed.
|
||||
- `--enforce`: apply maintenance even when `session.maintenance.mode` is `warn`.
|
||||
- `--fix-missing`: remove entries whose transcript files are missing, even if they would not normally age/count out yet.
|
||||
- `--fix-dm-scope`: when `session.dmScope` is `main`, retire stale peer-keyed direct-DM rows left behind by earlier `per-peer`, `per-channel-peer`, or `per-account-channel-peer` routing. Use `--dry-run` first; applying the cleanup removes those rows from `sessions.json` and preserves their transcripts as deleted archives.
|
||||
- `--active-key <key>`: protect a specific active key from disk-budget eviction. Durable external conversation pointers, such as group sessions and thread-scoped chat sessions, are also kept by age/count/disk-budget maintenance.
|
||||
- `--fix-missing`: remove entries whose SQLite transcript events are missing, even if they would not normally age/count out yet.
|
||||
- `--active-key <key>`: protect a specific active key from enforce-mode age/count retention.
|
||||
- `--agent <id>`: run cleanup for one configured agent store.
|
||||
- `--all-agents`: run cleanup for all configured agent stores.
|
||||
- `--store <path>`: run against a specific `sessions.json` file.
|
||||
@@ -130,8 +129,7 @@ openclaw sessions cleanup --json
|
||||
|
||||
When a Gateway is reachable, non-dry-run cleanup for configured agent stores is
|
||||
sent through the Gateway so it shares the same session-store writer as runtime
|
||||
traffic. Legacy JSON import belongs to `openclaw doctor --fix`; cleanup no
|
||||
longer acts as the migration path for `sessions.json`.
|
||||
traffic.
|
||||
|
||||
`openclaw sessions cleanup --all-agents --dry-run --json`:
|
||||
|
||||
|
||||
@@ -102,14 +102,14 @@ The session store keeps separate lifecycle timestamps:
|
||||
- `updatedAt`: last store-row mutation; useful for listing and pruning, but not
|
||||
authoritative for daily/idle reset freshness.
|
||||
|
||||
Older rows without `sessionStartedAt` are resolved from the transcript JSONL
|
||||
Older rows without `sessionStartedAt` are resolved from the SQLite transcript
|
||||
session header when available. If an older row also lacks `lastInteractionAt`,
|
||||
idle freshness falls back to that session start time, not to later bookkeeping
|
||||
writes.
|
||||
|
||||
## Session maintenance
|
||||
|
||||
OpenClaw bounds session storage through explicit maintenance. By default, it
|
||||
OpenClaw bounds SQLite session rows through explicit maintenance. By default, it
|
||||
runs in `warn` mode (reports what would be cleaned). Set
|
||||
`session.maintenance.mode` to `"enforce"` and run `openclaw sessions cleanup`
|
||||
when you want cleanup to apply:
|
||||
@@ -126,7 +126,7 @@ when you want cleanup to apply:
|
||||
}
|
||||
```
|
||||
|
||||
Gateway runtime writes do not prune, cap, or import session rows. Session store reads also do not prune or cap entries during Gateway startup. This avoids running full store cleanup on every startup or isolated cron session. `openclaw sessions cleanup --enforce` applies the cap immediately.
|
||||
Gateway runtime writes do not prune, cap, or import session rows. Session store reads also do not prune or cap entries during Gateway startup. This avoids running cleanup on every startup or isolated cron session. `openclaw sessions cleanup --enforce` applies row age/count retention explicitly.
|
||||
|
||||
Maintenance preserves durable external conversation pointers, including group
|
||||
sessions and thread-scoped chat sessions, while still allowing synthetic cron,
|
||||
|
||||
@@ -1184,8 +1184,6 @@ See [Multi-Agent Sandbox & Tools](/tools/multi-agent-sandbox-tools) for preceden
|
||||
mode: "warn", // warn | enforce
|
||||
pruneAfter: "30d",
|
||||
maxEntries: 500,
|
||||
maxDiskBytes: "500mb", // optional hard budget
|
||||
highWaterBytes: "400mb", // optional cleanup target
|
||||
},
|
||||
threadBindings: {
|
||||
enabled: true,
|
||||
@@ -1219,13 +1217,13 @@ See [Multi-Agent Sandbox & Tools](/tools/multi-agent-sandbox-tools) for preceden
|
||||
- **`agentToAgent.maxPingPongTurns`**: maximum reply-back turns between agents during agent-to-agent exchanges (integer, range: `0`–`5`). `0` disables ping-pong chaining.
|
||||
- **`sendPolicy`**: match by `channel`, `chatType` (`direct|group|channel`, with legacy `dm` alias), `keyPrefix`, or `rawKeyPrefix`. First deny wins.
|
||||
- **`store`**: optional legacy/custom JSON store path. When omitted, the canonical per-agent session store uses `~/.openclaw/state/openclaw.sqlite`.
|
||||
- **`maintenance`**: session-store cleanup + retention controls.
|
||||
- **`maintenance`**: explicit SQLite session-row cleanup + retention controls.
|
||||
- `mode`: `warn` emits warnings only; `enforce` applies cleanup.
|
||||
- `pruneAfter`: age cutoff for stale entries (default `30d`).
|
||||
- `maxEntries`: maximum number of entries in the session store (default `500`). Runtime writes do not prune or cap entries; `openclaw sessions cleanup --enforce` applies the cap immediately.
|
||||
- `rotateBytes`: deprecated and ignored; `openclaw doctor --fix` removes it from older configs.
|
||||
- `maxDiskBytes`: optional sessions-directory disk budget. In `warn` mode it logs warnings; in `enforce` mode it removes oldest artifacts/sessions first.
|
||||
- `highWaterBytes`: optional target after budget cleanup. Defaults to `80%` of `maxDiskBytes`.
|
||||
- `maxDiskBytes`: deprecated and ignored; session transcripts are stored in SQLite.
|
||||
- `highWaterBytes`: deprecated and ignored with `maxDiskBytes`.
|
||||
- **`threadBindings`**: global defaults for thread-bound session features.
|
||||
- `enabled`: master default switch (providers can override; Discord uses `channels.discord.threadBindings.enabled`)
|
||||
- `idleHours`: default inactivity auto-unfocus in hours (`0` disables; providers can override)
|
||||
|
||||
@@ -166,8 +166,6 @@ Save to `~/.openclaw/openclaw.json` and you can DM the bot from that number.
|
||||
mode: "warn",
|
||||
pruneAfter: "30d",
|
||||
maxEntries: 500,
|
||||
maxDiskBytes: "500mb", // optional
|
||||
highWaterBytes: "400mb", // optional (defaults to 80% of maxDiskBytes)
|
||||
},
|
||||
typingIntervalSeconds: 5,
|
||||
sendPolicy: {
|
||||
|
||||
@@ -50,21 +50,21 @@ This plan has started landing in slices:
|
||||
fix mode imports legacy `sessions.json` indexes into SQLite and removes the
|
||||
JSON index after import, instead of keeping a startup migration or parallel
|
||||
compatibility/export store. Runtime session reads and writes normalize and
|
||||
persist only: no JSON import, pruning, capping, archive cleanup, or
|
||||
persist only: no JSON import, row pruning, capping, archive cleanup, or
|
||||
disk-budget cleanup runs on the hot path. The old maintenance write options
|
||||
have been removed from the session-store API; doctor owns legacy import and
|
||||
`openclaw sessions cleanup` owns explicit cleanup. Status and discovery now
|
||||
use the primary session-store loader instead of a duplicated read-only JSON
|
||||
parser, and SQLite-backed agent session directories remain discoverable after
|
||||
doctor deletes the legacy `sessions.json` file. The legacy JSON session-store
|
||||
object/serialized cache is gone; JSON fallback reads now parse directly while
|
||||
canonical SQLite stores avoid that path. The cron timer no longer runs a
|
||||
dedicated session reaper; cron run sessions are maintained through the same
|
||||
explicit session cleanup path as other rows.
|
||||
`openclaw sessions cleanup` owns explicit SQLite row cleanup only. Status and
|
||||
discovery now use the primary session-store loader instead of a duplicated
|
||||
read-only JSON parser, and SQLite-backed agent session directories remain
|
||||
discoverable after doctor deletes the legacy `sessions.json` file. The legacy
|
||||
JSON session-store object/serialized cache is gone; JSON fallback reads now
|
||||
parse directly while canonical SQLite stores avoid that path. The cron timer
|
||||
no longer runs a dedicated session reaper; cron run sessions are maintained
|
||||
through the same explicit row cleanup path as other rows.
|
||||
- Transcript events are SQLite-primary. OpenClaw-owned append paths require
|
||||
agent/session scope and write `transcript_events` directly; `*.jsonl` is no
|
||||
longer a runtime mirror for those paths. JSONL is now an explicit
|
||||
import/export/debug shape only. The OpenClaw transcript session manager,
|
||||
import/export/debug boundary shape only. The OpenClaw transcript session manager,
|
||||
Gateway-injected assistant messages, CLI transcript persistence, Codex
|
||||
app-server mirroring, compaction successor transcripts, manual compaction
|
||||
boundary rewrites, and reset/header creation all persist through SQLite.
|
||||
|
||||
@@ -85,36 +85,30 @@ OpenClaw resolves these via `src/config/sessions/*`.
|
||||
|
||||
---
|
||||
|
||||
## Store maintenance and disk controls
|
||||
## Store Maintenance
|
||||
|
||||
Session persistence has explicit maintenance controls (`session.maintenance`) for session entries and trajectory sidecars:
|
||||
Session persistence has explicit maintenance controls (`session.maintenance`) for SQLite session rows:
|
||||
|
||||
- `mode`: `warn` (default) or `enforce`
|
||||
- `pruneAfter`: stale-entry age cutoff (default `30d`)
|
||||
- `maxEntries`: cap entries in the session store (default `500`)
|
||||
- `maxDiskBytes`: optional sessions-directory budget
|
||||
- `highWaterBytes`: optional target after cleanup (default `80%` of `maxDiskBytes`)
|
||||
- `maxDiskBytes`: deprecated and ignored
|
||||
- `highWaterBytes`: deprecated and ignored
|
||||
|
||||
Normal Gateway writes flow through a per-store session writer that serializes in-process mutations. SQLite is the canonical per-agent backend; `sessions.json` is a legacy doctor-import input, not a parallel export/debug store. Runtime code should prefer `updateSessionStore(...)` or `updateSessionStoreEntry(...)`. Runtime writes normalize and persist only; they do not prune, cap, import, archive, or run disk-budget cleanup. When a Gateway is reachable, non-dry-run `openclaw sessions cleanup` and `openclaw agents delete` delegate store mutations to the Gateway so cleanup joins the same writer queue. Session store reads do not import, prune, or cap entries during Gateway startup; use `openclaw doctor --fix` for legacy JSON import and `openclaw sessions cleanup --enforce` for cleanup. `openclaw sessions cleanup --enforce` applies the configured cap immediately and prunes old unreferenced trajectory artifacts even when no disk budget is configured. Compaction checkpoint cleanup removes SQLite snapshot rows, not file artifacts.
|
||||
Normal Gateway writes flow through a per-store session writer that serializes in-process mutations. SQLite is the canonical per-agent backend; `sessions.json` is a legacy doctor-import input, not a parallel export/debug store. Runtime code should prefer `updateSessionStore(...)` or `updateSessionStoreEntry(...)`. Runtime writes normalize and persist only; they do not prune, cap, import, archive, or run disk-budget cleanup. When a Gateway is reachable, non-dry-run `openclaw sessions cleanup` and `openclaw agents delete` delegate store mutations to the Gateway so cleanup joins the same writer queue. Session store reads do not import, prune, or cap entries during Gateway startup; use `openclaw doctor --fix` for legacy JSON import and `openclaw sessions cleanup --enforce` for row cleanup. `openclaw sessions cleanup --enforce` applies the configured age/count policy immediately. Compaction checkpoint cleanup removes SQLite snapshot rows, not file artifacts.
|
||||
|
||||
Maintenance keeps durable external conversation pointers such as group sessions
|
||||
and thread-scoped chat sessions, but synthetic runtime entries for cron, hooks,
|
||||
heartbeat, ACP, and sub-agents can still be removed when they exceed the
|
||||
configured age, count, or disk budget.
|
||||
configured age or count.
|
||||
|
||||
OpenClaw no longer creates automatic `sessions.json.bak.*` rotation backups during Gateway writes. The legacy `session.maintenance.rotateBytes` key is ignored and `openclaw doctor --fix` removes it from older configs.
|
||||
OpenClaw no longer creates automatic `sessions.json.bak.*` rotation backups during Gateway writes. The legacy `session.maintenance.rotateBytes`, `maxDiskBytes`, and `highWaterBytes` keys are ignored and `openclaw doctor --fix` removes them from older configs.
|
||||
|
||||
Transcript mutations are serialized through SQLite transactions plus the
|
||||
per-session append queue. The legacy `session.writeLock.acquireTimeoutMs`
|
||||
setting remains for older import/debug paths that still touch JSONL files.
|
||||
|
||||
Enforcement order for disk budget cleanup (`mode: "enforce"`):
|
||||
|
||||
1. Remove oldest orphan trajectory artifacts first.
|
||||
2. If still above the target, evict oldest session entries and their trajectory sidecars.
|
||||
3. Keep going until usage is at or below `highWaterBytes`.
|
||||
|
||||
In `mode: "warn"`, OpenClaw reports potential evictions but does not mutate the store/files.
|
||||
In `mode: "warn"`, OpenClaw reports potential row pruning/capping but does not mutate the store.
|
||||
|
||||
Run maintenance on demand:
|
||||
|
||||
|
||||
@@ -156,10 +156,9 @@ export OPENCLAW_TRAJECTORY_DIR=/var/lib/openclaw/trajectories
|
||||
When this variable is set, OpenClaw writes one JSONL file per session id in that
|
||||
directory.
|
||||
|
||||
Session maintenance removes trajectory sidecars when their owning session entry
|
||||
is pruned, capped, or evicted by the sessions disk budget. Runtime files outside
|
||||
the sessions directory are removed only when the pointer target still proves it
|
||||
belongs to that session.
|
||||
Session maintenance no longer prunes trajectory sidecars. Runtime-owned
|
||||
trajectory cleanup removes run-scoped artifacts when the owning runtime finishes
|
||||
or when a doctor migration imports legacy transcript files into SQLite.
|
||||
|
||||
## Disable capture
|
||||
|
||||
|
||||
@@ -1,8 +1,10 @@
|
||||
import crypto from "node:crypto";
|
||||
import fsSync from "node:fs";
|
||||
import fs from "node:fs/promises";
|
||||
import path from "node:path";
|
||||
import * as readline from "node:readline";
|
||||
import {
|
||||
loadSqliteSessionTranscriptEvents,
|
||||
resolveSqliteSessionTranscriptScopeForPath,
|
||||
} from "openclaw/plugin-sdk/agent-harness-runtime";
|
||||
import {
|
||||
DEFAULT_PROVIDER,
|
||||
parseModelRef,
|
||||
@@ -1640,43 +1642,24 @@ async function streamBoundedTranscriptJsonl(params: {
|
||||
onRecord: (record: unknown) => boolean | void;
|
||||
}): Promise<void> {
|
||||
const limits = resolveTranscriptReadLimits(params.limits);
|
||||
try {
|
||||
const stats = await fs.stat(params.sessionFile);
|
||||
if (!stats.isFile() || stats.size > limits.maxBytes) {
|
||||
return;
|
||||
}
|
||||
} catch {
|
||||
const scope = resolveSqliteSessionTranscriptScopeForPath({
|
||||
transcriptPath: params.sessionFile,
|
||||
});
|
||||
if (!scope) {
|
||||
return;
|
||||
}
|
||||
const stream = fsSync.createReadStream(params.sessionFile, {
|
||||
encoding: "utf8",
|
||||
});
|
||||
const rl = readline.createInterface({
|
||||
input: stream,
|
||||
crlfDelay: Infinity,
|
||||
});
|
||||
let seenLines = 0;
|
||||
try {
|
||||
for await (const line of rl) {
|
||||
seenLines += 1;
|
||||
if (seenLines > limits.maxLines) {
|
||||
const events = loadSqliteSessionTranscriptEvents(scope);
|
||||
if (JSON.stringify(events.map((entry) => entry.event)).length > limits.maxBytes) {
|
||||
return;
|
||||
}
|
||||
for (const { event } of events.slice(0, limits.maxLines)) {
|
||||
if (params.onRecord(event)) {
|
||||
break;
|
||||
}
|
||||
const trimmed = line.trim();
|
||||
if (!trimmed) {
|
||||
continue;
|
||||
}
|
||||
try {
|
||||
if (params.onRecord(JSON.parse(trimmed) as unknown)) {
|
||||
break;
|
||||
}
|
||||
} catch {}
|
||||
}
|
||||
} catch {
|
||||
// Treat transcript recovery as best-effort on timeout/abort paths.
|
||||
} finally {
|
||||
rl.close();
|
||||
stream.destroy();
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
@@ -5,8 +5,6 @@ import path from "node:path";
|
||||
import {
|
||||
buildSessionEntry,
|
||||
listSessionFilesForAgent,
|
||||
loadSessionTranscriptClassificationForAgent,
|
||||
normalizeSessionTranscriptPathForComparison,
|
||||
parseUsageCountedSessionIdFromFileName,
|
||||
sessionPathForFile,
|
||||
} from "openclaw/plugin-sdk/memory-core-host-engine-qmd";
|
||||
@@ -751,30 +749,17 @@ async function collectSessionIngestionBatches(params: {
|
||||
const sessionFiles: Array<{
|
||||
agentId: string;
|
||||
absolutePath: string;
|
||||
generatedByDreamingNarrative: boolean;
|
||||
generatedByCronRun: boolean;
|
||||
sessionPath: string;
|
||||
}> = [];
|
||||
for (const agentId of agentIds) {
|
||||
const files = await listSessionFilesForAgent(agentId);
|
||||
const transcriptClassification =
|
||||
files.length > 0
|
||||
? loadSessionTranscriptClassificationForAgent(agentId)
|
||||
: {
|
||||
dreamingNarrativeTranscriptPaths: new Set<string>(),
|
||||
cronRunTranscriptPaths: new Set<string>(),
|
||||
};
|
||||
for (const absolutePath of files) {
|
||||
if (isCheckpointSessionTranscriptPath(absolutePath)) {
|
||||
continue;
|
||||
}
|
||||
const normalizedPath = normalizeSessionTranscriptPathForComparison(absolutePath);
|
||||
sessionFiles.push({
|
||||
agentId,
|
||||
absolutePath,
|
||||
generatedByDreamingNarrative:
|
||||
transcriptClassification.dreamingNarrativeTranscriptPaths.has(normalizedPath),
|
||||
generatedByCronRun: transcriptClassification.cronRunTranscriptPaths.has(normalizedPath),
|
||||
sessionPath: sessionPathForFile(absolutePath),
|
||||
});
|
||||
}
|
||||
@@ -803,21 +788,16 @@ async function collectSessionIngestionBatches(params: {
|
||||
}
|
||||
const stateKey = buildSessionStateKey(file.agentId, file.absolutePath);
|
||||
const previous = params.state.files[stateKey];
|
||||
const stat = await fs.stat(file.absolutePath).catch((err: unknown) => {
|
||||
if ((err as NodeJS.ErrnoException)?.code === "ENOENT") {
|
||||
return null;
|
||||
}
|
||||
throw err;
|
||||
});
|
||||
if (!stat) {
|
||||
const entry = await buildSessionEntry(file.absolutePath);
|
||||
if (!entry) {
|
||||
if (previous) {
|
||||
changed = true;
|
||||
}
|
||||
continue;
|
||||
}
|
||||
const fingerprint = {
|
||||
mtimeMs: Math.floor(Math.max(0, stat.mtimeMs)),
|
||||
size: Math.floor(Math.max(0, stat.size)),
|
||||
mtimeMs: Math.floor(Math.max(0, entry.mtimeMs)),
|
||||
size: Math.floor(Math.max(0, entry.size)),
|
||||
};
|
||||
const cursorAtEnd = previous !== undefined && previous.lastContentLine >= previous.lineCount;
|
||||
const unchanged =
|
||||
@@ -831,13 +811,6 @@ async function collectSessionIngestionBatches(params: {
|
||||
continue;
|
||||
}
|
||||
|
||||
const entry = await buildSessionEntry(file.absolutePath, {
|
||||
generatedByDreamingNarrative: file.generatedByDreamingNarrative,
|
||||
generatedByCronRun: file.generatedByCronRun,
|
||||
});
|
||||
if (!entry) {
|
||||
continue;
|
||||
}
|
||||
if (entry.generatedByDreamingNarrative || entry.generatedByCronRun) {
|
||||
nextFiles[stateKey] = {
|
||||
mtimeMs: fingerprint.mtimeMs,
|
||||
|
||||
@@ -9,8 +9,11 @@ export {
|
||||
isHeartbeatUserMessage,
|
||||
isSilentReplyPayloadText,
|
||||
isUsageCountedSessionTranscriptFileName,
|
||||
listSqliteSessionTranscripts,
|
||||
loadSqliteSessionTranscriptEvents,
|
||||
onSessionTranscriptUpdate,
|
||||
parseUsageCountedSessionIdFromFileName,
|
||||
resolveSqliteSessionTranscriptScopeForPath,
|
||||
resolveSessionTranscriptsDirForAgent,
|
||||
stripInboundMetadata,
|
||||
stripInternalRuntimeContext,
|
||||
|
||||
@@ -54,6 +54,11 @@ export {
|
||||
parseUsageCountedSessionIdFromFileName,
|
||||
} from "../../../../src/config/sessions/artifacts.js";
|
||||
export { resolveSessionTranscriptsDirForAgent } from "../../../../src/config/sessions/paths.js";
|
||||
export {
|
||||
listSqliteSessionTranscripts,
|
||||
loadSqliteSessionTranscriptEvents,
|
||||
resolveSqliteSessionTranscriptScopeForPath,
|
||||
} from "../../../../src/config/sessions/transcript-store.sqlite.js";
|
||||
export type { SessionSendPolicyConfig } from "../../../../src/config/types.base.js";
|
||||
export type {
|
||||
MemoryBackend,
|
||||
|
||||
@@ -2,6 +2,8 @@ import fsSync from "node:fs";
|
||||
import os from "node:os";
|
||||
import path from "node:path";
|
||||
import { afterAll, afterEach, beforeAll, beforeEach, describe, expect, it } from "vitest";
|
||||
import { replaceSqliteSessionTranscriptEvents } from "../../../../src/config/sessions/transcript-store.sqlite.js";
|
||||
import { closeOpenClawStateDatabaseForTest } from "../../../../src/state/openclaw-state-db.js";
|
||||
import {
|
||||
buildSessionEntry,
|
||||
listSessionFilesForAgent,
|
||||
@@ -30,6 +32,7 @@ beforeEach(() => {
|
||||
});
|
||||
|
||||
afterEach(() => {
|
||||
closeOpenClawStateDatabaseForTest();
|
||||
if (originalStateDir === undefined) {
|
||||
delete process.env.OPENCLAW_STATE_DIR;
|
||||
} else {
|
||||
@@ -44,25 +47,42 @@ function requireSessionEntry(entry: SessionFileEntry | null): SessionFileEntry {
|
||||
return entry;
|
||||
}
|
||||
|
||||
function seedTranscript(params: {
|
||||
agentId?: string;
|
||||
sessionId: string;
|
||||
transcriptPath?: string;
|
||||
events: unknown[];
|
||||
now?: number;
|
||||
}): string {
|
||||
const agentId = params.agentId ?? "main";
|
||||
const transcriptPath =
|
||||
params.transcriptPath ??
|
||||
path.join(tmpDir, "agents", agentId, "sessions", `${params.sessionId}.jsonl`);
|
||||
replaceSqliteSessionTranscriptEvents({
|
||||
agentId,
|
||||
sessionId: params.sessionId,
|
||||
transcriptPath,
|
||||
events: params.events,
|
||||
now: () => params.now ?? 1_770_000_000_000,
|
||||
});
|
||||
return transcriptPath;
|
||||
}
|
||||
|
||||
describe("listSessionFilesForAgent", () => {
|
||||
it("includes primary transcripts in session file listing", async () => {
|
||||
const sessionsDir = path.join(tmpDir, "agents", "main", "sessions");
|
||||
fsSync.mkdirSync(path.join(sessionsDir, "archive"), { recursive: true });
|
||||
|
||||
const included = ["active.jsonl"];
|
||||
const excluded = ["active.jsonl.bak.2026-02-16T22-28-33.000Z", "sessions.json", "notes.md"];
|
||||
excluded.push("active.checkpoint.11111111-1111-4111-8111-111111111111.jsonl");
|
||||
|
||||
for (const fileName of [...included, ...excluded]) {
|
||||
fsSync.writeFileSync(path.join(sessionsDir, fileName), "");
|
||||
}
|
||||
fsSync.writeFileSync(path.join(sessionsDir, "archive", "nested.jsonl"), "");
|
||||
it("lists SQLite transcript handles for an agent", async () => {
|
||||
const includedPath = seedTranscript({
|
||||
sessionId: "active",
|
||||
events: [{ type: "session", id: "active" }],
|
||||
});
|
||||
seedTranscript({
|
||||
agentId: "other",
|
||||
sessionId: "other-active",
|
||||
events: [{ type: "session", id: "other-active" }],
|
||||
});
|
||||
|
||||
const files = await listSessionFilesForAgent("main");
|
||||
|
||||
expect(files.map((filePath) => path.basename(filePath)).toSorted()).toEqual(
|
||||
included.toSorted(),
|
||||
);
|
||||
expect(files).toEqual([includedPath]);
|
||||
});
|
||||
});
|
||||
|
||||
@@ -82,26 +102,25 @@ describe("sessionPathForFile", () => {
|
||||
|
||||
describe("buildSessionEntry", () => {
|
||||
it("returns lineMap tracking original JSONL line numbers", async () => {
|
||||
// Simulate a real session JSONL file with metadata records interspersed
|
||||
// Simulate a real transcript event stream with metadata records interspersed
|
||||
// Lines 1-3: non-message metadata records
|
||||
// Line 4: user message
|
||||
// Line 5: metadata
|
||||
// Line 6: assistant message
|
||||
// Line 7: user message
|
||||
const jsonlLines = [
|
||||
JSON.stringify({ type: "custom", customType: "model-snapshot", data: {} }),
|
||||
JSON.stringify({ type: "custom", customType: "openclaw.cache-ttl", data: {} }),
|
||||
JSON.stringify({ type: "session-meta", agentId: "test" }),
|
||||
JSON.stringify({ type: "message", message: { role: "user", content: "Hello world" } }),
|
||||
JSON.stringify({ type: "custom", customType: "tool-result", data: {} }),
|
||||
JSON.stringify({
|
||||
const events = [
|
||||
{ type: "custom", customType: "model-snapshot", data: {} },
|
||||
{ type: "custom", customType: "openclaw.cache-ttl", data: {} },
|
||||
{ type: "session-meta", agentId: "test" },
|
||||
{ type: "message", message: { role: "user", content: "Hello world" } },
|
||||
{ type: "custom", customType: "tool-result", data: {} },
|
||||
{
|
||||
type: "message",
|
||||
message: { role: "assistant", content: "Hi there, how can I help?" },
|
||||
}),
|
||||
JSON.stringify({ type: "message", message: { role: "user", content: "Tell me a joke" } }),
|
||||
},
|
||||
{ type: "message", message: { role: "user", content: "Tell me a joke" } },
|
||||
];
|
||||
const filePath = path.join(tmpDir, "session.jsonl");
|
||||
fsSync.writeFileSync(filePath, jsonlLines.join("\n"));
|
||||
const filePath = seedTranscript({ sessionId: "session", events });
|
||||
|
||||
const entry = requireSessionEntry(await buildSessionEntry(filePath));
|
||||
// The content should have 3 lines (3 message records)
|
||||
@@ -119,12 +138,13 @@ describe("buildSessionEntry", () => {
|
||||
});
|
||||
|
||||
it("returns empty lineMap when no messages are found", async () => {
|
||||
const jsonlLines = [
|
||||
JSON.stringify({ type: "custom", customType: "model-snapshot", data: {} }),
|
||||
JSON.stringify({ type: "session-meta", agentId: "test" }),
|
||||
];
|
||||
const filePath = path.join(tmpDir, "empty-session.jsonl");
|
||||
fsSync.writeFileSync(filePath, jsonlLines.join("\n"));
|
||||
const filePath = seedTranscript({
|
||||
sessionId: "empty-session",
|
||||
events: [
|
||||
{ type: "custom", customType: "model-snapshot", data: {} },
|
||||
{ type: "session-meta", agentId: "test" },
|
||||
],
|
||||
});
|
||||
|
||||
const entry = requireSessionEntry(await buildSessionEntry(filePath));
|
||||
expect(entry.content).toBe("");
|
||||
@@ -134,13 +154,21 @@ describe("buildSessionEntry", () => {
|
||||
it("skips checkpoint artifacts so snapshots do not double-index session content", async () => {
|
||||
const checkpointPath = path.join(
|
||||
tmpDir,
|
||||
"agents",
|
||||
"main",
|
||||
"sessions",
|
||||
"ordinary.checkpoint.11111111-1111-4111-8111-111111111111.jsonl",
|
||||
);
|
||||
const content = JSON.stringify({
|
||||
type: "message",
|
||||
message: { role: "user", content: "Archived hello" },
|
||||
seedTranscript({
|
||||
sessionId: "ordinary.checkpoint.11111111-1111-4111-8111-111111111111",
|
||||
transcriptPath: checkpointPath,
|
||||
events: [
|
||||
{
|
||||
type: "message",
|
||||
message: { role: "user", content: "Archived hello" },
|
||||
},
|
||||
],
|
||||
});
|
||||
fsSync.writeFileSync(checkpointPath, content);
|
||||
|
||||
const checkpointEntry = requireSessionEntry(await buildSessionEntry(checkpointPath));
|
||||
|
||||
@@ -193,71 +221,73 @@ describe("buildSessionEntry", () => {
|
||||
expect(entry.generatedByCronRun).toBe(true);
|
||||
});
|
||||
|
||||
it("skips blank lines and invalid JSON without breaking lineMap", async () => {
|
||||
const jsonlLines = [
|
||||
"",
|
||||
"not valid json",
|
||||
JSON.stringify({ type: "message", message: { role: "user", content: "First" } }),
|
||||
"",
|
||||
JSON.stringify({ type: "message", message: { role: "assistant", content: "Second" } }),
|
||||
];
|
||||
const filePath = path.join(tmpDir, "gaps.jsonl");
|
||||
fsSync.writeFileSync(filePath, jsonlLines.join("\n"));
|
||||
it("skips non-message events without breaking lineMap", async () => {
|
||||
const filePath = seedTranscript({
|
||||
sessionId: "gaps",
|
||||
events: [
|
||||
{ type: "custom", customType: "ignored" },
|
||||
{ type: "message", message: { role: "user", content: "First" } },
|
||||
{ type: "custom", customType: "ignored-again" },
|
||||
{ type: "message", message: { role: "assistant", content: "Second" } },
|
||||
],
|
||||
});
|
||||
|
||||
const entry = requireSessionEntry(await buildSessionEntry(filePath));
|
||||
expect(entry.lineMap).toEqual([3, 5]);
|
||||
expect(entry.lineMap).toEqual([2, 4]);
|
||||
});
|
||||
|
||||
it("strips inbound metadata when a user envelope is split across text blocks", async () => {
|
||||
const jsonlLines = [
|
||||
JSON.stringify({
|
||||
type: "message",
|
||||
message: {
|
||||
role: "user",
|
||||
content: [
|
||||
{ type: "text", text: "Conversation info (untrusted metadata):" },
|
||||
{ type: "text", text: "```json" },
|
||||
{ type: "text", text: '{"message_id":"msg-100","chat_id":"-100123"}' },
|
||||
{ type: "text", text: "```" },
|
||||
{ type: "text", text: "" },
|
||||
{ type: "text", text: "Sender (untrusted metadata):" },
|
||||
{ type: "text", text: "```json" },
|
||||
{ type: "text", text: '{"label":"Chris","id":"42"}' },
|
||||
{ type: "text", text: "```" },
|
||||
{ type: "text", text: "" },
|
||||
{ type: "text", text: "Actual user text" },
|
||||
],
|
||||
const filePath = seedTranscript({
|
||||
sessionId: "enveloped-session-array",
|
||||
events: [
|
||||
{
|
||||
type: "message",
|
||||
message: {
|
||||
role: "user",
|
||||
content: [
|
||||
{ type: "text", text: "Conversation info (untrusted metadata):" },
|
||||
{ type: "text", text: "```json" },
|
||||
{ type: "text", text: '{"message_id":"msg-100","chat_id":"-100123"}' },
|
||||
{ type: "text", text: "```" },
|
||||
{ type: "text", text: "" },
|
||||
{ type: "text", text: "Sender (untrusted metadata):" },
|
||||
{ type: "text", text: "```json" },
|
||||
{ type: "text", text: '{"label":"Chris","id":"42"}' },
|
||||
{ type: "text", text: "```" },
|
||||
{ type: "text", text: "" },
|
||||
{ type: "text", text: "Actual user text" },
|
||||
],
|
||||
},
|
||||
},
|
||||
}),
|
||||
];
|
||||
const filePath = path.join(tmpDir, "enveloped-session-array.jsonl");
|
||||
fsSync.writeFileSync(filePath, jsonlLines.join("\n"));
|
||||
],
|
||||
});
|
||||
|
||||
const entry = requireSessionEntry(await buildSessionEntry(filePath));
|
||||
expect(entry.content).toBe("User: Actual user text");
|
||||
});
|
||||
|
||||
it("skips inter-session user messages", async () => {
|
||||
const jsonlLines = [
|
||||
JSON.stringify({
|
||||
type: "message",
|
||||
message: {
|
||||
role: "user",
|
||||
content: "A background task completed. Internal relay text.",
|
||||
provenance: { kind: "inter_session", sourceTool: "subagent_announce" },
|
||||
const filePath = seedTranscript({
|
||||
sessionId: "inter-session-session",
|
||||
events: [
|
||||
{
|
||||
type: "message",
|
||||
message: {
|
||||
role: "user",
|
||||
content: "A background task completed. Internal relay text.",
|
||||
provenance: { kind: "inter_session", sourceTool: "subagent_announce" },
|
||||
},
|
||||
},
|
||||
}),
|
||||
JSON.stringify({
|
||||
type: "message",
|
||||
message: { role: "assistant", content: "User-facing summary." },
|
||||
}),
|
||||
JSON.stringify({
|
||||
type: "message",
|
||||
message: { role: "user", content: "Actual user follow-up." },
|
||||
}),
|
||||
];
|
||||
const filePath = path.join(tmpDir, "inter-session-session.jsonl");
|
||||
fsSync.writeFileSync(filePath, jsonlLines.join("\n"));
|
||||
{
|
||||
type: "message",
|
||||
message: { role: "assistant", content: "User-facing summary." },
|
||||
},
|
||||
{
|
||||
type: "message",
|
||||
message: { role: "user", content: "Actual user follow-up." },
|
||||
},
|
||||
],
|
||||
});
|
||||
|
||||
const entry = requireSessionEntry(await buildSessionEntry(filePath));
|
||||
expect(entry.content).toBe("Assistant: User-facing summary.\nUser: Actual user follow-up.");
|
||||
|
||||
@@ -1,7 +1,4 @@
|
||||
import fsSync from "node:fs";
|
||||
import fs from "node:fs/promises";
|
||||
import path from "node:path";
|
||||
import { readRegularFile, statRegularFile } from "./fs-utils.js";
|
||||
import { hashText } from "./hash.js";
|
||||
import { createSubsystemLogger, redactSensitiveText } from "./openclaw-runtime-io.js";
|
||||
import {
|
||||
@@ -13,7 +10,9 @@ import {
|
||||
isExecCompletionEvent,
|
||||
isHeartbeatUserMessage,
|
||||
isSilentReplyPayloadText,
|
||||
isUsageCountedSessionTranscriptFileName,
|
||||
listSqliteSessionTranscripts,
|
||||
loadSqliteSessionTranscriptEvents,
|
||||
resolveSqliteSessionTranscriptScopeForPath,
|
||||
parseUsageCountedSessionIdFromFileName,
|
||||
resolveSessionTranscriptsDirForAgent,
|
||||
stripInboundMetadata,
|
||||
@@ -57,11 +56,6 @@ export type SessionTranscriptClassification = {
|
||||
cronRunTranscriptPaths: ReadonlySet<string>;
|
||||
};
|
||||
|
||||
type SessionTranscriptStoreEntry = {
|
||||
sessionFile?: unknown;
|
||||
sessionId?: unknown;
|
||||
};
|
||||
|
||||
function shouldSkipTranscriptFileForDreaming(absPath: string): boolean {
|
||||
const fileName = path.basename(absPath);
|
||||
// Compaction checkpoints are always skipped: they are derived snapshots of an
|
||||
@@ -126,20 +120,6 @@ function isDreamingNarrativeGeneratedRecord(record: unknown): boolean {
|
||||
return hasDreamingNarrativeRunId(nested.runId) || hasDreamingNarrativeRunId(nested.sessionKey);
|
||||
}
|
||||
|
||||
function isDreamingNarrativeSessionStoreKey(sessionKey: string): boolean {
|
||||
const trimmed = sessionKey.trim();
|
||||
if (!trimmed) {
|
||||
return false;
|
||||
}
|
||||
const firstSeparator = trimmed.indexOf(":");
|
||||
if (firstSeparator < 0) {
|
||||
return trimmed.startsWith(DREAMING_NARRATIVE_RUN_PREFIX);
|
||||
}
|
||||
const secondSeparator = trimmed.indexOf(":", firstSeparator + 1);
|
||||
const sessionSegment = secondSeparator < 0 ? trimmed : trimmed.slice(secondSeparator + 1);
|
||||
return sessionSegment.startsWith(DREAMING_NARRATIVE_RUN_PREFIX);
|
||||
}
|
||||
|
||||
function hasCronRunSessionKey(value: unknown): boolean {
|
||||
return typeof value === "string" && isCronRunSessionKey(value);
|
||||
}
|
||||
@@ -173,69 +153,23 @@ export function normalizeSessionTranscriptPathForComparison(pathname: string): s
|
||||
return normalizeComparablePath(pathname);
|
||||
}
|
||||
|
||||
function resolveSessionStoreTranscriptPath(
|
||||
sessionsDir: string,
|
||||
entry: { sessionFile?: unknown; sessionId?: unknown } | undefined,
|
||||
): string | null {
|
||||
if (typeof entry?.sessionFile === "string" && entry.sessionFile.trim().length > 0) {
|
||||
const sessionFile = entry.sessionFile.trim();
|
||||
const resolved = path.isAbsolute(sessionFile)
|
||||
? sessionFile
|
||||
: path.resolve(sessionsDir, sessionFile);
|
||||
return normalizeComparablePath(resolved);
|
||||
}
|
||||
if (typeof entry?.sessionId === "string" && entry.sessionId.trim().length > 0) {
|
||||
return normalizeComparablePath(path.join(sessionsDir, `${entry.sessionId.trim()}.jsonl`));
|
||||
}
|
||||
return null;
|
||||
}
|
||||
|
||||
export function loadDreamingNarrativeTranscriptPathSetForSessionsDir(
|
||||
sessionsDir: string,
|
||||
): ReadonlySet<string> {
|
||||
return loadSessionTranscriptClassificationForSessionsDir(sessionsDir)
|
||||
.dreamingNarrativeTranscriptPaths;
|
||||
void sessionsDir;
|
||||
return new Set<string>();
|
||||
}
|
||||
|
||||
export function loadSessionTranscriptClassificationForSessionsDir(
|
||||
sessionsDir: string,
|
||||
): SessionTranscriptClassification {
|
||||
const storePath = path.join(sessionsDir, "sessions.json");
|
||||
const store = readSessionTranscriptClassificationStore(storePath);
|
||||
const dreamingTranscriptPaths = new Set<string>();
|
||||
const cronRunTranscriptPaths = new Set<string>();
|
||||
for (const [sessionKey, entry] of Object.entries(store)) {
|
||||
const transcriptPath = resolveSessionStoreTranscriptPath(sessionsDir, entry);
|
||||
if (!transcriptPath) {
|
||||
continue;
|
||||
}
|
||||
if (isDreamingNarrativeSessionStoreKey(sessionKey)) {
|
||||
dreamingTranscriptPaths.add(transcriptPath);
|
||||
}
|
||||
if (isCronRunSessionKey(sessionKey)) {
|
||||
cronRunTranscriptPaths.add(transcriptPath);
|
||||
}
|
||||
}
|
||||
void sessionsDir;
|
||||
return {
|
||||
dreamingNarrativeTranscriptPaths: dreamingTranscriptPaths,
|
||||
cronRunTranscriptPaths,
|
||||
dreamingNarrativeTranscriptPaths: new Set<string>(),
|
||||
cronRunTranscriptPaths: new Set<string>(),
|
||||
};
|
||||
}
|
||||
|
||||
function readSessionTranscriptClassificationStore(
|
||||
storePath: string,
|
||||
): Record<string, SessionTranscriptStoreEntry> {
|
||||
try {
|
||||
const parsed = JSON.parse(fsSync.readFileSync(storePath, "utf-8")) as unknown;
|
||||
if (!parsed || typeof parsed !== "object" || Array.isArray(parsed)) {
|
||||
return {};
|
||||
}
|
||||
return parsed as Record<string, SessionTranscriptStoreEntry>;
|
||||
} catch {
|
||||
return {};
|
||||
}
|
||||
}
|
||||
|
||||
export function loadDreamingNarrativeTranscriptPathSetForAgent(
|
||||
agentId: string,
|
||||
): ReadonlySet<string> {
|
||||
@@ -245,39 +179,18 @@ export function loadDreamingNarrativeTranscriptPathSetForAgent(
|
||||
export function loadSessionTranscriptClassificationForAgent(
|
||||
agentId: string,
|
||||
): SessionTranscriptClassification {
|
||||
return loadSessionTranscriptClassificationForSessionsDir(
|
||||
resolveSessionTranscriptsDirForAgent(agentId),
|
||||
);
|
||||
}
|
||||
|
||||
function classifySessionTranscriptFromSessionStore(absPath: string): {
|
||||
generatedByDreamingNarrative: boolean;
|
||||
generatedByCronRun: boolean;
|
||||
} {
|
||||
const sessionsDir = path.dirname(absPath);
|
||||
const normalizedAbsPath = normalizeComparablePath(absPath);
|
||||
const classification = loadSessionTranscriptClassificationForSessionsDir(sessionsDir);
|
||||
const hasClassifiedPath = (paths: ReadonlySet<string>) => paths.has(normalizedAbsPath);
|
||||
void agentId;
|
||||
return {
|
||||
generatedByDreamingNarrative: hasClassifiedPath(
|
||||
classification.dreamingNarrativeTranscriptPaths,
|
||||
),
|
||||
generatedByCronRun: hasClassifiedPath(classification.cronRunTranscriptPaths),
|
||||
dreamingNarrativeTranscriptPaths: new Set<string>(),
|
||||
cronRunTranscriptPaths: new Set<string>(),
|
||||
};
|
||||
}
|
||||
|
||||
export async function listSessionFilesForAgent(agentId: string): Promise<string[]> {
|
||||
const dir = resolveSessionTranscriptsDirForAgent(agentId);
|
||||
try {
|
||||
const entries = await fs.readdir(dir, { withFileTypes: true });
|
||||
return entries
|
||||
.filter((entry) => entry.isFile())
|
||||
.map((entry) => entry.name)
|
||||
.filter((name) => isUsageCountedSessionTranscriptFileName(name))
|
||||
.map((name) => path.join(dir, name));
|
||||
} catch {
|
||||
return [];
|
||||
}
|
||||
return listSqliteSessionTranscripts({ agentId }).map(
|
||||
(transcript) => transcript.path ?? path.join(dir, `${transcript.sessionId}.jsonl`),
|
||||
);
|
||||
}
|
||||
|
||||
function extractAgentIdFromSessionPath(absPath: string): string | null {
|
||||
@@ -289,6 +202,10 @@ function extractAgentIdFromSessionPath(absPath: string): string | null {
|
||||
return parts[sessionsIndex - 1] || null;
|
||||
}
|
||||
|
||||
function resolveSessionIdFromTranscriptPath(absPath: string): string | null {
|
||||
return parseUsageCountedSessionIdFromFileName(path.basename(absPath));
|
||||
}
|
||||
|
||||
export function sessionPathForFile(absPath: string): string {
|
||||
const agentId = extractAgentIdFromSessionPath(absPath);
|
||||
return path
|
||||
@@ -497,52 +414,48 @@ export async function buildSessionEntry(
|
||||
opts: BuildSessionEntryOptions = {},
|
||||
): Promise<SessionFileEntry | null> {
|
||||
try {
|
||||
const regularFile = await statRegularFile(absPath);
|
||||
if (regularFile.missing) {
|
||||
const transcriptPath = path.resolve(absPath);
|
||||
const rememberedScope = resolveSqliteSessionTranscriptScopeForPath({ transcriptPath });
|
||||
const agentId = rememberedScope?.agentId ?? extractAgentIdFromSessionPath(transcriptPath);
|
||||
const sessionId =
|
||||
rememberedScope?.sessionId ?? resolveSessionIdFromTranscriptPath(transcriptPath);
|
||||
if (!agentId || !sessionId) {
|
||||
return null;
|
||||
}
|
||||
const stat = regularFile.stat;
|
||||
const transcriptEvents = loadSqliteSessionTranscriptEvents({ agentId, sessionId });
|
||||
if (transcriptEvents.length === 0) {
|
||||
return null;
|
||||
}
|
||||
const mtimeMs = Math.max(0, ...transcriptEvents.map((entry) => entry.createdAt));
|
||||
const size = transcriptEvents.reduce(
|
||||
(total, entry) => total + JSON.stringify(entry.event).length + 1,
|
||||
0,
|
||||
);
|
||||
if (shouldSkipTranscriptFileForDreaming(absPath)) {
|
||||
return {
|
||||
path: sessionPathForFile(absPath),
|
||||
absPath,
|
||||
mtimeMs: stat.mtimeMs,
|
||||
size: stat.size,
|
||||
mtimeMs,
|
||||
size,
|
||||
hash: hashText("\n\n"),
|
||||
content: "",
|
||||
lineMap: [],
|
||||
messageTimestampsMs: [],
|
||||
};
|
||||
}
|
||||
const raw = (await readRegularFile({ filePath: absPath })).buffer.toString("utf-8");
|
||||
const lines = raw.split("\n");
|
||||
const collected: string[] = [];
|
||||
const lineMap: number[] = [];
|
||||
const messageTimestampsMs: number[] = [];
|
||||
const sessionStoreClassification =
|
||||
opts.generatedByDreamingNarrative === undefined || opts.generatedByCronRun === undefined
|
||||
? classifySessionTranscriptFromSessionStore(absPath)
|
||||
: null;
|
||||
let generatedByDreamingNarrative =
|
||||
opts.generatedByDreamingNarrative ??
|
||||
sessionStoreClassification?.generatedByDreamingNarrative ??
|
||||
false;
|
||||
let generatedByCronRun =
|
||||
opts.generatedByCronRun ?? sessionStoreClassification?.generatedByCronRun ?? false;
|
||||
for (let jsonlIdx = 0; jsonlIdx < lines.length; jsonlIdx++) {
|
||||
const line = lines[jsonlIdx];
|
||||
if (!line.trim()) {
|
||||
continue;
|
||||
}
|
||||
let record: unknown;
|
||||
try {
|
||||
record = JSON.parse(line);
|
||||
} catch {
|
||||
continue;
|
||||
}
|
||||
let generatedByDreamingNarrative = opts.generatedByDreamingNarrative ?? false;
|
||||
let generatedByCronRun = opts.generatedByCronRun ?? false;
|
||||
for (const transcriptEvent of transcriptEvents) {
|
||||
const record = transcriptEvent.event;
|
||||
if (!generatedByDreamingNarrative && isDreamingNarrativeGeneratedRecord(record)) {
|
||||
generatedByDreamingNarrative = true;
|
||||
}
|
||||
if (!generatedByCronRun && isCronRunGeneratedRecord(record)) {
|
||||
generatedByCronRun = true;
|
||||
}
|
||||
if (
|
||||
!record ||
|
||||
typeof record !== "object" ||
|
||||
@@ -588,15 +501,15 @@ export async function buildSessionEntry(
|
||||
message as { timestamp?: unknown },
|
||||
);
|
||||
collected.push(...renderedLines);
|
||||
lineMap.push(...renderedLines.map(() => jsonlIdx + 1));
|
||||
lineMap.push(...renderedLines.map(() => transcriptEvent.seq + 1));
|
||||
messageTimestampsMs.push(...renderedLines.map(() => timestampMs));
|
||||
}
|
||||
const content = collected.join("\n");
|
||||
return {
|
||||
path: sessionPathForFile(absPath),
|
||||
absPath,
|
||||
mtimeMs: stat.mtimeMs,
|
||||
size: stat.size,
|
||||
mtimeMs,
|
||||
size,
|
||||
hash: hashText(content + "\n" + lineMap.join(",") + "\n" + messageTimestampsMs.join(",")),
|
||||
content,
|
||||
lineMap,
|
||||
|
||||
@@ -1,14 +1,17 @@
|
||||
import { readFile } from "node:fs/promises";
|
||||
import {
|
||||
resolveSessionFilePath,
|
||||
resolveSessionFilePathOptions,
|
||||
type SessionEntry as StoredSessionEntry,
|
||||
} from "../config/sessions.js";
|
||||
import {
|
||||
loadSqliteSessionTranscriptEvents,
|
||||
resolveSqliteSessionTranscriptScope,
|
||||
} from "../config/sessions/transcript-store.sqlite.js";
|
||||
import { diagnosticLogger as diag } from "../logging/diagnostic.js";
|
||||
import {
|
||||
buildSessionContext,
|
||||
migrateSessionEntries,
|
||||
parseSessionEntries,
|
||||
type FileEntry,
|
||||
type SessionEntry as PiSessionEntry,
|
||||
} from "./transcript/session-transcript-contract.js";
|
||||
|
||||
@@ -105,7 +108,16 @@ export async function readBtwTranscriptMessages(params: {
|
||||
snapshotLeafId?: string | null;
|
||||
}): Promise<unknown[]> {
|
||||
try {
|
||||
const entries = parseSessionEntries(await readFile(params.sessionFile, "utf-8"));
|
||||
const scope = resolveSqliteSessionTranscriptScope({
|
||||
sessionId: params.sessionId,
|
||||
transcriptPath: params.sessionFile,
|
||||
});
|
||||
if (!scope) {
|
||||
return [];
|
||||
}
|
||||
const entries = loadSqliteSessionTranscriptEvents(scope)
|
||||
.map((entry) => entry.event)
|
||||
.filter((entry): entry is FileEntry => Boolean(entry && typeof entry === "object"));
|
||||
migrateSessionEntries(entries);
|
||||
const sessionEntries = entries.filter(
|
||||
(entry): entry is PiSessionEntry => entry.type !== "session",
|
||||
|
||||
@@ -38,6 +38,16 @@ vi.mock("node:fs/promises", () => ({
|
||||
readFile: (...args: unknown[]) => readFileMock(...args),
|
||||
}));
|
||||
|
||||
vi.mock("../config/sessions/transcript-store.sqlite.js", () => ({
|
||||
resolveSqliteSessionTranscriptScope: () => ({ agentId: "main", sessionId: "session-1" }),
|
||||
loadSqliteSessionTranscriptEvents: () =>
|
||||
(parseSessionEntriesMock() as unknown[]).map((event, seq) => ({
|
||||
seq,
|
||||
event,
|
||||
createdAt: seq + 1,
|
||||
})),
|
||||
}));
|
||||
|
||||
vi.mock("./transcript/session-transcript-contract.js", () => ({
|
||||
buildSessionContext: (...args: unknown[]) => buildSessionContextMock(...args),
|
||||
CURRENT_SESSION_VERSION: 3,
|
||||
|
||||
@@ -2,6 +2,8 @@ import fs from "node:fs";
|
||||
import os from "node:os";
|
||||
import path from "node:path";
|
||||
import { afterEach, describe, expect, it, vi } from "vitest";
|
||||
import { replaceSqliteSessionTranscriptEvents } from "../../config/sessions/transcript-store.sqlite.js";
|
||||
import { closeOpenClawStateDatabaseForTest } from "../../state/openclaw-state-db.js";
|
||||
import { CURRENT_SESSION_VERSION } from "../transcript/session-transcript-contract.js";
|
||||
import {
|
||||
buildCliSessionHistoryPrompt,
|
||||
@@ -27,35 +29,35 @@ function createSessionTranscript(params: {
|
||||
"sessions",
|
||||
`${params.sessionId}.jsonl`,
|
||||
);
|
||||
fs.mkdirSync(path.dirname(sessionFile), { recursive: true });
|
||||
fs.writeFileSync(
|
||||
sessionFile,
|
||||
`${JSON.stringify({
|
||||
const events: unknown[] = [
|
||||
{
|
||||
type: "session",
|
||||
version: CURRENT_SESSION_VERSION,
|
||||
id: params.sessionId,
|
||||
timestamp: new Date(0).toISOString(),
|
||||
cwd: params.rootDir,
|
||||
})}\n`,
|
||||
"utf-8",
|
||||
);
|
||||
},
|
||||
];
|
||||
for (const [index, message] of (params.messages ?? []).entries()) {
|
||||
fs.appendFileSync(
|
||||
sessionFile,
|
||||
`${JSON.stringify({
|
||||
type: "message",
|
||||
id: `msg-${index}`,
|
||||
parentId: index > 0 ? `msg-${index - 1}` : null,
|
||||
timestamp: new Date(index + 1).toISOString(),
|
||||
message: {
|
||||
role: "user",
|
||||
content: message,
|
||||
timestamp: index + 1,
|
||||
},
|
||||
})}\n`,
|
||||
"utf-8",
|
||||
);
|
||||
events.push({
|
||||
type: "message",
|
||||
id: `msg-${index}`,
|
||||
parentId: index > 0 ? `msg-${index - 1}` : null,
|
||||
timestamp: new Date(index + 1).toISOString(),
|
||||
message: {
|
||||
role: "user",
|
||||
content: message,
|
||||
timestamp: index + 1,
|
||||
},
|
||||
});
|
||||
}
|
||||
replaceSqliteSessionTranscriptEvents({
|
||||
agentId: params.agentId ?? "main",
|
||||
sessionId: params.sessionId,
|
||||
transcriptPath: sessionFile,
|
||||
events,
|
||||
now: () => 1_770_000_000_000,
|
||||
});
|
||||
return sessionFile;
|
||||
}
|
||||
|
||||
@@ -80,8 +82,51 @@ function expectCompactionSummary(value: unknown, summary: string) {
|
||||
expect(message.summary).toBe(summary);
|
||||
}
|
||||
|
||||
function appendSessionTranscriptEvents(params: {
|
||||
sessionId: string;
|
||||
sessionFile: string;
|
||||
agentId?: string;
|
||||
events: unknown[];
|
||||
}): void {
|
||||
replaceSqliteSessionTranscriptEvents({
|
||||
agentId: params.agentId ?? "main",
|
||||
sessionId: params.sessionId,
|
||||
transcriptPath: params.sessionFile,
|
||||
events: params.events,
|
||||
now: () => 1_770_000_000_000,
|
||||
});
|
||||
}
|
||||
|
||||
function createSessionTranscriptEvents(params: {
|
||||
rootDir: string;
|
||||
sessionId: string;
|
||||
messages?: string[];
|
||||
}) {
|
||||
return [
|
||||
{
|
||||
type: "session",
|
||||
version: CURRENT_SESSION_VERSION,
|
||||
id: params.sessionId,
|
||||
timestamp: new Date(0).toISOString(),
|
||||
cwd: params.rootDir,
|
||||
},
|
||||
...(params.messages ?? []).map((message, index) => ({
|
||||
type: "message",
|
||||
id: `msg-${index}`,
|
||||
parentId: index > 0 ? `msg-${index - 1}` : null,
|
||||
timestamp: new Date(index + 1).toISOString(),
|
||||
message: {
|
||||
role: "user",
|
||||
content: message,
|
||||
timestamp: index + 1,
|
||||
},
|
||||
})),
|
||||
];
|
||||
}
|
||||
|
||||
describe("loadCliSessionHistoryMessages", () => {
|
||||
afterEach(() => {
|
||||
closeOpenClawStateDatabaseForTest();
|
||||
vi.unstubAllEnvs();
|
||||
});
|
||||
|
||||
@@ -94,12 +139,7 @@ describe("loadCliSessionHistoryMessages", () => {
|
||||
sessionId: "session-test",
|
||||
messages: ["expected history"],
|
||||
});
|
||||
const outsideFile = createSessionTranscript({
|
||||
rootDir: outsideDir,
|
||||
sessionId: "session-test",
|
||||
filePath: path.join(outsideDir, "stolen.jsonl"),
|
||||
messages: ["stolen history"],
|
||||
});
|
||||
const outsideFile = path.join(outsideDir, "stolen.jsonl");
|
||||
|
||||
try {
|
||||
const history = await loadCliSessionHistoryMessages({
|
||||
@@ -157,15 +197,13 @@ describe("loadCliSessionHistoryMessages", () => {
|
||||
"sessions",
|
||||
"session-symlink.jsonl",
|
||||
);
|
||||
const outsideFile = createSessionTranscript({
|
||||
createSessionTranscript({
|
||||
rootDir: outsideDir,
|
||||
sessionId: "session-symlink",
|
||||
agentId: "other",
|
||||
filePath: path.join(outsideDir, "outside.jsonl"),
|
||||
messages: ["stolen history"],
|
||||
});
|
||||
fs.mkdirSync(path.dirname(canonicalSessionFile), { recursive: true });
|
||||
fs.symlinkSync(outsideFile, canonicalSessionFile);
|
||||
|
||||
try {
|
||||
expect(
|
||||
await loadCliSessionHistoryMessages({
|
||||
@@ -191,8 +229,12 @@ describe("loadCliSessionHistoryMessages", () => {
|
||||
"sessions",
|
||||
"session-oversized.jsonl",
|
||||
);
|
||||
fs.mkdirSync(path.dirname(sessionFile), { recursive: true });
|
||||
fs.writeFileSync(sessionFile, "x".repeat(MAX_CLI_SESSION_HISTORY_FILE_BYTES + 1), "utf-8");
|
||||
createSessionTranscript({
|
||||
rootDir: stateDir,
|
||||
sessionId: "session-oversized",
|
||||
filePath: sessionFile,
|
||||
messages: ["x".repeat(MAX_CLI_SESSION_HISTORY_FILE_BYTES + 1)],
|
||||
});
|
||||
|
||||
try {
|
||||
expect(
|
||||
@@ -213,7 +255,6 @@ describe("loadCliSessionHistoryMessages", () => {
|
||||
const customStoreDir = fs.mkdtempSync(path.join(os.tmpdir(), "openclaw-cli-store-"));
|
||||
vi.stubEnv("OPENCLAW_STATE_DIR", stateDir);
|
||||
const storePath = path.join(customStoreDir, "sessions.json");
|
||||
fs.writeFileSync(storePath, "{}", "utf-8");
|
||||
const sessionFile = createSessionTranscript({
|
||||
rootDir: customStoreDir,
|
||||
sessionId: "session-custom-store",
|
||||
@@ -244,6 +285,7 @@ describe("loadCliSessionHistoryMessages", () => {
|
||||
|
||||
describe("loadCliSessionReseedMessages", () => {
|
||||
afterEach(() => {
|
||||
closeOpenClawStateDatabaseForTest();
|
||||
vi.unstubAllEnvs();
|
||||
});
|
||||
|
||||
@@ -348,34 +390,37 @@ describe("loadCliSessionReseedMessages", () => {
|
||||
sessionId: "session-compacted",
|
||||
messages: ["pre-compaction raw history"],
|
||||
});
|
||||
fs.appendFileSync(
|
||||
appendSessionTranscriptEvents({
|
||||
sessionId: "session-compacted",
|
||||
sessionFile,
|
||||
`${JSON.stringify({
|
||||
type: "compaction",
|
||||
id: "compaction-1",
|
||||
parentId: "msg-0",
|
||||
timestamp: new Date(2).toISOString(),
|
||||
summary: "safe compacted summary",
|
||||
firstKeptEntryId: "msg-0",
|
||||
tokensBefore: 10_000,
|
||||
})}\n`,
|
||||
"utf-8",
|
||||
);
|
||||
fs.appendFileSync(
|
||||
sessionFile,
|
||||
`${JSON.stringify({
|
||||
type: "message",
|
||||
id: "msg-1",
|
||||
parentId: "compaction-1",
|
||||
timestamp: new Date(3).toISOString(),
|
||||
message: {
|
||||
role: "user",
|
||||
content: "post-compaction ask",
|
||||
timestamp: 3,
|
||||
events: [
|
||||
...createSessionTranscriptEvents({
|
||||
rootDir: stateDir,
|
||||
sessionId: "session-compacted",
|
||||
messages: ["pre-compaction raw history"],
|
||||
}),
|
||||
{
|
||||
type: "compaction",
|
||||
id: "compaction-1",
|
||||
parentId: "msg-0",
|
||||
timestamp: new Date(2).toISOString(),
|
||||
summary: "safe compacted summary",
|
||||
firstKeptEntryId: "msg-0",
|
||||
tokensBefore: 10_000,
|
||||
},
|
||||
})}\n`,
|
||||
"utf-8",
|
||||
);
|
||||
{
|
||||
type: "message",
|
||||
id: "msg-1",
|
||||
parentId: "compaction-1",
|
||||
timestamp: new Date(3).toISOString(),
|
||||
message: {
|
||||
role: "user",
|
||||
content: "post-compaction ask",
|
||||
timestamp: 3,
|
||||
},
|
||||
},
|
||||
],
|
||||
});
|
||||
|
||||
try {
|
||||
const reseed = await loadCliSessionReseedMessages({
|
||||
|
||||
@@ -1,11 +1,13 @@
|
||||
import fsp from "node:fs/promises";
|
||||
import path from "node:path";
|
||||
import {
|
||||
resolveSessionFilePath,
|
||||
resolveSessionFilePathOptions,
|
||||
} from "../../config/sessions/paths.js";
|
||||
import {
|
||||
loadSqliteSessionTranscriptEvents,
|
||||
resolveSqliteSessionTranscriptScope,
|
||||
} from "../../config/sessions/transcript-store.sqlite.js";
|
||||
import type { OpenClawConfig } from "../../config/types.openclaw.js";
|
||||
import { isPathInside } from "../../infra/path-guards.js";
|
||||
import { resolveSessionAgentIds } from "../agent-scope.js";
|
||||
import {
|
||||
limitAgentHookHistoryMessages,
|
||||
@@ -13,7 +15,7 @@ import {
|
||||
} from "../harness/hook-history.js";
|
||||
import {
|
||||
migrateSessionEntries,
|
||||
parseSessionEntries,
|
||||
type FileEntry,
|
||||
} from "../transcript/session-transcript-contract.js";
|
||||
|
||||
export const MAX_CLI_SESSION_HISTORY_FILE_BYTES = 5 * 1024 * 1024;
|
||||
@@ -119,14 +121,6 @@ export function buildCliSessionHistoryPrompt(params: {
|
||||
].join("\n");
|
||||
}
|
||||
|
||||
async function safeRealpath(filePath: string): Promise<string | undefined> {
|
||||
try {
|
||||
return await fsp.realpath(filePath);
|
||||
} catch {
|
||||
return undefined;
|
||||
}
|
||||
}
|
||||
|
||||
function resolveSafeCliSessionFile(params: {
|
||||
sessionId: string;
|
||||
sessionFile: string;
|
||||
@@ -162,25 +156,21 @@ async function loadCliSessionEntries(params: {
|
||||
config?: OpenClawConfig;
|
||||
}): Promise<unknown[]> {
|
||||
try {
|
||||
const { sessionFile, sessionsDir } = resolveSafeCliSessionFile(params);
|
||||
const entryStat = await fsp.lstat(sessionFile);
|
||||
if (!entryStat.isFile() || entryStat.isSymbolicLink()) {
|
||||
const { sessionFile } = resolveSafeCliSessionFile(params);
|
||||
const scope = resolveSqliteSessionTranscriptScope({
|
||||
agentId: params.agentId,
|
||||
sessionId: params.sessionId,
|
||||
transcriptPath: sessionFile,
|
||||
});
|
||||
if (!scope) {
|
||||
return [];
|
||||
}
|
||||
const realSessionsDir = (await safeRealpath(sessionsDir)) ?? path.resolve(sessionsDir);
|
||||
const realSessionFile = await safeRealpath(sessionFile);
|
||||
if (
|
||||
!realSessionFile ||
|
||||
realSessionFile === realSessionsDir ||
|
||||
!isPathInside(realSessionsDir, realSessionFile)
|
||||
) {
|
||||
const entries = loadSqliteSessionTranscriptEvents(scope)
|
||||
.map((entry) => entry.event)
|
||||
.filter((entry): entry is FileEntry => Boolean(entry && typeof entry === "object"));
|
||||
if (JSON.stringify(entries).length > MAX_CLI_SESSION_HISTORY_FILE_BYTES) {
|
||||
return [];
|
||||
}
|
||||
const stat = await fsp.stat(realSessionFile);
|
||||
if (!stat.isFile() || stat.size > MAX_CLI_SESSION_HISTORY_FILE_BYTES) {
|
||||
return [];
|
||||
}
|
||||
const entries = parseSessionEntries(await fsp.readFile(realSessionFile, "utf-8"));
|
||||
migrateSessionEntries(entries);
|
||||
return entries.filter((entry) => entry.type !== "session");
|
||||
} catch {
|
||||
|
||||
@@ -1,5 +1,4 @@
|
||||
import { randomUUID } from "node:crypto";
|
||||
import fs from "node:fs";
|
||||
import os from "node:os";
|
||||
import path from "node:path";
|
||||
import {
|
||||
@@ -56,10 +55,6 @@ function resolveDefaultSessionDir(cwd: string): string {
|
||||
return path.join(os.homedir(), ".openclaw", "sessions", encodeSessionCwd(cwd));
|
||||
}
|
||||
|
||||
function ensureDirSync(dir: string): void {
|
||||
fs.mkdirSync(dir, { recursive: true, mode: 0o700 });
|
||||
}
|
||||
|
||||
function resolveAgentIdFromSessionPath(sessionFile: string): string {
|
||||
const resolved = path.resolve(sessionFile);
|
||||
const sessionsDir = path.dirname(resolved);
|
||||
@@ -128,12 +123,6 @@ function loadTranscriptState(params: { sessionFile: string; sessionId?: string;
|
||||
return { state: createTranscriptStateFromEvents(sqliteEvents), scope };
|
||||
}
|
||||
|
||||
if (fs.existsSync(params.sessionFile) && fs.statSync(params.sessionFile).size > 0) {
|
||||
throw new Error(
|
||||
`Legacy transcript has not been imported into SQLite: ${params.sessionFile}. Run "openclaw doctor --fix" to build the session database.`,
|
||||
);
|
||||
}
|
||||
|
||||
const header = createSessionHeader({
|
||||
id: params.sessionId,
|
||||
cwd: params.cwd ?? process.cwd(),
|
||||
@@ -317,7 +306,6 @@ export class TranscriptSessionManager implements SessionManager {
|
||||
|
||||
static create(cwd: string, sessionDir?: string): TranscriptSessionManager {
|
||||
const dir = path.resolve(sessionDir ?? resolveDefaultSessionDir(cwd));
|
||||
ensureDirSync(dir);
|
||||
const header = createSessionHeader({ cwd });
|
||||
const sessionFile = path.join(dir, createSessionFileName(header));
|
||||
const sqliteScope = {
|
||||
@@ -348,7 +336,6 @@ export class TranscriptSessionManager implements SessionManager {
|
||||
|
||||
static continueRecent(cwd: string, sessionDir?: string): TranscriptSessionManager {
|
||||
const dir = path.resolve(sessionDir ?? resolveDefaultSessionDir(cwd));
|
||||
ensureDirSync(dir);
|
||||
const newestSqlite = listSqliteSessionTranscriptFiles()
|
||||
.filter((entry) => path.dirname(path.resolve(entry.path)) === dir)
|
||||
.toSorted((a, b) => b.updatedAt - a.updatedAt)[0];
|
||||
@@ -374,7 +361,6 @@ export class TranscriptSessionManager implements SessionManager {
|
||||
loadSqliteSessionTranscriptEvents(sourceScope).map((entry) => entry.event),
|
||||
);
|
||||
const dir = path.resolve(sessionDir ?? resolveDefaultSessionDir(targetCwd));
|
||||
ensureDirSync(dir);
|
||||
const header = createSessionHeader({
|
||||
cwd: targetCwd,
|
||||
parentSession: sourceFile,
|
||||
|
||||
@@ -107,6 +107,7 @@ export async function resetReplyRunSession(params: {
|
||||
sourceAgentId: agentId,
|
||||
sourceSessionId: prevEntry.sessionId,
|
||||
sourceTranscript: prevEntry.sessionFile,
|
||||
targetAgentId: agentId,
|
||||
targetTranscript: nextSessionFile,
|
||||
newSessionId: nextSessionId,
|
||||
});
|
||||
|
||||
@@ -2,13 +2,19 @@ import fs from "node:fs/promises";
|
||||
import os from "node:os";
|
||||
import path from "node:path";
|
||||
import { afterEach, describe, expect, it } from "vitest";
|
||||
import {
|
||||
loadSqliteSessionTranscriptEvents,
|
||||
replaceSqliteSessionTranscriptEvents,
|
||||
} from "../../config/sessions/transcript-store.sqlite.js";
|
||||
import type { SessionEntry } from "../../config/sessions/types.js";
|
||||
import { closeOpenClawStateDatabaseForTest } from "../../state/openclaw-state-db.js";
|
||||
import {
|
||||
forkSessionFromParentRuntime,
|
||||
resolveParentForkTokenCountRuntime,
|
||||
} from "./session-fork.runtime.js";
|
||||
|
||||
const roots: string[] = [];
|
||||
let originalStateDir: string | undefined;
|
||||
|
||||
async function makeRoot(prefix: string): Promise<string> {
|
||||
const root = await fs.mkdtemp(path.join(os.tmpdir(), prefix));
|
||||
@@ -17,37 +23,69 @@ async function makeRoot(prefix: string): Promise<string> {
|
||||
}
|
||||
|
||||
afterEach(async () => {
|
||||
closeOpenClawStateDatabaseForTest();
|
||||
if (originalStateDir === undefined) {
|
||||
delete process.env.OPENCLAW_STATE_DIR;
|
||||
} else {
|
||||
process.env.OPENCLAW_STATE_DIR = originalStateDir;
|
||||
}
|
||||
originalStateDir = undefined;
|
||||
await Promise.all(roots.splice(0).map((root) => fs.rm(root, { recursive: true, force: true })));
|
||||
});
|
||||
|
||||
function useStateRoot(root: string): void {
|
||||
originalStateDir ??= process.env.OPENCLAW_STATE_DIR;
|
||||
process.env.OPENCLAW_STATE_DIR = root;
|
||||
}
|
||||
|
||||
function seedTranscript(params: {
|
||||
agentId?: string;
|
||||
sessionId: string;
|
||||
transcriptPath: string;
|
||||
events: unknown[];
|
||||
}): void {
|
||||
replaceSqliteSessionTranscriptEvents({
|
||||
agentId: params.agentId ?? "main",
|
||||
sessionId: params.sessionId,
|
||||
transcriptPath: params.transcriptPath,
|
||||
events: params.events,
|
||||
now: () => 1_770_000_000_000,
|
||||
});
|
||||
}
|
||||
|
||||
function readTranscript(agentId: string, sessionId: string): unknown[] {
|
||||
return loadSqliteSessionTranscriptEvents({ agentId, sessionId }).map((entry) => entry.event);
|
||||
}
|
||||
|
||||
describe("resolveParentForkTokenCountRuntime", () => {
|
||||
it("falls back to recent transcript usage when cached totals are stale", async () => {
|
||||
const root = await makeRoot("openclaw-parent-fork-token-estimate-");
|
||||
useStateRoot(root);
|
||||
const sessionsDir = path.join(root, "sessions");
|
||||
await fs.mkdir(sessionsDir);
|
||||
|
||||
const sessionId = "parent-overflow-transcript";
|
||||
const sessionFile = path.join(sessionsDir, "parent.jsonl");
|
||||
const lines = [
|
||||
JSON.stringify({
|
||||
const sessionFile = path.join(sessionsDir, `${sessionId}.jsonl`);
|
||||
const events: unknown[] = [
|
||||
{
|
||||
type: "session",
|
||||
version: 3,
|
||||
id: sessionId,
|
||||
timestamp: new Date().toISOString(),
|
||||
cwd: process.cwd(),
|
||||
}),
|
||||
},
|
||||
];
|
||||
for (let index = 0; index < 40; index += 1) {
|
||||
const body = `turn-${index} ${"x".repeat(200)}`;
|
||||
lines.push(
|
||||
JSON.stringify({
|
||||
events.push(
|
||||
{
|
||||
type: "message",
|
||||
id: `u${index}`,
|
||||
parentId: index === 0 ? null : `a${index - 1}`,
|
||||
timestamp: new Date().toISOString(),
|
||||
message: { role: "user", content: body },
|
||||
}),
|
||||
JSON.stringify({
|
||||
},
|
||||
{
|
||||
type: "message",
|
||||
id: `a${index}`,
|
||||
parentId: `u${index}`,
|
||||
@@ -57,10 +95,10 @@ describe("resolveParentForkTokenCountRuntime", () => {
|
||||
content: body,
|
||||
usage: index === 39 ? { input: 90_000, output: 20_000 } : undefined,
|
||||
},
|
||||
}),
|
||||
},
|
||||
);
|
||||
}
|
||||
await fs.writeFile(sessionFile, `${lines.join("\n")}\n`, "utf-8");
|
||||
seedTranscript({ sessionId, transcriptPath: sessionFile, events });
|
||||
|
||||
const entry: SessionEntry = {
|
||||
sessionId,
|
||||
@@ -72,7 +110,7 @@ describe("resolveParentForkTokenCountRuntime", () => {
|
||||
|
||||
const tokens = await resolveParentForkTokenCountRuntime({
|
||||
parentEntry: entry,
|
||||
storePath: path.join(root, "sessions.json"),
|
||||
storePath: path.join(sessionsDir, "sessions.json"),
|
||||
});
|
||||
|
||||
expect(tokens).toBe(110_000);
|
||||
@@ -80,32 +118,31 @@ describe("resolveParentForkTokenCountRuntime", () => {
|
||||
|
||||
it("falls back to a conservative byte estimate when stale parent transcript has no usage", async () => {
|
||||
const root = await makeRoot("openclaw-parent-fork-byte-estimate-");
|
||||
useStateRoot(root);
|
||||
const sessionsDir = path.join(root, "sessions");
|
||||
await fs.mkdir(sessionsDir);
|
||||
|
||||
const sessionId = "parent-no-usage-transcript";
|
||||
const sessionFile = path.join(sessionsDir, "parent.jsonl");
|
||||
const lines = [
|
||||
JSON.stringify({
|
||||
const sessionFile = path.join(sessionsDir, `${sessionId}.jsonl`);
|
||||
const events: unknown[] = [
|
||||
{
|
||||
type: "session",
|
||||
version: 3,
|
||||
id: sessionId,
|
||||
timestamp: new Date().toISOString(),
|
||||
cwd: process.cwd(),
|
||||
}),
|
||||
},
|
||||
];
|
||||
for (let index = 0; index < 24; index += 1) {
|
||||
lines.push(
|
||||
JSON.stringify({
|
||||
type: "message",
|
||||
id: `u${index}`,
|
||||
parentId: index === 0 ? null : `a${index - 1}`,
|
||||
timestamp: new Date().toISOString(),
|
||||
message: { role: "user", content: `turn-${index} ${"x".repeat(24_000)}` },
|
||||
}),
|
||||
);
|
||||
events.push({
|
||||
type: "message",
|
||||
id: `u${index}`,
|
||||
parentId: index === 0 ? null : `a${index - 1}`,
|
||||
timestamp: new Date().toISOString(),
|
||||
message: { role: "user", content: `turn-${index} ${"x".repeat(24_000)}` },
|
||||
});
|
||||
}
|
||||
await fs.writeFile(sessionFile, `${lines.join("\n")}\n`, "utf-8");
|
||||
seedTranscript({ sessionId, transcriptPath: sessionFile, events });
|
||||
|
||||
const entry: SessionEntry = {
|
||||
sessionId,
|
||||
@@ -116,7 +153,7 @@ describe("resolveParentForkTokenCountRuntime", () => {
|
||||
|
||||
const tokens = await resolveParentForkTokenCountRuntime({
|
||||
parentEntry: entry,
|
||||
storePath: path.join(root, "sessions.json"),
|
||||
storePath: path.join(sessionsDir, "sessions.json"),
|
||||
});
|
||||
|
||||
expect(tokens).toBeGreaterThan(100_000);
|
||||
@@ -124,38 +161,39 @@ describe("resolveParentForkTokenCountRuntime", () => {
|
||||
|
||||
it("uses the latest usage snapshot instead of tail aggregates for parent fork checks", async () => {
|
||||
const root = await makeRoot("openclaw-parent-fork-latest-usage-");
|
||||
useStateRoot(root);
|
||||
const sessionsDir = path.join(root, "sessions");
|
||||
await fs.mkdir(sessionsDir);
|
||||
|
||||
const sessionId = "parent-multiple-usage-transcript";
|
||||
const sessionFile = path.join(sessionsDir, "parent.jsonl");
|
||||
await fs.writeFile(
|
||||
sessionFile,
|
||||
[
|
||||
JSON.stringify({
|
||||
const sessionFile = path.join(sessionsDir, `${sessionId}.jsonl`);
|
||||
seedTranscript({
|
||||
sessionId,
|
||||
transcriptPath: sessionFile,
|
||||
events: [
|
||||
{
|
||||
type: "session",
|
||||
version: 3,
|
||||
id: sessionId,
|
||||
timestamp: new Date().toISOString(),
|
||||
cwd: process.cwd(),
|
||||
}),
|
||||
JSON.stringify({
|
||||
},
|
||||
{
|
||||
message: {
|
||||
role: "assistant",
|
||||
content: "older",
|
||||
usage: { input: 60_000, output: 5_000 },
|
||||
},
|
||||
}),
|
||||
JSON.stringify({
|
||||
},
|
||||
{
|
||||
message: {
|
||||
role: "assistant",
|
||||
content: "latest",
|
||||
usage: { input: 70_000, output: 8_000 },
|
||||
},
|
||||
}),
|
||||
].join("\n"),
|
||||
"utf-8",
|
||||
);
|
||||
},
|
||||
],
|
||||
});
|
||||
|
||||
const entry: SessionEntry = {
|
||||
sessionId,
|
||||
@@ -166,7 +204,7 @@ describe("resolveParentForkTokenCountRuntime", () => {
|
||||
|
||||
const tokens = await resolveParentForkTokenCountRuntime({
|
||||
parentEntry: entry,
|
||||
storePath: path.join(root, "sessions.json"),
|
||||
storePath: path.join(sessionsDir, "sessions.json"),
|
||||
});
|
||||
|
||||
expect(tokens).toBe(78_000);
|
||||
@@ -174,37 +212,38 @@ describe("resolveParentForkTokenCountRuntime", () => {
|
||||
|
||||
it("keeps parent fork checks conservative for content appended after latest usage", async () => {
|
||||
const root = await makeRoot("openclaw-parent-fork-post-usage-tail-");
|
||||
useStateRoot(root);
|
||||
const sessionsDir = path.join(root, "sessions");
|
||||
await fs.mkdir(sessionsDir);
|
||||
|
||||
const sessionId = "parent-post-usage-tail";
|
||||
const sessionFile = path.join(sessionsDir, "parent.jsonl");
|
||||
await fs.writeFile(
|
||||
sessionFile,
|
||||
[
|
||||
JSON.stringify({
|
||||
const sessionFile = path.join(sessionsDir, `${sessionId}.jsonl`);
|
||||
seedTranscript({
|
||||
sessionId,
|
||||
transcriptPath: sessionFile,
|
||||
events: [
|
||||
{
|
||||
type: "session",
|
||||
version: 3,
|
||||
id: sessionId,
|
||||
timestamp: new Date().toISOString(),
|
||||
cwd: process.cwd(),
|
||||
}),
|
||||
JSON.stringify({
|
||||
},
|
||||
{
|
||||
message: {
|
||||
role: "assistant",
|
||||
content: "latest model call",
|
||||
usage: { input: 40_000, output: 2_000 },
|
||||
},
|
||||
}),
|
||||
JSON.stringify({
|
||||
},
|
||||
{
|
||||
message: {
|
||||
role: "tool",
|
||||
content: `large appended tool result ${"x".repeat(450_000)}`,
|
||||
},
|
||||
}),
|
||||
].join("\n"),
|
||||
"utf-8",
|
||||
);
|
||||
},
|
||||
],
|
||||
});
|
||||
|
||||
const entry: SessionEntry = {
|
||||
sessionId,
|
||||
@@ -215,7 +254,7 @@ describe("resolveParentForkTokenCountRuntime", () => {
|
||||
|
||||
const tokens = await resolveParentForkTokenCountRuntime({
|
||||
parentEntry: entry,
|
||||
storePath: path.join(root, "sessions.json"),
|
||||
storePath: path.join(sessionsDir, "sessions.json"),
|
||||
});
|
||||
|
||||
expect(tokens).toBeGreaterThan(100_000);
|
||||
@@ -225,13 +264,14 @@ describe("resolveParentForkTokenCountRuntime", () => {
|
||||
describe("forkSessionFromParentRuntime", () => {
|
||||
it("forks the active branch without synchronously opening the session manager", async () => {
|
||||
const root = await makeRoot("openclaw-parent-fork-");
|
||||
useStateRoot(root);
|
||||
const sessionsDir = path.join(root, "sessions");
|
||||
await fs.mkdir(sessionsDir);
|
||||
const parentSessionFile = path.join(sessionsDir, "parent.jsonl");
|
||||
const cwd = path.join(root, "workspace");
|
||||
await fs.mkdir(cwd);
|
||||
const parentSessionId = "parent-session";
|
||||
const lines = [
|
||||
const parentSessionFile = path.join(sessionsDir, `${parentSessionId}.jsonl`);
|
||||
const events = [
|
||||
{
|
||||
type: "session",
|
||||
version: 3,
|
||||
@@ -270,11 +310,7 @@ describe("forkSessionFromParentRuntime", () => {
|
||||
label: "start",
|
||||
},
|
||||
];
|
||||
await fs.writeFile(
|
||||
parentSessionFile,
|
||||
`${lines.map((entry) => JSON.stringify(entry)).join("\n")}\n`,
|
||||
"utf-8",
|
||||
);
|
||||
seedTranscript({ sessionId: parentSessionId, transcriptPath: parentSessionFile, events });
|
||||
|
||||
const fork = await forkSessionFromParentRuntime({
|
||||
parentEntry: {
|
||||
@@ -291,12 +327,11 @@ describe("forkSessionFromParentRuntime", () => {
|
||||
}
|
||||
expect(fork.sessionFile).toContain(sessionsDir);
|
||||
expect(fork.sessionId).not.toBe(parentSessionId);
|
||||
const raw = await fs.readFile(fork.sessionFile, "utf-8");
|
||||
const forkedEntries = raw
|
||||
.trim()
|
||||
.split(/\r?\n/u)
|
||||
.map((line) => JSON.parse(line) as Record<string, unknown>);
|
||||
const resolvedParentSessionFile = await fs.realpath(parentSessionFile);
|
||||
const forkedEntries = readTranscript("main", fork.sessionId) as Array<Record<string, unknown>>;
|
||||
const resolvedParentSessionFile = path.join(
|
||||
await fs.realpath(sessionsDir),
|
||||
`${parentSessionId}.jsonl`,
|
||||
);
|
||||
expect(forkedEntries[0]).toMatchObject({
|
||||
type: "session",
|
||||
id: fork.sessionId,
|
||||
@@ -318,21 +353,24 @@ describe("forkSessionFromParentRuntime", () => {
|
||||
|
||||
it("creates a header-only child when the parent has no entries", async () => {
|
||||
const root = await makeRoot("openclaw-parent-fork-empty-");
|
||||
useStateRoot(root);
|
||||
const sessionsDir = path.join(root, "sessions");
|
||||
await fs.mkdir(sessionsDir);
|
||||
const parentSessionFile = path.join(sessionsDir, "parent.jsonl");
|
||||
const parentSessionId = "parent-empty";
|
||||
await fs.writeFile(
|
||||
parentSessionFile,
|
||||
`${JSON.stringify({
|
||||
type: "session",
|
||||
version: 3,
|
||||
id: parentSessionId,
|
||||
timestamp: "2026-05-01T00:00:00.000Z",
|
||||
cwd: root,
|
||||
})}\n`,
|
||||
"utf-8",
|
||||
);
|
||||
const parentSessionFile = path.join(sessionsDir, `${parentSessionId}.jsonl`);
|
||||
seedTranscript({
|
||||
sessionId: parentSessionId,
|
||||
transcriptPath: parentSessionFile,
|
||||
events: [
|
||||
{
|
||||
type: "session",
|
||||
version: 3,
|
||||
id: parentSessionId,
|
||||
timestamp: "2026-05-01T00:00:00.000Z",
|
||||
cwd: root,
|
||||
},
|
||||
],
|
||||
});
|
||||
|
||||
const fork = await forkSessionFromParentRuntime({
|
||||
parentEntry: {
|
||||
@@ -347,11 +385,13 @@ describe("forkSessionFromParentRuntime", () => {
|
||||
if (!fork) {
|
||||
throw new Error("expected forked session entry");
|
||||
}
|
||||
const raw = await fs.readFile(fork.sessionFile, "utf-8");
|
||||
const lines = raw.trim().split(/\r?\n/u);
|
||||
expect(lines).toHaveLength(1);
|
||||
const resolvedParentSessionFile = await fs.realpath(parentSessionFile);
|
||||
expect(JSON.parse(lines[0] ?? "{}")).toMatchObject({
|
||||
const entries = readTranscript("main", fork.sessionId) as Array<Record<string, unknown>>;
|
||||
expect(entries).toHaveLength(1);
|
||||
const resolvedParentSessionFile = path.join(
|
||||
await fs.realpath(sessionsDir),
|
||||
`${parentSessionId}.jsonl`,
|
||||
);
|
||||
expect(entries[0]).toMatchObject({
|
||||
type: "session",
|
||||
id: fork.sessionId,
|
||||
parentSession: resolvedParentSessionFile,
|
||||
|
||||
@@ -1,10 +1,8 @@
|
||||
import crypto from "node:crypto";
|
||||
import fs from "node:fs/promises";
|
||||
import path from "node:path";
|
||||
import {
|
||||
CURRENT_SESSION_VERSION,
|
||||
migrateSessionEntries,
|
||||
parseSessionEntries,
|
||||
type FileEntry,
|
||||
type SessionEntry as PiSessionEntry,
|
||||
type SessionHeader,
|
||||
@@ -15,18 +13,18 @@ import {
|
||||
resolveSessionFilePathOptions,
|
||||
} from "../../config/sessions/paths.js";
|
||||
import {
|
||||
loadSqliteSessionTranscriptEvents,
|
||||
replaceSqliteSessionTranscriptEvents,
|
||||
resolveSqliteSessionTranscriptScopeForPath,
|
||||
resolveSqliteSessionTranscriptScope,
|
||||
} from "../../config/sessions/transcript-store.sqlite.js";
|
||||
import {
|
||||
resolveFreshSessionTotalTokens,
|
||||
type SessionEntry as StoreSessionEntry,
|
||||
} from "../../config/sessions/types.js";
|
||||
import { readLatestRecentSessionUsageFromTranscriptAsync } from "../../gateway/session-utils.fs.js";
|
||||
import { readRegularFile } from "../../infra/fs-safe.js";
|
||||
import { DEFAULT_AGENT_ID } from "../../routing/session-key.js";
|
||||
|
||||
type ForkSourceTranscript = {
|
||||
agentId: string;
|
||||
cwd: string;
|
||||
sessionDir: string;
|
||||
leafId: string | null;
|
||||
@@ -53,7 +51,7 @@ function maxPositiveTokenCount(...values: Array<number | undefined>): number | u
|
||||
return max;
|
||||
}
|
||||
|
||||
async function estimateParentTranscriptTokensFromBytes(params: {
|
||||
async function estimateParentTranscriptTokensFromSqlite(params: {
|
||||
parentEntry: StoreSessionEntry;
|
||||
storePath: string;
|
||||
}): Promise<number | undefined> {
|
||||
@@ -63,8 +61,18 @@ async function estimateParentTranscriptTokensFromBytes(params: {
|
||||
params.parentEntry,
|
||||
resolveSessionFilePathOptions({ storePath: params.storePath }),
|
||||
);
|
||||
const stat = await fs.stat(filePath);
|
||||
return resolvePositiveTokenCount(Math.ceil(stat.size / FALLBACK_TRANSCRIPT_BYTES_PER_TOKEN));
|
||||
const scope = resolveSqliteSessionTranscriptScope({
|
||||
sessionId: params.parentEntry.sessionId,
|
||||
transcriptPath: filePath,
|
||||
});
|
||||
if (!scope) {
|
||||
return undefined;
|
||||
}
|
||||
const size = loadSqliteSessionTranscriptEvents(scope).reduce(
|
||||
(total, entry) => total + JSON.stringify(entry.event).length + 1,
|
||||
0,
|
||||
);
|
||||
return resolvePositiveTokenCount(Math.ceil(size / FALLBACK_TRANSCRIPT_BYTES_PER_TOKEN));
|
||||
} catch {
|
||||
return undefined;
|
||||
}
|
||||
@@ -80,7 +88,7 @@ export async function resolveParentForkTokenCountRuntime(params: {
|
||||
}
|
||||
|
||||
const cachedTokens = resolvePositiveTokenCount(params.parentEntry.totalTokens);
|
||||
const byteEstimateTokens = await estimateParentTranscriptTokensFromBytes(params);
|
||||
const byteEstimateTokens = await estimateParentTranscriptTokensFromSqlite(params);
|
||||
try {
|
||||
const usage = await readLatestRecentSessionUsageFromTranscriptAsync(
|
||||
params.parentEntry.sessionId,
|
||||
@@ -172,11 +180,18 @@ function collectBranchLabels(params: {
|
||||
return labelsToWrite;
|
||||
}
|
||||
|
||||
async function readForkSourceTranscript(
|
||||
parentSessionFile: string,
|
||||
): Promise<ForkSourceTranscript | null> {
|
||||
const raw = (await readRegularFile({ filePath: parentSessionFile })).buffer.toString("utf-8");
|
||||
const fileEntries = parseSessionEntries(raw);
|
||||
async function readForkSourceTranscript(params: {
|
||||
parentSessionFile: string;
|
||||
agentId: string;
|
||||
sessionId: string;
|
||||
}): Promise<ForkSourceTranscript | null> {
|
||||
const fileEntries = loadSqliteSessionTranscriptEvents({
|
||||
agentId: params.agentId,
|
||||
sessionId: params.sessionId,
|
||||
}).map((entry) => entry.event as FileEntry);
|
||||
if (fileEntries.length === 0) {
|
||||
return null;
|
||||
}
|
||||
migrateSessionEntries(fileEntries);
|
||||
const header =
|
||||
fileEntries.find((entry): entry is SessionHeader => entry.type === "session") ?? null;
|
||||
@@ -188,8 +203,9 @@ async function readForkSourceTranscript(
|
||||
branchEntries.filter((entry) => entry.type !== "label").map((entry) => entry.id),
|
||||
);
|
||||
return {
|
||||
agentId: params.agentId,
|
||||
cwd: header?.cwd ?? process.cwd(),
|
||||
sessionDir: path.dirname(parentSessionFile),
|
||||
sessionDir: path.dirname(params.parentSessionFile),
|
||||
leafId,
|
||||
branchEntries,
|
||||
labelsToWrite: collectBranchLabels({ allEntries: entries, pathEntryIds }),
|
||||
@@ -221,6 +237,7 @@ function buildBranchLabelEntries(params: {
|
||||
|
||||
async function writeForkHeaderOnly(params: {
|
||||
parentSessionFile: string;
|
||||
agentId: string;
|
||||
sessionDir: string;
|
||||
cwd: string;
|
||||
}): Promise<{ sessionId: string; sessionFile: string }> {
|
||||
@@ -236,11 +253,8 @@ async function writeForkHeaderOnly(params: {
|
||||
cwd: params.cwd,
|
||||
parentSession: params.parentSessionFile,
|
||||
} satisfies SessionHeader;
|
||||
const parentScope = resolveSqliteSessionTranscriptScopeForPath({
|
||||
transcriptPath: params.parentSessionFile,
|
||||
});
|
||||
replaceSqliteSessionTranscriptEvents({
|
||||
agentId: parentScope?.agentId ?? DEFAULT_AGENT_ID,
|
||||
agentId: params.agentId,
|
||||
sessionId,
|
||||
transcriptPath: sessionFile,
|
||||
events: [header],
|
||||
@@ -276,11 +290,8 @@ async function writeBranchedSession(params: {
|
||||
(entry) => entry.type === "message" && entry.message.role === "assistant",
|
||||
);
|
||||
if (hasAssistant) {
|
||||
const parentScope = resolveSqliteSessionTranscriptScopeForPath({
|
||||
transcriptPath: params.parentSessionFile,
|
||||
});
|
||||
replaceSqliteSessionTranscriptEvents({
|
||||
agentId: parentScope?.agentId ?? DEFAULT_AGENT_ID,
|
||||
agentId: params.source.agentId,
|
||||
sessionId,
|
||||
transcriptPath: sessionFile,
|
||||
events: entries,
|
||||
@@ -303,7 +314,11 @@ export async function forkSessionFromParentRuntime(params: {
|
||||
return null;
|
||||
}
|
||||
try {
|
||||
const source = await readForkSourceTranscript(parentSessionFile);
|
||||
const source = await readForkSourceTranscript({
|
||||
parentSessionFile,
|
||||
agentId: params.agentId,
|
||||
sessionId: params.parentEntry.sessionId,
|
||||
});
|
||||
if (!source) {
|
||||
return null;
|
||||
}
|
||||
@@ -311,6 +326,7 @@ export async function forkSessionFromParentRuntime(params: {
|
||||
? await writeBranchedSession({ parentSessionFile, source })
|
||||
: await writeForkHeaderOnly({
|
||||
parentSessionFile,
|
||||
agentId: source.agentId,
|
||||
sessionDir: source.sessionDir,
|
||||
cwd: source.cwd,
|
||||
});
|
||||
|
||||
@@ -1,138 +1,140 @@
|
||||
import fs from "node:fs/promises";
|
||||
import os from "node:os";
|
||||
import path from "node:path";
|
||||
import { afterEach, beforeEach, describe, expect, it, vi } from "vitest";
|
||||
|
||||
const sqliteTranscriptMocks = vi.hoisted(() => ({
|
||||
exportSqliteSessionTranscriptJsonl: vi.fn(() => ""),
|
||||
hasSqliteSessionTranscriptEvents: vi.fn(() => false),
|
||||
}));
|
||||
|
||||
vi.mock("../../config/sessions/transcript-store.sqlite.js", () => ({
|
||||
exportSqliteSessionTranscriptJsonl: sqliteTranscriptMocks.exportSqliteSessionTranscriptJsonl,
|
||||
hasSqliteSessionTranscriptEvents: sqliteTranscriptMocks.hasSqliteSessionTranscriptEvents,
|
||||
}));
|
||||
|
||||
const { DEFAULT_REPLAY_MAX_MESSAGES, replayRecentUserAssistantMessages } =
|
||||
await import("./session-transcript-replay.js");
|
||||
|
||||
const j = (obj: unknown): string => `${JSON.stringify(obj)}\n`;
|
||||
|
||||
type ReplayRecord = {
|
||||
type?: string;
|
||||
id?: string;
|
||||
message?: {
|
||||
role?: string;
|
||||
content?: string;
|
||||
};
|
||||
};
|
||||
|
||||
async function readJsonlRecords(filePath: string): Promise<ReplayRecord[]> {
|
||||
const records: ReplayRecord[] = [];
|
||||
const raw = await fs.readFile(filePath, "utf8");
|
||||
for (const line of raw.split(/\r?\n/)) {
|
||||
if (line.trim().length === 0) {
|
||||
continue;
|
||||
}
|
||||
records.push(JSON.parse(line) as ReplayRecord);
|
||||
}
|
||||
return records;
|
||||
}
|
||||
|
||||
async function expectPathMissing(targetPath: string): Promise<void> {
|
||||
await expect(fs.stat(targetPath)).rejects.toMatchObject({ code: "ENOENT" });
|
||||
}
|
||||
import { afterEach, beforeEach, describe, expect, it } from "vitest";
|
||||
import {
|
||||
loadSqliteSessionTranscriptEvents,
|
||||
replaceSqliteSessionTranscriptEvents,
|
||||
} from "../../config/sessions/transcript-store.sqlite.js";
|
||||
import { closeOpenClawStateDatabaseForTest } from "../../state/openclaw-state-db.js";
|
||||
import {
|
||||
DEFAULT_REPLAY_MAX_MESSAGES,
|
||||
replayRecentUserAssistantMessages,
|
||||
} from "./session-transcript-replay.js";
|
||||
|
||||
describe("replayRecentUserAssistantMessages", () => {
|
||||
let root = "";
|
||||
let originalStateDir: string | undefined;
|
||||
|
||||
beforeEach(async () => {
|
||||
root = await fs.mkdtemp(path.join(os.tmpdir(), "openclaw-replay-"));
|
||||
sqliteTranscriptMocks.exportSqliteSessionTranscriptJsonl.mockReturnValue("");
|
||||
sqliteTranscriptMocks.hasSqliteSessionTranscriptEvents.mockReturnValue(false);
|
||||
originalStateDir = process.env.OPENCLAW_STATE_DIR;
|
||||
process.env.OPENCLAW_STATE_DIR = root;
|
||||
});
|
||||
|
||||
afterEach(async () => {
|
||||
closeOpenClawStateDatabaseForTest();
|
||||
if (originalStateDir === undefined) {
|
||||
delete process.env.OPENCLAW_STATE_DIR;
|
||||
} else {
|
||||
process.env.OPENCLAW_STATE_DIR = originalStateDir;
|
||||
}
|
||||
await fs.rm(root, { recursive: true, force: true });
|
||||
});
|
||||
const call = (source: string, target: string): Promise<number> =>
|
||||
|
||||
function seedTranscript(params: {
|
||||
agentId?: string;
|
||||
sessionId: string;
|
||||
transcriptPath?: string;
|
||||
events: unknown[];
|
||||
}): string {
|
||||
const agentId = params.agentId ?? "main";
|
||||
const transcriptPath =
|
||||
params.transcriptPath ??
|
||||
path.join(root, "agents", agentId, "sessions", `${params.sessionId}.jsonl`);
|
||||
replaceSqliteSessionTranscriptEvents({
|
||||
agentId,
|
||||
sessionId: params.sessionId,
|
||||
transcriptPath,
|
||||
events: params.events,
|
||||
now: () => 1_770_000_000_000,
|
||||
});
|
||||
return transcriptPath;
|
||||
}
|
||||
|
||||
function readEvents(agentId = "main", sessionId = "new-session"): unknown[] {
|
||||
return loadSqliteSessionTranscriptEvents({ agentId, sessionId }).map((entry) => entry.event);
|
||||
}
|
||||
|
||||
const call = (sourceTranscript: string, targetTranscript: string): Promise<number> =>
|
||||
replayRecentUserAssistantMessages({
|
||||
sourceTranscript: source,
|
||||
targetTranscript: target,
|
||||
sourceTranscript,
|
||||
targetTranscript,
|
||||
newSessionId: "new-session",
|
||||
});
|
||||
|
||||
it("replays only the user/assistant tail and skips tool/system/malformed records", async () => {
|
||||
const source = path.join(root, "prev.jsonl");
|
||||
const target = path.join(root, "next.jsonl");
|
||||
const lines: string[] = [j({ type: "session", id: "old" })];
|
||||
for (let i = 0; i < DEFAULT_REPLAY_MAX_MESSAGES + 4; i += 1) {
|
||||
lines.push(j({ message: { role: i % 2 === 0 ? "user" : "assistant", content: `m${i}` } }));
|
||||
}
|
||||
lines.push(j({ message: { role: "tool" } }));
|
||||
lines.push(j({ type: "compaction", timestamp: new Date().toISOString() }));
|
||||
lines.push("not-json-line\n");
|
||||
await fs.writeFile(source, lines.join(""), "utf8");
|
||||
it("replays only the user/assistant tail and skips tool/system records", async () => {
|
||||
const source = seedTranscript({
|
||||
sessionId: "prev",
|
||||
events: [
|
||||
{ type: "session", id: "old" },
|
||||
...Array.from({ length: DEFAULT_REPLAY_MAX_MESSAGES + 4 }, (_, i) => ({
|
||||
message: { role: i % 2 === 0 ? "user" : "assistant", content: `m${i}` },
|
||||
})),
|
||||
{ message: { role: "tool" } },
|
||||
{ type: "compaction", timestamp: new Date().toISOString() },
|
||||
],
|
||||
});
|
||||
const target = path.join(root, "agents", "main", "sessions", "next.jsonl");
|
||||
|
||||
expect(await call(source, target)).toBe(DEFAULT_REPLAY_MAX_MESSAGES);
|
||||
const records = await readJsonlRecords(target);
|
||||
const records = readEvents();
|
||||
expect(records[0]).toMatchObject({ type: "session", id: "new-session" });
|
||||
expect(records).toHaveLength(1 + DEFAULT_REPLAY_MAX_MESSAGES);
|
||||
for (const r of records.slice(1)) {
|
||||
expect(["user", "assistant"]).toContain(r.message?.role);
|
||||
for (const r of records.slice(1) as Array<{ message: { role: string } }>) {
|
||||
expect(["user", "assistant"]).toContain(r.message.role);
|
||||
}
|
||||
expect(await call(path.join(root, "missing.jsonl"), path.join(root, "out.jsonl"))).toBe(0);
|
||||
expect(await call(path.join(root, "missing.jsonl"), target)).toBe(0);
|
||||
|
||||
const assistantSource = path.join(root, "all-assistant.jsonl");
|
||||
const assistantTarget = path.join(root, "all-assistant-out.jsonl");
|
||||
const onlyAssistants = Array.from({ length: 3 }, () =>
|
||||
j({ message: { role: "assistant", content: "x" } }),
|
||||
).join("");
|
||||
await fs.writeFile(assistantSource, onlyAssistants, "utf8");
|
||||
expect(await call(assistantSource, assistantTarget)).toBe(0);
|
||||
await expectPathMissing(assistantTarget);
|
||||
const assistantSource = seedTranscript({
|
||||
sessionId: "all-assistant",
|
||||
events: Array.from({ length: 3 }, () => ({
|
||||
message: { role: "assistant", content: "x" },
|
||||
})),
|
||||
});
|
||||
expect(
|
||||
await call(assistantSource, path.join(root, "agents", "main", "sessions", "out.jsonl")),
|
||||
).toBe(0);
|
||||
expect(readEvents("main", "new-session")).toHaveLength(1 + DEFAULT_REPLAY_MAX_MESSAGES);
|
||||
});
|
||||
|
||||
it("skips header for pre-existing targets and aligns the tail to a user turn", async () => {
|
||||
const source = path.join(root, "prev.jsonl");
|
||||
const target = path.join(root, "next.jsonl");
|
||||
await fs.writeFile(target, j({ type: "session", id: "existing" }), "utf8");
|
||||
const lines: string[] = [];
|
||||
for (let i = 0; i < DEFAULT_REPLAY_MAX_MESSAGES + 1; i += 1) {
|
||||
lines.push(j({ message: { role: i % 2 === 0 ? "user" : "assistant", content: `m${i}` } }));
|
||||
}
|
||||
await fs.writeFile(source, lines.join(""), "utf8");
|
||||
it("keeps a pre-existing target header and aligns the tail to a user turn", async () => {
|
||||
const target = seedTranscript({
|
||||
sessionId: "new-session",
|
||||
events: [{ type: "session", id: "existing" }],
|
||||
});
|
||||
const source = seedTranscript({
|
||||
sessionId: "prev",
|
||||
events: Array.from({ length: DEFAULT_REPLAY_MAX_MESSAGES + 1 }, (_, i) => ({
|
||||
message: { role: i % 2 === 0 ? "user" : "assistant", content: `m${i}` },
|
||||
})),
|
||||
});
|
||||
|
||||
expect(await call(source, target)).toBe(DEFAULT_REPLAY_MAX_MESSAGES - 1);
|
||||
const records = await readJsonlRecords(target);
|
||||
expect(records.reduce((count, r) => count + (r.type === "session" ? 1 : 0), 0)).toBe(1);
|
||||
const records = readEvents();
|
||||
expect(records.filter((r) => (r as { type?: unknown }).type === "session")).toHaveLength(1);
|
||||
expect(records[0]).toMatchObject({ id: "existing" });
|
||||
expect(records[1].message?.role).toBe("user");
|
||||
expect((records[1] as { message: { role: string } }).message.role).toBe("user");
|
||||
});
|
||||
|
||||
it("coalesces same-role runs so replayed records strictly alternate", async () => {
|
||||
const source = path.join(root, "prev.jsonl");
|
||||
const target = path.join(root, "next.jsonl");
|
||||
await fs.writeFile(
|
||||
source,
|
||||
[
|
||||
j({ message: { role: "user", content: "older user" } }),
|
||||
j({ message: { role: "user", content: "latest user" } }),
|
||||
j({ message: { role: "assistant", content: "older assistant" } }),
|
||||
j({ message: { role: "assistant", content: "latest assistant" } }),
|
||||
j({ message: { role: "user", content: "follow-up" } }),
|
||||
j({ message: { role: "assistant", content: "answer" } }),
|
||||
].join(""),
|
||||
"utf8",
|
||||
);
|
||||
const source = seedTranscript({
|
||||
sessionId: "prev",
|
||||
events: [
|
||||
{ message: { role: "user", content: "older user" } },
|
||||
{ message: { role: "user", content: "latest user" } },
|
||||
{ message: { role: "assistant", content: "older assistant" } },
|
||||
{ message: { role: "assistant", content: "latest assistant" } },
|
||||
{ message: { role: "user", content: "follow-up" } },
|
||||
{ message: { role: "assistant", content: "answer" } },
|
||||
],
|
||||
});
|
||||
const target = path.join(root, "agents", "main", "sessions", "next.jsonl");
|
||||
|
||||
expect(await call(source, target)).toBe(4);
|
||||
const records = await readJsonlRecords(target);
|
||||
expect(records.slice(1).map((r) => r.message?.role)).toEqual([
|
||||
"user",
|
||||
"assistant",
|
||||
"user",
|
||||
"assistant",
|
||||
]);
|
||||
expect(records.slice(1).map((r) => r.message?.content)).toEqual([
|
||||
const records = readEvents().slice(1) as Array<{ message: { role: string; content: string } }>;
|
||||
expect(records.map((r) => r.message.role)).toEqual(["user", "assistant", "user", "assistant"]);
|
||||
expect(records.map((r) => r.message.content)).toEqual([
|
||||
"latest user",
|
||||
"latest assistant",
|
||||
"follow-up",
|
||||
@@ -140,44 +142,34 @@ describe("replayRecentUserAssistantMessages", () => {
|
||||
]);
|
||||
});
|
||||
|
||||
it("replays from scoped SQLite transcript events when source JSONL is missing", async () => {
|
||||
sqliteTranscriptMocks.hasSqliteSessionTranscriptEvents.mockReturnValue(true);
|
||||
sqliteTranscriptMocks.exportSqliteSessionTranscriptJsonl.mockReturnValue(
|
||||
[
|
||||
j({ type: "session", id: "old-session" }),
|
||||
j({ message: { role: "user", content: "sqlite user" } }),
|
||||
j({ message: { role: "tool", content: "skip me" } }),
|
||||
j({ message: { role: "assistant", content: "sqlite assistant" } }),
|
||||
].join(""),
|
||||
);
|
||||
const target = path.join(root, "next.jsonl");
|
||||
it("replays from explicit scoped SQLite transcript events", async () => {
|
||||
seedTranscript({
|
||||
agentId: "target",
|
||||
sessionId: "old-session",
|
||||
events: [
|
||||
{ type: "session", id: "old-session" },
|
||||
{ message: { role: "user", content: "sqlite user" } },
|
||||
{ message: { role: "tool", content: "skip me" } },
|
||||
{ message: { role: "assistant", content: "sqlite assistant" } },
|
||||
],
|
||||
});
|
||||
const target = path.join(root, "agents", "target", "sessions", "next.jsonl");
|
||||
|
||||
expect(
|
||||
await replayRecentUserAssistantMessages({
|
||||
sourceAgentId: "target",
|
||||
sourceSessionId: "old-session",
|
||||
sourceTranscript: path.join(root, "missing.jsonl"),
|
||||
targetAgentId: "target",
|
||||
targetTranscript: target,
|
||||
newSessionId: "new-session",
|
||||
}),
|
||||
).toBe(2);
|
||||
|
||||
expect(sqliteTranscriptMocks.hasSqliteSessionTranscriptEvents).toHaveBeenCalledWith({
|
||||
agentId: "target",
|
||||
sessionId: "old-session",
|
||||
});
|
||||
expect(sqliteTranscriptMocks.exportSqliteSessionTranscriptJsonl).toHaveBeenCalledWith({
|
||||
agentId: "target",
|
||||
sessionId: "old-session",
|
||||
});
|
||||
const records = (await fs.readFile(target, "utf8"))
|
||||
.split(/\r?\n/)
|
||||
.filter((line) => line.trim().length > 0)
|
||||
.map((line) => JSON.parse(line));
|
||||
const records = readEvents("target");
|
||||
expect(records[0]).toMatchObject({ type: "session", id: "new-session" });
|
||||
expect(records.slice(1).map((r) => r.message.content)).toEqual([
|
||||
"sqlite user",
|
||||
"sqlite assistant",
|
||||
]);
|
||||
expect(
|
||||
(records.slice(1) as Array<{ message: { content: string } }>).map((r) => r.message.content),
|
||||
).toEqual(["sqlite user", "sqlite assistant"]);
|
||||
});
|
||||
});
|
||||
|
||||
@@ -1,17 +1,18 @@
|
||||
import fs from "node:fs";
|
||||
import fsp from "node:fs/promises";
|
||||
import path from "node:path";
|
||||
import { CURRENT_SESSION_VERSION } from "../../agents/transcript/session-transcript-contract.js";
|
||||
import {
|
||||
exportSqliteSessionTranscriptJsonl,
|
||||
hasSqliteSessionTranscriptEvents,
|
||||
loadSqliteSessionTranscriptEvents,
|
||||
replaceSqliteSessionTranscriptEvents,
|
||||
resolveSqliteSessionTranscriptScopeForPath,
|
||||
} from "../../config/sessions/transcript-store.sqlite.js";
|
||||
import { DEFAULT_AGENT_ID } from "../../routing/session-key.js";
|
||||
|
||||
/** Tail kept so DM continuity survives silent session rotations. */
|
||||
export const DEFAULT_REPLAY_MAX_MESSAGES = 6;
|
||||
|
||||
type SessionRecord = { message?: { role?: unknown } };
|
||||
type KeptRecord = { role: "user" | "assistant"; line: string };
|
||||
type KeptRecord = { role: "user" | "assistant"; event: unknown };
|
||||
|
||||
/**
|
||||
* Copy the tail of user/assistant JSONL records from a prior transcript into a
|
||||
@@ -25,6 +26,7 @@ export async function replayRecentUserAssistantMessages(params: {
|
||||
sourceAgentId?: string;
|
||||
sourceSessionId?: string;
|
||||
sourceTranscript?: string;
|
||||
targetAgentId?: string;
|
||||
targetTranscript: string;
|
||||
newSessionId: string;
|
||||
maxMessages?: number;
|
||||
@@ -34,22 +36,15 @@ export async function replayRecentUserAssistantMessages(params: {
|
||||
return 0;
|
||||
}
|
||||
try {
|
||||
const sourceLines = await loadReplaySourceLines(params);
|
||||
if (!sourceLines) {
|
||||
const sourceEvents = loadReplaySourceEvents(params);
|
||||
if (!sourceEvents) {
|
||||
return 0;
|
||||
}
|
||||
const kept: KeptRecord[] = [];
|
||||
for (const line of sourceLines) {
|
||||
if (!line.trim()) {
|
||||
continue;
|
||||
}
|
||||
try {
|
||||
const role = (JSON.parse(line) as SessionRecord | null)?.message?.role;
|
||||
if (role === "user" || role === "assistant") {
|
||||
kept.push({ role, line });
|
||||
}
|
||||
} catch {
|
||||
// Skip malformed lines.
|
||||
for (const event of sourceEvents) {
|
||||
const role = (event as SessionRecord | null)?.message?.role;
|
||||
if (role === "user" || role === "assistant") {
|
||||
kept.push({ role, event });
|
||||
}
|
||||
}
|
||||
if (kept.length === 0) {
|
||||
@@ -64,48 +59,75 @@ export async function replayRecentUserAssistantMessages(params: {
|
||||
// role-ordering hazard this reset path is recovering from.
|
||||
return 0;
|
||||
}
|
||||
const tail = coalesceAlternatingReplayTail(kept.slice(startIdx)).map((entry) => entry.line);
|
||||
if (!fs.existsSync(params.targetTranscript)) {
|
||||
await fsp.mkdir(path.dirname(params.targetTranscript), { recursive: true });
|
||||
const header = JSON.stringify({
|
||||
type: "session",
|
||||
version: CURRENT_SESSION_VERSION,
|
||||
id: params.newSessionId,
|
||||
timestamp: new Date().toISOString(),
|
||||
cwd: process.cwd(),
|
||||
});
|
||||
await fsp.writeFile(params.targetTranscript, `${header}\n`, {
|
||||
encoding: "utf-8",
|
||||
mode: 0o600,
|
||||
});
|
||||
}
|
||||
await fsp.appendFile(params.targetTranscript, `${tail.join("\n")}\n`, "utf-8");
|
||||
const tail = coalesceAlternatingReplayTail(kept.slice(startIdx)).map((entry) => entry.event);
|
||||
const targetAgentId =
|
||||
params.targetAgentId ??
|
||||
params.sourceAgentId ??
|
||||
resolveAgentIdFromSessionPath(params.targetTranscript);
|
||||
const existingTargetEvents = loadSqliteSessionTranscriptEvents({
|
||||
agentId: targetAgentId,
|
||||
sessionId: params.newSessionId,
|
||||
}).map((entry) => entry.event);
|
||||
const targetEvents =
|
||||
existingTargetEvents.length > 0
|
||||
? [...existingTargetEvents, ...tail]
|
||||
: [
|
||||
{
|
||||
type: "session",
|
||||
version: CURRENT_SESSION_VERSION,
|
||||
id: params.newSessionId,
|
||||
timestamp: new Date().toISOString(),
|
||||
cwd: process.cwd(),
|
||||
},
|
||||
...tail,
|
||||
];
|
||||
replaceSqliteSessionTranscriptEvents({
|
||||
agentId: targetAgentId,
|
||||
sessionId: params.newSessionId,
|
||||
transcriptPath: path.resolve(params.targetTranscript),
|
||||
events: targetEvents,
|
||||
});
|
||||
return tail.length;
|
||||
} catch {
|
||||
return 0;
|
||||
}
|
||||
}
|
||||
|
||||
async function loadReplaySourceLines(params: {
|
||||
function resolveAgentIdFromSessionPath(sessionFile: string): string {
|
||||
const resolved = path.resolve(sessionFile);
|
||||
const sessionsDir = path.dirname(resolved);
|
||||
const agentDir = path.dirname(sessionsDir);
|
||||
const agentsDir = path.dirname(agentDir);
|
||||
if (path.basename(sessionsDir) === "sessions" && path.basename(agentsDir) === "agents") {
|
||||
return path.basename(agentDir);
|
||||
}
|
||||
return DEFAULT_AGENT_ID;
|
||||
}
|
||||
|
||||
function loadReplaySourceEvents(params: {
|
||||
sourceAgentId?: string;
|
||||
sourceSessionId?: string;
|
||||
sourceTranscript?: string;
|
||||
}): Promise<string[] | undefined> {
|
||||
const scopedJsonl = loadScopedReplaySourceJsonl(params);
|
||||
if (scopedJsonl !== undefined) {
|
||||
return scopedJsonl.split(/\r?\n/);
|
||||
}): unknown[] | undefined {
|
||||
const scopedEvents = loadScopedReplaySourceEvents(params);
|
||||
if (scopedEvents !== undefined) {
|
||||
return scopedEvents;
|
||||
}
|
||||
const src = params.sourceTranscript;
|
||||
if (!src || !fs.existsSync(src)) {
|
||||
if (!src) {
|
||||
return undefined;
|
||||
}
|
||||
return (await fsp.readFile(src, "utf-8")).split(/\r?\n/);
|
||||
const sourceScope = resolveSqliteSessionTranscriptScopeForPath({ transcriptPath: src });
|
||||
if (!sourceScope) {
|
||||
return undefined;
|
||||
}
|
||||
return loadSqliteSessionTranscriptEvents(sourceScope).map((entry) => entry.event);
|
||||
}
|
||||
|
||||
function loadScopedReplaySourceJsonl(params: {
|
||||
function loadScopedReplaySourceEvents(params: {
|
||||
sourceAgentId?: string;
|
||||
sourceSessionId?: string;
|
||||
}): string | undefined {
|
||||
}): unknown[] | undefined {
|
||||
if (!params.sourceAgentId?.trim() || !params.sourceSessionId?.trim()) {
|
||||
return undefined;
|
||||
}
|
||||
@@ -115,15 +137,14 @@ function loadScopedReplaySourceJsonl(params: {
|
||||
sessionId: params.sourceSessionId,
|
||||
};
|
||||
return hasSqliteSessionTranscriptEvents(scope)
|
||||
? exportSqliteSessionTranscriptJsonl(scope)
|
||||
? loadSqliteSessionTranscriptEvents(scope).map((entry) => entry.event)
|
||||
: undefined;
|
||||
} catch {
|
||||
return undefined;
|
||||
}
|
||||
}
|
||||
|
||||
// Keep the newest record from each same-role run, preserving original JSONL bytes
|
||||
// for replay while ensuring strict provider alternation.
|
||||
// Keep the newest record from each same-role run while ensuring strict provider alternation.
|
||||
function coalesceAlternatingReplayTail(entries: KeptRecord[]): KeptRecord[] {
|
||||
const tail: KeptRecord[] = [];
|
||||
for (const entry of entries) {
|
||||
|
||||
@@ -171,23 +171,18 @@ export function registerStatusHealthSessionsCommands(program: Command) {
|
||||
|
||||
sessionsCmd
|
||||
.command("cleanup")
|
||||
.description("Run session-store maintenance now")
|
||||
.option("--store <path>", "Path to session store (default: resolved from config)")
|
||||
.description("Run SQLite session-row maintenance now")
|
||||
.option("--store <path>", "Legacy/custom session store override (default: config)")
|
||||
.option("--agent <id>", "Agent id to maintain (default: configured default agent)")
|
||||
.option("--all-agents", "Run maintenance across all configured agents", false)
|
||||
.option("--dry-run", "Preview maintenance actions without writing", false)
|
||||
.option("--enforce", "Apply maintenance even when configured mode is warn", false)
|
||||
.option(
|
||||
"--fix-missing",
|
||||
"Remove store entries whose transcript files are missing (bypasses age/count retention)",
|
||||
"Remove store entries whose SQLite transcript events are missing (bypasses age/count retention)",
|
||||
false,
|
||||
)
|
||||
.option(
|
||||
"--fix-dm-scope",
|
||||
"Retire stale direct-DM session rows that no longer match session.dmScope=main",
|
||||
false,
|
||||
)
|
||||
.option("--active-key <key>", "Protect this session key from budget-eviction")
|
||||
.option("--active-key <key>", "Protect this session key from enforce-mode retention")
|
||||
.option("--json", "Output JSON", false)
|
||||
.addHelpText(
|
||||
"after",
|
||||
@@ -196,7 +191,7 @@ export function registerStatusHealthSessionsCommands(program: Command) {
|
||||
["openclaw sessions cleanup --dry-run", "Preview stale/cap cleanup."],
|
||||
[
|
||||
"openclaw sessions cleanup --dry-run --fix-missing",
|
||||
"Also preview pruning entries with missing transcript files.",
|
||||
"Also preview pruning entries with missing SQLite transcript events.",
|
||||
],
|
||||
[
|
||||
"openclaw sessions cleanup --dry-run --fix-dm-scope",
|
||||
@@ -207,7 +202,7 @@ export function registerStatusHealthSessionsCommands(program: Command) {
|
||||
["openclaw sessions cleanup --all-agents --dry-run", "Preview all agent stores."],
|
||||
[
|
||||
"openclaw sessions cleanup --enforce --store ./tmp/sessions.json",
|
||||
"Use a specific store.",
|
||||
"Use a legacy/custom store override.",
|
||||
],
|
||||
])}`,
|
||||
)
|
||||
|
||||
@@ -47,6 +47,29 @@ describe("legacy session maintenance migrate", () => {
|
||||
expect(res.changes).toContain("Removed deprecated session.maintenance.rotateBytes.");
|
||||
});
|
||||
|
||||
it("removes deprecated session.maintenance disk budget settings", () => {
|
||||
const res = migrateLegacyConfigForTest({
|
||||
session: {
|
||||
maintenance: {
|
||||
mode: "enforce",
|
||||
pruneAfter: "30d",
|
||||
maxEntries: 500,
|
||||
maxDiskBytes: "500mb",
|
||||
highWaterBytes: "400mb",
|
||||
},
|
||||
},
|
||||
});
|
||||
|
||||
expect(res.config?.session?.maintenance).toEqual({
|
||||
mode: "enforce",
|
||||
pruneAfter: "30d",
|
||||
maxEntries: 500,
|
||||
});
|
||||
expect(res.changes).toContain(
|
||||
"Removed deprecated session.maintenance.maxDiskBytes/highWaterBytes; session transcripts are stored in SQLite.",
|
||||
);
|
||||
});
|
||||
|
||||
it("removes legacy session.maintenance.resetArchiveRetention", () => {
|
||||
const res = migrateLegacyConfigForTest({
|
||||
session: {
|
||||
|
||||
@@ -10,6 +10,15 @@ function hasLegacyRotateBytes(value: unknown): boolean {
|
||||
return Boolean(maintenance && Object.prototype.hasOwnProperty.call(maintenance, "rotateBytes"));
|
||||
}
|
||||
|
||||
function hasLegacyDiskBudget(value: unknown): boolean {
|
||||
const maintenance = getRecord(value);
|
||||
return Boolean(
|
||||
maintenance &&
|
||||
(Object.prototype.hasOwnProperty.call(maintenance, "maxDiskBytes") ||
|
||||
Object.prototype.hasOwnProperty.call(maintenance, "highWaterBytes")),
|
||||
);
|
||||
}
|
||||
|
||||
function hasLegacyResetArchiveRetention(value: unknown): boolean {
|
||||
const maintenance = getRecord(value);
|
||||
return Boolean(
|
||||
@@ -29,6 +38,13 @@ const LEGACY_SESSION_MAINTENANCE_ROTATE_BYTES_RULE: LegacyConfigRule = {
|
||||
match: hasLegacyRotateBytes,
|
||||
};
|
||||
|
||||
const LEGACY_SESSION_MAINTENANCE_DISK_BUDGET_RULE: LegacyConfigRule = {
|
||||
path: ["session", "maintenance"],
|
||||
message:
|
||||
'session.maintenance.maxDiskBytes/highWaterBytes are deprecated and ignored; run "openclaw doctor --fix" to remove them.',
|
||||
match: hasLegacyDiskBudget,
|
||||
};
|
||||
|
||||
const LEGACY_SESSION_MAINTENANCE_RESET_ARCHIVE_RETENTION_RULE: LegacyConfigRule = {
|
||||
path: ["session", "maintenance"],
|
||||
message:
|
||||
@@ -57,6 +73,31 @@ export const LEGACY_CONFIG_MIGRATIONS_RUNTIME_SESSION: LegacyConfigMigrationSpec
|
||||
changes.push("Removed deprecated session.maintenance.rotateBytes.");
|
||||
},
|
||||
}),
|
||||
defineLegacyConfigMigration({
|
||||
id: "session.maintenance.diskBudget",
|
||||
describe: "Remove deprecated session.maintenance disk budget settings",
|
||||
legacyRules: [LEGACY_SESSION_MAINTENANCE_DISK_BUDGET_RULE],
|
||||
apply: (raw, changes) => {
|
||||
const maintenance = getRecord(getRecord(raw.session)?.maintenance);
|
||||
if (!maintenance) {
|
||||
return;
|
||||
}
|
||||
let removed = false;
|
||||
if (Object.prototype.hasOwnProperty.call(maintenance, "maxDiskBytes")) {
|
||||
delete maintenance.maxDiskBytes;
|
||||
removed = true;
|
||||
}
|
||||
if (Object.prototype.hasOwnProperty.call(maintenance, "highWaterBytes")) {
|
||||
delete maintenance.highWaterBytes;
|
||||
removed = true;
|
||||
}
|
||||
if (removed) {
|
||||
changes.push(
|
||||
"Removed deprecated session.maintenance.maxDiskBytes/highWaterBytes; session transcripts are stored in SQLite.",
|
||||
);
|
||||
}
|
||||
},
|
||||
}),
|
||||
defineLegacyConfigMigration({
|
||||
id: "session.maintenance.resetArchiveRetention",
|
||||
describe: "Remove legacy session.maintenance.resetArchiveRetention",
|
||||
|
||||
@@ -13,7 +13,6 @@ const mocks = vi.hoisted(() => ({
|
||||
pruneStaleEntries: vi.fn(),
|
||||
capEntryCount: vi.fn(),
|
||||
updateSessionStore: vi.fn(),
|
||||
enforceSessionDiskBudget: vi.fn(),
|
||||
resolveSessionCleanupAction: vi.fn(),
|
||||
runSessionsCleanup: vi.fn(),
|
||||
serializeSessionCleanupResult: vi.fn(),
|
||||
@@ -39,7 +38,6 @@ vi.mock("../config/sessions.js", () => ({
|
||||
pruneStaleEntries: mocks.pruneStaleEntries,
|
||||
capEntryCount: mocks.capEntryCount,
|
||||
updateSessionStore: mocks.updateSessionStore,
|
||||
enforceSessionDiskBudget: mocks.enforceSessionDiskBudget,
|
||||
resolveSessionCleanupAction: mocks.resolveSessionCleanupAction,
|
||||
runSessionsCleanup: mocks.runSessionsCleanup,
|
||||
serializeSessionCleanupResult: mocks.serializeSessionCleanupResult,
|
||||
@@ -91,8 +89,6 @@ describe("sessionsCleanupCommand", () => {
|
||||
mode: "warn",
|
||||
pruneAfterMs: 7 * 24 * 60 * 60 * 1000,
|
||||
maxEntries: 500,
|
||||
maxDiskBytes: null,
|
||||
highWaterBytes: null,
|
||||
});
|
||||
mocks.pruneStaleEntries.mockImplementation(
|
||||
(
|
||||
@@ -122,8 +118,6 @@ describe("sessionsCleanupCommand", () => {
|
||||
missingKeys: Set<string>;
|
||||
staleKeys: Set<string>;
|
||||
cappedKeys: Set<string>;
|
||||
budgetEvictedKeys: Set<string>;
|
||||
dmScopeRetiredKeys: Set<string>;
|
||||
}) => {
|
||||
if (params.dmScopeRetiredKeys.has(params.key)) {
|
||||
return "retire-dm-scope";
|
||||
@@ -137,9 +131,6 @@ describe("sessionsCleanupCommand", () => {
|
||||
if (params.cappedKeys.has(params.key)) {
|
||||
return "cap-overflow";
|
||||
}
|
||||
if (params.budgetEvictedKeys.has(params.key)) {
|
||||
return "evict-budget";
|
||||
}
|
||||
return "keep";
|
||||
},
|
||||
);
|
||||
@@ -161,16 +152,6 @@ describe("sessionsCleanupCommand", () => {
|
||||
previewResults: [],
|
||||
appliedSummaries: [],
|
||||
});
|
||||
mocks.enforceSessionDiskBudget.mockResolvedValue({
|
||||
totalBytesBefore: 1000,
|
||||
totalBytesAfter: 700,
|
||||
removedFiles: 1,
|
||||
removedEntries: 1,
|
||||
freedBytes: 300,
|
||||
maxBytes: 900,
|
||||
highWaterBytes: 700,
|
||||
overBudget: true,
|
||||
});
|
||||
});
|
||||
|
||||
it("emits a single JSON object for non-dry runs and applies maintenance", async () => {
|
||||
@@ -192,16 +173,6 @@ describe("sessionsCleanupCommand", () => {
|
||||
dmScopeRetired: 0,
|
||||
pruned: 0,
|
||||
capped: 2,
|
||||
diskBudget: {
|
||||
totalBytesBefore: 1200,
|
||||
totalBytesAfter: 800,
|
||||
removedFiles: 0,
|
||||
removedEntries: 0,
|
||||
freedBytes: 400,
|
||||
maxBytes: 1000,
|
||||
highWaterBytes: 800,
|
||||
overBudget: true,
|
||||
},
|
||||
wouldMutate: true,
|
||||
applied: true,
|
||||
appliedCount: 1,
|
||||
@@ -227,17 +198,16 @@ describe("sessionsCleanupCommand", () => {
|
||||
expect(payload.appliedCount).toBe(1);
|
||||
expect(payload.pruned).toBe(0);
|
||||
expect(payload.capped).toBe(2);
|
||||
const diskBudget = payload.diskBudget as Record<string, unknown>;
|
||||
expect(diskBudget.removedFiles).toBe(0);
|
||||
expect(diskBudget.removedEntries).toBe(0);
|
||||
expect(mocks.runSessionsCleanup).toHaveBeenCalledOnce();
|
||||
const cleanupCall = mocks.runSessionsCleanup.mock.calls[0]?.[0];
|
||||
expect(cleanupCall?.cfg).toEqual({ session: { store: "/cfg/sessions.json" } });
|
||||
expect(cleanupCall?.opts.enforce).toBe(true);
|
||||
expect(cleanupCall?.opts.activeKey).toBe("agent:main:main");
|
||||
expect(cleanupCall?.targets).toEqual([
|
||||
{ agentId: "main", storePath: "/resolved/sessions.json" },
|
||||
]);
|
||||
expect(mocks.runSessionsCleanup).toHaveBeenCalledWith(
|
||||
expect.objectContaining({
|
||||
cfg: { session: { store: "/cfg/sessions.json" } },
|
||||
opts: expect.objectContaining({
|
||||
enforce: true,
|
||||
activeKey: "agent:main:main",
|
||||
}),
|
||||
targets: [{ agentId: "main", storePath: "/resolved/sessions.json" }],
|
||||
}),
|
||||
);
|
||||
});
|
||||
|
||||
it("delegates non-store enforcing cleanup through the Gateway writer when reachable", async () => {
|
||||
@@ -252,7 +222,6 @@ describe("sessionsCleanupCommand", () => {
|
||||
dmScopeRetired: 0,
|
||||
pruned: 2,
|
||||
capped: 0,
|
||||
diskBudget: null,
|
||||
wouldMutate: true,
|
||||
applied: true,
|
||||
appliedCount: 1,
|
||||
@@ -292,24 +261,12 @@ describe("sessionsCleanupCommand", () => {
|
||||
dmScopeRetired: 0,
|
||||
pruned: 1,
|
||||
capped: 0,
|
||||
diskBudget: {
|
||||
totalBytesBefore: 1000,
|
||||
totalBytesAfter: 700,
|
||||
removedFiles: 1,
|
||||
removedEntries: 1,
|
||||
freedBytes: 300,
|
||||
maxBytes: 900,
|
||||
highWaterBytes: 700,
|
||||
overBudget: true,
|
||||
},
|
||||
wouldMutate: true,
|
||||
},
|
||||
beforeStore: {},
|
||||
missingKeys: new Set<string>(),
|
||||
staleKeys: new Set<string>(),
|
||||
cappedKeys: new Set<string>(),
|
||||
budgetEvictedKeys: new Set<string>(),
|
||||
dmScopeRetiredKeys: new Set<string>(),
|
||||
},
|
||||
],
|
||||
appliedSummaries: [],
|
||||
@@ -330,13 +287,9 @@ describe("sessionsCleanupCommand", () => {
|
||||
expect(payload.applied).toBeUndefined();
|
||||
expect(mocks.runSessionsCleanup).toHaveBeenCalled();
|
||||
expect(mocks.updateSessionStore).not.toHaveBeenCalled();
|
||||
const diskBudget = payload.diskBudget as Record<string, unknown>;
|
||||
expect(diskBudget.removedFiles).toBe(1);
|
||||
expect(diskBudget.removedEntries).toBe(1);
|
||||
});
|
||||
|
||||
it("counts missing transcript entries when --fix-missing is enabled in dry-run", async () => {
|
||||
mocks.enforceSessionDiskBudget.mockResolvedValue(null);
|
||||
mocks.runSessionsCleanup.mockResolvedValue({
|
||||
mode: "warn",
|
||||
previewResults: [
|
||||
@@ -352,15 +305,12 @@ describe("sessionsCleanupCommand", () => {
|
||||
dmScopeRetired: 0,
|
||||
pruned: 0,
|
||||
capped: 0,
|
||||
diskBudget: null,
|
||||
wouldMutate: true,
|
||||
},
|
||||
beforeStore: {},
|
||||
missingKeys: new Set(["missing"]),
|
||||
staleKeys: new Set<string>(),
|
||||
cappedKeys: new Set<string>(),
|
||||
budgetEvictedKeys: new Set<string>(),
|
||||
dmScopeRetiredKeys: new Set<string>(),
|
||||
},
|
||||
],
|
||||
appliedSummaries: [],
|
||||
@@ -384,7 +334,6 @@ describe("sessionsCleanupCommand", () => {
|
||||
});
|
||||
|
||||
it("renders a dry-run action table with keep/prune actions", async () => {
|
||||
mocks.enforceSessionDiskBudget.mockResolvedValue(null);
|
||||
mocks.runSessionsCleanup.mockResolvedValue({
|
||||
mode: "warn",
|
||||
previewResults: [
|
||||
@@ -400,13 +349,6 @@ describe("sessionsCleanupCommand", () => {
|
||||
dmScopeRetired: 0,
|
||||
pruned: 1,
|
||||
capped: 0,
|
||||
unreferencedArtifacts: {
|
||||
scannedFiles: 5,
|
||||
removedFiles: 2,
|
||||
freedBytes: 128,
|
||||
olderThanMs: 604800000,
|
||||
},
|
||||
diskBudget: null,
|
||||
wouldMutate: true,
|
||||
},
|
||||
beforeStore: {
|
||||
@@ -416,8 +358,6 @@ describe("sessionsCleanupCommand", () => {
|
||||
missingKeys: new Set<string>(),
|
||||
staleKeys: new Set(["stale"]),
|
||||
cappedKeys: new Set<string>(),
|
||||
budgetEvictedKeys: new Set<string>(),
|
||||
dmScopeRetiredKeys: new Set<string>(),
|
||||
},
|
||||
],
|
||||
appliedSummaries: [],
|
||||
@@ -432,7 +372,6 @@ describe("sessionsCleanupCommand", () => {
|
||||
);
|
||||
|
||||
expectLogsToInclude(logs, "Planned session actions:");
|
||||
expectLogsToInclude(logs, "Would prune unreferenced artifacts: 2");
|
||||
const tableHeaderLines = logs.filter((line) => line.includes("Action") && line.includes("Key"));
|
||||
expect(tableHeaderLines.length).toBeGreaterThan(0);
|
||||
const freshKeepLines = logs.filter((line) => line.includes("fresh") && line.includes("keep"));
|
||||
@@ -448,7 +387,6 @@ describe("sessionsCleanupCommand", () => {
|
||||
{ agentId: "main", storePath: "/resolved/main-sessions.json" },
|
||||
{ agentId: "work", storePath: "/resolved/work-sessions.json" },
|
||||
]);
|
||||
mocks.enforceSessionDiskBudget.mockResolvedValue(null);
|
||||
mocks.runSessionsCleanup.mockResolvedValue({
|
||||
mode: "warn",
|
||||
previewResults: [
|
||||
@@ -464,15 +402,12 @@ describe("sessionsCleanupCommand", () => {
|
||||
dmScopeRetired: 0,
|
||||
pruned: 1,
|
||||
capped: 0,
|
||||
diskBudget: null,
|
||||
wouldMutate: true,
|
||||
},
|
||||
beforeStore: {},
|
||||
missingKeys: new Set<string>(),
|
||||
staleKeys: new Set(["stale"]),
|
||||
cappedKeys: new Set<string>(),
|
||||
budgetEvictedKeys: new Set<string>(),
|
||||
dmScopeRetiredKeys: new Set<string>(),
|
||||
},
|
||||
{
|
||||
summary: {
|
||||
@@ -486,15 +421,12 @@ describe("sessionsCleanupCommand", () => {
|
||||
dmScopeRetired: 0,
|
||||
pruned: 1,
|
||||
capped: 0,
|
||||
diskBudget: null,
|
||||
wouldMutate: true,
|
||||
},
|
||||
beforeStore: {},
|
||||
missingKeys: new Set<string>(),
|
||||
staleKeys: new Set(["stale"]),
|
||||
cappedKeys: new Set<string>(),
|
||||
budgetEvictedKeys: new Set<string>(),
|
||||
dmScopeRetiredKeys: new Set<string>(),
|
||||
},
|
||||
],
|
||||
appliedSummaries: [],
|
||||
|
||||
@@ -62,8 +62,6 @@ function buildActionRows(params: {
|
||||
missingKeys: Set<string>;
|
||||
staleKeys: Set<string>;
|
||||
cappedKeys: Set<string>;
|
||||
budgetEvictedKeys: Set<string>;
|
||||
dmScopeRetiredKeys: Set<string>;
|
||||
}): SessionCleanupActionRow[] {
|
||||
return toSessionDisplayRows(params.beforeStore).map((row) =>
|
||||
Object.assign({}, row, {
|
||||
@@ -72,8 +70,6 @@ function buildActionRows(params: {
|
||||
missingKeys: params.missingKeys,
|
||||
staleKeys: params.staleKeys,
|
||||
cappedKeys: params.cappedKeys,
|
||||
budgetEvictedKeys: params.budgetEvictedKeys,
|
||||
dmScopeRetiredKeys: params.dmScopeRetiredKeys,
|
||||
}),
|
||||
}),
|
||||
);
|
||||
@@ -99,16 +95,6 @@ function renderStoreDryRunPlan(params: {
|
||||
params.runtime.log(`Would retire stale direct DM sessions: ${params.summary.dmScopeRetired}`);
|
||||
params.runtime.log(`Would prune stale: ${params.summary.pruned}`);
|
||||
params.runtime.log(`Would cap overflow: ${params.summary.capped}`);
|
||||
if (params.summary.unreferencedArtifacts?.scannedFiles) {
|
||||
params.runtime.log(
|
||||
`Would prune unreferenced artifacts: ${params.summary.unreferencedArtifacts.removedFiles}`,
|
||||
);
|
||||
}
|
||||
if (params.summary.diskBudget) {
|
||||
params.runtime.log(
|
||||
`Would enforce disk budget: ${params.summary.diskBudget.totalBytesBefore} -> ${params.summary.diskBudget.totalBytesAfter} bytes (files ${params.summary.diskBudget.removedFiles}, entries ${params.summary.diskBudget.removedEntries})`,
|
||||
);
|
||||
}
|
||||
if (params.actionRows.length === 0) {
|
||||
return;
|
||||
}
|
||||
@@ -152,11 +138,6 @@ function renderAppliedSummaries(params: {
|
||||
}
|
||||
params.runtime.log(`Session store: ${summary.storePath}`);
|
||||
params.runtime.log(`Applied maintenance. Current entries: ${summary.appliedCount ?? 0}`);
|
||||
if (summary.unreferencedArtifacts?.removedFiles) {
|
||||
params.runtime.log(
|
||||
`Pruned unreferenced artifacts: ${summary.unreferencedArtifacts.removedFiles}`,
|
||||
);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
@@ -705,7 +705,7 @@ describe("config help copy quality", () => {
|
||||
expect(/transcript|lock/i.test(acquireTimeout)).toBe(true);
|
||||
});
|
||||
|
||||
it("documents session maintenance duration/size examples and deprecations", () => {
|
||||
it("documents session maintenance duration examples and deprecations", () => {
|
||||
const pruneAfter = FIELD_HELP["session.maintenance.pruneAfter"];
|
||||
expect(pruneAfter.includes("30d")).toBe(true);
|
||||
expect(pruneAfter.includes("12h")).toBe(true);
|
||||
@@ -719,10 +719,11 @@ describe("config help copy quality", () => {
|
||||
expect(deprecated.includes("session.maintenance.pruneAfter")).toBe(true);
|
||||
|
||||
const maxDisk = FIELD_HELP["session.maintenance.maxDiskBytes"];
|
||||
expect(maxDisk.includes("500mb")).toBe(true);
|
||||
expect(/deprecated|ignored/i.test(maxDisk)).toBe(true);
|
||||
expect(maxDisk.includes("doctor --fix")).toBe(true);
|
||||
|
||||
const highWater = FIELD_HELP["session.maintenance.highWaterBytes"];
|
||||
expect(highWater.includes("80%")).toBe(true);
|
||||
expect(/deprecated|ignored/i.test(highWater)).toBe(true);
|
||||
});
|
||||
|
||||
it("documents cron run-log retention controls", () => {
|
||||
|
||||
@@ -1523,7 +1523,7 @@ export const FIELD_HELP: Record<string, string> = {
|
||||
"session.threadBindings.defaultSpawnContext":
|
||||
'Default native subagent context for thread-bound spawns. Use "fork" to start from the requester transcript or "isolated" for a clean child. Default: "fork".',
|
||||
"session.maintenance":
|
||||
"Automatic session-store maintenance controls for pruning age, entry caps, and disk budget cleanup. Start in warn mode to observe impact, then enforce once thresholds are tuned.",
|
||||
"Explicit SQLite session-row maintenance controls for age and entry-count retention. Start in warn mode to observe impact, then enforce once thresholds are tuned.",
|
||||
"session.maintenance.mode":
|
||||
'Determines whether maintenance policies are only reported ("warn") or actively applied ("enforce"). Keep "warn" during rollout and switch to "enforce" after validating safe thresholds.',
|
||||
"session.maintenance.pruneAfter":
|
||||
@@ -1535,9 +1535,9 @@ export const FIELD_HELP: Record<string, string> = {
|
||||
"session.maintenance.rotateBytes":
|
||||
'Deprecated and ignored. Do not use for `sessions.json` growth control; OpenClaw no longer creates automatic rotation backups, and "openclaw doctor --fix" removes this key.',
|
||||
"session.maintenance.maxDiskBytes":
|
||||
"Optional per-agent sessions-directory disk budget (for example `500mb`). Use this to cap session storage per agent; when exceeded, warn mode reports pressure and enforce mode performs oldest-first cleanup.",
|
||||
"Deprecated and ignored. Session transcripts now live in SQLite; use openclaw doctor --fix to remove legacy disk-budget settings.",
|
||||
"session.maintenance.highWaterBytes":
|
||||
"Target size after disk-budget cleanup (high-water mark). Defaults to 80% of maxDiskBytes; set explicitly for tighter reclaim behavior on constrained disks.",
|
||||
"Deprecated and ignored with session.maintenance.maxDiskBytes. Use openclaw doctor --fix to remove it from legacy configs.",
|
||||
cron: "Global scheduler settings for stored cron jobs, run concurrency, delivery fallback, and run-session retention. Keep defaults unless you are scaling job volume or integrating external webhook receivers.",
|
||||
"cron.enabled":
|
||||
"Enables cron job execution for stored schedules managed by the gateway. Keep enabled for normal reminder/automation flows, and disable only to pause all cron execution without deleting jobs.",
|
||||
|
||||
@@ -746,8 +746,8 @@ export const FIELD_LABELS: Record<string, string> = {
|
||||
"session.maintenance.pruneDays": "Session Prune Days (Deprecated)",
|
||||
"session.maintenance.maxEntries": "Session Max Entries",
|
||||
"session.maintenance.rotateBytes": "Deprecated Session Rotate Size",
|
||||
"session.maintenance.maxDiskBytes": "Session Max Disk Budget",
|
||||
"session.maintenance.highWaterBytes": "Session Disk High-water Target",
|
||||
"session.maintenance.maxDiskBytes": "Deprecated Session Max Disk Budget",
|
||||
"session.maintenance.highWaterBytes": "Deprecated Session Disk High-water Target",
|
||||
cron: "Cron",
|
||||
"cron.enabled": "Cron Enabled",
|
||||
"cron.store": "Cron Store Path",
|
||||
|
||||
@@ -13,6 +13,5 @@ export * from "./sessions/types.js";
|
||||
export * from "./sessions/transcript.js";
|
||||
export * from "./sessions/session-file.js";
|
||||
export * from "./sessions/delivery-info.js";
|
||||
export * from "./sessions/disk-budget.js";
|
||||
export * from "./sessions/targets.js";
|
||||
export * from "./sessions/cleanup-service.js";
|
||||
|
||||
@@ -1,15 +1,8 @@
|
||||
import path from "node:path";
|
||||
import { resolveDefaultAgentId } from "../../agents/agent-scope.js";
|
||||
import { resolveStoredSessionOwnerAgentId } from "../../gateway/session-store-key.js";
|
||||
import { getLogger } from "../../logging/logger.js";
|
||||
import { normalizeAgentId, parseAgentSessionKey } from "../../routing/session-key.js";
|
||||
import type { OpenClawConfig } from "../types.openclaw.js";
|
||||
import {
|
||||
enforceSessionDiskBudget,
|
||||
pruneUnreferencedSessionArtifacts,
|
||||
resolveSessionArtifactCanonicalPathsForEntry,
|
||||
type SessionUnreferencedArtifactSweepResult,
|
||||
} from "./disk-budget.js";
|
||||
import { resolveStorePath } from "./paths.js";
|
||||
import { resolveMaintenanceConfig } from "./store-maintenance-runtime.js";
|
||||
import {
|
||||
@@ -35,13 +28,7 @@ export type SessionsCleanupOptions = SessionStoreSelectionOptions & {
|
||||
fixDmScope?: boolean;
|
||||
};
|
||||
|
||||
export type SessionCleanupAction =
|
||||
| "keep"
|
||||
| "prune-missing"
|
||||
| "prune-stale"
|
||||
| "cap-overflow"
|
||||
| "evict-budget"
|
||||
| "retire-dm-scope";
|
||||
export type SessionCleanupAction = "keep" | "prune-missing" | "prune-stale" | "cap-overflow";
|
||||
|
||||
export type SessionCleanupSummary = {
|
||||
agentId: string;
|
||||
@@ -54,8 +41,6 @@ export type SessionCleanupSummary = {
|
||||
dmScopeRetired: number;
|
||||
pruned: number;
|
||||
capped: number;
|
||||
unreferencedArtifacts: SessionUnreferencedArtifactSweepResult;
|
||||
diskBudget: Awaited<ReturnType<typeof enforceSessionDiskBudget>>;
|
||||
wouldMutate: boolean;
|
||||
applied?: true;
|
||||
appliedCount?: number;
|
||||
@@ -78,8 +63,6 @@ export type SessionsCleanupRunResult = {
|
||||
missingKeys: Set<string>;
|
||||
staleKeys: Set<string>;
|
||||
cappedKeys: Set<string>;
|
||||
budgetEvictedKeys: Set<string>;
|
||||
dmScopeRetiredKeys: Set<string>;
|
||||
}>;
|
||||
appliedSummaries: SessionCleanupSummary[];
|
||||
};
|
||||
@@ -90,7 +73,6 @@ type AppliedSessionCleanupReport = {
|
||||
afterCount: number;
|
||||
pruned: number;
|
||||
capped: number;
|
||||
diskBudget: Awaited<ReturnType<typeof enforceSessionDiskBudget>>;
|
||||
};
|
||||
|
||||
export function resolveSessionCleanupAction(params: {
|
||||
@@ -98,8 +80,6 @@ export function resolveSessionCleanupAction(params: {
|
||||
missingKeys: Set<string>;
|
||||
staleKeys: Set<string>;
|
||||
cappedKeys: Set<string>;
|
||||
budgetEvictedKeys: Set<string>;
|
||||
dmScopeRetiredKeys: Set<string>;
|
||||
}): SessionCleanupAction {
|
||||
if (params.dmScopeRetiredKeys.has(params.key)) {
|
||||
return "retire-dm-scope";
|
||||
@@ -113,9 +93,6 @@ export function resolveSessionCleanupAction(params: {
|
||||
if (params.cappedKeys.has(params.key)) {
|
||||
return "cap-overflow";
|
||||
}
|
||||
if (params.budgetEvictedKeys.has(params.key)) {
|
||||
return "evict-budget";
|
||||
}
|
||||
return "keep";
|
||||
}
|
||||
|
||||
@@ -217,27 +194,6 @@ function pruneMissingTranscriptEntries(params: {
|
||||
return removed;
|
||||
}
|
||||
|
||||
function addEntryArtifactPathsToSet(params: {
|
||||
paths: Set<string>;
|
||||
store: Record<string, SessionEntry>;
|
||||
storePath: string;
|
||||
keys: ReadonlySet<string>;
|
||||
}): void {
|
||||
const sessionsDir = path.dirname(params.storePath);
|
||||
for (const key of params.keys) {
|
||||
const entry = params.store[key];
|
||||
if (!entry) {
|
||||
continue;
|
||||
}
|
||||
for (const artifactPath of resolveSessionArtifactCanonicalPathsForEntry({
|
||||
sessionsDir,
|
||||
entry,
|
||||
})) {
|
||||
params.paths.add(artifactPath);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
async function previewStoreCleanup(params: {
|
||||
cfg: OpenClawConfig;
|
||||
target: SessionStoreTarget;
|
||||
@@ -289,55 +245,9 @@ async function previewStoreCleanup(params: {
|
||||
cappedKeys.add(key);
|
||||
},
|
||||
});
|
||||
const entryCleanupArtifactPaths = new Set<string>();
|
||||
addEntryArtifactPathsToSet({
|
||||
paths: entryCleanupArtifactPaths,
|
||||
store: beforeStore,
|
||||
storePath: params.target.storePath,
|
||||
keys: staleKeys,
|
||||
});
|
||||
addEntryArtifactPathsToSet({
|
||||
paths: entryCleanupArtifactPaths,
|
||||
store: beforeStore,
|
||||
storePath: params.target.storePath,
|
||||
keys: cappedKeys,
|
||||
});
|
||||
const beforeBudgetStore = structuredClone(previewStore);
|
||||
const budgetRemovedFilePaths = new Set<string>();
|
||||
const diskBudget = await enforceSessionDiskBudget({
|
||||
store: previewStore,
|
||||
storePath: params.target.storePath,
|
||||
activeSessionKey: params.activeKey,
|
||||
maintenance: params.maintenance,
|
||||
warnOnly: false,
|
||||
dryRun: true,
|
||||
onRemoveFile: (canonicalPath) => {
|
||||
budgetRemovedFilePaths.add(canonicalPath);
|
||||
},
|
||||
});
|
||||
const unreferencedArtifacts = await pruneUnreferencedSessionArtifacts({
|
||||
store: previewStore,
|
||||
storePath: params.target.storePath,
|
||||
olderThanMs: params.maintenance.pruneAfterMs,
|
||||
dryRun: true,
|
||||
excludeCanonicalPaths: new Set([...budgetRemovedFilePaths, ...entryCleanupArtifactPaths]),
|
||||
});
|
||||
const budgetEvictedKeys = new Set<string>();
|
||||
for (const key of Object.keys(beforeBudgetStore)) {
|
||||
if (!Object.hasOwn(previewStore, key)) {
|
||||
budgetEvictedKeys.add(key);
|
||||
}
|
||||
}
|
||||
const beforeCount = Object.keys(beforeStore).length;
|
||||
const afterPreviewCount = Object.keys(previewStore).length;
|
||||
const wouldMutate =
|
||||
missing > 0 ||
|
||||
dmScopeRetired > 0 ||
|
||||
pruned > 0 ||
|
||||
capped > 0 ||
|
||||
unreferencedArtifacts.removedFiles > 0 ||
|
||||
(diskBudget?.removedEntries ?? 0) > 0 ||
|
||||
(diskBudget?.removedFiles ?? 0) > 0;
|
||||
const wouldMutate = missing > 0 || pruned > 0 || capped > 0;
|
||||
|
||||
const summary: SessionCleanupSummary = {
|
||||
agentId: params.target.agentId,
|
||||
@@ -350,8 +260,6 @@ async function previewStoreCleanup(params: {
|
||||
dmScopeRetired,
|
||||
pruned,
|
||||
capped,
|
||||
unreferencedArtifacts,
|
||||
diskBudget,
|
||||
wouldMutate,
|
||||
};
|
||||
|
||||
@@ -361,8 +269,6 @@ async function previewStoreCleanup(params: {
|
||||
missingKeys,
|
||||
staleKeys,
|
||||
cappedKeys,
|
||||
budgetEvictedKeys,
|
||||
dmScopeRetiredKeys,
|
||||
};
|
||||
}
|
||||
|
||||
@@ -414,16 +320,7 @@ export async function runSessionsCleanup(params: {
|
||||
: 0;
|
||||
let pruned = 0;
|
||||
let capped = 0;
|
||||
let diskBudget: AppliedSessionCleanupReport["diskBudget"] = null;
|
||||
if (mode === "warn") {
|
||||
diskBudget = await enforceSessionDiskBudget({
|
||||
store,
|
||||
storePath: target.storePath,
|
||||
activeSessionKey: opts.activeKey,
|
||||
maintenance,
|
||||
warnOnly: true,
|
||||
});
|
||||
} else {
|
||||
if (mode === "enforce") {
|
||||
const preserveKeys = opts.activeKey ? new Set([opts.activeKey]) : undefined;
|
||||
pruned = pruneStaleEntries(store, maintenance.pruneAfterMs, {
|
||||
preserveKeys,
|
||||
@@ -431,13 +328,6 @@ export async function runSessionsCleanup(params: {
|
||||
capped = capEntryCount(store, maintenance.maxEntries, {
|
||||
preserveKeys,
|
||||
});
|
||||
diskBudget = await enforceSessionDiskBudget({
|
||||
store,
|
||||
storePath: target.storePath,
|
||||
activeSessionKey: opts.activeKey,
|
||||
maintenance,
|
||||
warnOnly: false,
|
||||
});
|
||||
}
|
||||
appliedReportRef.current = {
|
||||
mode,
|
||||
@@ -445,25 +335,10 @@ export async function runSessionsCleanup(params: {
|
||||
afterCount: Object.keys(store).length,
|
||||
pruned,
|
||||
capped,
|
||||
diskBudget,
|
||||
};
|
||||
return missing;
|
||||
});
|
||||
const afterStore = loadSessionStore(target.storePath);
|
||||
const unreferencedArtifacts =
|
||||
mode === "warn"
|
||||
? {
|
||||
scannedFiles: 0,
|
||||
removedFiles: 0,
|
||||
freedBytes: 0,
|
||||
olderThanMs: maintenance.pruneAfterMs,
|
||||
}
|
||||
: await pruneUnreferencedSessionArtifacts({
|
||||
store: afterStore,
|
||||
storePath: target.storePath,
|
||||
olderThanMs: maintenance.pruneAfterMs,
|
||||
dryRun: false,
|
||||
});
|
||||
const preview = previewResults.find(
|
||||
(result) => result.summary.storePath === target.storePath,
|
||||
);
|
||||
@@ -482,14 +357,10 @@ export async function runSessionsCleanup(params: {
|
||||
dmScopeRetired: 0,
|
||||
pruned: 0,
|
||||
capped: 0,
|
||||
unreferencedArtifacts,
|
||||
diskBudget: null,
|
||||
wouldMutate: false,
|
||||
}),
|
||||
dryRun: false,
|
||||
unreferencedArtifacts,
|
||||
wouldMutate:
|
||||
(preview?.summary.wouldMutate ?? false) || unreferencedArtifacts.removedFiles > 0,
|
||||
wouldMutate: preview?.summary.wouldMutate ?? false,
|
||||
applied: true,
|
||||
appliedCount: Object.keys(afterStore).length,
|
||||
}
|
||||
@@ -504,16 +375,8 @@ export async function runSessionsCleanup(params: {
|
||||
dmScopeRetired: dmScopeRetiredApplied,
|
||||
pruned: appliedReport.pruned,
|
||||
capped: appliedReport.capped,
|
||||
unreferencedArtifacts,
|
||||
diskBudget: appliedReport.diskBudget,
|
||||
wouldMutate:
|
||||
missingApplied > 0 ||
|
||||
dmScopeRetiredApplied > 0 ||
|
||||
appliedReport.pruned > 0 ||
|
||||
appliedReport.capped > 0 ||
|
||||
unreferencedArtifacts.removedFiles > 0 ||
|
||||
(appliedReport.diskBudget?.removedEntries ?? 0) > 0 ||
|
||||
(appliedReport.diskBudget?.removedFiles ?? 0) > 0,
|
||||
missingApplied > 0 || appliedReport.pruned > 0 || appliedReport.capped > 0,
|
||||
applied: true,
|
||||
appliedCount: Object.keys(afterStore).length,
|
||||
};
|
||||
|
||||
@@ -1,114 +0,0 @@
|
||||
import fs from "node:fs/promises";
|
||||
import path from "node:path";
|
||||
import { describe, expect, it } from "vitest";
|
||||
import { withTempDir } from "../../test-helpers/temp-dir.js";
|
||||
import {
|
||||
resolveTrajectoryFilePath,
|
||||
resolveTrajectoryPointerFilePath,
|
||||
} from "../../trajectory/paths.js";
|
||||
import { enforceSessionDiskBudget } from "./disk-budget.js";
|
||||
import type { SessionEntry } from "./types.js";
|
||||
|
||||
describe("enforceSessionDiskBudget", () => {
|
||||
it("removes unreferenced trajectory sidecars while preserving referenced ones", async () => {
|
||||
await withTempDir({ prefix: "openclaw-disk-budget-" }, async (dir) => {
|
||||
const storePath = path.join(dir, "sessions.json");
|
||||
const sessionId = "keep";
|
||||
const transcriptPath = path.join(dir, `${sessionId}.jsonl`);
|
||||
const referencedRuntime = resolveTrajectoryFilePath({
|
||||
env: {},
|
||||
sessionFile: transcriptPath,
|
||||
sessionId,
|
||||
});
|
||||
const referencedPointer = resolveTrajectoryPointerFilePath(transcriptPath);
|
||||
const orphanRuntime = path.join(dir, "old.trajectory.jsonl");
|
||||
const orphanPointer = path.join(dir, "old.trajectory-path.json");
|
||||
const store: Record<string, SessionEntry> = {
|
||||
"agent:main:main": {
|
||||
sessionId,
|
||||
sessionFile: transcriptPath,
|
||||
updatedAt: Date.now(),
|
||||
},
|
||||
};
|
||||
await fs.writeFile(referencedRuntime, "r".repeat(80), "utf-8");
|
||||
await fs.writeFile(referencedPointer, "p".repeat(80), "utf-8");
|
||||
await fs.writeFile(orphanRuntime, "o".repeat(5000), "utf-8");
|
||||
await fs.writeFile(orphanPointer, "q".repeat(5000), "utf-8");
|
||||
|
||||
const result = await enforceSessionDiskBudget({
|
||||
store,
|
||||
storePath,
|
||||
maintenance: {
|
||||
maxDiskBytes: 7000,
|
||||
highWaterBytes: 2000,
|
||||
},
|
||||
warnOnly: false,
|
||||
});
|
||||
|
||||
await expect(fs.stat(referencedRuntime)).resolves.toBeDefined();
|
||||
await expect(fs.stat(referencedPointer)).resolves.toBeDefined();
|
||||
await expect(fs.stat(orphanRuntime)).rejects.toThrow();
|
||||
await expect(fs.stat(orphanPointer)).rejects.toThrow();
|
||||
expect(result).toEqual(
|
||||
expect.objectContaining({
|
||||
removedFiles: 2,
|
||||
removedEntries: 0,
|
||||
}),
|
||||
);
|
||||
});
|
||||
});
|
||||
|
||||
it("does not evict protected thread session entries under store pressure", async () => {
|
||||
await withTempDir({ prefix: "openclaw-disk-budget-" }, async (dir) => {
|
||||
const storePath = path.join(dir, "sessions.json");
|
||||
const protectedKey = "agent:main:slack:channel:C123:thread:1710000000.000100";
|
||||
const removableKey = "agent:main:subagent:old-worker";
|
||||
const activeKey = "agent:main:main";
|
||||
const removableSessionFile = path.join(dir, "removable-worker.jsonl");
|
||||
const removableRuntime = resolveTrajectoryFilePath({
|
||||
env: {},
|
||||
sessionFile: removableSessionFile,
|
||||
sessionId: "removable-worker",
|
||||
});
|
||||
const store: Record<string, SessionEntry> = {
|
||||
[protectedKey]: {
|
||||
sessionId: "protected-thread",
|
||||
updatedAt: 1,
|
||||
displayName: "p".repeat(2000),
|
||||
},
|
||||
[removableKey]: {
|
||||
sessionId: "removable-worker",
|
||||
sessionFile: removableSessionFile,
|
||||
updatedAt: 2,
|
||||
displayName: "r".repeat(2000),
|
||||
},
|
||||
[activeKey]: {
|
||||
sessionId: "active",
|
||||
updatedAt: 3,
|
||||
},
|
||||
};
|
||||
await fs.writeFile(removableRuntime, "w".repeat(800), "utf-8");
|
||||
|
||||
const result = await enforceSessionDiskBudget({
|
||||
store,
|
||||
storePath,
|
||||
activeSessionKey: activeKey,
|
||||
maintenance: {
|
||||
maxDiskBytes: 600,
|
||||
highWaterBytes: 200,
|
||||
},
|
||||
warnOnly: false,
|
||||
});
|
||||
|
||||
expect(store[protectedKey]).toBeDefined();
|
||||
expect(store[removableKey]).toBeUndefined();
|
||||
expect(store[activeKey]).toBeDefined();
|
||||
expect(result).toEqual(
|
||||
expect.objectContaining({
|
||||
removedEntries: 1,
|
||||
removedFiles: 1,
|
||||
}),
|
||||
);
|
||||
});
|
||||
});
|
||||
});
|
||||
@@ -1,461 +0,0 @@
|
||||
import fs from "node:fs";
|
||||
import path from "node:path";
|
||||
import {
|
||||
normalizeLowercaseStringOrEmpty,
|
||||
normalizeOptionalLowercaseString,
|
||||
} from "../../shared/string-coerce.js";
|
||||
import {
|
||||
resolveTrajectoryFilePath,
|
||||
resolveTrajectoryPointerFilePath,
|
||||
} from "../../trajectory/paths.js";
|
||||
import { isTrajectorySessionArtifactName } from "./artifacts.js";
|
||||
import { resolveSessionFilePath } from "./paths.js";
|
||||
import { isProtectedSessionMaintenanceEntry } from "./store-maintenance.js";
|
||||
import type { SessionEntry } from "./types.js";
|
||||
|
||||
export type SessionDiskBudgetConfig = {
|
||||
maxDiskBytes: number | null;
|
||||
highWaterBytes: number | null;
|
||||
};
|
||||
|
||||
export type SessionDiskBudgetSweepResult = {
|
||||
totalBytesBefore: number;
|
||||
totalBytesAfter: number;
|
||||
removedFiles: number;
|
||||
removedEntries: number;
|
||||
freedBytes: number;
|
||||
maxBytes: number;
|
||||
highWaterBytes: number;
|
||||
overBudget: boolean;
|
||||
};
|
||||
|
||||
export type SessionUnreferencedArtifactSweepResult = {
|
||||
scannedFiles: number;
|
||||
removedFiles: number;
|
||||
freedBytes: number;
|
||||
olderThanMs: number;
|
||||
};
|
||||
|
||||
export type SessionDiskBudgetLogger = {
|
||||
warn: (message: string, context?: Record<string, unknown>) => void;
|
||||
info: (message: string, context?: Record<string, unknown>) => void;
|
||||
};
|
||||
|
||||
const NOOP_LOGGER: SessionDiskBudgetLogger = {
|
||||
warn: () => {},
|
||||
info: () => {},
|
||||
};
|
||||
|
||||
type SessionsDirFileStat = {
|
||||
path: string;
|
||||
canonicalPath: string;
|
||||
name: string;
|
||||
size: number;
|
||||
mtimeMs: number;
|
||||
};
|
||||
|
||||
function canonicalizePathForComparison(filePath: string): string {
|
||||
const resolved = path.resolve(filePath);
|
||||
try {
|
||||
return fs.realpathSync(resolved);
|
||||
} catch {
|
||||
return resolved;
|
||||
}
|
||||
}
|
||||
|
||||
function getEntryUpdatedAt(entry?: SessionEntry): number {
|
||||
if (!entry) {
|
||||
return 0;
|
||||
}
|
||||
const updatedAt = entry.updatedAt;
|
||||
return Number.isFinite(updatedAt) ? updatedAt : 0;
|
||||
}
|
||||
|
||||
function buildSessionIdRefCounts(store: Record<string, SessionEntry>): Map<string, number> {
|
||||
const counts = new Map<string, number>();
|
||||
for (const entry of Object.values(store)) {
|
||||
const sessionId = entry?.sessionId;
|
||||
if (!sessionId) {
|
||||
continue;
|
||||
}
|
||||
counts.set(sessionId, (counts.get(sessionId) ?? 0) + 1);
|
||||
}
|
||||
return counts;
|
||||
}
|
||||
|
||||
function resolveSessionTranscriptPathForEntry(params: {
|
||||
sessionsDir: string;
|
||||
entry: SessionEntry;
|
||||
}): string | null {
|
||||
if (!params.entry.sessionId) {
|
||||
return null;
|
||||
}
|
||||
try {
|
||||
const resolved = resolveSessionFilePath(params.entry.sessionId, params.entry, {
|
||||
sessionsDir: params.sessionsDir,
|
||||
});
|
||||
const resolvedSessionsDir = canonicalizePathForComparison(params.sessionsDir);
|
||||
const resolvedPath = canonicalizePathForComparison(resolved);
|
||||
const relative = path.relative(resolvedSessionsDir, resolvedPath);
|
||||
if (!relative || relative.startsWith("..") || path.isAbsolute(relative)) {
|
||||
return null;
|
||||
}
|
||||
return resolvedPath;
|
||||
} catch {
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
function resolveSessionArtifactPathsForEntry(params: {
|
||||
sessionsDir: string;
|
||||
entry: SessionEntry;
|
||||
}): string[] {
|
||||
const transcriptPath = resolveSessionTranscriptPathForEntry(params);
|
||||
if (!transcriptPath) {
|
||||
return [];
|
||||
}
|
||||
const paths: string[] = [];
|
||||
if (params.entry.sessionId) {
|
||||
paths.push(resolveTrajectoryPointerFilePath(transcriptPath));
|
||||
paths.push(
|
||||
resolveTrajectoryFilePath({
|
||||
env: {},
|
||||
sessionFile: transcriptPath,
|
||||
sessionId: params.entry.sessionId,
|
||||
}),
|
||||
);
|
||||
}
|
||||
return paths;
|
||||
}
|
||||
|
||||
export function resolveSessionArtifactCanonicalPathsForEntry(params: {
|
||||
sessionsDir: string;
|
||||
entry: SessionEntry;
|
||||
}): string[] {
|
||||
return resolveSessionArtifactPathsForEntry(params).map(canonicalizePathForComparison);
|
||||
}
|
||||
|
||||
function resolveReferencedSessionArtifactPaths(params: {
|
||||
sessionsDir: string;
|
||||
store: Record<string, SessionEntry>;
|
||||
}): Set<string> {
|
||||
const referenced = new Set<string>();
|
||||
for (const entry of Object.values(params.store)) {
|
||||
for (const resolved of resolveSessionArtifactCanonicalPathsForEntry({
|
||||
sessionsDir: params.sessionsDir,
|
||||
entry,
|
||||
})) {
|
||||
referenced.add(resolved);
|
||||
}
|
||||
}
|
||||
return referenced;
|
||||
}
|
||||
|
||||
async function readSessionsDirFiles(sessionsDir: string): Promise<SessionsDirFileStat[]> {
|
||||
const dirEntries = await fs.promises
|
||||
.readdir(sessionsDir, { withFileTypes: true })
|
||||
.catch(() => []);
|
||||
const files: SessionsDirFileStat[] = [];
|
||||
for (const dirent of dirEntries) {
|
||||
if (!dirent.isFile()) {
|
||||
continue;
|
||||
}
|
||||
const filePath = path.join(sessionsDir, dirent.name);
|
||||
const stat = await fs.promises.stat(filePath).catch(() => null);
|
||||
if (!stat?.isFile()) {
|
||||
continue;
|
||||
}
|
||||
files.push({
|
||||
path: filePath,
|
||||
canonicalPath: canonicalizePathForComparison(filePath),
|
||||
name: dirent.name,
|
||||
size: stat.size,
|
||||
mtimeMs: stat.mtimeMs,
|
||||
});
|
||||
}
|
||||
return files;
|
||||
}
|
||||
|
||||
function isUnreferencedSessionArtifactFile(
|
||||
file: Pick<SessionsDirFileStat, "canonicalPath" | "name">,
|
||||
referencedPaths: ReadonlySet<string>,
|
||||
): boolean {
|
||||
if (referencedPaths.has(file.canonicalPath)) {
|
||||
return false;
|
||||
}
|
||||
return isTrajectorySessionArtifactName(file.name);
|
||||
}
|
||||
|
||||
function isDiskBudgetRemovableSessionFile(
|
||||
file: Pick<SessionsDirFileStat, "canonicalPath" | "name">,
|
||||
referencedPaths: ReadonlySet<string>,
|
||||
): boolean {
|
||||
return isUnreferencedSessionArtifactFile(file, referencedPaths);
|
||||
}
|
||||
|
||||
async function removeFileIfExists(filePath: string): Promise<number> {
|
||||
const stat = await fs.promises.stat(filePath).catch(() => null);
|
||||
if (!stat?.isFile()) {
|
||||
return 0;
|
||||
}
|
||||
await fs.promises.rm(filePath, { force: true }).catch(() => undefined);
|
||||
return stat.size;
|
||||
}
|
||||
|
||||
async function removeFileForBudget(params: {
|
||||
filePath: string;
|
||||
canonicalPath?: string;
|
||||
dryRun: boolean;
|
||||
fileSizesByPath: Map<string, number>;
|
||||
simulatedRemovedPaths: Set<string>;
|
||||
onRemovedPath?: (canonicalPath: string) => void;
|
||||
}): Promise<number> {
|
||||
const resolvedPath = path.resolve(params.filePath);
|
||||
const canonicalPath = params.canonicalPath ?? canonicalizePathForComparison(resolvedPath);
|
||||
if (params.dryRun) {
|
||||
if (params.simulatedRemovedPaths.has(canonicalPath)) {
|
||||
return 0;
|
||||
}
|
||||
const size = params.fileSizesByPath.get(canonicalPath) ?? 0;
|
||||
if (size <= 0) {
|
||||
return 0;
|
||||
}
|
||||
params.simulatedRemovedPaths.add(canonicalPath);
|
||||
params.onRemovedPath?.(canonicalPath);
|
||||
return size;
|
||||
}
|
||||
const size = await removeFileIfExists(resolvedPath);
|
||||
if (size > 0) {
|
||||
params.onRemovedPath?.(canonicalPath);
|
||||
}
|
||||
return size;
|
||||
}
|
||||
|
||||
export async function pruneUnreferencedSessionArtifacts(params: {
|
||||
store: Record<string, SessionEntry>;
|
||||
storePath: string;
|
||||
olderThanMs: number;
|
||||
dryRun?: boolean;
|
||||
excludeCanonicalPaths?: ReadonlySet<string>;
|
||||
}): Promise<SessionUnreferencedArtifactSweepResult> {
|
||||
const olderThanMs =
|
||||
Number.isFinite(params.olderThanMs) && params.olderThanMs > 0 ? params.olderThanMs : 0;
|
||||
const sessionsDir = path.dirname(params.storePath);
|
||||
const files = await readSessionsDirFiles(sessionsDir);
|
||||
const fileSizesByPath = new Map(files.map((file) => [file.canonicalPath, file.size]));
|
||||
const simulatedRemovedPaths = new Set<string>();
|
||||
const referencedPaths = resolveReferencedSessionArtifactPaths({
|
||||
sessionsDir,
|
||||
store: params.store,
|
||||
});
|
||||
const cutoffMs = Date.now() - olderThanMs;
|
||||
const removableFiles = files
|
||||
.filter(
|
||||
(file) =>
|
||||
!params.excludeCanonicalPaths?.has(file.canonicalPath) &&
|
||||
file.mtimeMs <= cutoffMs &&
|
||||
isUnreferencedSessionArtifactFile(file, referencedPaths),
|
||||
)
|
||||
.toSorted((a, b) => a.mtimeMs - b.mtimeMs);
|
||||
|
||||
let removedFiles = 0;
|
||||
let freedBytes = 0;
|
||||
for (const file of removableFiles) {
|
||||
const deletedBytes = await removeFileForBudget({
|
||||
filePath: file.path,
|
||||
canonicalPath: file.canonicalPath,
|
||||
dryRun: params.dryRun === true,
|
||||
fileSizesByPath,
|
||||
simulatedRemovedPaths,
|
||||
});
|
||||
if (deletedBytes <= 0) {
|
||||
continue;
|
||||
}
|
||||
removedFiles += 1;
|
||||
freedBytes += deletedBytes;
|
||||
}
|
||||
|
||||
return {
|
||||
scannedFiles: files.length,
|
||||
removedFiles,
|
||||
freedBytes,
|
||||
olderThanMs,
|
||||
};
|
||||
}
|
||||
|
||||
export async function enforceSessionDiskBudget(params: {
|
||||
store: Record<string, SessionEntry>;
|
||||
storePath: string;
|
||||
activeSessionKey?: string;
|
||||
maintenance: SessionDiskBudgetConfig;
|
||||
warnOnly: boolean;
|
||||
dryRun?: boolean;
|
||||
log?: SessionDiskBudgetLogger;
|
||||
onRemoveFile?: (canonicalPath: string) => void;
|
||||
}): Promise<SessionDiskBudgetSweepResult | null> {
|
||||
const maxBytes = params.maintenance.maxDiskBytes;
|
||||
const highWaterBytes = params.maintenance.highWaterBytes;
|
||||
if (maxBytes == null || highWaterBytes == null) {
|
||||
return null;
|
||||
}
|
||||
const log = params.log ?? NOOP_LOGGER;
|
||||
const dryRun = params.dryRun === true;
|
||||
const sessionsDir = path.dirname(params.storePath);
|
||||
const files = await readSessionsDirFiles(sessionsDir);
|
||||
const fileSizesByPath = new Map(files.map((file) => [file.canonicalPath, file.size]));
|
||||
const simulatedRemovedPaths = new Set<string>();
|
||||
const resolvedStorePath = canonicalizePathForComparison(params.storePath);
|
||||
let total = files
|
||||
.filter((file) => file.canonicalPath !== resolvedStorePath)
|
||||
.reduce((sum, file) => sum + file.size, 0);
|
||||
const totalBefore = total;
|
||||
if (total <= maxBytes) {
|
||||
return {
|
||||
totalBytesBefore: totalBefore,
|
||||
totalBytesAfter: total,
|
||||
removedFiles: 0,
|
||||
removedEntries: 0,
|
||||
freedBytes: 0,
|
||||
maxBytes,
|
||||
highWaterBytes,
|
||||
overBudget: false,
|
||||
};
|
||||
}
|
||||
|
||||
if (params.warnOnly) {
|
||||
log.warn("session disk budget exceeded (warn-only mode)", {
|
||||
sessionsDir,
|
||||
totalBytes: total,
|
||||
maxBytes,
|
||||
highWaterBytes,
|
||||
});
|
||||
return {
|
||||
totalBytesBefore: totalBefore,
|
||||
totalBytesAfter: total,
|
||||
removedFiles: 0,
|
||||
removedEntries: 0,
|
||||
freedBytes: 0,
|
||||
maxBytes,
|
||||
highWaterBytes,
|
||||
overBudget: true,
|
||||
};
|
||||
}
|
||||
|
||||
let removedFiles = 0;
|
||||
let removedEntries = 0;
|
||||
let freedBytes = 0;
|
||||
|
||||
const referencedPaths = resolveReferencedSessionArtifactPaths({
|
||||
sessionsDir,
|
||||
store: params.store,
|
||||
});
|
||||
const removableFileQueue = files
|
||||
.filter((file) => isDiskBudgetRemovableSessionFile(file, referencedPaths))
|
||||
.toSorted((a, b) => a.mtimeMs - b.mtimeMs);
|
||||
for (const file of removableFileQueue) {
|
||||
if (total <= highWaterBytes) {
|
||||
break;
|
||||
}
|
||||
const deletedBytes = await removeFileForBudget({
|
||||
filePath: file.path,
|
||||
canonicalPath: file.canonicalPath,
|
||||
dryRun,
|
||||
fileSizesByPath,
|
||||
simulatedRemovedPaths,
|
||||
onRemovedPath: params.onRemoveFile,
|
||||
});
|
||||
if (deletedBytes <= 0) {
|
||||
continue;
|
||||
}
|
||||
total -= deletedBytes;
|
||||
freedBytes += deletedBytes;
|
||||
removedFiles += 1;
|
||||
}
|
||||
|
||||
if (total > highWaterBytes) {
|
||||
const activeSessionKey = normalizeOptionalLowercaseString(params.activeSessionKey);
|
||||
const sessionIdRefCounts = buildSessionIdRefCounts(params.store);
|
||||
const keys = Object.keys(params.store).toSorted((a, b) => {
|
||||
const aTime = getEntryUpdatedAt(params.store[a]);
|
||||
const bTime = getEntryUpdatedAt(params.store[b]);
|
||||
return aTime - bTime;
|
||||
});
|
||||
for (const key of keys) {
|
||||
if (total <= highWaterBytes) {
|
||||
break;
|
||||
}
|
||||
if (activeSessionKey && normalizeLowercaseStringOrEmpty(key) === activeSessionKey) {
|
||||
continue;
|
||||
}
|
||||
const entry = params.store[key];
|
||||
if (!entry) {
|
||||
continue;
|
||||
}
|
||||
if (isProtectedSessionMaintenanceEntry(key, entry)) {
|
||||
continue;
|
||||
}
|
||||
delete params.store[key];
|
||||
removedEntries += 1;
|
||||
|
||||
const sessionId = entry.sessionId;
|
||||
if (!sessionId) {
|
||||
continue;
|
||||
}
|
||||
const nextRefCount = (sessionIdRefCounts.get(sessionId) ?? 1) - 1;
|
||||
if (nextRefCount > 0) {
|
||||
sessionIdRefCounts.set(sessionId, nextRefCount);
|
||||
continue;
|
||||
}
|
||||
sessionIdRefCounts.delete(sessionId);
|
||||
for (const artifactPath of resolveSessionArtifactPathsForEntry({ sessionsDir, entry })) {
|
||||
const deletedBytes = await removeFileForBudget({
|
||||
filePath: artifactPath,
|
||||
dryRun,
|
||||
fileSizesByPath,
|
||||
simulatedRemovedPaths,
|
||||
onRemovedPath: params.onRemoveFile,
|
||||
});
|
||||
if (deletedBytes <= 0) {
|
||||
continue;
|
||||
}
|
||||
total -= deletedBytes;
|
||||
freedBytes += deletedBytes;
|
||||
removedFiles += 1;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if (!dryRun) {
|
||||
if (total > highWaterBytes) {
|
||||
log.warn("session disk budget still above high-water target after cleanup", {
|
||||
sessionsDir,
|
||||
totalBytes: total,
|
||||
maxBytes,
|
||||
highWaterBytes,
|
||||
removedFiles,
|
||||
removedEntries,
|
||||
});
|
||||
} else if (removedFiles > 0 || removedEntries > 0) {
|
||||
log.info("applied session disk budget cleanup", {
|
||||
sessionsDir,
|
||||
totalBytesBefore: totalBefore,
|
||||
totalBytesAfter: total,
|
||||
maxBytes,
|
||||
highWaterBytes,
|
||||
removedFiles,
|
||||
removedEntries,
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
return {
|
||||
totalBytesBefore: totalBefore,
|
||||
totalBytesAfter: total,
|
||||
removedFiles,
|
||||
removedEntries,
|
||||
freedBytes,
|
||||
maxBytes,
|
||||
highWaterBytes,
|
||||
overBudget: true,
|
||||
};
|
||||
}
|
||||
@@ -1,7 +1,7 @@
|
||||
import fs from "node:fs";
|
||||
import fsPromises from "node:fs/promises";
|
||||
import path from "node:path";
|
||||
import { afterAll, afterEach, beforeAll, describe, expect, it, vi } from "vitest";
|
||||
import { afterAll, afterEach, beforeAll, describe, expect, it } from "vitest";
|
||||
import { upsertAcpSessionMeta } from "../../acp/runtime/session-meta.js";
|
||||
import { createSuiteTempRootTracker, withTempDirSync } from "../../test-helpers/temp-dir.js";
|
||||
import type { OpenClawConfig } from "../config.js";
|
||||
|
||||
@@ -1,4 +1,3 @@
|
||||
import { parseByteSize } from "../../cli/parse-bytes.js";
|
||||
import { parseDurationMs } from "../../cli/parse-duration.js";
|
||||
import { createSubsystemLogger } from "../../logging/subsystem.js";
|
||||
import {
|
||||
@@ -20,7 +19,6 @@ const log = createSubsystemLogger("sessions/store");
|
||||
const DEFAULT_SESSION_PRUNE_AFTER_MS = 30 * 24 * 60 * 60 * 1000;
|
||||
const DEFAULT_SESSION_MAX_ENTRIES = 500;
|
||||
const DEFAULT_SESSION_MAINTENANCE_MODE: SessionMaintenanceMode = "enforce";
|
||||
const DEFAULT_SESSION_DISK_BUDGET_HIGH_WATER_RATIO = 0.8;
|
||||
const STRICT_ENTRY_MAINTENANCE_MAX_ENTRIES = 49;
|
||||
const MIN_BATCHED_ENTRY_MAINTENANCE_SLACK = 25;
|
||||
const BATCHED_ENTRY_MAINTENANCE_SLACK_RATIO = 0.1;
|
||||
@@ -39,8 +37,6 @@ export type ResolvedSessionMaintenanceConfig = {
|
||||
mode: SessionMaintenanceMode;
|
||||
pruneAfterMs: number;
|
||||
maxEntries: number;
|
||||
maxDiskBytes: number | null;
|
||||
highWaterBytes: number | null;
|
||||
};
|
||||
|
||||
function resolvePruneAfterMs(maintenance?: SessionMaintenanceConfig): number {
|
||||
@@ -56,54 +52,6 @@ function resolvePruneAfterMs(maintenance?: SessionMaintenanceConfig): number {
|
||||
}
|
||||
}
|
||||
|
||||
function resolveMaxDiskBytes(maintenance?: SessionMaintenanceConfig): number | null {
|
||||
const raw = maintenance?.maxDiskBytes;
|
||||
const normalized = normalizeStringifiedOptionalString(raw);
|
||||
if (!normalized) {
|
||||
return null;
|
||||
}
|
||||
try {
|
||||
return parseByteSize(normalized, { defaultUnit: "b" });
|
||||
} catch {
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
function resolveHighWaterBytes(
|
||||
maintenance: SessionMaintenanceConfig | undefined,
|
||||
maxDiskBytes: number | null,
|
||||
): number | null {
|
||||
const computeDefault = () => {
|
||||
if (maxDiskBytes == null) {
|
||||
return null;
|
||||
}
|
||||
if (maxDiskBytes <= 0) {
|
||||
return 0;
|
||||
}
|
||||
return Math.max(
|
||||
1,
|
||||
Math.min(
|
||||
maxDiskBytes,
|
||||
Math.floor(maxDiskBytes * DEFAULT_SESSION_DISK_BUDGET_HIGH_WATER_RATIO),
|
||||
),
|
||||
);
|
||||
};
|
||||
if (maxDiskBytes == null) {
|
||||
return null;
|
||||
}
|
||||
const raw = maintenance?.highWaterBytes;
|
||||
const normalized = normalizeStringifiedOptionalString(raw);
|
||||
if (!normalized) {
|
||||
return computeDefault();
|
||||
}
|
||||
try {
|
||||
const parsed = parseByteSize(normalized, { defaultUnit: "b" });
|
||||
return Math.min(parsed, maxDiskBytes);
|
||||
} catch {
|
||||
return computeDefault();
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Resolve maintenance settings from openclaw.json (`session.maintenance`).
|
||||
* Falls back to built-in defaults when config is missing or unset.
|
||||
@@ -112,13 +60,10 @@ export function resolveMaintenanceConfigFromInput(
|
||||
maintenance?: SessionMaintenanceConfig,
|
||||
): ResolvedSessionMaintenanceConfig {
|
||||
const pruneAfterMs = resolvePruneAfterMs(maintenance);
|
||||
const maxDiskBytes = resolveMaxDiskBytes(maintenance);
|
||||
return {
|
||||
mode: maintenance?.mode ?? DEFAULT_SESSION_MAINTENANCE_MODE,
|
||||
pruneAfterMs,
|
||||
maxEntries: maintenance?.maxEntries ?? DEFAULT_SESSION_MAX_ENTRIES,
|
||||
maxDiskBytes,
|
||||
highWaterBytes: resolveHighWaterBytes(maintenance, maxDiskBytes),
|
||||
};
|
||||
}
|
||||
|
||||
|
||||
@@ -102,10 +102,6 @@ function parseAssistantTranscriptEventText(event: unknown): AssistantTranscriptT
|
||||
};
|
||||
}
|
||||
|
||||
function parseAssistantTranscriptText(line: string): AssistantTranscriptText | undefined {
|
||||
return parseAssistantTranscriptEventText(JSON.parse(line));
|
||||
}
|
||||
|
||||
export async function resolveSessionTranscriptFile(params: {
|
||||
sessionId: string;
|
||||
sessionKey: string;
|
||||
|
||||
@@ -210,7 +210,7 @@ export type SessionConfig = {
|
||||
};
|
||||
/** Shared defaults for thread-bound session routing across channels/providers. */
|
||||
threadBindings?: SessionThreadBindingsConfig;
|
||||
/** Automatic session store maintenance (pruning, capping, archive retention, disk budget). */
|
||||
/** Explicit SQLite session-row maintenance (age and entry-count retention). */
|
||||
maintenance?: SessionMaintenanceConfig;
|
||||
};
|
||||
|
||||
@@ -232,15 +232,9 @@ export type SessionMaintenanceConfig = {
|
||||
maxEntries?: number;
|
||||
/** @deprecated Ignored. Run `openclaw doctor --fix` to remove. */
|
||||
rotateBytes?: number | string;
|
||||
/**
|
||||
* Optional per-agent sessions-directory disk budget (e.g. "500mb").
|
||||
* When exceeded, warn (mode=warn) or enforce oldest-first cleanup (mode=enforce).
|
||||
*/
|
||||
/** @deprecated Ignored. Session transcripts are stored in SQLite. */
|
||||
maxDiskBytes?: number | string;
|
||||
/**
|
||||
* Target size after disk-budget cleanup (high-water mark), e.g. "400mb".
|
||||
* Default: 80% of maxDiskBytes.
|
||||
*/
|
||||
/** @deprecated Ignored with maxDiskBytes. */
|
||||
highWaterBytes?: number | string;
|
||||
};
|
||||
|
||||
|
||||
@@ -22,24 +22,14 @@ describe("SessionSchema maintenance extensions", () => {
|
||||
).toThrow(/acquireTimeoutMs|number/i);
|
||||
});
|
||||
|
||||
it("accepts valid maintenance extensions", () => {
|
||||
expect(
|
||||
SessionSchema.safeParse({
|
||||
maintenance: {
|
||||
maxDiskBytes: "500mb",
|
||||
highWaterBytes: "350mb",
|
||||
},
|
||||
}),
|
||||
).toMatchObject({ success: true });
|
||||
});
|
||||
|
||||
it("rejects invalid maintenance extension values", () => {
|
||||
it("accepts ignored legacy disk budget settings", () => {
|
||||
expect(() =>
|
||||
SessionSchema.parse({
|
||||
maintenance: {
|
||||
maxDiskBytes: "big",
|
||||
highWaterBytes: "legacy",
|
||||
},
|
||||
}),
|
||||
).toThrow(/maxDiskBytes|size/i);
|
||||
).not.toThrow();
|
||||
});
|
||||
});
|
||||
|
||||
@@ -1,5 +1,4 @@
|
||||
import { z } from "zod";
|
||||
import { parseByteSize } from "../cli/parse-bytes.js";
|
||||
import { parseDurationMs } from "../cli/parse-duration.js";
|
||||
import { normalizeStringifiedOptionalString } from "../shared/string-coerce.js";
|
||||
import { ElevatedAllowFromSchema } from "./zod-schema.agent-runtime.js";
|
||||
@@ -104,32 +103,6 @@ export const SessionSchema = z
|
||||
});
|
||||
}
|
||||
}
|
||||
if (val.maxDiskBytes !== undefined) {
|
||||
try {
|
||||
parseByteSize(normalizeStringifiedOptionalString(val.maxDiskBytes) ?? "", {
|
||||
defaultUnit: "b",
|
||||
});
|
||||
} catch {
|
||||
ctx.addIssue({
|
||||
code: z.ZodIssueCode.custom,
|
||||
path: ["maxDiskBytes"],
|
||||
message: "invalid size (use b, kb, mb, gb, tb)",
|
||||
});
|
||||
}
|
||||
}
|
||||
if (val.highWaterBytes !== undefined) {
|
||||
try {
|
||||
parseByteSize(normalizeStringifiedOptionalString(val.highWaterBytes) ?? "", {
|
||||
defaultUnit: "b",
|
||||
});
|
||||
} catch {
|
||||
ctx.addIssue({
|
||||
code: z.ZodIssueCode.custom,
|
||||
path: ["highWaterBytes"],
|
||||
message: "invalid size (use b, kb, mb, gb, tb)",
|
||||
});
|
||||
}
|
||||
}
|
||||
})
|
||||
.optional(),
|
||||
})
|
||||
|
||||
@@ -1,5 +1,4 @@
|
||||
import { createHash } from "node:crypto";
|
||||
import fs from "node:fs";
|
||||
import path from "node:path";
|
||||
import { resolveSendableOutboundReplyParts } from "openclaw/plugin-sdk/reply-payload";
|
||||
import type { AgentMessage } from "../../agents/agent-core-contract.js";
|
||||
|
||||
@@ -1,8 +1,11 @@
|
||||
import fs from "node:fs";
|
||||
import {
|
||||
resolveSessionFilePath,
|
||||
resolveSessionFilePathOptions,
|
||||
} from "../../config/sessions/paths.js";
|
||||
import {
|
||||
loadSqliteSessionTranscriptEvents,
|
||||
resolveSqliteSessionTranscriptScope,
|
||||
} from "../../config/sessions/transcript-store.sqlite.js";
|
||||
import type { SessionEntry } from "../../config/sessions/types.js";
|
||||
import type { OpenClawConfig } from "../../config/types.openclaw.js";
|
||||
import { loadProviderUsageSummary } from "../../infra/provider-usage.js";
|
||||
@@ -64,6 +67,26 @@ type CostUsageCacheEntry = {
|
||||
inFlight?: Promise<CostUsageSummary>;
|
||||
};
|
||||
|
||||
function readSessionTranscriptUpdatedAt(params: {
|
||||
agentId?: string;
|
||||
sessionId: string;
|
||||
sessionFile?: string;
|
||||
}): number | undefined {
|
||||
const scope = resolveSqliteSessionTranscriptScope({
|
||||
agentId: params.agentId,
|
||||
sessionId: params.sessionId,
|
||||
transcriptPath: params.sessionFile,
|
||||
});
|
||||
if (!scope) {
|
||||
return undefined;
|
||||
}
|
||||
const events = loadSqliteSessionTranscriptEvents(scope);
|
||||
if (events.length === 0) {
|
||||
return undefined;
|
||||
}
|
||||
return events.at(-1)?.createdAt;
|
||||
}
|
||||
|
||||
const costUsageCache = new Map<string, CostUsageCacheEntry>();
|
||||
|
||||
function findCostUsageCacheEvictionKey(): string | undefined {
|
||||
@@ -877,24 +900,24 @@ export const usageHandlers: GatewayRequestHandlers = {
|
||||
}
|
||||
|
||||
if (sessionFile) {
|
||||
try {
|
||||
const stats = fs.statSync(sessionFile);
|
||||
if (stats.isFile()) {
|
||||
maybeMergeFamilyEntry({
|
||||
mergedEntries,
|
||||
groupingMode,
|
||||
base: {
|
||||
key: resolvedStoreKey,
|
||||
sessionId,
|
||||
sessionFile,
|
||||
label: storeEntry?.label,
|
||||
updatedAt: storeEntry?.updatedAt ?? stats.mtimeMs,
|
||||
storeEntry,
|
||||
},
|
||||
});
|
||||
}
|
||||
} catch {
|
||||
// File doesn't exist - no results for this key
|
||||
const transcriptUpdatedAt = readSessionTranscriptUpdatedAt({
|
||||
agentId: agentIdFromKey,
|
||||
sessionId,
|
||||
sessionFile,
|
||||
});
|
||||
if (transcriptUpdatedAt !== undefined) {
|
||||
maybeMergeFamilyEntry({
|
||||
mergedEntries,
|
||||
groupingMode,
|
||||
base: {
|
||||
key: resolvedStoreKey,
|
||||
sessionId,
|
||||
sessionFile,
|
||||
label: storeEntry?.label,
|
||||
updatedAt: storeEntry?.updatedAt ?? transcriptUpdatedAt,
|
||||
storeEntry,
|
||||
},
|
||||
});
|
||||
}
|
||||
}
|
||||
} else {
|
||||
|
||||
@@ -11,7 +11,6 @@ import {
|
||||
piSdkMock,
|
||||
rpcReq,
|
||||
startConnectedServerWithClient,
|
||||
writeSessionStore,
|
||||
} from "./test-helpers.js";
|
||||
import {
|
||||
setupGatewaySessionsTestHarness,
|
||||
|
||||
@@ -1,4 +1,3 @@
|
||||
import fs from "node:fs/promises";
|
||||
import path from "node:path";
|
||||
import { expect, test } from "vitest";
|
||||
import { loadSessionStore } from "../config/sessions.js";
|
||||
|
||||
@@ -1,4 +1,3 @@
|
||||
import fs from "node:fs/promises";
|
||||
import path from "node:path";
|
||||
import { expect, test } from "vitest";
|
||||
import { loadSessionStore } from "../config/sessions.js";
|
||||
@@ -246,7 +245,7 @@ test("sessions.reset emits enriched session_end and session_start hooks", async
|
||||
});
|
||||
|
||||
test("sessions.reset returns unavailable when active run does not stop", async () => {
|
||||
const { dir, storePath } = await seedActiveMainSession();
|
||||
const { storePath } = await seedActiveMainSession();
|
||||
const waitCallCountAtSnapshotClear: number[] = [];
|
||||
bootstrapCacheMocks.clearBootstrapSnapshot.mockImplementation(() => {
|
||||
waitCallCountAtSnapshotClear.push(embeddedRunMock.waitCalls.length);
|
||||
|
||||
@@ -1,6 +1,4 @@
|
||||
import { randomUUID } from "node:crypto";
|
||||
import fs from "node:fs";
|
||||
import path from "node:path";
|
||||
import { getAcpSessionManager } from "../acp/control-plane/manager.js";
|
||||
import { getAcpRuntimeBackend } from "../acp/runtime/registry.js";
|
||||
import { readAcpSessionEntry, upsertAcpSessionMeta } from "../acp/runtime/session-meta.js";
|
||||
|
||||
@@ -12,7 +12,6 @@ import {
|
||||
readLatestSessionUsageFromTranscript,
|
||||
readLatestSessionUsageFromTranscriptAsync,
|
||||
readRecentSessionMessages,
|
||||
readRecentSessionMessagesAsync,
|
||||
readRecentSessionMessagesWithStats,
|
||||
readRecentSessionMessagesWithStatsAsync,
|
||||
readRecentSessionTranscriptLines,
|
||||
|
||||
@@ -2,7 +2,7 @@ import fs from "node:fs";
|
||||
import os from "node:os";
|
||||
import path from "node:path";
|
||||
import { afterEach, describe, expect, test, vi } from "vitest";
|
||||
import { resetConfigRuntimeState, setRuntimeConfigSnapshot } from "../config/config.js";
|
||||
import { resetConfigRuntimeState } from "../config/config.js";
|
||||
import type { OpenClawConfig } from "../config/config.js";
|
||||
import type { SessionEntry } from "../config/sessions.js";
|
||||
import { replaceSqliteSessionTranscriptEvents } from "../config/sessions/transcript-store.sqlite.js";
|
||||
|
||||
@@ -1,9 +1,10 @@
|
||||
import fs from "node:fs/promises";
|
||||
import os from "node:os";
|
||||
import path from "node:path";
|
||||
import { afterAll, beforeAll, describe, expect, it, vi } from "vitest";
|
||||
import { afterAll, afterEach, beforeAll, describe, expect, it, vi } from "vitest";
|
||||
import type { OpenClawConfig } from "../../../config/config.js";
|
||||
import { writeWorkspaceFile } from "../../../test-helpers/workspace.js";
|
||||
import { replaceSqliteSessionTranscriptEvents } from "../../../config/sessions/transcript-store.sqlite.js";
|
||||
import { closeOpenClawStateDatabaseForTest } from "../../../state/openclaw-state-db.js";
|
||||
import { withEnvAsync } from "../../../test-utils/env.js";
|
||||
import { createHookEvent } from "../../hooks.js";
|
||||
import { generateSlugViaLLM } from "../../llm-slug-generator.js";
|
||||
@@ -18,6 +19,7 @@ let handler: typeof import("./handler.js").default;
|
||||
let flushSessionMemoryWritesForTest: typeof import("./handler.js").flushSessionMemoryWritesForTest;
|
||||
let suiteWorkspaceRoot = "";
|
||||
let workspaceCaseCounter = 0;
|
||||
let originalStateDir: string | undefined;
|
||||
|
||||
async function createCaseWorkspace(prefix = "case"): Promise<string> {
|
||||
const dir = path.join(suiteWorkspaceRoot, `${prefix}-${workspaceCaseCounter}`);
|
||||
@@ -27,11 +29,23 @@ async function createCaseWorkspace(prefix = "case"): Promise<string> {
|
||||
}
|
||||
|
||||
beforeAll(async () => {
|
||||
({ default: handler, flushSessionMemoryWritesForTest } = await import("./handler.js"));
|
||||
suiteWorkspaceRoot = await fs.mkdtemp(path.join(os.tmpdir(), "openclaw-session-memory-"));
|
||||
originalStateDir = process.env.OPENCLAW_STATE_DIR;
|
||||
process.env.OPENCLAW_STATE_DIR = path.join(suiteWorkspaceRoot, "state");
|
||||
({ default: handler, flushSessionMemoryWritesForTest } = await import("./handler.js"));
|
||||
});
|
||||
|
||||
afterEach(() => {
|
||||
closeOpenClawStateDatabaseForTest();
|
||||
});
|
||||
|
||||
afterAll(async () => {
|
||||
closeOpenClawStateDatabaseForTest();
|
||||
if (originalStateDir === undefined) {
|
||||
delete process.env.OPENCLAW_STATE_DIR;
|
||||
} else {
|
||||
process.env.OPENCLAW_STATE_DIR = originalStateDir;
|
||||
}
|
||||
if (!suiteWorkspaceRoot) {
|
||||
return;
|
||||
}
|
||||
@@ -63,6 +77,29 @@ function createMockSessionContent(
|
||||
.join("\n");
|
||||
}
|
||||
|
||||
function parseMockSessionContent(content: string): unknown[] {
|
||||
return content
|
||||
.split(/\r?\n/)
|
||||
.map((line) => line.trim())
|
||||
.filter(Boolean)
|
||||
.map((line) => JSON.parse(line));
|
||||
}
|
||||
|
||||
function seedSessionTranscript(params: {
|
||||
sessionId: string;
|
||||
sessionFile: string;
|
||||
content: string;
|
||||
agentId?: string;
|
||||
}): void {
|
||||
replaceSqliteSessionTranscriptEvents({
|
||||
agentId: params.agentId ?? "main",
|
||||
sessionId: params.sessionId,
|
||||
transcriptPath: params.sessionFile,
|
||||
events: parseMockSessionContent(params.content),
|
||||
now: () => 1_770_000_000_000,
|
||||
});
|
||||
}
|
||||
|
||||
async function runNewWithPreviousSessionEntry(params: {
|
||||
tempDir: string;
|
||||
previousSessionEntry: { sessionId: string; sessionFile?: string };
|
||||
@@ -109,9 +146,10 @@ async function runNewWithPreviousSession(params: {
|
||||
const sessionsDir = path.join(tempDir, "sessions");
|
||||
await fs.mkdir(sessionsDir, { recursive: true });
|
||||
|
||||
const sessionFile = await writeWorkspaceFile({
|
||||
dir: sessionsDir,
|
||||
name: "test-session.jsonl",
|
||||
const sessionFile = path.join(sessionsDir, "test-session.jsonl");
|
||||
seedSessionTranscript({
|
||||
sessionId: "test-123",
|
||||
sessionFile,
|
||||
content: params.sessionContent,
|
||||
});
|
||||
|
||||
@@ -144,22 +182,25 @@ async function createSessionMemoryWorkspace(params?: {
|
||||
return { tempDir, sessionsDir };
|
||||
}
|
||||
|
||||
const activeSessionFile = await writeWorkspaceFile({
|
||||
dir: sessionsDir,
|
||||
name: params.activeSession.name,
|
||||
const activeSessionFile = path.join(sessionsDir, params.activeSession.name);
|
||||
seedSessionTranscript({
|
||||
sessionId: path.basename(params.activeSession.name, ".jsonl"),
|
||||
sessionFile: activeSessionFile,
|
||||
content: params.activeSession.content,
|
||||
});
|
||||
return { tempDir, sessionsDir, activeSessionFile };
|
||||
}
|
||||
|
||||
async function writeSessionTranscript(params: {
|
||||
sessionId?: string;
|
||||
name: string;
|
||||
content: string;
|
||||
}): Promise<{ tempDir: string; sessionsDir: string; sessionFile: string }> {
|
||||
const { tempDir, sessionsDir } = await createSessionMemoryWorkspace();
|
||||
const sessionFile = await writeWorkspaceFile({
|
||||
dir: sessionsDir,
|
||||
name: params.name,
|
||||
const sessionFile = path.join(sessionsDir, params.name);
|
||||
seedSessionTranscript({
|
||||
sessionId: params.sessionId ?? path.basename(params.name, ".jsonl"),
|
||||
sessionFile,
|
||||
content: params.content,
|
||||
});
|
||||
return { tempDir, sessionsDir, sessionFile };
|
||||
@@ -193,6 +234,16 @@ async function expectPathMissing(targetPath: string): Promise<void> {
|
||||
await expect(fs.access(targetPath)).rejects.toMatchObject({ code: "ENOENT" });
|
||||
}
|
||||
|
||||
async function waitUntil(condition: () => boolean, timeoutMs = 500): Promise<void> {
|
||||
const deadline = Date.now() + timeoutMs;
|
||||
while (!condition()) {
|
||||
if (Date.now() > deadline) {
|
||||
throw new Error("condition was not met before timeout");
|
||||
}
|
||||
await new Promise((resolve) => setTimeout(resolve, 5));
|
||||
}
|
||||
}
|
||||
|
||||
describe("session-memory hook", () => {
|
||||
it("skips non-command events", async () => {
|
||||
const tempDir = await createCaseWorkspace("workspace");
|
||||
@@ -310,9 +361,10 @@ describe("session-memory hook", () => {
|
||||
const sessionsDir = path.join(tempDir, "sessions");
|
||||
await fs.mkdir(sessionsDir, { recursive: true });
|
||||
|
||||
const sessionFile = await writeWorkspaceFile({
|
||||
dir: sessionsDir,
|
||||
name: "test-session.jsonl",
|
||||
const sessionFile = path.join(sessionsDir, "test-session.jsonl");
|
||||
seedSessionTranscript({
|
||||
sessionId: "test-123",
|
||||
sessionFile,
|
||||
content: createMockSessionContent([
|
||||
{ role: "user", content: "Investigate slow WhatsApp reset" },
|
||||
{ role: "assistant", content: "Checking reset hooks" },
|
||||
@@ -444,9 +496,10 @@ describe("session-memory hook", () => {
|
||||
const naviSessionsDir = path.join(naviWorkspace, "sessions");
|
||||
await fs.mkdir(naviSessionsDir, { recursive: true });
|
||||
|
||||
const sessionFile = await writeWorkspaceFile({
|
||||
dir: naviSessionsDir,
|
||||
name: "navi-session.jsonl",
|
||||
const sessionFile = path.join(naviSessionsDir, "navi-session.jsonl");
|
||||
seedSessionTranscript({
|
||||
sessionId: "navi-session",
|
||||
sessionFile,
|
||||
content: createMockSessionContent([
|
||||
{ role: "user", content: "Remember this under Navi" },
|
||||
{ role: "assistant", content: "Stored in the bound workspace" },
|
||||
@@ -582,9 +635,9 @@ describe("session-memory hook", () => {
|
||||
const { sessionsDir } = await createSessionMemoryWorkspace();
|
||||
|
||||
const sessionId = "missing-session-file";
|
||||
await writeWorkspaceFile({
|
||||
dir: sessionsDir,
|
||||
name: `${sessionId}.jsonl`,
|
||||
seedSessionTranscript({
|
||||
sessionId,
|
||||
sessionFile: path.join(sessionsDir, `${sessionId}.jsonl`),
|
||||
content: createMockSessionContent([
|
||||
{ role: "user", content: "Recovered with missing sessionFile pointer" },
|
||||
{ role: "assistant", content: "Recovered by sessionId fallback" },
|
||||
@@ -614,9 +667,10 @@ describe("session-memory hook", () => {
|
||||
const sessionsDir = path.join(customAgentWorkspace, "sessions");
|
||||
await fs.mkdir(sessionsDir, { recursive: true });
|
||||
|
||||
const sessionFile = await writeWorkspaceFile({
|
||||
dir: sessionsDir,
|
||||
name: "custom-agent-session.jsonl",
|
||||
const sessionFile = path.join(sessionsDir, "custom-agent-session.jsonl");
|
||||
seedSessionTranscript({
|
||||
sessionId: "custom-agent-session",
|
||||
sessionFile,
|
||||
content: createMockSessionContent([
|
||||
{ role: "user", content: "Custom agent conversation" },
|
||||
{ role: "assistant", content: "Stored in agent workspace" },
|
||||
|
||||
@@ -1,5 +1,11 @@
|
||||
import fs from "node:fs/promises";
|
||||
import path from "node:path";
|
||||
import {
|
||||
listSqliteSessionTranscriptFiles,
|
||||
loadSqliteSessionTranscriptEvents,
|
||||
resolveSqliteSessionTranscriptScope,
|
||||
resolveSqliteSessionTranscriptScopeForPath,
|
||||
type SqliteSessionTranscriptScope,
|
||||
} from "../../../config/sessions/transcript-store.sqlite.js";
|
||||
import { hasInterSessionUserProvenance } from "../../../sessions/input-provenance.js";
|
||||
|
||||
function extractTextMessageContent(content: unknown): string | undefined {
|
||||
@@ -26,15 +32,17 @@ export async function getRecentSessionContent(
|
||||
messageCount: number = 15,
|
||||
): Promise<string | null> {
|
||||
try {
|
||||
const content = await fs.readFile(sessionFilePath, "utf-8");
|
||||
const lines = content.trim().split("\n");
|
||||
const scope = resolveScopeForTranscriptPath(sessionFilePath);
|
||||
if (!scope) {
|
||||
return null;
|
||||
}
|
||||
const events = loadSqliteSessionTranscriptEvents(scope);
|
||||
|
||||
const allMessages: string[] = [];
|
||||
for (const line of lines) {
|
||||
for (const { event } of events) {
|
||||
try {
|
||||
const entry = JSON.parse(line);
|
||||
if (entry.type === "message" && entry.message) {
|
||||
const msg = entry.message as {
|
||||
if (isRecord(event) && event.type === "message" && event.message) {
|
||||
const msg = event.message as {
|
||||
role?: unknown;
|
||||
content?: unknown;
|
||||
provenance?: unknown;
|
||||
@@ -61,27 +69,80 @@ export async function getRecentSessionContent(
|
||||
}
|
||||
}
|
||||
|
||||
function isRecord(value: unknown): value is Record<string, unknown> {
|
||||
return Boolean(value && typeof value === "object" && !Array.isArray(value));
|
||||
}
|
||||
|
||||
function extractSessionIdFromTranscriptPath(sessionFilePath: string): string | undefined {
|
||||
const base = path.basename(sessionFilePath);
|
||||
if (!base.endsWith(".jsonl")) {
|
||||
return undefined;
|
||||
}
|
||||
const stem = base.slice(0, -".jsonl".length);
|
||||
const topicIndex = stem.indexOf("-topic-");
|
||||
return topicIndex > 0 ? stem.slice(0, topicIndex) : stem || undefined;
|
||||
}
|
||||
|
||||
function resolveScopeForTranscriptPath(
|
||||
sessionFilePath: string,
|
||||
): SqliteSessionTranscriptScope | undefined {
|
||||
const byPath = resolveSqliteSessionTranscriptScopeForPath({ transcriptPath: sessionFilePath });
|
||||
if (byPath) {
|
||||
return byPath;
|
||||
}
|
||||
const sessionId = extractSessionIdFromTranscriptPath(sessionFilePath);
|
||||
if (!sessionId) {
|
||||
return undefined;
|
||||
}
|
||||
return resolveSqliteSessionTranscriptScope({
|
||||
sessionId,
|
||||
transcriptPath: sessionFilePath,
|
||||
});
|
||||
}
|
||||
|
||||
function resolveRememberedPathInSessionsDir(params: {
|
||||
sessionsDir: string;
|
||||
sessionId: string;
|
||||
}): string | undefined {
|
||||
const sessionsDir = path.resolve(params.sessionsDir);
|
||||
const candidates = listSqliteSessionTranscriptFiles()
|
||||
.filter((file) => path.dirname(path.resolve(file.path)) === sessionsDir)
|
||||
.filter((file) => file.sessionId === params.sessionId)
|
||||
.toSorted((a, b) => {
|
||||
const updatedDelta = b.updatedAt - a.updatedAt;
|
||||
return updatedDelta || b.path.localeCompare(a.path);
|
||||
});
|
||||
|
||||
if (candidates.length === 0) {
|
||||
return undefined;
|
||||
}
|
||||
|
||||
const canonicalPath = path.join(sessionsDir, `${params.sessionId}.jsonl`);
|
||||
const canonical = candidates.find((file) => path.resolve(file.path) === canonicalPath);
|
||||
return canonical?.path ?? candidates[0]?.path;
|
||||
}
|
||||
|
||||
export async function findPreviousSessionFile(params: {
|
||||
sessionsDir: string;
|
||||
sessionId?: string;
|
||||
}): Promise<string | undefined> {
|
||||
try {
|
||||
const files = await fs.readdir(params.sessionsDir);
|
||||
const fileSet = new Set(files);
|
||||
|
||||
const trimmedSessionId = params.sessionId?.trim();
|
||||
if (trimmedSessionId) {
|
||||
const canonicalFile = `${trimmedSessionId}.jsonl`;
|
||||
if (fileSet.has(canonicalFile)) {
|
||||
return path.join(params.sessionsDir, canonicalFile);
|
||||
const rememberedPath = resolveRememberedPathInSessionsDir({
|
||||
sessionsDir: params.sessionsDir,
|
||||
sessionId: trimmedSessionId,
|
||||
});
|
||||
if (rememberedPath) {
|
||||
return rememberedPath;
|
||||
}
|
||||
|
||||
const topicVariants = files
|
||||
.filter((name) => name.startsWith(`${trimmedSessionId}-topic-`) && name.endsWith(".jsonl"))
|
||||
.toSorted()
|
||||
.toReversed();
|
||||
if (topicVariants.length > 0) {
|
||||
return path.join(params.sessionsDir, topicVariants[0]);
|
||||
const scope = resolveSqliteSessionTranscriptScope({
|
||||
sessionId: trimmedSessionId,
|
||||
transcriptPath: path.join(params.sessionsDir, `${trimmedSessionId}.jsonl`),
|
||||
});
|
||||
if (scope) {
|
||||
return path.join(params.sessionsDir, `${trimmedSessionId}.jsonl`);
|
||||
}
|
||||
}
|
||||
} catch {
|
||||
|
||||
Reference in New Issue
Block a user