refactor: move session state migration to doctor

This commit is contained in:
Peter Steinberger
2026-05-07 22:00:59 +01:00
parent 3e7945e74d
commit 012a1e966b
44 changed files with 1746 additions and 1762 deletions

View File

@@ -61,27 +61,22 @@ This plan has started landing in slices:
canonical SQLite stores avoid that path. The cron timer no longer runs a
dedicated session reaper; cron run sessions are maintained through the same
explicit session cleanup path as other rows.
- Transcript events have a SQLite store primitive with JSONL import/export.
Transcript append paths dual-write when the caller already has agent and
session scope, including gateway-injected assistant messages. Scoped appends
also import the current JSONL stream into SQLite when the SQLite transcript is
empty, so headers and legacy rows are not skipped before the new event is
mirrored. Scoped latest/tail assistant transcript reads can now use the
SQLite mirror first, and delivery-mirror idempotency/latest-match checks use
the same scoped mirror before falling back to JSONL for legacy or file-only
callers. `/export-session` and `before_reset` hook payload construction can
also read scoped SQLite transcript events when the compatibility JSONL is
missing, and silent session-rotation replay can use the scoped SQLite
transcript tail before falling back to JSONL. Shared async Gateway transcript
readers also have a scoped SQLite fallback for chat history, TUI history,
restart and subagent recovery, managed outgoing media indexing, token
estimation, title/preview/usage helpers, and bounded session inspection
surfaces. JSONL remains the compatibility file while the transcript moves to
OpenClaw-owned semantics. The remaining transcript tail rewrites for
recovery/yield cleanup are now isolated behind OpenClaw-owned helpers instead
of being duplicated inline, and live runs no longer need PI's private
first-run persistence normalization because OpenClaw's file-backed manager
persists the header and initial user message synchronously.
- Transcript events are SQLite-primary. OpenClaw-owned append paths require
agent/session scope and write `transcript_events` directly; `*.jsonl` is no
longer a runtime mirror for those paths. JSONL is now an explicit
import/export/debug shape only. The OpenClaw transcript session manager,
Gateway-injected assistant messages, CLI transcript persistence, Codex
app-server mirroring, compaction successor transcripts, manual compaction
boundary rewrites, and reset/header creation all persist through SQLite.
Scoped latest/tail assistant reads, delivery-mirror idempotency/latest-match
checks, `/export-session`, `before_reset` hook payloads, silent rotation
replay, chat/TUI history, restart/subagent recovery, managed media indexing,
token estimation, title/preview/usage helpers, runtime transcript repair,
bootstrap completion checks, and bounded inspection all use the scoped SQLite
transcript. Legacy JSONL import is doctor/import/debug only: `openclaw doctor
--fix` builds the transcript database from old files and removes the JSONL
sources after successful import. Runtime paths do not import, prune, or repair
JSONL files.
- `AgentFilesystem` and `SqliteVirtualAgentFs` exist for scratch storage, with
`disk`, `vfs-scratch`, and `vfs-only` filesystem modes at the runtime
boundary. VFS contents can be listed and exported for support bundles. When
@@ -195,12 +190,12 @@ This plan has started landing in slices:
OpenAI completion conversion subpaths route through narrow OpenClaw facades.
TUI imports route through `src/agents/pi-tui-contract.ts`, with
`src/tui/pi-tui-contract.ts` left as a local compatibility re-export.
- Transcript JSONL header, entry, tree, parser, legacy migration, context
- Transcript header, entry, tree, parser, legacy migration, context
builder, and session-manager structural types are now defined by OpenClaw's
transcript contract. The parser, migration, and context builder runtime
helpers have one OpenClaw-owned implementation under `src/agents/transcript`
instead of duplicated facade/file-state logic. OpenClaw also owns a
synchronous file-backed transcript session manager that implements the live
synchronous SQLite-backed transcript session manager that implements the live
`SessionManager` shape over `TranscriptFileState`, including header creation,
append persistence, tree, label, branch, session name, branch-summary,
in-memory, create/open, list/listAll, and fork APIs. Live embedded runs,
@@ -356,9 +351,10 @@ Migration order:
stores.
5. Import old `sessions.json` only from `openclaw doctor --fix`, then remove the
JSON index after SQLite has the rows. Done for session indexes.
6. Leave `*.jsonl` transcripts on disk while PI owns transcript semantics.
7. After session manager ownership moves behind OpenClaw APIs, store transcript
events in SQLite and export JSONL for compatibility.
6. Import old `*.jsonl` transcripts only from `openclaw doctor --fix`, then
remove the JSONL source after SQLite has the events. Done for canonical
transcript files.
7. Keep JSONL export as explicit debug/support output only.
Keep `openclaw.json` and `auth-profiles.json` file-backed until operator
repair, secret audit, and backup flows can handle the SQLite layout naturally.
@@ -588,7 +584,7 @@ Phase 5: transcript ownership
- Move transcript mutation behind OpenClaw APIs.
- Store transcript events in SQLite.
- Export JSONL for compatibility and debugging.
- Import legacy JSONL through doctor only; export JSONL for debugging/support.
- Remove direct PI `SessionManager` usage from non-adapter code.
Phase 6: internalize or replace PI pieces

View File

@@ -1,7 +1,7 @@
---
summary: "Deep dive: session store + transcripts, lifecycle, and (auto)compaction internals"
read_when:
- You need to debug session ids, transcript JSONL, SQLite session rows, or legacy sessions.json fields
- You need to debug session ids, SQLite session rows/events, or doctor migration of legacy sessions.json/JSONL files
- You are changing auto-compaction behavior or adding "pre-compaction" housekeeping
- You want to implement memory flushes or silent system turns
title: "Session management deep dive"
@@ -11,7 +11,7 @@ OpenClaw manages sessions end-to-end across these areas:
- **Session routing** (how inbound messages map to a `sessionKey`)
- **Session store** and what it tracks
- **Transcript persistence** (`*.jsonl`) and its structure
- **Transcript persistence** (SQLite event streams, doctor-only JSONL import, explicit debug export) and its structure
- **Transcript hygiene** (provider-specific fixups before runs)
- **Context limits** (context window vs tracked tokens)
- **Compaction** (manual and auto-compaction) and where to hook pre-compaction work
@@ -47,17 +47,15 @@ OpenClaw persists sessions in two layers:
- Tracks session metadata (current session id, last activity, toggles, token counters, etc.)
2. **Transcript (`<sessionId>.jsonl`)**
- Append-only transcript with tree structure (entries have `id` + `parentId`)
- SQLite-backed transcript event stream with tree structure (entries have `id` + `parentId`)
- Stores the actual conversation + tool calls + compaction summaries
- Used to rebuild the model context for future turns
- Mirrored into SQLite for scoped Gateway appends; scoped latest/tail
assistant-text lookups, session exports, and `before_reset` hook payloads
prefer that mirror and fall back to JSONL. Silent session rotations also
replay recent user/assistant turns from the scoped SQLite mirror when
available. Shared async Gateway transcript readers fall back to the scoped
SQLite mirror for chat history, TUI history, recovery, managed media
indexing, token estimation, title/preview/usage helpers, and bounded
session inspection when the compatibility JSONL is missing.
- Stored in SQLite for OpenClaw-owned runtime paths; JSONL is legacy
import/export/debug compatibility, not a runtime sidecar
- Scoped latest/tail assistant-text lookups, session exports, `before_reset`
hook payloads, silent session rotations, chat history, TUI history,
recovery, managed media indexing, token estimation, title/preview/usage
helpers, and bounded session inspection read the scoped SQLite transcript.
- Large pre-compaction debug checkpoints are skipped once the active
transcript exceeds the checkpoint size cap, avoiding a second giant
`.checkpoint.*.jsonl` copy.
@@ -78,8 +76,11 @@ Per agent, on the Gateway host:
imports legacy `~/.openclaw/agents/<agentId>/sessions/sessions.json` indexes
into SQLite and removes the JSON index after import; Gateway startup leaves
legacy indexes alone.
- Transcripts: `~/.openclaw/agents/<agentId>/sessions/<sessionId>.jsonl`
- Telegram topic sessions: `.../<sessionId>-topic-<threadId>.jsonl`
- Transcripts: `~/.openclaw/state/openclaw.sqlite` (`transcript_events` and
`transcript_files`). Legacy/export paths may still use
`~/.openclaw/agents/<agentId>/sessions/<sessionId>.jsonl` names as stable
handles.
- Telegram topic handles: `.../<sessionId>-topic-<threadId>.jsonl`
OpenClaw resolves these via `src/config/sessions/*`.
@@ -105,10 +106,9 @@ configured age, count, or disk budget.
OpenClaw no longer creates automatic `sessions.json.bak.*` rotation backups during Gateway writes. The legacy `session.maintenance.rotateBytes` key is ignored and `openclaw doctor --fix` removes it from older configs.
Transcript mutations use a session write lock on the transcript file. Lock acquisition waits up to
`session.writeLock.acquireTimeoutMs` before surfacing a busy-session error; the default is `60000`
ms. Raise this only when legitimate prep, cleanup, compaction, or transcript mirror work contends
longer on slow machines. Stale-lock detection and maximum hold warnings remain separate policies.
Transcript mutations are serialized through SQLite transactions plus the
per-session append queue. The legacy `session.writeLock.acquireTimeoutMs`
setting remains for older import/debug paths that still touch JSONL files.
Enforcement order for disk budget cleanup (`mode: "enforce"`):
@@ -209,15 +209,18 @@ The store is safe to edit, but the Gateway is the authority: it may rewrite or r
---
## Transcript structure (`*.jsonl`)
## Transcript structure
Transcripts are managed by `@mariozechner/pi-coding-agent`'s `SessionManager`.
Transcripts are managed by OpenClaw's SQLite-backed `SessionManager`.
The file is JSONL:
The event stream is stored in `transcript_events`:
- First line: session header (`type: "session"`, includes `id`, `cwd`, `timestamp`, optional `parentSession`)
- First event: session header (`type: "session"`, includes `id`, `cwd`,
`timestamp`, optional `parentSession`)
- Then: session entries with `id` + `parentId` (tree)
JSONL import/export uses the same event shape, one JSON object per line.
Notable entry types:
- `message`: user/assistant/toolResult messages

View File

@@ -2318,6 +2318,7 @@ async function mirrorTranscriptBestEffort(params: {
try {
await mirrorCodexAppServerTranscript({
sessionFile: params.params.sessionFile,
sessionId: params.params.sessionId,
agentId: params.agentId,
sessionKey: params.sessionKey,
messages: params.result.messagesSnapshot,

View File

@@ -13,7 +13,12 @@ import {
makeAgentAssistantMessage,
makeAgentUserMessage,
} from "openclaw/plugin-sdk/test-fixtures";
import { afterEach, describe, expect, it } from "vitest";
import { afterEach, describe, expect, it, vi } from "vitest";
import {
loadSqliteSessionTranscriptEvents,
replaceSqliteSessionTranscriptEvents,
} from "../../../../src/config/sessions/transcript-store.sqlite.js";
import { closeOpenClawStateDatabaseForTest } from "../../../../src/state/openclaw-state-db.js";
import { attachCodexMirrorIdentity, mirrorCodexAppServerTranscript } from "./transcript-mirror.js";
type MirroredAgentMessage = Extract<AgentMessage, { role: "user" | "assistant" | "toolResult" }>;
@@ -29,6 +34,8 @@ const tempDirs: string[] = [];
afterEach(async () => {
resetGlobalHookRunner();
closeOpenClawStateDatabaseForTest();
vi.unstubAllEnvs();
for (const dir of tempDirs.splice(0)) {
await fs.rm(dir, { recursive: true, force: true });
}
@@ -37,23 +44,31 @@ afterEach(async () => {
async function createTempSessionFile() {
const dir = await fs.mkdtemp(path.join(os.tmpdir(), "openclaw-codex-transcript-"));
tempDirs.push(dir);
vi.stubEnv("OPENCLAW_STATE_DIR", dir);
return path.join(dir, "session.jsonl");
}
async function makeRoot(prefix: string): Promise<string> {
const root = await fs.mkdtemp(path.join(os.tmpdir(), prefix));
tempDirs.push(root);
vi.stubEnv("OPENCLAW_STATE_DIR", root);
return root;
}
function parseJsonLines<T>(raw: string): T[] {
const records: T[] = [];
for (const line of raw.trim().split("\n")) {
if (line.length > 0) {
records.push(JSON.parse(line) as T);
}
}
return records;
function sessionIdFromFile(sessionFile: string): string {
return path.basename(sessionFile).replace(/\.jsonl$/i, "");
}
function readTranscriptEvents(sessionFile: string, sessionId = sessionIdFromFile(sessionFile)) {
return loadSqliteSessionTranscriptEvents({
agentId: "main",
sessionId,
}).map((entry) => entry.event);
}
function readTranscriptRaw(sessionFile: string, sessionId = sessionIdFromFile(sessionFile)) {
const lines = readTranscriptEvents(sessionFile, sessionId).map((event) => JSON.stringify(event));
return lines.length ? `${lines.join("\n")}\n` : "";
}
describe("mirrorCodexAppServerTranscript", () => {
@@ -88,7 +103,7 @@ describe("mirrorCodexAppServerTranscript", () => {
idempotencyScope: "scope-1",
});
const raw = await fs.readFile(sessionFile, "utf8");
const raw = readTranscriptRaw(sessionFile);
expect(raw).toContain('"role":"user"');
expect(raw).toContain('"content":[{"type":"text","text":"hello"}]');
expect(raw).toContain('"role":"assistant"');
@@ -121,7 +136,7 @@ describe("mirrorCodexAppServerTranscript", () => {
idempotencyScope: "scope-1",
});
const raw = await fs.readFile(sessionFile, "utf8");
const raw = readTranscriptRaw(sessionFile);
expect(raw).toContain('"role":"assistant"');
expect(raw).toContain('"content":[{"type":"text","text":"first mirror"}]');
});
@@ -152,9 +167,11 @@ describe("mirrorCodexAppServerTranscript", () => {
idempotencyScope: "scope-1",
});
const records = parseJsonLines<{ type?: string; message?: { role?: string } }>(
await fs.readFile(sessionFile, "utf8"),
);
const records = readTranscriptRaw(sessionFile)
.trim()
.split("\n")
.filter(Boolean)
.map((line) => JSON.parse(line) as { type?: string; message?: { role?: string } });
expect(records.slice(1)).toHaveLength(2);
});
@@ -185,7 +202,7 @@ describe("mirrorCodexAppServerTranscript", () => {
idempotencyScope: "scope-1",
});
const raw = await fs.readFile(sessionFile, "utf8");
const raw = readTranscriptRaw(sessionFile);
expect(raw).toContain('"content":[{"type":"text","text":"hello [hooked]"}]');
// The idempotency fingerprint is derived from the pre-hook message so a
// hook rewrite cannot bypass dedupe by reshaping content on every retry.
@@ -221,7 +238,7 @@ describe("mirrorCodexAppServerTranscript", () => {
idempotencyScope: "scope-1",
});
const raw = await fs.readFile(sessionFile, "utf8");
const raw = readTranscriptRaw(sessionFile);
expect(raw).toContain(
`"idempotencyKey":"scope-1:assistant:${expectedFingerprint(sourceMessage)}"`,
);
@@ -251,33 +268,36 @@ describe("mirrorCodexAppServerTranscript", () => {
idempotencyScope: "scope-1",
});
await expect(fs.readFile(sessionFile, "utf8")).rejects.toMatchObject({ code: "ENOENT" });
expect(readTranscriptRaw(sessionFile)).toBe("");
});
it("migrates small linear transcripts before mirroring", async () => {
const sessionFile = await createTempSessionFile();
await fs.writeFile(
sessionFile,
[
JSON.stringify({
replaceSqliteSessionTranscriptEvents({
agentId: "main",
sessionId: "linear-codex-session",
transcriptPath: sessionFile,
events: [
{
type: "session",
version: 3,
id: "linear-codex-session",
timestamp: new Date().toISOString(),
cwd: process.cwd(),
}),
JSON.stringify({
},
{
type: "message",
id: "legacy-user",
parentId: null,
timestamp: new Date().toISOString(),
message: { role: "user", content: "legacy user" },
}),
].join("\n") + "\n",
"utf8",
);
},
],
});
await mirrorCodexAppServerTranscript({
sessionFile,
sessionId: "linear-codex-session",
sessionKey: "session-1",
messages: [
makeAgentAssistantMessage({
@@ -288,7 +308,7 @@ describe("mirrorCodexAppServerTranscript", () => {
idempotencyScope: "scope-1",
});
const records = (await fs.readFile(sessionFile, "utf8"))
const records = readTranscriptRaw(sessionFile, "linear-codex-session")
.trim()
.split("\n")
.map(
@@ -373,9 +393,7 @@ describe("mirrorCodexAppServerTranscript", () => {
idempotencyScope: "codex-app-server:thread-X",
});
const messageTexts = readFileMessages(await fs.readFile(sessionFile, "utf8")).map(
(m) => m.text,
);
const messageTexts = readFileMessages(readTranscriptRaw(sessionFile)).map((m) => m.text);
expect(messageTexts).toEqual(["hello", "hi there", "[Codex reasoning] thinking"]);
});
@@ -427,7 +445,7 @@ describe("mirrorCodexAppServerTranscript", () => {
idempotencyScope: "codex-app-server:thread-X",
});
expect(readFileMessages(await fs.readFile(sessionFile, "utf8"))).toEqual([
expect(readFileMessages(readTranscriptRaw(sessionFile))).toEqual([
{ role: "user", text: "yes" },
{ role: "assistant", text: "ok 1" },
{ role: "user", text: "yes" },
@@ -487,7 +505,7 @@ describe("mirrorCodexAppServerTranscript", () => {
idempotencyScope: "codex-app-server:thread-X",
});
expect(readFileMessages(await fs.readFile(sessionFile, "utf8"))).toEqual([
expect(readFileMessages(readTranscriptRaw(sessionFile))).toEqual([
{ role: "user", text: "msg1" },
{ role: "assistant", text: "reply1" },
{ role: "user", text: "msg2" },
@@ -517,7 +535,7 @@ describe("mirrorCodexAppServerTranscript", () => {
idempotencyScope: "scope-1",
});
const raw = await fs.readFile(sessionFile, "utf8");
const raw = readTranscriptRaw(sessionFile);
expect(raw).toContain(`"idempotencyKey":"scope-1:user:${expectedFingerprint(userMessage)}"`);
expect(raw).toContain(
`"idempotencyKey":"scope-1:assistant:${expectedFingerprint(assistantMessage)}"`,

View File

@@ -1,15 +1,15 @@
import { createHash } from "node:crypto";
import fs from "node:fs/promises";
import path from "node:path";
import {
acquireSessionWriteLock,
appendSessionTranscriptMessage,
emitSessionTranscriptUpdate,
resolveSessionWriteLockAcquireTimeoutMs,
runAgentHarnessBeforeMessageWriteHook,
type AgentMessage,
type SessionWriteLockAcquireTimeoutConfig,
} from "openclaw/plugin-sdk/agent-harness-runtime";
const DEFAULT_AGENT_ID = "main";
type MirroredAgentMessage = Extract<AgentMessage, { role: "user" | "assistant" | "toolResult" }>;
const MIRROR_IDENTITY_META_KEY = "mirrorIdentity" as const;
@@ -67,6 +67,7 @@ function buildMirrorDedupeIdentity(message: MirroredAgentMessage): string {
export async function mirrorCodexAppServerTranscript(params: {
sessionFile: string;
sessionId?: string;
sessionKey?: string;
agentId?: string;
messages: AgentMessage[];
@@ -81,51 +82,46 @@ export async function mirrorCodexAppServerTranscript(params: {
return;
}
const lock = await acquireSessionWriteLock({
sessionFile: params.sessionFile,
timeoutMs: resolveSessionWriteLockAcquireTimeoutMs(params.config),
});
try {
const existingIdempotencyKeys = await readTranscriptIdempotencyKeys(params.sessionFile);
for (const message of messages) {
const dedupeIdentity = buildMirrorDedupeIdentity(message);
const idempotencyKey = params.idempotencyScope
? `${params.idempotencyScope}:${dedupeIdentity}`
: undefined;
if (idempotencyKey && existingIdempotencyKeys.has(idempotencyKey)) {
continue;
}
const transcriptMessage = {
...message,
...(idempotencyKey ? { idempotencyKey } : {}),
} as AgentMessage;
const nextMessage = runAgentHarnessBeforeMessageWriteHook({
message: transcriptMessage,
agentId: params.agentId,
sessionKey: params.sessionKey,
});
if (!nextMessage) {
continue;
}
const messageToAppend = (
idempotencyKey
? {
...(nextMessage as unknown as Record<string, unknown>),
idempotencyKey,
}
: nextMessage
) as AgentMessage;
await appendSessionTranscriptMessage({
transcriptPath: params.sessionFile,
message: messageToAppend,
config: params.config,
});
if (idempotencyKey) {
existingIdempotencyKeys.add(idempotencyKey);
}
const agentId = params.agentId?.trim() || DEFAULT_AGENT_ID;
const sessionId =
params.sessionId?.trim() ||
path
.basename(params.sessionFile)
.replace(/\.jsonl$/i, "")
.trim();
for (const message of messages) {
const dedupeIdentity = buildMirrorDedupeIdentity(message);
const idempotencyKey = params.idempotencyScope
? `${params.idempotencyScope}:${dedupeIdentity}`
: undefined;
const transcriptMessage = {
...message,
...(idempotencyKey ? { idempotencyKey } : {}),
} as AgentMessage;
const nextMessage = runAgentHarnessBeforeMessageWriteHook({
message: transcriptMessage,
agentId: params.agentId,
sessionKey: params.sessionKey,
});
if (!nextMessage) {
continue;
}
} finally {
await lock.release();
const messageToAppend = (
idempotencyKey
? {
...(nextMessage as unknown as Record<string, unknown>),
idempotencyKey,
}
: nextMessage
) as AgentMessage;
await appendSessionTranscriptMessage({
transcriptPath: params.sessionFile,
agentId,
sessionId,
message: messageToAppend,
config: params.config,
});
}
if (params.sessionKey) {
@@ -134,30 +130,3 @@ export async function mirrorCodexAppServerTranscript(params: {
emitSessionTranscriptUpdate(params.sessionFile);
}
}
async function readTranscriptIdempotencyKeys(sessionFile: string): Promise<Set<string>> {
const keys = new Set<string>();
let raw: string;
try {
raw = await fs.readFile(sessionFile, "utf8");
} catch (error) {
if ((error as NodeJS.ErrnoException).code !== "ENOENT") {
throw error;
}
return keys;
}
for (const line of raw.split(/\r?\n/)) {
if (!line.trim()) {
continue;
}
try {
const parsed = JSON.parse(line) as { message?: { idempotencyKey?: unknown } };
if (typeof parsed.message?.idempotencyKey === "string") {
keys.add(parsed.message.idempotencyKey);
}
} catch {
continue;
}
}
return keys;
}

View File

@@ -1,11 +1,13 @@
import fs from "node:fs/promises";
import path from "node:path";
import { afterEach, beforeEach, describe, expect, it } from "vitest";
import { afterEach, beforeEach, describe, expect, it, vi } from "vitest";
import { replaceSqliteSessionTranscriptEvents } from "../config/sessions/transcript-store.sqlite.js";
import {
clearInternalHooks,
registerInternalHook,
type AgentBootstrapHookContext,
} from "../hooks/internal-hooks.js";
import { closeOpenClawStateDatabaseForTest } from "../state/openclaw-state-db.js";
import { makeTempWorkspace } from "../test-helpers/workspace.js";
import {
_resetBootstrapWarningCacheForTest,
@@ -273,156 +275,135 @@ describe("hasCompletedBootstrapTurn", () => {
beforeEach(async () => {
tmpDir = await fs.mkdtemp(path.join(await fs.realpath("/tmp"), "openclaw-bootstrap-turn-"));
vi.stubEnv("OPENCLAW_STATE_DIR", tmpDir);
});
afterEach(async () => {
closeOpenClawStateDatabaseForTest();
vi.unstubAllEnvs();
await fs.rm(tmpDir, { recursive: true, force: true });
});
function writeTranscript(sessionFile: string, events: unknown[]): void {
const sessionId =
events.find((event): event is { type: "session"; id: string } =>
Boolean(
event &&
typeof event === "object" &&
(event as { type?: unknown }).type === "session" &&
typeof (event as { id?: unknown }).id === "string",
),
)?.id ?? path.basename(sessionFile, ".jsonl");
replaceSqliteSessionTranscriptEvents({
agentId: "main",
sessionId,
transcriptPath: sessionFile,
events,
});
}
it("returns false when session file does not exist", async () => {
expect(await hasCompletedBootstrapTurn(path.join(tmpDir, "missing.jsonl"))).toBe(false);
});
it("returns false for empty session files", async () => {
const sessionFile = path.join(tmpDir, "empty.jsonl");
await fs.writeFile(sessionFile, "", "utf8");
expect(await hasCompletedBootstrapTurn(sessionFile)).toBe(false);
});
it("returns false for header-only session files", async () => {
const sessionFile = path.join(tmpDir, "header-only.jsonl");
await fs.writeFile(sessionFile, `${JSON.stringify({ type: "session", id: "s1" })}\n`, "utf8");
writeTranscript(sessionFile, [{ type: "session", id: "s1" }]);
expect(await hasCompletedBootstrapTurn(sessionFile)).toBe(false);
});
it("returns false when no assistant turn has been flushed yet", async () => {
const sessionFile = path.join(tmpDir, "user-only.jsonl");
await fs.writeFile(
sessionFile,
[
JSON.stringify({ type: "session", id: "s1" }),
JSON.stringify({ type: "message", message: { role: "user", content: "hello" } }),
].join("\n") + "\n",
"utf8",
);
writeTranscript(sessionFile, [
{ type: "session", id: "s1" },
{ type: "message", message: { role: "user", content: "hello" } },
]);
expect(await hasCompletedBootstrapTurn(sessionFile)).toBe(false);
});
it("returns false for assistant turns without a recorded full bootstrap marker", async () => {
const sessionFile = path.join(tmpDir, "assistant-no-marker.jsonl");
await fs.writeFile(
sessionFile,
[
JSON.stringify({ type: "session", id: "s1" }),
JSON.stringify({ type: "message", message: { role: "user", content: "hello" } }),
JSON.stringify({ type: "message", message: { role: "assistant", content: "hi" } }),
].join("\n") + "\n",
"utf8",
);
writeTranscript(sessionFile, [
{ type: "session", id: "s1" },
{ type: "message", message: { role: "user", content: "hello" } },
{ type: "message", message: { role: "assistant", content: "hi" } },
]);
expect(await hasCompletedBootstrapTurn(sessionFile)).toBe(false);
});
it("returns true when a full bootstrap completion marker exists", async () => {
const sessionFile = path.join(tmpDir, "full-bootstrap.jsonl");
await fs.writeFile(
sessionFile,
[
JSON.stringify({ type: "message", message: { role: "assistant", content: "hi" } }),
JSON.stringify({
type: "custom",
customType: FULL_BOOTSTRAP_COMPLETED_CUSTOM_TYPE,
data: { timestamp: 1 },
}),
].join("\n") + "\n",
"utf8",
);
writeTranscript(sessionFile, [
{ type: "session", id: "s1" },
{ type: "message", message: { role: "assistant", content: "hi" } },
{
type: "custom",
customType: FULL_BOOTSTRAP_COMPLETED_CUSTOM_TYPE,
data: { timestamp: 1 },
},
]);
expect(await hasCompletedBootstrapTurn(sessionFile)).toBe(true);
});
it("returns false when compaction happened after the last assistant turn", async () => {
const sessionFile = path.join(tmpDir, "post-compaction.jsonl");
await fs.writeFile(
sessionFile,
[
JSON.stringify({
type: "custom",
customType: FULL_BOOTSTRAP_COMPLETED_CUSTOM_TYPE,
data: { timestamp: 1 },
}),
JSON.stringify({ type: "compaction", summary: "trimmed" }),
].join("\n") + "\n",
"utf8",
);
writeTranscript(sessionFile, [
{ type: "session", id: "s1" },
{
type: "custom",
customType: FULL_BOOTSTRAP_COMPLETED_CUSTOM_TYPE,
data: { timestamp: 1 },
},
{ type: "compaction", summary: "trimmed" },
]);
expect(await hasCompletedBootstrapTurn(sessionFile)).toBe(false);
});
it("returns true when a later full bootstrap marker happens after compaction", async () => {
const sessionFile = path.join(tmpDir, "assistant-after-compaction.jsonl");
await fs.writeFile(
sessionFile,
[
JSON.stringify({
type: "custom",
customType: FULL_BOOTSTRAP_COMPLETED_CUSTOM_TYPE,
data: { timestamp: 1 },
}),
JSON.stringify({ type: "compaction", summary: "trimmed" }),
JSON.stringify({ type: "message", message: { role: "user", content: "new ask" } }),
JSON.stringify({ type: "message", message: { role: "assistant", content: "new reply" } }),
JSON.stringify({
type: "custom",
customType: FULL_BOOTSTRAP_COMPLETED_CUSTOM_TYPE,
data: { timestamp: 2 },
}),
].join("\n") + "\n",
"utf8",
);
writeTranscript(sessionFile, [
{ type: "session", id: "s1" },
{
type: "custom",
customType: FULL_BOOTSTRAP_COMPLETED_CUSTOM_TYPE,
data: { timestamp: 1 },
},
{ type: "compaction", summary: "trimmed" },
{ type: "message", message: { role: "user", content: "new ask" } },
{ type: "message", message: { role: "assistant", content: "new reply" } },
{
type: "custom",
customType: FULL_BOOTSTRAP_COMPLETED_CUSTOM_TYPE,
data: { timestamp: 2 },
},
]);
expect(await hasCompletedBootstrapTurn(sessionFile)).toBe(true);
});
it("ignores malformed JSON lines", async () => {
const sessionFile = path.join(tmpDir, "malformed.jsonl");
await fs.writeFile(
sessionFile,
[
"{broken",
JSON.stringify({
type: "custom",
customType: FULL_BOOTSTRAP_COMPLETED_CUSTOM_TYPE,
data: { timestamp: 1 },
}),
].join("\n") + "\n",
"utf8",
);
expect(await hasCompletedBootstrapTurn(sessionFile)).toBe(true);
});
it("finds a recent full bootstrap marker even when the scan starts mid-file", async () => {
it("finds a recent full bootstrap marker after large earlier content", async () => {
const sessionFile = path.join(tmpDir, "large-prefix.jsonl");
const hugePrefix = "x".repeat(300 * 1024);
await fs.writeFile(
sessionFile,
[
JSON.stringify({ type: "message", message: { role: "user", content: hugePrefix } }),
JSON.stringify({
type: "custom",
customType: FULL_BOOTSTRAP_COMPLETED_CUSTOM_TYPE,
data: { timestamp: 1 },
}),
].join("\n") + "\n",
"utf8",
);
writeTranscript(sessionFile, [
{ type: "session", id: "s1" },
{ type: "message", message: { role: "user", content: hugePrefix } },
{
type: "custom",
customType: FULL_BOOTSTRAP_COMPLETED_CUSTOM_TYPE,
data: { timestamp: 1 },
},
]);
expect(await hasCompletedBootstrapTurn(sessionFile)).toBe(true);
});
it("returns false for symbolic links", async () => {
const realFile = path.join(tmpDir, "real.jsonl");
const linkFile = path.join(tmpDir, "link.jsonl");
await fs.writeFile(
realFile,
`${JSON.stringify({ type: "custom", customType: FULL_BOOTSTRAP_COMPLETED_CUSTOM_TYPE, data: { timestamp: 1 } })}\n`,
"utf8",
);
await fs.writeFile(realFile, "", "utf8");
await fs.symlink(realFile, linkFile);
expect(await hasCompletedBootstrapTurn(linkFile)).toBe(false);
});

View File

@@ -1,5 +1,8 @@
import fs from "node:fs/promises";
import path from "node:path";
import {
loadSqliteSessionTranscriptEvents,
resolveSqliteSessionTranscriptScopeForPath,
} from "../config/sessions/transcript-store.sqlite.js";
import type { AgentContextInjection } from "../config/types.agent-defaults.js";
import type { OpenClawConfig } from "../config/types.openclaw.js";
import { normalizeOptionalString } from "../shared/string-coerce.js";
@@ -24,7 +27,6 @@ import {
export type BootstrapContextMode = "full" | "lightweight";
type BootstrapContextRunKind = "default" | "heartbeat" | "cron";
const CONTINUATION_SCAN_MAX_TAIL_BYTES = 256 * 1024;
const CONTINUATION_SCAN_MAX_RECORDS = 500;
export const FULL_BOOTSTRAP_COMPLETED_CUSTOM_TYPE = "openclaw:bootstrap-context:full";
const BOOTSTRAP_WARNING_DEDUPE_LIMIT = 1024;
@@ -56,74 +58,34 @@ export function resolveContextInjectionMode(config?: OpenClawConfig): AgentConte
}
export async function hasCompletedBootstrapTurn(sessionFile: string): Promise<boolean> {
try {
const stat = await fs.lstat(sessionFile);
if (stat.isSymbolicLink()) {
return false;
}
const fh = await fs.open(sessionFile, "r");
try {
const bytesToRead = Math.min(stat.size, CONTINUATION_SCAN_MAX_TAIL_BYTES);
if (bytesToRead <= 0) {
return false;
}
const start = stat.size - bytesToRead;
const buffer = Buffer.allocUnsafe(bytesToRead);
const { bytesRead } = await fh.read(buffer, 0, bytesToRead, start);
let text = buffer.toString("utf-8", 0, bytesRead);
if (start > 0) {
const firstNewline = text.indexOf("\n");
if (firstNewline === -1) {
return false;
}
text = text.slice(firstNewline + 1);
}
const records = text
.split(/\r?\n/u)
.filter((line) => line.trim().length > 0)
.slice(-CONTINUATION_SCAN_MAX_RECORDS);
let compactedAfterLatestAssistant = false;
for (let i = records.length - 1; i >= 0; i--) {
const line = records[i];
if (!line) {
continue;
}
let entry: unknown;
try {
entry = JSON.parse(line);
} catch {
continue;
}
const record = entry as
| {
type?: string;
customType?: string;
message?: { role?: string };
}
| null
| undefined;
if (record?.type === "compaction") {
compactedAfterLatestAssistant = true;
continue;
}
if (
record?.type === "custom" &&
record.customType === FULL_BOOTSTRAP_COMPLETED_CUSTOM_TYPE
) {
return !compactedAfterLatestAssistant;
}
}
return false;
} finally {
await fh.close();
}
} catch {
const scope = resolveSqliteSessionTranscriptScopeForPath({ transcriptPath: sessionFile });
if (!scope) {
return false;
}
const records = loadSqliteSessionTranscriptEvents(scope)
.map((entry) => entry.event)
.slice(-CONTINUATION_SCAN_MAX_RECORDS);
let compactedAfterLatestAssistant = false;
for (let i = records.length - 1; i >= 0; i--) {
const record = records[i] as
| {
type?: string;
customType?: string;
message?: { role?: string };
}
| null
| undefined;
if (record?.type === "compaction") {
compactedAfterLatestAssistant = true;
continue;
}
if (record?.type === "custom" && record.customType === FULL_BOOTSTRAP_COMPLETED_CUSTOM_TYPE) {
return !compactedAfterLatestAssistant;
}
}
return false;
}
export function makeBootstrapWarn(params: {

View File

@@ -2,9 +2,11 @@ import fs from "node:fs/promises";
import os from "node:os";
import path from "node:path";
import { afterEach, beforeEach, describe, expect, it, vi } from "vitest";
import type { SessionEntry } from "../../config/sessions.js";
import { loadSessionStore, saveSessionStore, type SessionEntry } from "../../config/sessions.js";
import { appendSessionTranscriptMessage } from "../../config/sessions/transcript-append.js";
import { loadSqliteSessionTranscriptEvents } from "../../config/sessions/transcript-store.sqlite.js";
import type { OpenClawConfig } from "../../config/types.openclaw.js";
import { closeOpenClawStateDatabaseForTest } from "../../state/openclaw-state-db.js";
import { FailoverError } from "../failover-error.js";
import { runEmbeddedPiAgent, type EmbeddedPiRunResult } from "../pi-embedded.js";
import { persistCliTurnTranscript, runAgentAttempt } from "./attempt-execution.js";
@@ -62,7 +64,7 @@ function makeCliResult(text: string): EmbeddedPiRunResult {
}
async function readSessionMessages(sessionFile: string) {
return (await readSessionFileJsonLines<{ type?: string; message?: unknown }>(sessionFile))
return (await readSessionFileEntries(sessionFile))
.filter((entry) => entry.type === "message")
.map(
(entry) =>
@@ -71,25 +73,20 @@ async function readSessionMessages(sessionFile: string) {
}
async function readSessionFileEntries(sessionFile: string) {
return await readSessionFileJsonLines<{
type?: string;
id?: string;
parentId?: string | null;
cwd?: string;
message?: { role?: string };
}>(sessionFile);
}
async function readSessionFileJsonLines<T>(sessionFile: string): Promise<T[]> {
const raw = await fs.readFile(sessionFile, "utf-8");
const entries: T[] = [];
for (const line of raw.split(/\r?\n/)) {
if (line.length === 0) {
continue;
}
entries.push(JSON.parse(line) as T);
}
return entries;
const sessionId = path.basename(sessionFile).replace(/\.jsonl$/, "");
return loadSqliteSessionTranscriptEvents({
agentId: "main",
sessionId,
}).map(
(entry) =>
entry.event as {
type?: string;
id?: string;
parentId?: string | null;
cwd?: string;
message?: { role?: string };
},
);
}
describe("CLI attempt execution", () => {
@@ -98,7 +95,8 @@ describe("CLI attempt execution", () => {
beforeEach(async () => {
tmpDir = await fs.mkdtemp(path.join(os.tmpdir(), "openclaw-cli-attempt-"));
storePath = path.join(tmpDir, "sessions.json");
storePath = path.join(tmpDir, "agents", "main", "sessions", "sessions.json");
vi.stubEnv("OPENCLAW_STATE_DIR", tmpDir);
runCliAgentMock.mockReset();
runEmbeddedPiAgentMock.mockReset();
});
@@ -109,9 +107,15 @@ describe("CLI attempt execution", () => {
} else {
process.env.HOME = ORIGINAL_HOME;
}
closeOpenClawStateDatabaseForTest();
vi.unstubAllEnvs();
await fs.rm(tmpDir, { recursive: true, force: true });
});
async function writeStore(store: Record<string, SessionEntry>) {
await saveSessionStore(storePath, store);
}
async function runClaudeCliAttempt(params: {
sessionKey: string;
sessionEntry: SessionEntry;
@@ -171,7 +175,7 @@ describe("CLI attempt execution", () => {
claudeCliSessionId: "stale-legacy-session",
};
const sessionStore: Record<string, SessionEntry> = { [sessionKey]: sessionEntry };
await fs.writeFile(storePath, JSON.stringify(sessionStore, null, 2), "utf-8");
await writeStore(sessionStore);
runCliAgentMock
.mockRejectedValueOnce(
@@ -220,10 +224,7 @@ describe("CLI attempt execution", () => {
expect(sessionStore[sessionKey]?.cliSessionIds?.["claude-cli"]).toBeUndefined();
expect(sessionStore[sessionKey]?.claudeCliSessionId).toBeUndefined();
const persisted = JSON.parse(await fs.readFile(storePath, "utf-8")) as Record<
string,
SessionEntry
>;
const persisted = loadSessionStore(storePath);
expect(persisted[sessionKey]?.cliSessionIds?.["claude-cli"]).toBeUndefined();
expect(persisted[sessionKey]?.claudeCliSessionId).toBeUndefined();
});
@@ -245,7 +246,7 @@ describe("CLI attempt execution", () => {
claudeCliSessionId: "phantom-claude-session",
};
const sessionStore: Record<string, SessionEntry> = { [sessionKey]: sessionEntry };
await fs.writeFile(storePath, JSON.stringify(sessionStore, null, 2), "utf-8");
await writeStore(sessionStore);
runCliAgentMock.mockResolvedValueOnce(makeCliResult("fresh cli response"));
await runClaudeCliAttempt({
@@ -263,10 +264,7 @@ describe("CLI attempt execution", () => {
expect(sessionStore[sessionKey]?.cliSessionIds?.["claude-cli"]).toBeUndefined();
expect(sessionStore[sessionKey]?.claudeCliSessionId).toBeUndefined();
const persisted = JSON.parse(await fs.readFile(storePath, "utf-8")) as Record<
string,
SessionEntry
>;
const persisted = loadSessionStore(storePath);
expect(persisted[sessionKey]?.cliSessionBindings?.["claude-cli"]).toBeUndefined();
expect(persisted[sessionKey]?.cliSessionIds?.["claude-cli"]).toBeUndefined();
expect(persisted[sessionKey]?.claudeCliSessionId).toBeUndefined();
@@ -303,7 +301,7 @@ describe("CLI attempt execution", () => {
claudeCliSessionId: cliSessionId,
};
const sessionStore: Record<string, SessionEntry> = { [sessionKey]: sessionEntry };
await fs.writeFile(storePath, JSON.stringify(sessionStore, null, 2), "utf-8");
await writeStore(sessionStore);
runCliAgentMock.mockResolvedValueOnce(makeCliResult("resumed cli response"));
await runClaudeCliAttempt({
@@ -333,7 +331,7 @@ describe("CLI attempt execution", () => {
authProfileOverrideSource: "user",
};
const sessionStore: Record<string, SessionEntry> = { [sessionKey]: sessionEntry };
await fs.writeFile(storePath, JSON.stringify(sessionStore, null, 2), "utf-8");
await writeStore(sessionStore);
runCliAgentMock.mockResolvedValueOnce(makeCliResult("codex cli response"));
await runAgentAttempt({
@@ -377,7 +375,7 @@ describe("CLI attempt execution", () => {
updatedAt: Date.now(),
};
const sessionStore: Record<string, SessionEntry> = { [sessionKey]: sessionEntry };
await fs.writeFile(storePath, JSON.stringify(sessionStore, null, 2), "utf-8");
await writeStore(sessionStore);
const updatedEntry = await persistCliTurnTranscript({
body: "persist this",
@@ -429,7 +427,7 @@ describe("CLI attempt execution", () => {
updatedAt: Date.now(),
};
const sessionStore: Record<string, SessionEntry> = { [sessionKey]: sessionEntry };
await fs.writeFile(storePath, JSON.stringify(sessionStore, null, 2), "utf-8");
await writeStore(sessionStore);
const result = makeCliResult("already mirrored");
result.meta.executionTrace = {
@@ -486,7 +484,7 @@ describe("CLI attempt execution", () => {
updatedAt: Date.now(),
};
const sessionStore: Record<string, SessionEntry> = { [sessionKey]: sessionEntry };
await fs.writeFile(storePath, JSON.stringify(sessionStore, null, 2), "utf-8");
await writeStore(sessionStore);
const result = makeCliResult("same answer");
result.meta.executionTrace = {
@@ -518,6 +516,7 @@ describe("CLI attempt execution", () => {
await appendSessionTranscriptMessage({
transcriptPath: sessionFile,
agentId: "main",
sessionId: sessionEntry.sessionId,
cwd: tmpDir,
config: {},
@@ -557,7 +556,7 @@ describe("CLI attempt execution", () => {
updatedAt: Date.now(),
};
const sessionStore: Record<string, SessionEntry> = { [sessionKey]: sessionEntry };
await fs.writeFile(storePath, JSON.stringify(sessionStore, null, 2), "utf-8");
await writeStore(sessionStore);
const updatedEntry = await persistCliTurnTranscript({
body: [
@@ -593,7 +592,7 @@ describe("CLI attempt execution", () => {
updatedAt: Date.now(),
};
const sessionStore: Record<string, SessionEntry> = { [sessionKey]: sessionEntry };
await fs.writeFile(storePath, JSON.stringify(sessionStore, null, 2), "utf-8");
await writeStore(sessionStore);
runCliAgentMock.mockResolvedValueOnce(makeCliResult("channel aware"));
await runAgentAttempt({
@@ -697,7 +696,7 @@ describe("CLI attempt execution", () => {
updatedAt: Date.now(),
};
const sessionStore: Record<string, SessionEntry> = { [sessionKey]: sessionEntry };
await fs.writeFile(storePath, JSON.stringify(sessionStore, null, 2), "utf-8");
await writeStore(sessionStore);
runCliAgentMock.mockResolvedValueOnce(makeCliResult("canonical cli"));
await runAgentAttempt({
@@ -754,7 +753,7 @@ describe("CLI attempt execution", () => {
updatedAt: Date.now(),
};
const sessionStore: Record<string, SessionEntry> = { [sessionKey]: sessionEntry };
await fs.writeFile(storePath, JSON.stringify(sessionStore, null, 2), "utf-8");
await writeStore(sessionStore);
runCliAgentMock.mockResolvedValueOnce(makeCliResult("canonical codex cli"));
await runAgentAttempt({
@@ -811,7 +810,7 @@ describe("CLI attempt execution", () => {
updatedAt: Date.now(),
};
const sessionStore: Record<string, SessionEntry> = { [sessionKey]: sessionEntry };
await fs.writeFile(storePath, JSON.stringify(sessionStore, null, 2), "utf-8");
await writeStore(sessionStore);
runEmbeddedPiAgentMock.mockResolvedValueOnce({
meta: { durationMs: 1 },
} satisfies EmbeddedPiRunResult);
@@ -888,7 +887,7 @@ describe("CLI attempt execution", () => {
updatedAt: Date.now(),
};
const sessionStore: Record<string, SessionEntry> = { [sessionKey]: sessionEntry };
await fs.writeFile(storePath, JSON.stringify(sessionStore, null, 2), "utf-8");
await writeStore(sessionStore);
runCliAgentMock.mockResolvedValueOnce(makeCliResult("cleanup cli"));
await runAgentAttempt({

View File

@@ -9,6 +9,10 @@ import {
startsWithSilentToken,
stripLeadingSilentToken,
} from "../../auto-reply/tokens.js";
import {
loadSqliteSessionTranscriptEvents,
resolveSqliteSessionTranscriptScopeForPath,
} from "../../config/sessions/transcript-store.sqlite.js";
import {
type ClaudeCliFallbackSeed,
readClaudeCliFallbackSeed,
@@ -68,13 +72,23 @@ async function jsonlFileHasAssistantMessage(filePath: string | undefined): Promi
}
}
/**
* Check whether a session transcript file exists and contains at least one
* assistant message, indicating that the SessionManager has flushed the
* initial user+assistant exchange to disk.
*/
function sqliteTranscriptHasAssistantMessage(sessionFile: string | undefined): boolean {
if (!sessionFile) {
return false;
}
const scope = resolveSqliteSessionTranscriptScopeForPath({ transcriptPath: sessionFile });
if (!scope) {
return false;
}
return loadSqliteSessionTranscriptEvents(scope).some((entry) => {
const record = entry.event as Record<string, unknown> | null;
return (record?.message as Record<string, unknown> | undefined)?.role === "assistant";
});
}
/** Check whether the SQLite transcript contains at least one assistant message. */
export async function sessionFileHasContent(sessionFile: string | undefined): Promise<boolean> {
return await jsonlFileHasAssistantMessage(sessionFile);
return sqliteTranscriptHasAssistantMessage(sessionFile);
}
export async function claudeCliSessionTranscriptHasContent(params: {

View File

@@ -1,7 +1,9 @@
import fs from "node:fs/promises";
import os from "node:os";
import path from "node:path";
import { afterEach, beforeEach, describe, expect, it } from "vitest";
import { afterEach, beforeEach, describe, expect, it, vi } from "vitest";
import { replaceSqliteSessionTranscriptEvents } from "../../config/sessions/transcript-store.sqlite.js";
import { closeOpenClawStateDatabaseForTest } from "../../state/openclaw-state-db.js";
import {
buildClaudeCliFallbackContextPrelude,
claudeCliSessionTranscriptHasContent,
@@ -284,12 +286,24 @@ describe("sessionFileHasContent", () => {
beforeEach(async () => {
tmpDir = await fs.mkdtemp(path.join(os.tmpdir(), "oc-test-"));
vi.stubEnv("OPENCLAW_STATE_DIR", tmpDir);
});
afterEach(async () => {
closeOpenClawStateDatabaseForTest();
vi.unstubAllEnvs();
await fs.rm(tmpDir, { recursive: true, force: true });
});
function writeTranscript(file: string, events: unknown[]): void {
replaceSqliteSessionTranscriptEvents({
agentId: "main",
sessionId: path.basename(file, ".jsonl"),
transcriptPath: file,
events: [{ type: "session", id: path.basename(file, ".jsonl") }, ...events],
});
}
it("returns false for undefined sessionFile", async () => {
expect(await sessionFileHasContent(undefined)).toBe(false);
});
@@ -300,63 +314,45 @@ describe("sessionFileHasContent", () => {
it("returns false when session file is empty", async () => {
const file = path.join(tmpDir, "empty.jsonl");
await fs.writeFile(file, "", "utf-8");
expect(await sessionFileHasContent(file)).toBe(false);
});
it("returns false when session file has only user message (no assistant flush)", async () => {
const file = path.join(tmpDir, "user-only.jsonl");
await fs.writeFile(
file,
'{"type":"session","id":"s1"}\n{"type":"message","message":{"role":"user","content":"hello"}}\n',
"utf-8",
);
writeTranscript(file, [{ type: "message", message: { role: "user", content: "hello" } }]);
expect(await sessionFileHasContent(file)).toBe(false);
});
it("returns true when session file has assistant message (flushed)", async () => {
const file = path.join(tmpDir, "with-assistant.jsonl");
await fs.writeFile(
file,
'{"type":"session","id":"s1"}\n{"type":"message","message":{"role":"user","content":"hello"}}\n{"type":"message","message":{"role":"assistant","content":"hi"}}\n',
"utf-8",
);
writeTranscript(file, [
{ type: "message", message: { role: "user", content: "hello" } },
{ type: "message", message: { role: "assistant", content: "hi" } },
]);
expect(await sessionFileHasContent(file)).toBe(true);
});
it("returns true when session file has spaced JSON (role : assistant)", async () => {
const file = path.join(tmpDir, "spaced.jsonl");
await fs.writeFile(
file,
'{"type":"message","message":{"role": "assistant","content":"hi"}}\n',
"utf-8",
);
writeTranscript(file, [{ type: "message", message: { role: "assistant", content: "hi" } }]);
expect(await sessionFileHasContent(file)).toBe(true);
});
it("returns true when assistant message appears after large user content", async () => {
const file = path.join(tmpDir, "large-user.jsonl");
// Create a user message whose JSON line exceeds 256KB to ensure the
// JSONL-based parser (CWE-703 fix) finds the assistant record that a
// naive byte-prefix approach would miss.
// transcript parser finds the assistant record after large earlier content.
const bigContent = "x".repeat(300 * 1024);
const lines =
[
`{"type":"session","id":"s1"}`,
`{"type":"message","message":{"role":"user","content":"${bigContent}"}}`,
`{"type":"message","message":{"role":"assistant","content":"done"}}`,
].join("\n") + "\n";
await fs.writeFile(file, lines, "utf-8");
writeTranscript(file, [
{ type: "message", message: { role: "user", content: bigContent } },
{ type: "message", message: { role: "assistant", content: "done" } },
]);
expect(await sessionFileHasContent(file)).toBe(true);
});
it("returns false when session file is a symbolic link", async () => {
const realFile = path.join(tmpDir, "real.jsonl");
await fs.writeFile(
realFile,
'{"type":"message","message":{"role":"assistant","content":"hi"}}\n',
"utf-8",
);
await fs.writeFile(realFile, "", "utf-8");
const link = path.join(tmpDir, "link.jsonl");
await fs.symlink(realFile, link);
expect(await sessionFileHasContent(link)).toBe(false);

View File

@@ -28,10 +28,6 @@ import { isCliProvider } from "../model-selection.js";
import { resolveOpenAIRuntimeProviderForPi } from "../openai-codex-routing.js";
import { runEmbeddedPiAgent, type EmbeddedPiRunResult } from "../pi-embedded.js";
import { buildAgentRuntimeAuthPlan } from "../runtime-plan/auth.js";
import {
acquireSessionWriteLock,
resolveSessionWriteLockAcquireTimeoutMs,
} from "../session-write-lock.js";
import { buildWorkspaceSkillSnapshot } from "../skills.js";
import { buildUsageWithNoCost } from "../stream-message-shared.js";
import {
@@ -207,13 +203,35 @@ async function persistTextTurnTranscript(
agentId: params.sessionAgentId,
threadId: params.threadId,
});
const lock = await acquireSessionWriteLock({
sessionFile,
timeoutMs: resolveSessionWriteLockAcquireTimeoutMs(params.config),
allowReentrant: true,
});
try {
if (promptText) {
if (promptText) {
await appendSessionTranscriptMessage({
transcriptPath: sessionFile,
agentId: params.sessionAgentId,
sessionId: params.sessionId,
cwd: params.sessionCwd,
config: params.config,
message: {
role: "user",
content: promptText,
timestamp: Date.now(),
},
});
}
if (replyText) {
let appendAssistant = true;
if (params.embeddedAssistantGapFill) {
const latest = await readTailAssistantTextFromSessionTranscript(sessionFile, {
agentId: params.sessionAgentId,
sessionId: params.sessionId,
});
const normalizedReply = normalizeTranscriptMirrorText(replyText);
const normalizedLatest = latest?.text ? normalizeTranscriptMirrorText(latest.text) : "";
if (normalizedLatest && normalizedLatest === normalizedReply) {
appendAssistant = false;
}
}
if (appendAssistant) {
await appendSessionTranscriptMessage({
transcriptPath: sessionFile,
agentId: params.sessionAgentId,
@@ -221,48 +239,17 @@ async function persistTextTurnTranscript(
cwd: params.sessionCwd,
config: params.config,
message: {
role: "user",
content: promptText,
role: "assistant",
content: [{ type: "text", text: replyText }],
api: params.assistant.api,
provider: params.assistant.provider,
model: params.assistant.model,
usage: resolveTranscriptUsage(params.assistant.usage),
stopReason: "stop",
timestamp: Date.now(),
},
});
}
if (replyText) {
let appendAssistant = true;
if (params.embeddedAssistantGapFill) {
const latest = await readTailAssistantTextFromSessionTranscript(sessionFile, {
agentId: params.sessionAgentId,
sessionId: params.sessionId,
});
const normalizedReply = normalizeTranscriptMirrorText(replyText);
const normalizedLatest = latest?.text ? normalizeTranscriptMirrorText(latest.text) : "";
if (normalizedLatest && normalizedLatest === normalizedReply) {
appendAssistant = false;
}
}
if (appendAssistant) {
await appendSessionTranscriptMessage({
transcriptPath: sessionFile,
agentId: params.sessionAgentId,
sessionId: params.sessionId,
cwd: params.sessionCwd,
config: params.config,
message: {
role: "assistant",
content: [{ type: "text", text: replyText }],
api: params.assistant.api,
provider: params.assistant.provider,
model: params.assistant.model,
usage: resolveTranscriptUsage(params.assistant.usage),
stopReason: "stop",
timestamp: Date.now(),
},
});
}
}
} finally {
await lock.release();
}
emitSessionTranscriptUpdate({ sessionFile, sessionKey: params.sessionKey });

View File

@@ -2,6 +2,7 @@ import fs from "node:fs/promises";
import os from "node:os";
import path from "node:path";
import { afterEach, beforeEach, describe, expect, it, vi } from "vitest";
import { replaceSqliteSessionTranscriptEvents } from "../../config/sessions/transcript-store.sqlite.js";
import type { SessionEntry } from "../../config/sessions/types.js";
import type { OpenClawConfig } from "../../config/types.openclaw.js";
import type { ContextEngine } from "../../context-engine/types.js";
@@ -43,24 +44,26 @@ function buildContextEngine(params: {
async function writeSessionFile(params: { sessionFile: string; sessionId: string }) {
await fs.mkdir(path.dirname(params.sessionFile), { recursive: true });
await fs.writeFile(
params.sessionFile,
[
JSON.stringify({
replaceSqliteSessionTranscriptEvents({
agentId: "main",
sessionId: params.sessionId,
transcriptPath: params.sessionFile,
events: [
{
type: "session",
version: CURRENT_SESSION_VERSION,
id: params.sessionId,
timestamp: new Date(0).toISOString(),
cwd: path.dirname(params.sessionFile),
}),
JSON.stringify({
},
{
type: "message",
id: "user-1",
parentId: null,
message: { role: "user", content: "old ask", timestamp: 1 },
timestamp: new Date(1).toISOString(),
}),
JSON.stringify({
},
{
type: "message",
id: "assistant-1",
parentId: "user-1",
@@ -70,11 +73,9 @@ async function writeSessionFile(params: { sessionFile: string; sessionId: string
timestamp: 2,
},
timestamp: new Date(2).toISOString(),
}),
"",
].join("\n"),
"utf-8",
);
},
],
});
}
describe("runCliTurnCompactionLifecycle", () => {
@@ -82,10 +83,12 @@ describe("runCliTurnCompactionLifecycle", () => {
beforeEach(async () => {
tmpDir = await fs.mkdtemp(path.join(os.tmpdir(), "openclaw-cli-compaction-"));
vi.stubEnv("OPENCLAW_STATE_DIR", tmpDir);
});
afterEach(async () => {
resetCliCompactionTestDeps();
vi.unstubAllEnvs();
await fs.rm(tmpDir, { recursive: true, force: true });
});

View File

@@ -2,6 +2,7 @@ import fs from "node:fs/promises";
import os from "node:os";
import path from "node:path";
import { afterEach, describe, expect, it, vi } from "vitest";
import { closeOpenClawStateDatabaseForTest } from "../../state/openclaw-state-db.js";
import { makeAgentAssistantMessage } from "../test-helpers/agent-message-fixtures.js";
import { SessionManager } from "../transcript/session-transcript-contract.js";
import {
@@ -15,10 +16,13 @@ let tmpDir: string | undefined;
async function createTmpDir(): Promise<string> {
tmpDir = await fs.mkdtemp(path.join(os.tmpdir(), "compaction-successor-test-"));
vi.stubEnv("OPENCLAW_STATE_DIR", tmpDir);
return tmpDir;
}
afterEach(async () => {
closeOpenClawStateDatabaseForTest();
vi.unstubAllEnvs();
if (tmpDir) {
await fs.rm(tmpDir, { recursive: true, force: true }).catch(() => undefined);
tmpDir = undefined;
@@ -130,8 +134,8 @@ describe("rotateTranscriptAfterCompaction", () => {
it("creates a compacted successor transcript and leaves the archive untouched", async () => {
const dir = await createTmpDir();
const { manager, sessionFile, firstKeptId, oldUserId } = createCompactedSession(dir);
const originalBytes = await fs.readFile(sessionFile, "utf8");
const originalEntryCount = manager.getEntries().length;
const originalEntries = manager.getEntries();
const result = await rotateTranscriptAfterCompaction({
sessionManager: manager,
@@ -143,7 +147,7 @@ describe("rotateTranscriptAfterCompaction", () => {
const successorSessionId = requireString(result.sessionId, "successor session id");
const successorFile = requireString(result.sessionFile, "successor session file");
expect(successorFile).not.toBe(sessionFile);
expect(await fs.readFile(sessionFile, "utf8")).toBe(originalBytes);
expect(SessionManager.open(sessionFile).getEntries()).toEqual(originalEntries);
const successor = SessionManager.open(successorFile);
const header = requireValue(successor.getHeader(), "successor header");

View File

@@ -2,6 +2,12 @@ import fs from "node:fs/promises";
import os from "node:os";
import path from "node:path";
import { afterEach, describe, expect, it, vi } from "vitest";
import {
exportSqliteSessionTranscriptJsonl,
replaceSqliteSessionTranscriptEvents,
resolveSqliteSessionTranscriptScopeForPath,
} from "../config/sessions/transcript-store.sqlite.js";
import { closeOpenClawStateDatabaseForTest } from "../state/openclaw-state-db.js";
import { BLANK_USER_FALLBACK_TEXT, repairSessionFileIfNeeded } from "./session-file-repair.js";
function buildSessionHeaderAndMessage() {
@@ -30,35 +36,50 @@ async function createTempSessionPath() {
return { dir, file: path.join(dir, "session.jsonl") };
}
function requireBackupPath(result: { backupPath?: string }): string {
if (!result.backupPath) {
throw new Error("expected session repair backup path");
}
return result.backupPath;
}
afterEach(async () => {
closeOpenClawStateDatabaseForTest();
await Promise.all(tempDirs.splice(0).map((dir) => fs.rm(dir, { recursive: true, force: true })));
});
function writeTranscriptEvents(file: string, events: unknown[]) {
const sessionId =
events.find((event): event is { type: "session"; id: string } =>
Boolean(
event &&
typeof event === "object" &&
(event as { type?: unknown }).type === "session" &&
typeof (event as { id?: unknown }).id === "string",
),
)?.id ?? path.basename(file, ".jsonl");
replaceSqliteSessionTranscriptEvents({
agentId: "main",
sessionId,
transcriptPath: file,
events,
});
}
async function readTranscriptJsonl(file: string): Promise<string> {
const scope = resolveSqliteSessionTranscriptScopeForPath({ transcriptPath: file });
return scope ? exportSqliteSessionTranscriptJsonl(scope) : "";
}
describe("repairSessionFileIfNeeded", () => {
it("rewrites session files that contain malformed lines", async () => {
const { file } = await createTempSessionPath();
const { header, message } = buildSessionHeaderAndMessage();
const content = `${JSON.stringify(header)}\n${JSON.stringify(message)}\n{"type":"message"`;
await fs.writeFile(file, content, "utf-8");
writeTranscriptEvents(file, [
header,
message,
{ type: "message", id: "corrupt", message: { role: null, content: "bad" } },
]);
const result = await repairSessionFileIfNeeded({ sessionFile: file });
expect(result.repaired).toBe(true);
expect(result.droppedLines).toBe(1);
const backupPath = requireBackupPath(result);
const repaired = await fs.readFile(file, "utf-8");
const repaired = await readTranscriptJsonl(file);
expect(repaired.trim().split("\n")).toHaveLength(2);
const backup = await fs.readFile(backupPath, "utf-8");
expect(backup).toBe(content);
});
it("does not drop CRLF-terminated JSONL lines", async () => {
@@ -80,8 +101,7 @@ describe("repairSessionFileIfNeeded", () => {
timestamp: new Date().toISOString(),
message: { role: "user", content: "hello" },
};
const content = `${JSON.stringify(badHeader)}\n{"type":"message"`;
await fs.writeFile(file, content, "utf-8");
writeTranscriptEvents(file, [badHeader]);
const warn = vi.fn();
const result = await repairSessionFileIfNeeded({ sessionFile: file, warn });
@@ -99,8 +119,8 @@ describe("repairSessionFileIfNeeded", () => {
const result = await repairSessionFileIfNeeded({ sessionFile: dir, warn });
expect(result.repaired).toBe(false);
expect(result.reason).toContain("failed to read session file");
expect(warn).toHaveBeenCalledTimes(1);
expect(result.reason).toBe("missing SQLite transcript");
expect(warn).not.toHaveBeenCalled();
});
it("rewrites persisted assistant messages with empty content arrays", async () => {
@@ -130,8 +150,7 @@ describe("repairSessionFileIfNeeded", () => {
timestamp: new Date().toISOString(),
message: { role: "user", content: "retry" },
};
const original = `${JSON.stringify(header)}\n${JSON.stringify(message)}\n${JSON.stringify(poisonedAssistantEntry)}\n${JSON.stringify(followUp)}\n`;
await fs.writeFile(file, original, "utf-8");
writeTranscriptEvents(file, [header, message, poisonedAssistantEntry, followUp]);
const debug = vi.fn();
const result = await repairSessionFileIfNeeded({ sessionFile: file, debug });
@@ -139,13 +158,12 @@ describe("repairSessionFileIfNeeded", () => {
expect(result.repaired).toBe(true);
expect(result.droppedLines).toBe(0);
expect(result.rewrittenAssistantMessages).toBe(1);
await expect(fs.readFile(requireBackupPath(result), "utf-8")).resolves.toBe(original);
expect(debug).toHaveBeenCalledTimes(1);
const debugMessage = debug.mock.calls[0]?.[0] as string;
expect(debugMessage).toContain("rewrote 1 assistant message(s)");
expect(debugMessage).not.toContain("dropped");
const repaired = await fs.readFile(file, "utf-8");
const repaired = await readTranscriptJsonl(file);
const repairedLines = repaired.trim().split("\n");
expect(repairedLines).toHaveLength(4);
const repairedEntry: { message: { content: { type: string; text: string }[] } } = JSON.parse(
@@ -169,8 +187,7 @@ describe("repairSessionFileIfNeeded", () => {
content: [{ type: "text", text: "" }],
},
};
const original = `${JSON.stringify(header)}\n${JSON.stringify(blankUserEntry)}\n${JSON.stringify(message)}\n`;
await fs.writeFile(file, original, "utf-8");
writeTranscriptEvents(file, [header, blankUserEntry, message]);
const debug = vi.fn();
const result = await repairSessionFileIfNeeded({ sessionFile: file, debug });
@@ -180,7 +197,7 @@ describe("repairSessionFileIfNeeded", () => {
expect(result.droppedBlankUserMessages).toBe(0);
expect(debug.mock.calls[0]?.[0]).toContain("rewrote 1 user message(s)");
const repaired = await fs.readFile(file, "utf-8");
const repaired = await readTranscriptJsonl(file);
const repairedLines = repaired.trim().split("\n");
expect(repairedLines).toHaveLength(3);
const rewrittenEntry = JSON.parse(repairedLines[1]);
@@ -203,15 +220,14 @@ describe("repairSessionFileIfNeeded", () => {
content: " ",
},
};
const original = `${JSON.stringify(header)}\n${JSON.stringify(blankStringUserEntry)}\n${JSON.stringify(message)}\n`;
await fs.writeFile(file, original, "utf-8");
writeTranscriptEvents(file, [header, blankStringUserEntry, message]);
const result = await repairSessionFileIfNeeded({ sessionFile: file });
expect(result.repaired).toBe(true);
expect(result.rewrittenUserMessages).toBe(1);
const repaired = await fs.readFile(file, "utf-8");
const repaired = await readTranscriptJsonl(file);
const repairedLines = repaired.trim().split("\n");
expect(repairedLines).toHaveLength(3);
const rewrittenEntry = JSON.parse(repairedLines[1]);
@@ -234,14 +250,13 @@ describe("repairSessionFileIfNeeded", () => {
],
},
};
const original = `${JSON.stringify(header)}\n${JSON.stringify(mediaUserEntry)}\n`;
await fs.writeFile(file, original, "utf-8");
writeTranscriptEvents(file, [header, mediaUserEntry]);
const result = await repairSessionFileIfNeeded({ sessionFile: file });
expect(result.repaired).toBe(true);
expect(result.rewrittenUserMessages).toBe(1);
const repaired = await fs.readFile(file, "utf-8");
const repaired = await readTranscriptJsonl(file);
const repairedEntry = JSON.parse(repaired.trim().split("\n")[1] ?? "{}");
expect(repairedEntry.message.content).toEqual([
{ type: "image", data: "AA==", mimeType: "image/png" },
@@ -266,8 +281,11 @@ describe("repairSessionFileIfNeeded", () => {
stopReason: "error",
},
};
const original = `${JSON.stringify(header)}\n${JSON.stringify(poisonedAssistantEntry)}\n{"type":"message"`;
await fs.writeFile(file, original, "utf-8");
writeTranscriptEvents(file, [
header,
poisonedAssistantEntry,
{ type: "message", id: "corrupt", message: { role: null, content: "bad" } },
]);
const debug = vi.fn();
const result = await repairSessionFileIfNeeded({ sessionFile: file, debug });
@@ -612,22 +630,13 @@ describe("repairSessionFileIfNeeded", () => {
message: { role: " ", content: "blank role" },
};
const content = [
JSON.stringify(header),
JSON.stringify(message),
JSON.stringify(nullRoleEntry),
JSON.stringify(missingRoleEntry),
JSON.stringify(emptyRoleEntry),
].join("\n");
await fs.writeFile(file, `${content}\n`, "utf-8");
writeTranscriptEvents(file, [header, message, nullRoleEntry, missingRoleEntry, emptyRoleEntry]);
const result = await repairSessionFileIfNeeded({ sessionFile: file });
expect(result.repaired).toBe(true);
expect(result.droppedLines).toBe(3);
await expect(fs.readFile(requireBackupPath(result), "utf-8")).resolves.toBe(`${content}\n`);
const after = await fs.readFile(file, "utf-8");
const after = await readTranscriptJsonl(file);
const lines = after.trimEnd().split("\n");
expect(lines).toHaveLength(2);
expect(JSON.parse(lines[0])).toEqual(header);
@@ -653,20 +662,14 @@ describe("repairSessionFileIfNeeded", () => {
message: "not an object",
};
const content = [
JSON.stringify(header),
JSON.stringify(message),
JSON.stringify(missingMessage),
JSON.stringify(stringMessage),
].join("\n");
await fs.writeFile(file, `${content}\n`, "utf-8");
writeTranscriptEvents(file, [header, message, missingMessage, stringMessage]);
const result = await repairSessionFileIfNeeded({ sessionFile: file });
expect(result.repaired).toBe(true);
expect(result.droppedLines).toBe(2);
const after = await fs.readFile(file, "utf-8");
const after = await readTranscriptJsonl(file);
const lines = after.trimEnd().split("\n");
expect(lines).toHaveLength(2);
});

View File

@@ -1,6 +1,9 @@
import fs from "node:fs/promises";
import path from "node:path";
import { replaceFileAtomic } from "../infra/replace-file.js";
import {
loadSqliteSessionTranscriptEvents,
replaceSqliteSessionTranscriptEvents,
resolveSqliteSessionTranscriptScopeForPath,
} from "../config/sessions/transcript-store.sqlite.js";
import { STREAM_ERROR_FALLBACK_TEXT } from "./stream-message-shared.js";
/** Placeholder for blank user messages — preserves the user turn so strict
@@ -193,68 +196,49 @@ export async function repairSessionFileIfNeeded(params: {
return { repaired: false, droppedLines: 0, reason: "missing session file" };
}
let content: string;
try {
content = await fs.readFile(sessionFile, "utf-8");
} catch (err) {
const code = (err as { code?: unknown } | undefined)?.code;
if (code === "ENOENT") {
return { repaired: false, droppedLines: 0, reason: "missing session file" };
}
const reason = `failed to read session file: ${err instanceof Error ? err.message : "unknown error"}`;
params.warn?.(`session file repair skipped: ${reason} (${path.basename(sessionFile)})`);
return { repaired: false, droppedLines: 0, reason };
const scope = resolveSqliteSessionTranscriptScopeForPath({ transcriptPath: sessionFile });
if (!scope) {
return { repaired: false, droppedLines: 0, reason: "missing SQLite transcript" };
}
const lines = content.split(/\r?\n/);
const storedEntries = loadSqliteSessionTranscriptEvents(scope).map((entry) => entry.event);
const entries: unknown[] = [];
let droppedLines = 0;
let rewrittenAssistantMessages = 0;
let droppedBlankUserMessages = 0;
let rewrittenUserMessages = 0;
for (const line of lines) {
if (!line.trim()) {
for (const entry of storedEntries) {
if (isStructurallyInvalidMessageEntry(entry)) {
// Drop "null role" / missing-role message entries the same way the old
// JSONL repair dropped malformed lines: providers cannot replay them.
droppedLines += 1;
continue;
}
try {
const entry: unknown = JSON.parse(line);
if (isStructurallyInvalidMessageEntry(entry)) {
// Drop "null role" / missing-role message entries the same way we
// drop unparseable JSONL: they cannot be replayed to any provider
// and preserving them through repair just relocates the corruption
// into the post-repair file (#77228: 935+ null-role entries
// surviving the auto-repair pass).
droppedLines += 1;
continue;
}
if (isAssistantEntryWithEmptyContent(entry)) {
entries.push(rewriteAssistantEntryWithEmptyContent(entry));
rewrittenAssistantMessages += 1;
continue;
}
if (
entry &&
typeof entry === "object" &&
(entry as { type?: unknown }).type === "message" &&
typeof (entry as { message?: unknown }).message === "object" &&
((entry as { message: { role?: unknown } }).message?.role ?? undefined) === "user"
) {
const repairedUser = repairUserEntryWithBlankTextContent(entry as SessionMessageEntry);
if (repairedUser.kind === "drop") {
droppedBlankUserMessages += 1;
continue;
}
if (repairedUser.kind === "rewrite") {
entries.push(repairedUser.entry);
rewrittenUserMessages += 1;
continue;
}
}
entries.push(entry);
} catch {
droppedLines += 1;
if (isAssistantEntryWithEmptyContent(entry)) {
entries.push(rewriteAssistantEntryWithEmptyContent(entry));
rewrittenAssistantMessages += 1;
continue;
}
if (
entry &&
typeof entry === "object" &&
(entry as { type?: unknown }).type === "message" &&
typeof (entry as { message?: unknown }).message === "object" &&
((entry as { message: { role?: unknown } }).message?.role ?? undefined) === "user"
) {
const repairedUser = repairUserEntryWithBlankTextContent(entry as SessionMessageEntry);
if (repairedUser.kind === "drop") {
droppedBlankUserMessages += 1;
continue;
}
if (repairedUser.kind === "rewrite") {
entries.push(repairedUser.entry);
rewrittenUserMessages += 1;
continue;
}
}
entries.push(entry);
}
if (entries.length === 0) {
@@ -277,19 +261,11 @@ export async function repairSessionFileIfNeeded(params: {
return { repaired: false, droppedLines: 0 };
}
const cleaned = `${entries.map((entry) => JSON.stringify(entry)).join("\n")}\n`;
const backupPath = `${sessionFile}.bak-${process.pid}-${Date.now()}`;
try {
const stat = await fs.stat(sessionFile).catch(() => null);
await fs.writeFile(backupPath, content, "utf-8");
if (stat) {
await fs.chmod(backupPath, stat.mode);
}
await replaceFileAtomic({
filePath: sessionFile,
content: cleaned,
preserveExistingMode: true,
tempPrefix: `${path.basename(sessionFile)}.repair`,
replaceSqliteSessionTranscriptEvents({
...scope,
transcriptPath: sessionFile,
events: entries,
});
} catch (err) {
return {
@@ -316,6 +292,5 @@ export async function repairSessionFileIfNeeded(params: {
rewrittenAssistantMessages,
droppedBlankUserMessages,
rewrittenUserMessages,
backupPath,
};
}

View File

@@ -1,15 +1,34 @@
import fs from "node:fs/promises";
import os from "node:os";
import path from "node:path";
import { describe, expect, it } from "vitest";
import { afterEach, describe, expect, it, vi } from "vitest";
import {
loadSqliteSessionTranscriptEvents,
resolveSqliteSessionTranscriptScopeForPath,
} from "../../config/sessions/transcript-store.sqlite.js";
import { closeOpenClawStateDatabaseForTest } from "../../state/openclaw-state-db.js";
import { openTranscriptSessionManager } from "./session-manager.js";
import { parseSessionEntries, SessionManager } from "./session-transcript-contract.js";
import { SessionManager } from "./session-transcript-contract.js";
async function makeTempSessionFile(name = "session.jsonl"): Promise<string> {
const dir = await fs.mkdtemp(path.join(os.tmpdir(), "openclaw-transcript-session-"));
vi.stubEnv("OPENCLAW_STATE_DIR", dir);
return path.join(dir, name);
}
function readSessionEntries(sessionFile: string) {
const scope = resolveSqliteSessionTranscriptScopeForPath({ transcriptPath: sessionFile });
if (!scope) {
return [];
}
return loadSqliteSessionTranscriptEvents(scope).map((entry) => entry.event);
}
afterEach(() => {
closeOpenClawStateDatabaseForTest();
vi.unstubAllEnvs();
});
describe("TranscriptSessionManager", () => {
it("exposes create, in-memory, list, continue, and fork through the contract value", async () => {
const dir = path.dirname(await makeTempSessionFile());
@@ -61,7 +80,7 @@ describe("TranscriptSessionManager", () => {
expect(sessionManager.getCwd()).toBe("/tmp/workspace");
expect(sessionManager.getSessionFile()).toBe(sessionFile);
const entries = parseSessionEntries(await fs.readFile(sessionFile, "utf8"));
const entries = readSessionEntries(sessionFile);
expect(entries).toMatchObject([
{
type: "session",
@@ -86,7 +105,7 @@ describe("TranscriptSessionManager", () => {
timestamp: 1,
});
const afterUser = parseSessionEntries(await fs.readFile(sessionFile, "utf8"));
const afterUser = readSessionEntries(sessionFile);
expect(afterUser).toHaveLength(2);
expect(afterUser[1]).toMatchObject({
type: "message",

View File

@@ -1,9 +1,17 @@
import { randomUUID } from "node:crypto";
import fs from "node:fs";
import fsPromises from "node:fs/promises";
import os from "node:os";
import path from "node:path";
import {
appendSqliteSessionTranscriptEvent,
listSqliteSessionTranscriptFiles,
loadSqliteSessionTranscriptEvents,
replaceSqliteSessionTranscriptEvents,
resolveSqliteSessionTranscriptScopeForPath,
} from "../../config/sessions/transcript-store.sqlite.js";
import { DEFAULT_AGENT_ID, normalizeAgentId } from "../../routing/session-key.js";
import type {
FileEntry,
SessionContext,
SessionEntry,
SessionHeader,
@@ -13,27 +21,7 @@ import type {
SessionTreeNode,
} from "./session-transcript-contract.js";
import { CURRENT_SESSION_VERSION } from "./session-transcript-format.js";
import {
persistTranscriptStateMutationSync,
readTranscriptFileStateSync,
TranscriptFileState,
writeTranscriptFileAtomicSync,
} from "./transcript-file-state.js";
function transcriptHasSessionHeader(raw: string): boolean {
for (const line of raw.trim().split(/\r?\n/)) {
if (!line.trim()) {
continue;
}
try {
const parsed = JSON.parse(line) as { type?: unknown; id?: unknown };
return parsed.type === "session" && typeof parsed.id === "string";
} catch {
continue;
}
}
return false;
}
import { TranscriptFileState } from "./transcript-file-state.js";
function createSessionHeader(params: {
id?: string;
@@ -54,6 +42,12 @@ function createSessionFileName(header: SessionHeader): string {
return `${header.timestamp.replace(/[:.]/g, "-")}_${header.id}.jsonl`;
}
type TranscriptSqliteScope = {
agentId: string;
sessionId: string;
transcriptPath: string;
};
function encodeSessionCwd(cwd: string): string {
return `--${cwd.replace(/^[/\\]/, "").replace(/[/\\:]/g, "-")}--`;
}
@@ -66,27 +60,78 @@ function ensureDirSync(dir: string): void {
fs.mkdirSync(dir, { recursive: true, mode: 0o700 });
}
function loadTranscriptState(params: {
sessionFile: string;
sessionId?: string;
cwd?: string;
}): TranscriptFileState {
if (fs.existsSync(params.sessionFile)) {
const raw = fs.readFileSync(params.sessionFile, "utf-8");
if (transcriptHasSessionHeader(raw)) {
const state = readTranscriptFileStateSync(params.sessionFile);
if (state.migrated) {
writeTranscriptFileAtomicSync(params.sessionFile, [
...(state.header ? [state.header] : []),
...state.entries,
]);
return new TranscriptFileState({
header: state.header,
entries: state.entries,
});
}
return state;
}
function resolveAgentIdFromSessionPath(sessionFile: string): string {
const resolved = path.resolve(sessionFile);
const sessionsDir = path.dirname(resolved);
const agentDir = path.dirname(sessionsDir);
const agentsDir = path.dirname(agentDir);
if (path.basename(sessionsDir) === "sessions" && path.basename(agentsDir) === "agents") {
return normalizeAgentId(path.basename(agentDir));
}
return DEFAULT_AGENT_ID;
}
function resolveFallbackSessionIdFromPath(sessionFile: string): string {
const basename = path.basename(sessionFile);
const stem = basename.endsWith(".jsonl") ? basename.slice(0, -".jsonl".length) : basename;
const timestampIdMatch = /^[0-9]{4}-[0-9]{2}-[0-9]{2}T.*_([^_]+)$/.exec(stem);
return timestampIdMatch?.[1] ?? stem;
}
function createTranscriptStateFromEvents(events: unknown[]): TranscriptFileState {
const fileEntries = events.filter((event): event is FileEntry =>
Boolean(event && typeof event === "object"),
);
const header =
fileEntries.find((entry): entry is SessionHeader => entry.type === "session") ?? null;
const entries = fileEntries.filter((entry): entry is SessionEntry => entry.type !== "session");
return new TranscriptFileState({ header, entries });
}
function persistFullTranscriptStateToSqlite(
scope: TranscriptSqliteScope,
state: TranscriptFileState,
): void {
replaceSqliteSessionTranscriptEvents({
agentId: scope.agentId,
sessionId: scope.sessionId,
transcriptPath: scope.transcriptPath,
events: [...(state.header ? [state.header] : []), ...state.entries],
});
}
function appendTranscriptEntryToSqlite(scope: TranscriptSqliteScope, entry: SessionEntry): void {
appendSqliteSessionTranscriptEvent({
agentId: scope.agentId,
sessionId: scope.sessionId,
transcriptPath: scope.transcriptPath,
event: entry,
});
}
function loadTranscriptState(params: { sessionFile: string; sessionId?: string; cwd?: string }): {
state: TranscriptFileState;
scope: TranscriptSqliteScope;
} {
const transcriptPath = path.resolve(params.sessionFile);
const existingScope = resolveSqliteSessionTranscriptScopeForPath({ transcriptPath });
const scope = {
agentId: existingScope?.agentId ?? resolveAgentIdFromSessionPath(transcriptPath),
sessionId:
existingScope?.sessionId ??
params.sessionId ??
resolveFallbackSessionIdFromPath(transcriptPath),
transcriptPath,
};
const sqliteEvents = loadSqliteSessionTranscriptEvents(scope).map((entry) => entry.event);
if (sqliteEvents.length > 0) {
return { state: createTranscriptStateFromEvents(sqliteEvents), scope };
}
if (fs.existsSync(params.sessionFile) && fs.statSync(params.sessionFile).size > 0) {
throw new Error(
`Legacy transcript has not been imported into SQLite: ${params.sessionFile}. Run "openclaw doctor --fix" to build the session database.`,
);
}
const header = createSessionHeader({
@@ -94,8 +139,9 @@ function loadTranscriptState(params: {
cwd: params.cwd ?? process.cwd(),
});
const state = new TranscriptFileState({ header, entries: [] });
writeTranscriptFileAtomicSync(params.sessionFile, [header]);
return state;
const headerScope = { ...scope, sessionId: header.id };
persistFullTranscriptStateToSqlite(headerScope, state);
return { state, scope: headerScope };
}
function isMessageWithContent(
@@ -129,14 +175,16 @@ function extractTextContent(message: { content: unknown }): string {
.join(" ");
}
async function buildSessionInfo(filePath: string): Promise<SessionInfo | null> {
function buildSessionInfoFromState(
filePath: string,
state: TranscriptFileState,
modifiedFallback: Date,
): SessionInfo | null {
const header = state.getHeader();
if (!header) {
return null;
}
try {
const state = readTranscriptFileStateSync(filePath);
const header = state.getHeader();
if (!header) {
return null;
}
const stats = await fsPromises.stat(filePath);
let messageCount = 0;
let firstMessage = "";
const allMessages: string[] = [];
@@ -179,13 +227,13 @@ async function buildSessionInfo(filePath: string): Promise<SessionInfo | null> {
cwd: header.cwd,
name: state.getSessionName(),
parentSessionPath: header.parentSession,
created: Number.isFinite(headerTime) ? new Date(headerTime) : stats.mtime,
created: Number.isFinite(headerTime) ? new Date(headerTime) : modifiedFallback,
modified:
typeof lastActivityTime === "number" && lastActivityTime > 0
? new Date(lastActivityTime)
: Number.isFinite(headerTime)
? new Date(headerTime)
: stats.mtime,
: modifiedFallback,
messageCount,
firstMessage: firstMessage || "(no messages)",
allMessagesText: allMessages.join(" "),
@@ -201,26 +249,28 @@ async function listSessionsFromDir(
progressOffset = 0,
progressTotal?: number,
): Promise<SessionInfo[]> {
try {
const entries = await fsPromises.readdir(dir);
const files = entries
.filter((entry) => entry.endsWith(".jsonl"))
.map((entry) => path.join(dir, entry));
const total = progressTotal ?? files.length;
const sessions: SessionInfo[] = [];
let loaded = 0;
for (const file of files) {
const info = await buildSessionInfo(file);
loaded += 1;
onProgress?.(progressOffset + loaded, total);
if (info) {
sessions.push(info);
}
const resolvedDir = path.resolve(dir);
const sqliteFiles = listSqliteSessionTranscriptFiles().filter(
(entry) => path.dirname(path.resolve(entry.path)) === resolvedDir,
);
const sessions: SessionInfo[] = [];
let loaded = 0;
const total = progressTotal ?? sqliteFiles.length;
for (const file of sqliteFiles) {
const state = createTranscriptStateFromEvents(
loadSqliteSessionTranscriptEvents({
agentId: file.agentId,
sessionId: file.sessionId,
}).map((entry) => entry.event),
);
loaded += 1;
onProgress?.(progressOffset + loaded, total);
const info = buildSessionInfoFromState(file.path, state, new Date(file.updatedAt));
if (info) {
sessions.push(info);
}
return sessions.toSorted((a, b) => b.modified.getTime() - a.modified.getTime());
} catch {
return [];
}
return sessions.toSorted((a, b) => b.modified.getTime() - a.modified.getTime());
}
export class TranscriptSessionManager implements SessionManager {
@@ -228,17 +278,20 @@ export class TranscriptSessionManager implements SessionManager {
private sessionFile: string | undefined;
private sessionDir: string;
private persist: boolean;
private sqliteScope: TranscriptSqliteScope | undefined;
private constructor(params: {
sessionDir: string;
state: TranscriptFileState;
sessionFile?: string;
persist: boolean;
sqliteScope?: TranscriptSqliteScope;
}) {
this.sessionFile = params.sessionFile ? path.resolve(params.sessionFile) : undefined;
this.sessionDir = path.resolve(params.sessionDir);
this.state = params.state;
this.persist = params.persist;
this.sqliteScope = params.sqliteScope;
}
static open(params: {
@@ -248,15 +301,17 @@ export class TranscriptSessionManager implements SessionManager {
sessionDir?: string;
}): TranscriptSessionManager {
const sessionFile = path.resolve(params.sessionFile);
const loaded = loadTranscriptState({
sessionFile,
sessionId: params.sessionId,
cwd: params.cwd,
});
return new TranscriptSessionManager({
sessionDir: params.sessionDir ? path.resolve(params.sessionDir) : path.dirname(sessionFile),
sessionFile,
persist: true,
state: loadTranscriptState({
sessionFile,
sessionId: params.sessionId,
cwd: params.cwd,
}),
state: loaded.state,
sqliteScope: loaded.scope,
});
}
@@ -265,12 +320,19 @@ export class TranscriptSessionManager implements SessionManager {
ensureDirSync(dir);
const header = createSessionHeader({ cwd });
const sessionFile = path.join(dir, createSessionFileName(header));
writeTranscriptFileAtomicSync(sessionFile, [header]);
const sqliteScope = {
agentId: resolveAgentIdFromSessionPath(sessionFile),
sessionId: header.id,
transcriptPath: path.resolve(sessionFile),
};
const state = new TranscriptFileState({ header, entries: [] });
persistFullTranscriptStateToSqlite(sqliteScope, state);
return new TranscriptSessionManager({
sessionDir: dir,
sessionFile,
persist: true,
state: new TranscriptFileState({ header, entries: [] }),
state,
sqliteScope,
});
}
@@ -280,21 +342,20 @@ export class TranscriptSessionManager implements SessionManager {
sessionDir: "",
persist: false,
state: new TranscriptFileState({ header, entries: [] }),
sqliteScope: undefined,
});
}
static continueRecent(cwd: string, sessionDir?: string): TranscriptSessionManager {
const dir = path.resolve(sessionDir ?? resolveDefaultSessionDir(cwd));
ensureDirSync(dir);
const newest = fs
.readdirSync(dir)
.filter((entry) => entry.endsWith(".jsonl"))
.map((entry) => path.join(dir, entry))
.filter((file) => fs.existsSync(file))
.toSorted((a, b) => fs.statSync(b).mtimeMs - fs.statSync(a).mtimeMs)[0];
return newest
? TranscriptSessionManager.open({ sessionFile: newest, cwd })
: TranscriptSessionManager.create(cwd, dir);
const newestSqlite = listSqliteSessionTranscriptFiles()
.filter((entry) => path.dirname(path.resolve(entry.path)) === dir)
.toSorted((a, b) => b.updatedAt - a.updatedAt)[0];
if (newestSqlite) {
return TranscriptSessionManager.open({ sessionFile: newestSqlite.path, cwd });
}
return TranscriptSessionManager.create(cwd, dir);
}
static forkFrom(
@@ -303,7 +364,15 @@ export class TranscriptSessionManager implements SessionManager {
sessionDir?: string,
): TranscriptSessionManager {
const sourceFile = path.resolve(sourcePath);
const sourceState = readTranscriptFileStateSync(sourceFile);
const sourceScope = resolveSqliteSessionTranscriptScopeForPath({ transcriptPath: sourceFile });
if (!sourceScope) {
throw new Error(
`Legacy transcript has not been imported into SQLite: ${sourceFile}. Run "openclaw doctor --fix" to build the session database.`,
);
}
const sourceState = createTranscriptStateFromEvents(
loadSqliteSessionTranscriptEvents(sourceScope).map((entry) => entry.event),
);
const dir = path.resolve(sessionDir ?? resolveDefaultSessionDir(targetCwd));
ensureDirSync(dir);
const header = createSessionHeader({
@@ -311,7 +380,13 @@ export class TranscriptSessionManager implements SessionManager {
parentSession: sourceFile,
});
const sessionFile = path.join(dir, createSessionFileName(header));
writeTranscriptFileAtomicSync(sessionFile, [header, ...sourceState.getEntries()]);
const state = new TranscriptFileState({ header, entries: sourceState.getEntries() });
const sqliteScope = {
agentId: resolveAgentIdFromSessionPath(sessionFile),
sessionId: header.id,
transcriptPath: path.resolve(sessionFile),
};
persistFullTranscriptStateToSqlite(sqliteScope, state);
return TranscriptSessionManager.open({ sessionFile, cwd: targetCwd });
}
@@ -327,46 +402,36 @@ export class TranscriptSessionManager implements SessionManager {
}
static async listAll(onProgress?: SessionListProgress): Promise<SessionInfo[]> {
const root = path.join(os.homedir(), ".openclaw", "sessions");
try {
const dirs = (await fsPromises.readdir(root, { withFileTypes: true })).filter((entry) =>
entry.isDirectory(),
const files = listSqliteSessionTranscriptFiles();
const sessions: SessionInfo[] = [];
let loaded = 0;
for (const file of files) {
const state = createTranscriptStateFromEvents(
loadSqliteSessionTranscriptEvents({
agentId: file.agentId,
sessionId: file.sessionId,
}).map((entry) => entry.event),
);
const totalFiles = (
await Promise.all(
dirs.map(async (entry) => {
try {
return (await fsPromises.readdir(path.join(root, entry.name))).filter((file) =>
file.endsWith(".jsonl"),
).length;
} catch {
return 0;
}
}),
)
).reduce((sum, count) => sum + count, 0);
const sessions: SessionInfo[] = [];
let offset = 0;
for (const dir of dirs) {
const dirPath = path.join(root, dir.name);
const listed = await listSessionsFromDir(dirPath, onProgress, offset, totalFiles);
offset += listed.length;
sessions.push(...listed);
loaded += 1;
onProgress?.(loaded, files.length);
const info = buildSessionInfoFromState(file.path, state, new Date(file.updatedAt));
if (info) {
sessions.push(info);
}
return sessions.toSorted((a, b) => b.modified.getTime() - a.modified.getTime());
} catch {
return [];
}
return sessions.toSorted((a, b) => b.modified.getTime() - a.modified.getTime());
}
setSessionFile(sessionFile: string): void {
this.sessionFile = path.resolve(sessionFile);
this.sessionDir = path.dirname(this.sessionFile);
this.persist = true;
this.state = loadTranscriptState({
const loaded = loadTranscriptState({
sessionFile: this.sessionFile,
cwd: this.getCwd(),
});
this.state = loaded.state;
this.sqliteScope = loaded.scope;
}
newSession(options?: { id?: string; parentSession?: string }): string | undefined {
@@ -379,7 +444,12 @@ export class TranscriptSessionManager implements SessionManager {
if (this.persist) {
this.sessionFile =
this.sessionFile ?? path.join(this.sessionDir, createSessionFileName(header));
writeTranscriptFileAtomicSync(this.sessionFile, [header]);
this.sqliteScope = {
agentId: resolveAgentIdFromSessionPath(this.sessionFile),
sessionId: header.id,
transcriptPath: path.resolve(this.sessionFile),
};
persistFullTranscriptStateToSqlite(this.sqliteScope, this.state);
}
return this.sessionFile;
}
@@ -528,22 +598,30 @@ export class TranscriptSessionManager implements SessionManager {
if (!this.persist) {
return undefined;
}
writeTranscriptFileAtomicSync(sessionFile, [
const state = new TranscriptFileState({
header,
...branch.filter((e) => e.type !== "label"),
]);
entries: branch.filter((e) => e.type !== "label"),
});
persistFullTranscriptStateToSqlite(
{
agentId: resolveAgentIdFromSessionPath(sessionFile),
sessionId: header.id,
transcriptPath: path.resolve(sessionFile),
},
state,
);
return sessionFile;
}
private persistAppendedEntry(entry: SessionEntry): string {
if (!this.persist || !this.sessionFile) {
if (!this.persist || !this.sessionFile || !this.sqliteScope) {
return entry.id;
}
persistTranscriptStateMutationSync({
sessionFile: this.sessionFile,
state: this.state,
appendedEntries: [entry],
});
if (this.state.migrated) {
persistFullTranscriptStateToSqlite(this.sqliteScope, this.state);
} else {
appendTranscriptEntryToSqlite(this.sqliteScope, entry);
}
return entry.id;
}
}

View File

@@ -1,9 +1,12 @@
import { randomUUID } from "node:crypto";
import fsSync from "node:fs";
import fs from "node:fs/promises";
import path from "node:path";
import { appendRegularFile, appendRegularFileSync } from "../../infra/fs-safe.js";
import { privateFileStore, privateFileStoreSync } from "../../infra/private-file-store.js";
import {
appendSqliteSessionTranscriptEvent,
loadSqliteSessionTranscriptEvents,
replaceSqliteSessionTranscriptEvents,
resolveSqliteSessionTranscriptScopeForPath,
} from "../../config/sessions/transcript-store.sqlite.js";
import { DEFAULT_AGENT_ID, normalizeAgentId } from "../../routing/session-key.js";
import type {
FileEntry,
SessionContext,
@@ -15,7 +18,6 @@ import {
buildSessionContext,
CURRENT_SESSION_VERSION,
migrateSessionEntries,
parseSessionEntries,
} from "./session-transcript-format.js";
type BranchSummaryEntry = Extract<SessionEntry, { type: "branch_summary" }>;
@@ -46,8 +48,57 @@ function generateEntryId(byId: { has(id: string): boolean }): string {
return randomUUID();
}
function serializeTranscriptFileEntries(entries: FileEntry[]): string {
return `${entries.map((entry) => JSON.stringify(entry)).join("\n")}\n`;
function resolveAgentIdFromTranscriptPath(sessionFile: string): string {
const resolved = path.resolve(sessionFile);
const sessionsDir = path.dirname(resolved);
const agentDir = path.dirname(sessionsDir);
const agentsDir = path.dirname(agentDir);
if (path.basename(sessionsDir) === "sessions" && path.basename(agentsDir) === "agents") {
return normalizeAgentId(path.basename(agentDir));
}
return DEFAULT_AGENT_ID;
}
function transcriptStateFromFileEntries(fileEntries: FileEntry[]): TranscriptFileState {
const headerBeforeMigration =
fileEntries.find((entry): entry is SessionHeader => entry.type === "session") ?? null;
const migrated = sessionHeaderVersion(headerBeforeMigration) < CURRENT_SESSION_VERSION;
migrateSessionEntries(fileEntries);
const header =
fileEntries.find((entry): entry is SessionHeader => entry.type === "session") ?? null;
const entries = fileEntries.filter(isSessionEntry);
return new TranscriptFileState({ header, entries, migrated });
}
function transcriptStateFromSqlite(sessionFile: string): TranscriptFileState | undefined {
const scope = resolveSqliteSessionTranscriptScopeForPath({ transcriptPath: sessionFile });
if (!scope) {
return undefined;
}
const events = loadSqliteSessionTranscriptEvents(scope).map((entry) => entry.event);
if (events.length === 0) {
return undefined;
}
return transcriptStateFromFileEntries(
events.filter((event): event is FileEntry => Boolean(event && typeof event === "object")),
);
}
function resolveTranscriptWriteScope(
sessionFile: string,
entries: Array<SessionHeader | SessionEntry>,
): { agentId: string; sessionId: string; transcriptPath: string } | undefined {
const header = entries.find((entry): entry is SessionHeader => entry.type === "session");
const existing = resolveSqliteSessionTranscriptScopeForPath({ transcriptPath: sessionFile });
const sessionId = header?.id ?? existing?.sessionId;
if (!sessionId) {
return undefined;
}
return {
agentId: existing?.agentId ?? resolveAgentIdFromTranscriptPath(sessionFile),
sessionId,
transcriptPath: path.resolve(sessionFile),
};
}
export class TranscriptFileState {
@@ -343,49 +394,51 @@ export class TranscriptFileState {
}
export async function readTranscriptFileState(sessionFile: string): Promise<TranscriptFileState> {
const raw = await fs.readFile(sessionFile, "utf-8");
const fileEntries = parseSessionEntries(raw);
const headerBeforeMigration =
fileEntries.find((entry): entry is SessionHeader => entry.type === "session") ?? null;
const migrated = sessionHeaderVersion(headerBeforeMigration) < CURRENT_SESSION_VERSION;
migrateSessionEntries(fileEntries);
const header =
fileEntries.find((entry): entry is SessionHeader => entry.type === "session") ?? null;
const entries = fileEntries.filter(isSessionEntry);
return new TranscriptFileState({ header, entries, migrated });
const sqliteState = transcriptStateFromSqlite(sessionFile);
if (sqliteState) {
return sqliteState;
}
throw new Error(
`Transcript is not in SQLite: ${sessionFile}. Run "openclaw doctor --fix" to import legacy JSONL transcripts.`,
);
}
export function readTranscriptFileStateSync(sessionFile: string): TranscriptFileState {
const raw = fsSync.readFileSync(sessionFile, "utf-8");
const fileEntries = parseSessionEntries(raw);
const headerBeforeMigration =
fileEntries.find((entry): entry is SessionHeader => entry.type === "session") ?? null;
const migrated = sessionHeaderVersion(headerBeforeMigration) < CURRENT_SESSION_VERSION;
migrateSessionEntries(fileEntries);
const header =
fileEntries.find((entry): entry is SessionHeader => entry.type === "session") ?? null;
const entries = fileEntries.filter(isSessionEntry);
return new TranscriptFileState({ header, entries, migrated });
const sqliteState = transcriptStateFromSqlite(sessionFile);
if (sqliteState) {
return sqliteState;
}
throw new Error(
`Transcript is not in SQLite: ${sessionFile}. Run "openclaw doctor --fix" to import legacy JSONL transcripts.`,
);
}
export async function writeTranscriptFileAtomic(
filePath: string,
entries: Array<SessionHeader | SessionEntry>,
): Promise<void> {
await privateFileStore(path.dirname(filePath)).writeText(
path.basename(filePath),
serializeTranscriptFileEntries(entries),
);
const scope = resolveTranscriptWriteScope(filePath, entries);
if (!scope) {
throw new Error(`Cannot write SQLite transcript without a session header: ${filePath}`);
}
replaceSqliteSessionTranscriptEvents({
...scope,
events: entries,
});
}
export function writeTranscriptFileAtomicSync(
filePath: string,
entries: Array<SessionHeader | SessionEntry>,
): void {
privateFileStoreSync(path.dirname(filePath)).writeText(
path.basename(filePath),
serializeTranscriptFileEntries(entries),
);
const scope = resolveTranscriptWriteScope(filePath, entries);
if (!scope) {
throw new Error(`Cannot write SQLite transcript without a session header: ${filePath}`);
}
replaceSqliteSessionTranscriptEvents({
...scope,
events: entries,
});
}
export async function persistTranscriptStateMutation(params: {
@@ -403,11 +456,18 @@ export async function persistTranscriptStateMutation(params: {
]);
return;
}
await appendRegularFile({
filePath: params.sessionFile,
content: `${params.appendedEntries.map((entry) => JSON.stringify(entry)).join("\n")}\n`,
rejectSymlinkParents: true,
});
const scope = resolveTranscriptWriteScope(params.sessionFile, [
...(params.state.header ? [params.state.header] : []),
...params.state.entries,
]);
if (!scope) {
throw new Error(
`Cannot append SQLite transcript without a session header: ${params.sessionFile}`,
);
}
for (const entry of params.appendedEntries) {
appendSqliteSessionTranscriptEvent({ ...scope, event: entry });
}
}
export function persistTranscriptStateMutationSync(params: {
@@ -425,9 +485,16 @@ export function persistTranscriptStateMutationSync(params: {
]);
return;
}
appendRegularFileSync({
filePath: params.sessionFile,
content: `${params.appendedEntries.map((entry) => JSON.stringify(entry)).join("\n")}\n`,
rejectSymlinkParents: true,
});
const scope = resolveTranscriptWriteScope(params.sessionFile, [
...(params.state.header ? [params.state.header] : []),
...params.state.entries,
]);
if (!scope) {
throw new Error(
`Cannot append SQLite transcript without a session header: ${params.sessionFile}`,
);
}
for (const entry of params.appendedEntries) {
appendSqliteSessionTranscriptEvent({ ...scope, event: entry });
}
}

View File

@@ -14,12 +14,17 @@ import {
resolveSessionFilePath,
resolveSessionFilePathOptions,
} from "../../config/sessions/paths.js";
import {
replaceSqliteSessionTranscriptEvents,
resolveSqliteSessionTranscriptScopeForPath,
} from "../../config/sessions/transcript-store.sqlite.js";
import {
resolveFreshSessionTotalTokens,
type SessionEntry as StoreSessionEntry,
} from "../../config/sessions/types.js";
import { readLatestRecentSessionUsageFromTranscriptAsync } from "../../gateway/session-utils.fs.js";
import { readRegularFile } from "../../infra/fs-safe.js";
import { DEFAULT_AGENT_ID } from "../../routing/session-key.js";
type ForkSourceTranscript = {
cwd: string;
@@ -231,11 +236,14 @@ async function writeForkHeaderOnly(params: {
cwd: params.cwd,
parentSession: params.parentSessionFile,
} satisfies SessionHeader;
await fs.mkdir(path.dirname(sessionFile), { recursive: true });
await fs.writeFile(sessionFile, `${JSON.stringify(header)}\n`, {
encoding: "utf-8",
mode: 0o600,
flag: "wx",
const parentScope = resolveSqliteSessionTranscriptScopeForPath({
transcriptPath: params.parentSessionFile,
});
replaceSqliteSessionTranscriptEvents({
agentId: parentScope?.agentId ?? DEFAULT_AGENT_ID,
sessionId,
transcriptPath: sessionFile,
events: [header],
});
return { sessionId, sessionFile };
}
@@ -268,16 +276,15 @@ async function writeBranchedSession(params: {
(entry) => entry.type === "message" && entry.message.role === "assistant",
);
if (hasAssistant) {
await fs.mkdir(path.dirname(sessionFile), { recursive: true });
await fs.writeFile(
sessionFile,
`${entries.map((entry) => JSON.stringify(entry)).join("\n")}\n`,
{
encoding: "utf-8",
mode: 0o600,
flag: "wx",
},
);
const parentScope = resolveSqliteSessionTranscriptScopeForPath({
transcriptPath: params.parentSessionFile,
});
replaceSqliteSessionTranscriptEvents({
agentId: parentScope?.agentId ?? DEFAULT_AGENT_ID,
sessionId,
transcriptPath: sessionFile,
events: entries,
});
}
return { sessionId, sessionFile };
}

View File

@@ -7,6 +7,10 @@ import {
type resolveSessionFilePathOptions,
} from "../config/sessions/paths.js";
import { updateSessionStore } from "../config/sessions/store.js";
import {
loadSqliteSessionTranscriptEvents,
resolveSqliteSessionTranscriptScopeForPath,
} from "../config/sessions/transcript-store.sqlite.js";
import type { SessionEntry } from "../config/sessions/types.js";
import type { OpenClawConfig } from "../config/types.openclaw.js";
import { parseAgentSessionKey } from "../sessions/session-key-utils.js";
@@ -56,13 +60,8 @@ function sessionEntryHasSyntheticHeartbeatOwnership(entry: SessionEntry): boolea
);
}
function parseTranscriptMessageLine(line: string): { role: string; content?: unknown } | null {
let parsed: unknown;
try {
parsed = JSON.parse(line);
} catch {
return null;
}
function parseTranscriptMessageEvent(event: unknown): { role: string; content?: unknown } | null {
const parsed = event;
const record = asNullableObjectRecord(parsed);
if (!record) {
return null;
@@ -79,12 +78,11 @@ function parseTranscriptMessageLine(line: string): { role: string; content?: unk
function summarizeTranscriptHeartbeatMessages(
transcriptPath: string,
): TranscriptHeartbeatSummary | null {
let raw: string;
try {
raw = fs.readFileSync(transcriptPath, "utf8");
} catch {
const scope = resolveSqliteSessionTranscriptScopeForPath({ transcriptPath });
if (!scope) {
return null;
}
const events = loadSqliteSessionTranscriptEvents(scope);
const summary: TranscriptHeartbeatSummary = {
inspectedMessages: 0,
userMessages: 0,
@@ -93,12 +91,8 @@ function summarizeTranscriptHeartbeatMessages(
assistantMessages: 0,
heartbeatOkAssistantMessages: 0,
};
for (const line of raw.split(/\r?\n/)) {
const trimmed = line.trim();
if (!trimmed) {
continue;
}
const message = parseTranscriptMessageLine(trimmed);
for (const event of events) {
const message = parseTranscriptMessageEvent(event.event);
if (!message) {
continue;
}

View File

@@ -9,6 +9,7 @@ vi.mock("../terminal/note.js", () => ({
note,
}));
import { loadSqliteSessionTranscriptEvents } from "../config/sessions/transcript-store.sqlite.js";
import {
noteSessionTranscriptHealth,
repairBrokenSessionTranscriptFile,
@@ -30,9 +31,11 @@ describe("doctor session transcript repair", () => {
beforeEach(async () => {
note.mockClear();
root = await fs.mkdtemp(path.join(os.tmpdir(), "openclaw-doctor-transcripts-"));
vi.stubEnv("OPENCLAW_STATE_DIR", root);
});
afterEach(async () => {
vi.unstubAllEnvs();
await fs.rm(root, { recursive: true, force: true });
});
@@ -135,11 +138,46 @@ describe("doctor session transcript repair", () => {
expect(note).toHaveBeenCalledTimes(1);
const [message, title] = note.mock.calls[0] as [string, string];
expect(title).toBe("Session transcripts");
expect(message).toContain("duplicated prompt-rewrite branches");
expect(message).toContain("legacy transcript JSONL");
expect(message).toContain('Run "openclaw doctor --fix"');
expect(countNonEmptyLines(await fs.readFile(filePath, "utf-8"))).toBe(3);
});
it("imports legacy transcript files into SQLite during repair mode", async () => {
const filePath = await writeTranscript([
{
type: "session",
version: 3,
id: "session-1",
timestamp: "2026-04-25T00:00:00Z",
cwd: root,
},
{
type: "message",
id: "user-1",
parentId: null,
message: { role: "user", content: "hello" },
},
]);
const sessionsDir = path.dirname(filePath);
await noteSessionTranscriptHealth({ shouldRepair: true, sessionDirs: [sessionsDir] });
await expect(fs.access(filePath)).rejects.toThrow();
expect(
loadSqliteSessionTranscriptEvents({
agentId: "main",
sessionId: "session-1",
}).map((entry) => entry.event),
).toMatchObject([
{ type: "session", id: "session-1" },
{ type: "message", id: "user-1" },
]);
const [message, title] = note.mock.calls[0] as [string, string];
expect(title).toBe("Session transcripts");
expect(message).toContain("Imported 1 transcript file into SQLite");
});
it("ignores ordinary branch history without internal runtime context", async () => {
const filePath = await writeTranscript([
{ type: "session", version: 3, id: "session-1", timestamp: "2026-04-25T00:00:00Z" },

View File

@@ -7,6 +7,8 @@ import {
} from "../agents/internal-runtime-context.js";
import { resolveAgentSessionDirs } from "../agents/session-dirs.js";
import { resolveStateDir } from "../config/paths.js";
import { replaceSqliteSessionTranscriptEvents } from "../config/sessions/transcript-store.sqlite.js";
import { DEFAULT_AGENT_ID, normalizeAgentId } from "../routing/session-key.js";
import { note } from "../terminal/note.js";
import { shortenHomePath } from "../utils.js";
@@ -27,6 +29,12 @@ type TranscriptRepairResult = {
reason?: string;
};
type TranscriptMigrationResult = TranscriptRepairResult & {
imported: boolean;
removedSource: boolean;
sessionId?: string;
};
function parseTranscriptEntries(raw: string): TranscriptEntry[] {
const entries: TranscriptEntry[] = [];
for (const line of raw.split(/\r?\n/)) {
@@ -45,6 +53,22 @@ function parseTranscriptEntries(raw: string): TranscriptEntry[] {
return entries;
}
function getSessionId(entries: TranscriptEntry[]): string | null {
const header = entries.find((entry) => entry.type === "session");
return typeof header?.id === "string" && header.id.trim() ? header.id : null;
}
function resolveAgentIdFromTranscriptPath(filePath: string): string {
const resolved = path.resolve(filePath);
const sessionsDir = path.dirname(resolved);
const agentDir = path.dirname(sessionsDir);
const agentsDir = path.dirname(agentDir);
if (path.basename(sessionsDir) === "sessions" && path.basename(agentsDir) === "agents") {
return normalizeAgentId(path.basename(agentDir));
}
return DEFAULT_AGENT_ID;
}
function getEntryId(entry: TranscriptEntry): string | null {
return typeof entry.id === "string" && entry.id.trim() ? entry.id : null;
}
@@ -228,6 +252,81 @@ export async function repairBrokenSessionTranscriptFile(params: {
}
}
export async function migrateSessionTranscriptFileToSqlite(params: {
filePath: string;
shouldRepair: boolean;
agentId?: string;
transcriptPath?: string;
}): Promise<TranscriptMigrationResult> {
try {
const raw = await fs.readFile(params.filePath, "utf-8");
const entries = parseTranscriptEntries(raw);
const sessionId = getSessionId(entries);
if (!sessionId) {
return {
filePath: params.filePath,
broken: false,
repaired: false,
imported: false,
removedSource: false,
originalEntries: entries.length,
activeEntries: 0,
reason: "missing session header",
};
}
const activePath = selectActivePath(entries);
const broken = activePath ? hasBrokenPromptRewriteBranch(entries, activePath) : false;
const header = entries.find((entry) => entry.type === "session");
const events =
broken && params.shouldRepair && activePath && header ? [header, ...activePath] : entries;
if (!params.shouldRepair) {
return {
filePath: params.filePath,
broken,
repaired: false,
imported: false,
removedSource: false,
originalEntries: entries.length,
activeEntries: activePath?.length ?? 0,
sessionId,
};
}
const transcriptPath = path.resolve(params.transcriptPath ?? params.filePath);
replaceSqliteSessionTranscriptEvents({
agentId: params.agentId ?? resolveAgentIdFromTranscriptPath(transcriptPath),
sessionId,
transcriptPath,
events,
});
await fs.rm(params.filePath, { force: true });
return {
filePath: params.filePath,
broken,
repaired: broken,
imported: true,
removedSource: true,
originalEntries: entries.length,
activeEntries: activePath?.length ?? 0,
sessionId,
};
} catch (err) {
return {
filePath: params.filePath,
broken: false,
repaired: false,
imported: false,
removedSource: false,
originalEntries: 0,
activeEntries: 0,
reason: String(err),
};
}
}
async function listSessionTranscriptFiles(sessionDirs: string[]): Promise<string[]> {
const files: string[] = [];
for (const sessionsDir of sessionDirs) {
@@ -264,31 +363,49 @@ export async function noteSessionTranscriptHealth(params?: {
return;
}
const results: TranscriptRepairResult[] = [];
const results: TranscriptMigrationResult[] = [];
for (const filePath of files) {
results.push(await repairBrokenSessionTranscriptFile({ filePath, shouldRepair }));
results.push(await migrateSessionTranscriptFileToSqlite({ filePath, shouldRepair }));
}
const broken = results.filter((result) => result.broken);
if (broken.length === 0) {
return;
}
const imported = results.filter((result) => result.imported);
const failed = results.filter((result) => result.reason && !result.imported);
const repairedCount = broken.filter((result) => result.repaired).length;
const legacyCount = results.length;
const lines = [
`- Found ${broken.length} transcript file${broken.length === 1 ? "" : "s"} with duplicated prompt-rewrite branches.`,
...broken.slice(0, 20).map((result) => {
const backup = result.backupPath ? ` backup=${shortenHomePath(result.backupPath)}` : "";
const status = result.repaired ? "repaired" : "needs repair";
return `- ${shortenHomePath(result.filePath)} ${status} entries=${result.originalEntries}->${result.activeEntries + 1}${backup}`;
`- Found ${legacyCount} legacy transcript JSONL file${legacyCount === 1 ? "" : "s"} outside the SQLite session database.`,
...results.slice(0, 20).map((result) => {
const status = result.imported
? result.repaired
? "imported with active-branch repair"
: "imported"
: result.broken
? "needs import + repair"
: "needs import";
const reason = result.reason ? ` reason=${result.reason}` : "";
return `- ${shortenHomePath(result.filePath)} ${status} entries=${result.originalEntries}${reason}`;
}),
];
if (broken.length > 20) {
lines.push(`- ...and ${broken.length - 20} more.`);
if (results.length > 20) {
lines.push(`- ...and ${results.length - 20} more.`);
}
if (!shouldRepair) {
lines.push('- Run "openclaw doctor --fix" to rewrite affected files to their active branch.');
} else if (repairedCount > 0) {
lines.push(`- Repaired ${repairedCount} transcript file${repairedCount === 1 ? "" : "s"}.`);
lines.push('- Run "openclaw doctor --fix" to import legacy transcripts into SQLite.');
} else if (imported.length > 0) {
lines.push(
`- Imported ${imported.length} transcript file${imported.length === 1 ? "" : "s"} into SQLite and removed the JSONL source${imported.length === 1 ? "" : "s"}.`,
);
if (repairedCount > 0) {
lines.push(
`- Repaired duplicated prompt-rewrite branches for ${repairedCount} transcript file${repairedCount === 1 ? "" : "s"} during import.`,
);
}
}
if (failed.length > 0) {
lines.push(
`- Could not import ${failed.length} transcript file${failed.length === 1 ? "" : "s"}; left source file${failed.length === 1 ? "" : "s"} in place.`,
);
}
note(lines.join("\n"), "Session transcripts");

View File

@@ -22,6 +22,11 @@ import {
} from "../config/sessions/paths.js";
import { loadSessionStore } from "../config/sessions/store-load.js";
import { updateSessionStore } from "../config/sessions/store.js";
import {
hasSqliteSessionTranscriptEvents,
loadSqliteSessionTranscriptEvents,
resolveSqliteSessionTranscriptScopeForPath,
} from "../config/sessions/transcript-store.sqlite.js";
import type { OpenClawConfig } from "../config/types.openclaw.js";
import { resolveRequiredHomeDir } from "../infra/home-dir.js";
import { resolveMemoryBackendConfig } from "../memory-host-sdk/engine-storage.js";
@@ -218,6 +223,41 @@ function countJsonlLines(filePath: string): number {
}
}
function resolveTranscriptSqliteScope(params: {
agentId: string;
sessionId: string;
transcriptPath: string;
}): { agentId: string; sessionId: string } {
return (
resolveSqliteSessionTranscriptScopeForPath({ transcriptPath: params.transcriptPath }) ?? {
agentId: params.agentId,
sessionId: params.sessionId,
}
);
}
function hasSessionTranscript(params: {
agentId: string;
sessionId: string;
transcriptPath: string;
}): boolean {
const scope = resolveTranscriptSqliteScope(params);
return hasSqliteSessionTranscriptEvents(scope) || existsFile(params.transcriptPath);
}
function countSessionTranscriptEvents(params: {
agentId: string;
sessionId: string;
transcriptPath: string;
}): number {
const scope = resolveTranscriptSqliteScope(params);
const sqliteEvents = loadSqliteSessionTranscriptEvents(scope);
if (sqliteEvents.length > 0) {
return sqliteEvents.length;
}
return countJsonlLines(params.transcriptPath);
}
function findOtherStateDirs(stateDir: string): string[] {
const resolvedState = path.resolve(stateDir);
const roots =
@@ -858,7 +898,7 @@ export async function noteStateIntegrity(
return false;
}
const transcriptPath = resolveSessionFilePath(sessionId, entry, sessionPathOpts);
return !existsFile(transcriptPath);
return !hasSessionTranscript({ agentId, sessionId, transcriptPath });
});
if (missing.length > 0) {
warnings.push(
@@ -951,15 +991,19 @@ export async function noteStateIntegrity(
mainEntry,
sessionPathOpts,
);
if (!existsFile(transcriptPath)) {
if (!hasSessionTranscript({ agentId, sessionId: mainEntry.sessionId, transcriptPath })) {
warnings.push(
`- Main session transcript missing (${shortenHomePath(transcriptPath)}). History will appear to reset.`,
);
} else {
const lineCount = countJsonlLines(transcriptPath);
if (lineCount <= 1) {
const eventCount = countSessionTranscriptEvents({
agentId,
sessionId: mainEntry.sessionId,
transcriptPath,
});
if (eventCount <= 1) {
warnings.push(
`- Main session transcript has only ${lineCount} line. Session history may not be appending.`,
`- Main session transcript has only ${eventCount} event. Session history may not be appending.`,
);
}
}

View File

@@ -4,6 +4,7 @@ import path from "node:path";
import { afterEach, describe, expect, it, vi } from "vitest";
import type { OpenClawConfig } from "../config/config.js";
import { loadSqliteSessionStore } from "../config/sessions/store-backend.sqlite.js";
import { loadSqliteSessionTranscriptEvents } from "../config/sessions/transcript-store.sqlite.js";
import { closeOpenClawStateDatabaseForTest } from "../state/openclaw-state-db.js";
import {
autoMigrateLegacyStateDir,
@@ -369,8 +370,20 @@ describe("doctor legacy state migrations", () => {
"subagent:xyz": { sessionId: "e", updatedAt: 50 },
},
transcripts: {
"a.jsonl": "a",
"b.jsonl": "b",
"a.jsonl": `${JSON.stringify({
type: "session",
version: 3,
id: "a",
timestamp: "2026-04-25T00:00:00Z",
cwd: root,
})}\n`,
"b.jsonl": `${JSON.stringify({
type: "session",
version: 3,
id: "b",
timestamp: "2026-04-25T00:00:00Z",
cwd: root,
})}\n`,
},
});
@@ -385,8 +398,8 @@ describe("doctor legacy state migrations", () => {
expect(result.warnings).toStrictEqual([]);
const targetDir = path.join(root, "agents", "main", "sessions");
expect(fs.existsSync(path.join(targetDir, "a.jsonl"))).toBe(true);
expect(fs.existsSync(path.join(targetDir, "b.jsonl"))).toBe(true);
expect(fs.existsSync(path.join(targetDir, "a.jsonl"))).toBe(false);
expect(fs.existsSync(path.join(targetDir, "b.jsonl"))).toBe(false);
expect(fs.existsSync(path.join(legacySessionsDir, "a.jsonl"))).toBe(false);
expect(fs.existsSync(path.join(targetDir, "sessions.json"))).toBe(false);
@@ -399,6 +412,20 @@ describe("doctor legacy state migrations", () => {
expect(store["agent:main:slack:channel:c123"]?.sessionId).toBe("c");
expect(store["agent:main:unknown:group:abc"]?.sessionId).toBe("d");
expect(store["agent:main:subagent:xyz"]?.sessionId).toBe("e");
expect(
loadSqliteSessionTranscriptEvents({
agentId: "main",
sessionId: "a",
env: { OPENCLAW_STATE_DIR: root } as NodeJS.ProcessEnv,
}),
).toHaveLength(1);
expect(
loadSqliteSessionTranscriptEvents({
agentId: "main",
sessionId: "b",
env: { OPENCLAW_STATE_DIR: root } as NodeJS.ProcessEnv,
}),
).toHaveLength(1);
});
it("keeps shipped WhatsApp legacy group keys channel-qualified during migration", async () => {

View File

@@ -1,4 +1,3 @@
import fs from "node:fs";
import path from "node:path";
import { resolveDefaultAgentId } from "../../agents/agent-scope.js";
import { resolveStoredSessionOwnerAgentId } from "../../gateway/session-store-key.js";
@@ -11,11 +10,7 @@ import {
resolveSessionArtifactCanonicalPathsForEntry,
type SessionUnreferencedArtifactSweepResult,
} from "./disk-budget.js";
import {
resolveSessionFilePath,
resolveSessionFilePathOptions,
resolveStorePath,
} from "./paths.js";
import { resolveStorePath } from "./paths.js";
import { resolveMaintenanceConfig } from "./store-maintenance-runtime.js";
import {
capEntryCount,
@@ -28,6 +23,7 @@ import {
type SessionStoreTarget,
type SessionStoreSelectionOptions,
} from "./targets.js";
import { hasSqliteSessionTranscriptEvents } from "./transcript-store.sqlite.js";
import type { SessionEntry } from "./types.js";
export type SessionsCleanupOptions = SessionStoreSelectionOptions & {
@@ -200,18 +196,19 @@ export function serializeSessionCleanupResult(params: {
function pruneMissingTranscriptEntries(params: {
store: Record<string, SessionEntry>;
storePath: string;
agentId: string;
onPruned?: (key: string) => void;
}): number {
const sessionPathOpts = resolveSessionFilePathOptions({
storePath: params.storePath,
});
let removed = 0;
for (const [key, entry] of Object.entries(params.store)) {
if (!entry?.sessionId) {
continue;
}
const transcriptPath = resolveSessionFilePath(entry.sessionId, entry, sessionPathOpts);
if (!fs.existsSync(transcriptPath)) {
const hasTranscript = hasSqliteSessionTranscriptEvents({
agentId: params.agentId,
sessionId: entry.sessionId,
});
if (!hasTranscript) {
delete params.store[key];
removed += 1;
params.onPruned?.(key);
@@ -262,6 +259,7 @@ async function previewStoreCleanup(params: {
? pruneMissingTranscriptEntries({
store: previewStore,
storePath: params.target.storePath,
agentId: params.target.agentId,
onPruned: (key) => {
missingKeys.add(key);
},
@@ -411,6 +409,7 @@ export async function runSessionsCleanup(params: {
? pruneMissingTranscriptEntries({
store,
storePath: target.storePath,
agentId: target.agentId,
})
: 0;
let pruned = 0;

View File

@@ -3,7 +3,6 @@ import fsPromises from "node:fs/promises";
import path from "node:path";
import { afterAll, afterEach, beforeAll, describe, expect, it, vi } from "vitest";
import { upsertAcpSessionMeta } from "../../acp/runtime/session-meta.js";
import * as jsonFiles from "../../infra/json-files.js";
import { createSuiteTempRootTracker, withTempDirSync } from "../../test-helpers/temp-dir.js";
import type { OpenClawConfig } from "../config.js";
import type { SessionConfig } from "../types.base.js";
@@ -16,7 +15,12 @@ import {
} from "./paths.js";
import { evaluateSessionFreshness, resolveSessionResetPolicy } from "./reset.js";
import { resolveAndPersistSessionFile } from "./session-file.js";
import { clearSessionStoreCacheForTest, loadSessionStore, updateSessionStore } from "./store.js";
import {
clearSessionStoreCacheForTest,
loadSessionStore,
saveSessionStore,
updateSessionStore,
} from "./store.js";
import { useTempSessionsFixture } from "./test-helpers.js";
import { mergeSessionEntry, mergeSessionEntryWithPolicy, type SessionEntry } from "./types.js";
@@ -278,9 +282,9 @@ describe("session store writer queue", () => {
initial: Record<string, unknown> = {},
): Promise<{ dir: string; storePath: string }> {
const dir = await writerFixtureRootTracker.make("case");
const storePath = path.join(dir, "sessions.json");
const storePath = path.join(dir, "agents", "main", "sessions", "sessions.json");
if (Object.keys(initial).length > 0) {
await fsPromises.writeFile(storePath, JSON.stringify(initial, null, 2), "utf-8");
await saveSessionStore(storePath, initial as Record<string, SessionEntry>);
}
return { dir, storePath };
}
@@ -319,18 +323,16 @@ describe("session store writer queue", () => {
expect((store[key] as Record<string, unknown>).counter).toBe(N);
});
it("writes legacy JSON fallback stores directly even when payload is unchanged", async () => {
it("persists SQLite stores even when payload is unchanged", async () => {
const key = "agent:main:no-op-save";
const { storePath } = await makeTmpStore({
[key]: { sessionId: "s-noop", updatedAt: Date.now() },
});
const writeSpy = vi.spyOn(jsonFiles, "writeTextAtomic");
await updateSessionStore(storePath, async () => {
// Intentionally no-op mutation.
});
expect(writeSpy).toHaveBeenCalledTimes(1);
writeSpy.mockRestore();
expect(loadSessionStore(storePath)[key]?.sessionId).toBe("s-noop");
});
it("keeps session store writes atomic while skipping durable fsync inside the writer lock", async () => {
@@ -528,7 +530,7 @@ describe("resolveAndPersistSessionFile", () => {
updatedAt: Date.now(),
},
};
fs.writeFileSync(fixture.storePath(), JSON.stringify(store), "utf-8");
await saveSessionStore(fixture.storePath(), store);
const sessionStore = loadSessionStore(fixture.storePath());
const fallbackSessionFile = resolveSessionTranscriptPathInDir(
sessionId,
@@ -554,7 +556,7 @@ describe("resolveAndPersistSessionFile", () => {
it("creates and persists entry when session is not yet present", async () => {
const sessionId = "new-session-id";
const sessionKey = "agent:main:telegram:group:123";
fs.writeFileSync(fixture.storePath(), JSON.stringify({}), "utf-8");
await saveSessionStore(fixture.storePath(), {});
const sessionStore = loadSessionStore(fixture.storePath());
const fallbackSessionFile = resolveSessionTranscriptPathInDir(sessionId, fixture.sessionsDir());
@@ -591,7 +593,7 @@ describe("resolveAndPersistSessionFile", () => {
sessionFile: previousSessionFile,
},
};
fs.writeFileSync(fixture.storePath(), JSON.stringify(store), "utf-8");
await saveSessionStore(fixture.storePath(), store);
const sessionStore = loadSessionStore(fixture.storePath());
const result = await resolveAndPersistSessionFile({

View File

@@ -1,5 +1,6 @@
import fs, { readFileSync } from "node:fs";
import type { SQLInputValue, StatementSync } from "node:sqlite";
import { DEFAULT_AGENT_ID } from "../../routing/session-key.js";
import {
type OpenClawStateDatabase,
openOpenClawStateDatabase,
@@ -39,10 +40,7 @@ export function resolveSqliteSessionStoreOptionsForPath(
if (!isSqliteSessionStoreBackendEnabled(env)) {
return null;
}
const agentId = resolveAgentIdFromSessionStorePath(storePath);
if (!agentId) {
return null;
}
const agentId = resolveAgentIdFromSessionStorePath(storePath) ?? DEFAULT_AGENT_ID;
return { agentId, env, sourcePath: storePath };
}

View File

@@ -1,50 +1,15 @@
import fs from "node:fs";
import {
loadSqliteSessionStore,
resolveSqliteSessionStoreOptionsForPath,
} from "./store-backend.sqlite.js";
import { applySessionStoreMigrations } from "./store-migrations.js";
import { normalizeSessionStore } from "./store-normalize.js";
import type { SessionEntry } from "./types.js";
export { normalizeSessionStore } from "./store-normalize.js";
function isSessionStoreRecord(value: unknown): value is Record<string, SessionEntry> {
return !!value && typeof value === "object" && !Array.isArray(value);
}
export function loadSessionStore(storePath: string): Record<string, SessionEntry> {
const sqliteOptions = resolveSqliteSessionStoreOptionsForPath(storePath);
if (sqliteOptions) {
return loadSqliteSessionStore(sqliteOptions);
if (!sqliteOptions) {
throw new Error(`Session stores are SQLite-only; cannot resolve agent for ${storePath}`);
}
// Retry a few times on Windows because readers can briefly observe empty or
// transiently invalid content while another process is swapping the file.
let store: Record<string, SessionEntry> = {};
const maxReadAttempts = process.platform === "win32" ? 3 : 1;
const retryBuf = maxReadAttempts > 1 ? new Int32Array(new SharedArrayBuffer(4)) : undefined;
for (let attempt = 0; attempt < maxReadAttempts; attempt += 1) {
try {
const raw = fs.readFileSync(storePath, "utf-8");
if (raw.length === 0 && attempt < maxReadAttempts - 1) {
Atomics.wait(retryBuf!, 0, 0, 50);
continue;
}
const parsed = JSON.parse(raw);
if (isSessionStoreRecord(parsed)) {
store = parsed;
}
break;
} catch {
if (attempt < maxReadAttempts - 1) {
Atomics.wait(retryBuf!, 0, 0, 50);
continue;
}
}
}
applySessionStoreMigrations(store);
normalizeSessionStore(store);
return store;
return loadSqliteSessionStore(sqliteOptions);
}

View File

@@ -1,8 +1,6 @@
import fs from "node:fs";
import path from "node:path";
import type { MsgContext } from "../../auto-reply/templating.js";
import { writeTextAtomic } from "../../infra/json-files.js";
import { createSubsystemLogger } from "../../logging/subsystem.js";
import {
deliveryContextFromSession,
mergeDeliveryContext,
@@ -37,7 +35,6 @@ export { withSessionStoreWriterForTest } from "./store-writer.js";
export { loadSessionStore } from "./store-load.js";
export { normalizeStoreSessionKey, resolveSessionStoreEntry } from "./store-entry.js";
const log = createSubsystemLogger("sessions/store");
let sessionArchiveRuntimePromise: Promise<
typeof import("../../gateway/session-archive.runtime.js")
> | null = null;
@@ -148,57 +145,10 @@ async function saveSessionStoreUnlocked(
await fs.promises.mkdir(path.dirname(storePath), { recursive: true });
const sqliteOptions = resolveSqliteSessionStoreOptionsForPath(storePath);
if (sqliteOptions) {
saveSqliteSessionStore(sqliteOptions, store);
return;
}
const json = JSON.stringify(store, null, 2);
// Windows: keep retry semantics because rename can fail while readers hold locks.
if (process.platform === "win32") {
for (let i = 0; i < 5; i++) {
try {
await writeSessionStoreAtomic({ storePath, serialized: json });
return;
} catch (err) {
const code = getErrorCode(err);
if (code === "ENOENT") {
return;
}
if (i < 4) {
await new Promise((r) => setTimeout(r, 50 * (i + 1)));
continue;
}
// Final attempt failed - skip this save. The writer queue ensures
// the next save will retry with fresh data. Log for diagnostics.
log.warn(`atomic write failed after 5 attempts: ${storePath}`);
}
}
return;
}
try {
await writeSessionStoreAtomic({ storePath, serialized: json });
} catch (err) {
const code = getErrorCode(err);
if (code === "ENOENT") {
// In tests the temp session-store directory may be deleted while writes are in-flight.
// Best-effort: try a direct write (recreating the parent dir), otherwise ignore.
try {
await writeSessionStoreAtomic({ storePath, serialized: json });
} catch (err2) {
const code2 = getErrorCode(err2);
if (code2 === "ENOENT") {
return;
}
throw err2;
}
return;
}
throw err;
if (!sqliteOptions) {
throw new Error(`Session stores are SQLite-only; cannot resolve agent for ${storePath}`);
}
saveSqliteSessionStore(sqliteOptions, store);
}
export async function saveSessionStore(
@@ -249,13 +199,6 @@ export async function runQuotaSuspensionMaintenance(params: {
);
}
function getErrorCode(error: unknown): string | null {
if (!error || typeof error !== "object" || !("code" in error)) {
return null;
}
return String((error as { code?: unknown }).code);
}
export async function archiveRemovedSessionTranscripts(params: {
removedSessionFiles: Iterable<[string, string | undefined]>;
referencedSessionIds: ReadonlySet<string>;
@@ -283,13 +226,6 @@ export async function archiveRemovedSessionTranscripts(params: {
return archivedDirs;
}
async function writeSessionStoreAtomic(params: {
storePath: string;
serialized: string;
}): Promise<void> {
await writeTextAtomic(params.storePath, params.serialized, { mode: 0o600 });
}
async function persistResolvedSessionEntry(params: {
storePath: string;
store: Record<string, SessionEntry>;

View File

@@ -1,7 +1,7 @@
import fs from "node:fs";
import os from "node:os";
import path from "node:path";
import { afterEach, beforeEach } from "vitest";
import { afterEach, beforeEach, vi } from "vitest";
export function useTempSessionsFixture(prefix: string) {
let tempDir = "";
@@ -12,7 +12,8 @@ export function useTempSessionsFixture(prefix: string) {
tempDir = fs.mkdtempSync(path.join(os.tmpdir(), prefix));
sessionsDir = path.join(tempDir, "agents", "main", "sessions");
fs.mkdirSync(sessionsDir, { recursive: true });
storePath = path.join(tempDir, "session-store.json");
storePath = path.join(sessionsDir, "sessions.json");
vi.stubEnv("OPENCLAW_STATE_DIR", tempDir);
});
afterEach(() => {

View File

@@ -1,21 +1,11 @@
import { randomUUID } from "node:crypto";
import fs from "node:fs/promises";
import path from "node:path";
import { StringDecoder } from "node:string_decoder";
import {
acquireSessionWriteLock,
type SessionWriteLockAcquireTimeoutConfig,
resolveSessionWriteLockAcquireTimeoutMs,
} from "../../agents/session-write-lock.js";
import type { SessionWriteLockAcquireTimeoutConfig } from "../../agents/session-write-lock.js";
import {
appendSqliteSessionTranscriptEvent,
hasSqliteSessionTranscriptEvents,
importJsonlTranscriptToSqlite,
loadSqliteSessionTranscriptEvents,
} from "./transcript-store.sqlite.js";
const TRANSCRIPT_APPEND_SCAN_CHUNK_BYTES = 64 * 1024;
const SESSION_MANAGER_APPEND_MAX_BYTES = 8 * 1024 * 1024;
const transcriptAppendQueues = new Map<string, Promise<void>>();
async function loadCurrentSessionVersion(): Promise<number> {
@@ -23,229 +13,26 @@ async function loadCurrentSessionVersion(): Promise<number> {
.CURRENT_SESSION_VERSION;
}
type TranscriptLeafInfo = {
leafId?: string;
hasParentLinkedEntries: boolean;
nonSessionEntryCount: number;
};
async function yieldTranscriptAppendScan(): Promise<void> {
await new Promise<void>((resolve) => setImmediate(resolve));
}
function lineParentLinkedEntryId(line: string): string | undefined {
if (!line.trim()) {
return undefined;
}
try {
const parsed = JSON.parse(line) as { type?: unknown; id?: unknown; parentId?: unknown };
return parsed.type !== "session" && typeof parsed.id === "string" && "parentId" in parsed
? parsed.id
: undefined;
} catch {
return undefined;
}
}
function normalizeEntryId(value: unknown): string | undefined {
return typeof value === "string" && value.trim().length > 0 ? value : undefined;
}
function generateEntryId(existingIds: Set<string>): string {
for (let attempt = 0; attempt < 100; attempt += 1) {
const id = randomUUID().slice(0, 8);
if (!existingIds.has(id)) {
existingIds.add(id);
return id;
}
}
const id = randomUUID();
existingIds.add(id);
return id;
}
async function readTranscriptLeafInfo(transcriptPath: string): Promise<TranscriptLeafInfo> {
const handle = await fs.open(transcriptPath, "r");
try {
const decoder = new StringDecoder("utf8");
const buffer = Buffer.allocUnsafe(TRANSCRIPT_APPEND_SCAN_CHUNK_BYTES);
let carry = "";
let leafId: string | undefined;
let hasParentLinkedEntries = false;
let nonSessionEntryCount = 0;
while (true) {
const { bytesRead } = await handle.read(buffer, 0, buffer.length, null);
if (bytesRead <= 0) {
break;
}
const text = carry + decoder.write(buffer.subarray(0, bytesRead));
const lines = text.split(/\r?\n/);
carry = lines.pop() ?? "";
for (const line of lines) {
if (lineHasNonSessionEntry(line)) {
nonSessionEntryCount += 1;
}
const id = lineParentLinkedEntryId(line);
if (id) {
leafId = id;
hasParentLinkedEntries = true;
}
}
await yieldTranscriptAppendScan();
}
const tail = carry + decoder.end();
if (lineHasNonSessionEntry(tail)) {
nonSessionEntryCount += 1;
}
const id = lineParentLinkedEntryId(tail);
if (id) {
leafId = id;
hasParentLinkedEntries = true;
}
return {
...(leafId ? { leafId } : {}),
hasParentLinkedEntries,
nonSessionEntryCount,
};
} finally {
await handle.close();
}
}
function lineHasNonSessionEntry(line: string): boolean {
if (!line.trim()) {
return false;
}
try {
const parsed = JSON.parse(line) as { type?: unknown };
return parsed.type !== "session";
} catch {
return false;
}
}
function shouldMirrorTranscriptToSqlite(params: {
function normalizeRequiredScope(params: {
transcriptPath: string;
agentId?: string;
sessionId?: string;
}): params is {
agentId: string;
sessionId: string;
} {
return Boolean(params.agentId?.trim() && params.sessionId?.trim());
}
function importJsonlTranscriptToSqliteIfEmpty(params: {
transcriptPath: string;
agentId: string;
sessionId: string;
now: number;
}): void {
if (
hasSqliteSessionTranscriptEvents({
agentId: params.agentId,
sessionId: params.sessionId,
})
) {
return;
}): { agentId: string; sessionId: string; queueKey: string } {
const agentId = params.agentId?.trim();
const sessionId = params.sessionId?.trim();
if (!agentId || !sessionId) {
throw new Error(
`SQLite transcript appends require agentId and sessionId; path-only transcript writes are retired (${params.transcriptPath})`,
);
}
importJsonlTranscriptToSqlite({
agentId: params.agentId,
sessionId: params.sessionId,
transcriptPath: params.transcriptPath,
now: () => params.now,
});
}
async function migrateLinearTranscriptToParentLinked(transcriptPath: string): Promise<{
leafId?: string;
}> {
const raw = await fs.readFile(transcriptPath, "utf-8");
const currentSessionVersion = await loadCurrentSessionVersion();
const existingIds = new Set<string>();
const output: string[] = [];
let previousId: string | null = null;
let leafId: string | undefined;
for (const line of raw.split(/\r?\n/)) {
if (!line.trim()) {
continue;
}
let parsed: unknown;
try {
parsed = JSON.parse(line);
} catch {
output.push(line);
continue;
}
if (!parsed || typeof parsed !== "object" || Array.isArray(parsed)) {
output.push(line);
continue;
}
const record = parsed as Record<string, unknown>;
if (record.type === "session") {
output.push(JSON.stringify({ ...record, version: currentSessionVersion }));
continue;
}
const id = normalizeEntryId(record.id) ?? generateEntryId(existingIds);
existingIds.add(id);
record.id = id;
if (!Object.hasOwn(record, "parentId")) {
record.parentId = previousId;
}
previousId = id;
leafId = id;
output.push(JSON.stringify(record));
}
await fs.writeFile(transcriptPath, `${output.join("\n")}\n`, {
encoding: "utf-8",
mode: 0o600,
});
const result: { leafId?: string } = {};
if (leafId) {
result.leafId = leafId;
}
return result;
}
async function ensureTranscriptHeader(
transcriptPath: string,
params: { sessionId?: string; cwd?: string } = {},
): Promise<void> {
const stat = await fs.stat(transcriptPath).catch(() => null);
if (stat?.isFile() && stat.size > 0) {
return;
}
const currentSessionVersion = await loadCurrentSessionVersion();
await fs.mkdir(path.dirname(transcriptPath), { recursive: true });
const header = {
type: "session",
version: currentSessionVersion,
id: params.sessionId ?? randomUUID(),
timestamp: new Date().toISOString(),
cwd: params.cwd ?? process.cwd(),
return {
agentId,
sessionId,
queueKey: `${agentId}\0${sessionId}`,
};
await fs.writeFile(transcriptPath, `${JSON.stringify(header)}\n`, {
encoding: "utf-8",
mode: 0o600,
flag: stat?.isFile() ? "w" : "wx",
});
}
async function resolveTranscriptAppendQueueKey(transcriptPath: string): Promise<string> {
const resolvedTranscriptPath = path.resolve(transcriptPath);
const transcriptDir = path.dirname(resolvedTranscriptPath);
await fs.mkdir(transcriptDir, { recursive: true });
try {
return path.join(await fs.realpath(transcriptDir), path.basename(resolvedTranscriptPath));
} catch {
return resolvedTranscriptPath;
}
}
async function withTranscriptAppendQueue<T>(
transcriptPath: string,
fn: () => Promise<T>,
): Promise<T> {
const queueKey = await resolveTranscriptAppendQueueKey(transcriptPath);
async function withTranscriptAppendQueue<T>(queueKey: string, fn: () => Promise<T>): Promise<T> {
const previous = transcriptAppendQueues.get(queueKey) ?? Promise.resolve();
let releaseCurrent!: () => void;
const current = new Promise<void>((resolve) => {
@@ -264,6 +51,83 @@ async function withTranscriptAppendQueue<T>(
}
}
function latestParentLinkedEntryId(events: unknown[]): string | undefined {
for (const event of events.toReversed()) {
if (!event || typeof event !== "object" || Array.isArray(event)) {
continue;
}
const record = event as { type?: unknown; id?: unknown; parentId?: unknown };
if (
record.type !== "session" &&
typeof record.id === "string" &&
Object.hasOwn(record, "parentId")
) {
return record.id;
}
}
return undefined;
}
function readMessageIdempotencyKey(message: unknown): string | undefined {
if (!message || typeof message !== "object" || Array.isArray(message)) {
return undefined;
}
const key = (message as { idempotencyKey?: unknown }).idempotencyKey;
return typeof key === "string" && key.trim() ? key : undefined;
}
function findExistingMessageIdForIdempotencyKey(
events: unknown[],
idempotencyKey: string | undefined,
): string | undefined {
if (!idempotencyKey) {
return undefined;
}
for (const event of events) {
if (!event || typeof event !== "object" || Array.isArray(event)) {
continue;
}
const record = event as { id?: unknown; message?: { idempotencyKey?: unknown } };
if (record.message?.idempotencyKey === idempotencyKey && typeof record.id === "string") {
return record.id;
}
}
return undefined;
}
async function appendSessionHeaderIfEmpty(params: {
agentId: string;
sessionId: string;
transcriptPath: string;
cwd?: string;
now: number;
}): Promise<unknown[]> {
const existing = loadSqliteSessionTranscriptEvents({
agentId: params.agentId,
sessionId: params.sessionId,
}).map((entry) => entry.event);
if (existing.length > 0) {
return existing;
}
const currentSessionVersion = await loadCurrentSessionVersion();
const header = {
type: "session",
version: currentSessionVersion,
id: params.sessionId,
timestamp: new Date(params.now).toISOString(),
cwd: params.cwd ?? process.cwd(),
};
appendSqliteSessionTranscriptEvent({
agentId: params.agentId,
sessionId: params.sessionId,
transcriptPath: path.resolve(params.transcriptPath),
event: header,
now: () => params.now,
});
return [header];
}
export async function appendSessionTranscriptMessage(params: {
transcriptPath: string;
message: unknown;
@@ -274,81 +138,38 @@ export async function appendSessionTranscriptMessage(params: {
useRawWhenLinear?: boolean;
config?: SessionWriteLockAcquireTimeoutConfig;
}): Promise<{ messageId: string }> {
return await withTranscriptAppendQueue(params.transcriptPath, () =>
appendSessionTranscriptMessageLocked(params),
);
}
async function appendSessionTranscriptMessageLocked(params: {
transcriptPath: string;
message: unknown;
agentId?: string;
now?: number;
sessionId?: string;
cwd?: string;
useRawWhenLinear?: boolean;
config?: SessionWriteLockAcquireTimeoutConfig;
}): Promise<{ messageId: string }> {
const lock = await acquireSessionWriteLock({
sessionFile: params.transcriptPath,
timeoutMs: resolveSessionWriteLockAcquireTimeoutMs(params.config),
allowReentrant: true,
});
try {
const scope = normalizeRequiredScope(params);
return await withTranscriptAppendQueue(scope.queueKey, async () => {
const now = params.now ?? Date.now();
const messageId = randomUUID();
await ensureTranscriptHeader(params.transcriptPath, {
...(params.sessionId ? { sessionId: params.sessionId } : {}),
...(params.cwd ? { cwd: params.cwd } : {}),
const events = await appendSessionHeaderIfEmpty({
agentId: scope.agentId,
sessionId: scope.sessionId,
transcriptPath: params.transcriptPath,
cwd: params.cwd,
now,
});
const stat = await fs.stat(params.transcriptPath).catch(() => null);
let leafInfo: TranscriptLeafInfo = await readTranscriptLeafInfo(params.transcriptPath).catch(
() => ({
hasParentLinkedEntries: false,
nonSessionEntryCount: 0,
}),
const existingMessageId = findExistingMessageIdForIdempotencyKey(
events,
readMessageIdempotencyKey(params.message),
);
const hasLinearEntries = !leafInfo.hasParentLinkedEntries && leafInfo.nonSessionEntryCount > 0;
const allowRawWhenLinear = params.useRawWhenLinear !== false;
const shouldRawAppend =
allowRawWhenLinear &&
hasLinearEntries &&
(stat?.size ?? 0) > SESSION_MANAGER_APPEND_MAX_BYTES;
if (hasLinearEntries && !shouldRawAppend) {
const migrated = await migrateLinearTranscriptToParentLinked(params.transcriptPath);
leafInfo = {
...(migrated.leafId ? { leafId: migrated.leafId } : {}),
hasParentLinkedEntries: Boolean(migrated.leafId),
nonSessionEntryCount: leafInfo.nonSessionEntryCount,
};
}
if (shouldMirrorTranscriptToSqlite(params)) {
importJsonlTranscriptToSqliteIfEmpty({
transcriptPath: params.transcriptPath,
agentId: params.agentId,
sessionId: params.sessionId,
now,
});
if (existingMessageId) {
return { messageId: existingMessageId };
}
const messageId = randomUUID();
const entry = {
type: "message",
id: messageId,
...(shouldRawAppend ? {} : { parentId: leafInfo.leafId ?? null }),
parentId: latestParentLinkedEntryId(events) ?? null,
timestamp: new Date(now).toISOString(),
message: params.message,
};
await fs.appendFile(params.transcriptPath, `${JSON.stringify(entry)}\n`, "utf-8");
if (shouldMirrorTranscriptToSqlite(params)) {
appendSqliteSessionTranscriptEvent({
agentId: params.agentId,
sessionId: params.sessionId,
transcriptPath: params.transcriptPath,
event: entry,
now: () => now,
});
}
appendSqliteSessionTranscriptEvent({
agentId: scope.agentId,
sessionId: scope.sessionId,
transcriptPath: path.resolve(params.transcriptPath),
event: entry,
now: () => now,
});
return { messageId };
} finally {
await lock.release();
}
});
}

View File

@@ -1,4 +1,5 @@
import fs from "node:fs";
import path from "node:path";
import { writeTextAtomic } from "../../infra/json-files.js";
import { normalizeAgentId } from "../../routing/session-key.js";
import {
@@ -37,6 +38,16 @@ export type ImportJsonlTranscriptToSqliteOptions = SqliteSessionTranscriptStoreO
export type ExportSqliteTranscriptJsonlOptions = SqliteSessionTranscriptStoreOptions;
export type SqliteSessionTranscriptScope = {
agentId: string;
sessionId: string;
};
export type SqliteSessionTranscriptFile = SqliteSessionTranscriptScope & {
path: string;
updatedAt: number;
};
function normalizeSessionId(value: string): string {
const sessionId = value.trim();
if (!sessionId) {
@@ -78,6 +89,7 @@ function rememberTranscriptFile(params: {
if (!transcriptPath) {
return;
}
const resolvedTranscriptPath = path.resolve(transcriptPath);
runOpenClawStateWriteTransaction((database) => {
database.db
.prepare(
@@ -97,13 +109,92 @@ function rememberTranscriptFile(params: {
.run(
params.agentId,
params.sessionId,
transcriptPath,
resolvedTranscriptPath,
params.importedAt ?? null,
params.exportedAt ?? null,
);
}, params.options);
}
export function resolveSqliteSessionTranscriptScopeForPath(
options: OpenClawStateDatabaseOptions & { transcriptPath: string },
): SqliteSessionTranscriptScope | undefined {
const transcriptPath = path.resolve(options.transcriptPath);
const database = openOpenClawStateDatabase(options);
const row = database.db
.prepare(
`
SELECT agent_id, session_id
FROM transcript_files
WHERE path = ?
ORDER BY COALESCE(imported_at, exported_at, 0) DESC
LIMIT 1
`,
)
.get(transcriptPath) as { agent_id?: unknown; session_id?: unknown } | undefined;
if (typeof row?.agent_id !== "string" || typeof row.session_id !== "string") {
return undefined;
}
return {
agentId: normalizeAgentId(row.agent_id),
sessionId: normalizeSessionId(row.session_id),
};
}
export function listSqliteSessionTranscriptFiles(
options: OpenClawStateDatabaseOptions = {},
): SqliteSessionTranscriptFile[] {
const database = openOpenClawStateDatabase(options);
return database.db
.prepare(
`
SELECT
files.agent_id,
files.session_id,
files.path,
MAX(
COALESCE(events.created_at, 0),
COALESCE(files.imported_at, 0),
COALESCE(files.exported_at, 0)
) AS updated_at
FROM transcript_files files
LEFT JOIN transcript_events events
ON events.agent_id = files.agent_id
AND events.session_id = files.session_id
GROUP BY files.agent_id, files.session_id, files.path
ORDER BY updated_at DESC, files.path ASC
`,
)
.all()
.flatMap((row) => {
const record = row as {
agent_id?: unknown;
session_id?: unknown;
path?: unknown;
updated_at?: unknown;
};
if (
typeof record.agent_id !== "string" ||
typeof record.session_id !== "string" ||
typeof record.path !== "string"
) {
return [];
}
const updatedAt =
typeof record.updated_at === "bigint"
? Number(record.updated_at)
: Number(record.updated_at ?? 0);
return [
{
agentId: normalizeAgentId(record.agent_id),
sessionId: normalizeSessionId(record.session_id),
path: record.path,
updatedAt: Number.isFinite(updatedAt) ? updatedAt : 0,
},
];
});
}
export function appendSqliteSessionTranscriptEvent(
options: AppendSqliteSessionTranscriptEventOptions,
): { seq: number } {

View File

@@ -5,6 +5,7 @@ import { afterEach, describe, expect, it, vi } from "vitest";
import * as transcriptEvents from "../../sessions/transcript-events.js";
import { closeOpenClawStateDatabaseForTest } from "../../state/openclaw-state-db.js";
import { resolveSessionTranscriptPathInDir } from "./paths.js";
import { saveSessionStore } from "./store.js";
import { useTempSessionsFixture } from "./test-helpers.js";
import { appendSessionTranscriptMessage } from "./transcript-append.js";
import {
@@ -17,6 +18,7 @@ import {
readLatestAssistantTextFromSessionTranscript,
readTailAssistantTextFromSessionTranscript,
} from "./transcript.js";
import type { SessionEntry } from "./types.js";
afterEach(() => {
closeOpenClawStateDatabaseForTest();
@@ -31,17 +33,37 @@ describe("appendAssistantMessageToSessionTranscript", () => {
typeof appendExactAssistantMessageToSessionTranscript
>[0]["message"];
function writeTranscriptStore() {
fs.writeFileSync(
fixture.storePath(),
JSON.stringify({
[sessionKey]: {
sessionId,
chatType: "direct",
channel: "discord",
async function writeTranscriptStore(
store: Record<string, SessionEntry> = {
[sessionKey]: {
sessionId,
chatType: "direct",
channel: "discord",
updatedAt: 1,
},
},
) {
await saveSessionStore(fixture.storePath(), store);
}
function readEvents(targetSessionId = sessionId) {
return loadSqliteSessionTranscriptEvents({
agentId: "main",
sessionId: targetSessionId,
}).map(
(entry) =>
entry.event as {
type?: string;
id?: string;
parentId?: string | null;
message?: {
role?: string;
provider?: string;
model?: string;
content?: Array<{ type?: string; text?: string }> | string;
idempotencyKey?: string;
};
},
}),
"utf-8",
);
}
@@ -71,7 +93,7 @@ describe("appendAssistantMessageToSessionTranscript", () => {
}
it("creates transcript file and appends message for valid session", async () => {
writeTranscriptStore();
await writeTranscriptStore();
const result = await appendAssistantMessageToSessionTranscript({
sessionKey,
@@ -81,36 +103,28 @@ describe("appendAssistantMessageToSessionTranscript", () => {
expect(result.ok).toBe(true);
if (result.ok) {
expect(fs.existsSync(result.sessionFile)).toBe(true);
const sessionFileMode = fs.statSync(result.sessionFile).mode & 0o777;
if (process.platform !== "win32") {
expect(sessionFileMode).toBe(0o600);
}
expect(fs.existsSync(result.sessionFile)).toBe(false);
const events = readEvents();
expect(events.length).toBe(2);
const lines = fs.readFileSync(result.sessionFile, "utf-8").trim().split("\n");
expect(lines.length).toBe(2);
const header = JSON.parse(lines[0]);
const header = events[0];
expect(header.type).toBe("session");
expect(header.id).toBe(sessionId);
const messageLine = JSON.parse(lines[1]);
const messageLine = events[1];
expect(messageLine.type).toBe("message");
expect(messageLine.message.role).toBe("assistant");
expect(messageLine.message.content[0].type).toBe("text");
expect(messageLine.message.content[0].text).toBe("Hello from delivery mirror!");
expect(messageLine.message?.role).toBe("assistant");
const content = messageLine.message?.content;
expect(Array.isArray(content)).toBe(true);
expect(Array.isArray(content) ? content[0]?.type : undefined).toBe("text");
expect(Array.isArray(content) ? content[0]?.text : undefined).toBe(
"Hello from delivery mirror!",
);
}
});
it("emits transcript update events for delivery mirrors", async () => {
const store = {
[sessionKey]: {
sessionId,
chatType: "direct",
channel: "discord",
},
};
fs.writeFileSync(fixture.storePath(), JSON.stringify(store), "utf-8");
await writeTranscriptStore();
const emitSpy = vi.spyOn(transcriptEvents, "emitSessionTranscriptUpdate");
await appendAssistantMessageToSessionTranscript({
@@ -137,7 +151,7 @@ describe("appendAssistantMessageToSessionTranscript", () => {
});
it("does not append a duplicate delivery mirror for the same idempotency key", async () => {
writeTranscriptStore();
await writeTranscriptStore();
await appendAssistantMessageToSessionTranscript({
sessionKey,
@@ -152,19 +166,20 @@ describe("appendAssistantMessageToSessionTranscript", () => {
storePath: fixture.storePath(),
});
const sessionFile = resolveSessionTranscriptPathInDir(sessionId, fixture.sessionsDir());
const lines = fs.readFileSync(sessionFile, "utf-8").trim().split("\n");
expect(lines.length).toBe(2);
const messageLine = JSON.parse(lines[1]);
expect(messageLine.message.idempotencyKey).toBe("mirror:test-source-message");
expect(messageLine.message.content[0].text).toBe("Hello from delivery mirror!");
const events = readEvents();
expect(events.length).toBe(2);
const messageLine = events[1];
expect(messageLine?.message?.idempotencyKey).toBe("mirror:test-source-message");
const content = messageLine?.message?.content;
expect(Array.isArray(content) ? content[0]?.text : undefined).toBe(
"Hello from delivery mirror!",
);
});
it("uses scoped SQLite transcript events for delivery mirror idempotency", async () => {
const stateDir = fs.mkdtempSync(path.join(os.tmpdir(), "openclaw-transcript-state-"));
vi.stubEnv("OPENCLAW_STATE_DIR", stateDir);
writeTranscriptStore();
await writeTranscriptStore();
appendSqliteSessionTranscriptEvent({
agentId: "main",
sessionId,
@@ -191,17 +206,13 @@ describe("appendAssistantMessageToSessionTranscript", () => {
ok: true,
messageId: "sqlite-mirror-message",
});
if (result.ok) {
const lines = fs.readFileSync(result.sessionFile, "utf-8").trim().split("\n");
expect(lines).toHaveLength(1);
expect(JSON.parse(lines[0]).type).toBe("session");
}
expect(readEvents()).toHaveLength(1);
fs.rmSync(stateDir, { recursive: true, force: true });
});
it("does not append a duplicate delivery mirror when the latest assistant message already matches", async () => {
writeTranscriptStore();
await writeTranscriptStore();
const exactResult = await appendExactAssistantMessageToSessionTranscript({
sessionKey,
@@ -220,20 +231,21 @@ describe("appendAssistantMessageToSessionTranscript", () => {
expect(mirrorResult.ok).toBe(true);
if (exactResult.ok && mirrorResult.ok) {
expect(mirrorResult.messageId).toBe(exactResult.messageId);
const lines = fs.readFileSync(mirrorResult.sessionFile, "utf-8").trim().split("\n");
expect(lines.length).toBe(2);
const events = readEvents();
expect(events.length).toBe(2);
const messageLine = JSON.parse(lines[1]);
expect(messageLine.message.provider).toBe("codex");
expect(messageLine.message.model).toBe("gpt-5.4");
expect(messageLine.message.content[0].text).toBe("Hello from Codex!");
const messageLine = events[1];
expect(messageLine?.message?.provider).toBe("codex");
expect(messageLine?.message?.model).toBe("gpt-5.4");
const content = messageLine?.message?.content;
expect(Array.isArray(content) ? content[0]?.text : undefined).toBe("Hello from Codex!");
}
});
it("uses scoped SQLite transcript events for delivery mirror latest-match dedupe", async () => {
const stateDir = fs.mkdtempSync(path.join(os.tmpdir(), "openclaw-transcript-state-"));
vi.stubEnv("OPENCLAW_STATE_DIR", stateDir);
writeTranscriptStore();
await writeTranscriptStore();
appendSqliteSessionTranscriptEvent({
agentId: "main",
sessionId,
@@ -255,17 +267,13 @@ describe("appendAssistantMessageToSessionTranscript", () => {
ok: true,
messageId: "sqlite-latest-assistant",
});
if (result.ok) {
const lines = fs.readFileSync(result.sessionFile, "utf-8").trim().split("\n");
expect(lines).toHaveLength(1);
expect(JSON.parse(lines[0]).type).toBe("session");
}
expect(readEvents()).toHaveLength(1);
fs.rmSync(stateDir, { recursive: true, force: true });
});
it("does not reuse an older matching assistant message across turns", async () => {
writeTranscriptStore();
await writeTranscriptStore();
const olderResult = await appendExactAssistantMessageToSessionTranscript({
sessionKey,
@@ -292,26 +300,28 @@ describe("appendAssistantMessageToSessionTranscript", () => {
expect(mirrorResult.messageId).not.toBe(olderResult.messageId);
expect(mirrorResult.messageId).not.toBe(latestResult.messageId);
const lines = fs.readFileSync(mirrorResult.sessionFile, "utf-8").trim().split("\n");
expect(lines.length).toBe(4);
const events = readEvents();
expect(events.length).toBe(4);
const messageLine = JSON.parse(lines[3]);
expect(messageLine.message.provider).toBe("openclaw");
expect(messageLine.message.model).toBe("delivery-mirror");
expect(messageLine.message.content[0].text).toBe("Repeated answer");
const messageLine = events[3];
expect(messageLine?.message?.provider).toBe("openclaw");
expect(messageLine?.message?.model).toBe("delivery-mirror");
const content = messageLine?.message?.content;
expect(Array.isArray(content) ? content[0]?.text : undefined).toBe("Repeated answer");
}
});
it("finds session entry using normalized (lowercased) key", async () => {
const storeKey = "agent:main:imessage:direct:+15551234567";
const store = {
const store: Record<string, SessionEntry> = {
[storeKey]: {
sessionId: "test-session-normalized",
chatType: "direct",
channel: "imessage",
updatedAt: 1,
},
};
fs.writeFileSync(fixture.storePath(), JSON.stringify(store), "utf-8");
await saveSessionStore(fixture.storePath(), store);
const result = await appendAssistantMessageToSessionTranscript({
sessionKey: "agent:main:iMessage:direct:+15551234567",
@@ -324,14 +334,15 @@ describe("appendAssistantMessageToSessionTranscript", () => {
it("finds Slack session entry using normalized (lowercased) key", async () => {
const storeKey = "agent:main:slack:direct:u12345abc";
const store = {
const store: Record<string, SessionEntry> = {
[storeKey]: {
sessionId: "test-slack-session",
chatType: "direct",
channel: "slack",
updatedAt: 1,
},
};
fs.writeFileSync(fixture.storePath(), JSON.stringify(store), "utf-8");
await saveSessionStore(fixture.storePath(), store);
const result = await appendAssistantMessageToSessionTranscript({
sessionKey: "agent:main:slack:direct:U12345ABC",
@@ -343,31 +354,7 @@ describe("appendAssistantMessageToSessionTranscript", () => {
});
it("ignores malformed transcript lines when checking mirror idempotency", async () => {
writeTranscriptStore();
const sessionFile = resolveSessionTranscriptPathInDir(sessionId, fixture.sessionsDir());
fs.writeFileSync(
sessionFile,
[
JSON.stringify({
type: "session",
version: 1,
id: sessionId,
timestamp: new Date().toISOString(),
cwd: process.cwd(),
}),
"{not-json",
JSON.stringify({
type: "message",
message: {
role: "assistant",
idempotencyKey: "mirror:test-source-message",
content: [{ type: "text", text: "Hello from delivery mirror!" }],
},
}),
].join("\n") + "\n",
"utf-8",
);
await writeTranscriptStore();
const result = await appendAssistantMessageToSessionTranscript({
sessionKey,
@@ -377,12 +364,11 @@ describe("appendAssistantMessageToSessionTranscript", () => {
});
expect(result.ok).toBe(true);
const lines = fs.readFileSync(sessionFile, "utf-8").trim().split("\n");
expect(lines.length).toBe(3);
expect(readEvents()).toHaveLength(2);
});
it("appends exact assistant transcript messages without rewriting phased content", async () => {
writeTranscriptStore();
await writeTranscriptStore();
const result = await appendExactAssistantMessageToSessionTranscript({
sessionKey,
@@ -407,9 +393,8 @@ describe("appendAssistantMessageToSessionTranscript", () => {
expect(result.ok).toBe(true);
if (result.ok) {
const lines = fs.readFileSync(result.sessionFile, "utf-8").trim().split("\n");
const messageLine = JSON.parse(lines[1]);
expect(messageLine.message.content).toEqual([
const messageLine = readEvents()[1];
expect(messageLine?.message?.content).toEqual([
{
type: "text",
text: "internal reasoning",
@@ -425,7 +410,7 @@ describe("appendAssistantMessageToSessionTranscript", () => {
});
it("can emit file-only transcript refresh events for exact assistant appends", async () => {
writeTranscriptStore();
await writeTranscriptStore();
const emitSpy = vi.spyOn(transcriptEvents, "emitSessionTranscriptUpdate");
const result = await appendExactAssistantMessageToSessionTranscript({
@@ -450,54 +435,37 @@ describe("appendAssistantMessageToSessionTranscript", () => {
});
it("serializes concurrent parent-linked transcript appends", async () => {
const sessionFile = resolveSessionTranscriptPathInDir(
"concurrent-tree-session",
fixture.sessionsDir(),
);
fs.writeFileSync(
sessionFile,
[
JSON.stringify({
type: "session",
version: 1,
id: "concurrent-tree-session",
timestamp: new Date().toISOString(),
cwd: process.cwd(),
}),
JSON.stringify({
type: "message",
id: "root-message",
parentId: null,
timestamp: new Date().toISOString(),
message: { role: "user", content: "root" },
}),
].join("\n") + "\n",
"utf-8",
);
const targetSessionId = "concurrent-tree-session";
const sessionFile = resolveSessionTranscriptPathInDir(targetSessionId, fixture.sessionsDir());
appendSqliteSessionTranscriptEvent({
agentId: "main",
sessionId: targetSessionId,
event: { type: "session", id: targetSessionId },
});
appendSqliteSessionTranscriptEvent({
agentId: "main",
sessionId: targetSessionId,
event: {
type: "message",
id: "root-message",
parentId: null,
timestamp: new Date().toISOString(),
message: { role: "user", content: "root" },
},
});
await Promise.all(
Array.from({ length: 8 }, (_, index) =>
appendSessionTranscriptMessage({
transcriptPath: sessionFile,
agentId: "main",
sessionId: targetSessionId,
message: { role: "assistant", content: `reply ${index}` },
}),
),
);
const records = fs
.readFileSync(sessionFile, "utf-8")
.trim()
.split("\n")
.map(
(line) =>
JSON.parse(line) as {
type?: string;
id?: string;
parentId?: string | null;
message?: { content?: string };
},
)
.filter((record) => record.type === "message");
const records = readEvents(targetSessionId).filter((record) => record.type === "message");
expect(records).toHaveLength(9);
for (let index = 1; index < records.length; index += 1) {
@@ -505,55 +473,45 @@ describe("appendAssistantMessageToSessionTranscript", () => {
}
});
it("migrates small linear transcripts before appending", async () => {
const sessionFile = resolveSessionTranscriptPathInDir(
"small-linear-session",
fixture.sessionsDir(),
);
fs.writeFileSync(
sessionFile,
[
JSON.stringify({
type: "session",
version: 3,
id: "small-linear-session",
timestamp: new Date().toISOString(),
cwd: process.cwd(),
}),
JSON.stringify({
type: "message",
id: "legacy-first",
timestamp: new Date().toISOString(),
message: { role: "user", content: "legacy first" },
}),
JSON.stringify({
type: "message",
id: "legacy-second",
timestamp: new Date().toISOString(),
message: { role: "assistant", content: "legacy second" },
}),
].join("\n") + "\n",
"utf-8",
);
it("appends to existing SQLite transcript chains", async () => {
const targetSessionId = "small-linear-session";
const sessionFile = resolveSessionTranscriptPathInDir(targetSessionId, fixture.sessionsDir());
appendSqliteSessionTranscriptEvent({
agentId: "main",
sessionId: targetSessionId,
event: { type: "session", version: 3, id: targetSessionId },
});
appendSqliteSessionTranscriptEvent({
agentId: "main",
sessionId: targetSessionId,
event: {
type: "message",
id: "legacy-first",
parentId: null,
timestamp: new Date().toISOString(),
message: { role: "user", content: "legacy first" },
},
});
appendSqliteSessionTranscriptEvent({
agentId: "main",
sessionId: targetSessionId,
event: {
type: "message",
id: "legacy-second",
parentId: "legacy-first",
timestamp: new Date().toISOString(),
message: { role: "assistant", content: "legacy second" },
},
});
const appended = await appendSessionTranscriptMessage({
transcriptPath: sessionFile,
agentId: "main",
sessionId: targetSessionId,
message: { role: "assistant", content: "new reply" },
});
const records = fs
.readFileSync(sessionFile, "utf-8")
.trim()
.split("\n")
.map(
(line) =>
JSON.parse(line) as {
type?: string;
id?: string;
parentId?: string | null;
message?: { content?: string };
},
);
const records = readEvents(targetSessionId);
const messages = records.filter((record) => record.type === "message");
expect(messages.map((record) => record.message?.content)).toEqual([
@@ -569,33 +527,31 @@ describe("appendAssistantMessageToSessionTranscript", () => {
});
});
it("imports existing scoped JSONL transcript into SQLite before appending", async () => {
it("appends scoped SQLite transcript entries without importing JSONL at runtime", async () => {
const stateDir = fs.mkdtempSync(path.join(os.tmpdir(), "openclaw-transcript-state-"));
vi.stubEnv("OPENCLAW_STATE_DIR", stateDir);
const sessionFile = resolveSessionTranscriptPathInDir(
"sqlite-import-session",
fixture.sessionsDir(),
);
fs.writeFileSync(
sessionFile,
[
JSON.stringify({
type: "session",
version: 3,
id: "sqlite-import-session",
timestamp: new Date().toISOString(),
cwd: process.cwd(),
}),
JSON.stringify({
type: "message",
id: "legacy-first",
timestamp: new Date().toISOString(),
message: { role: "user", content: "legacy first" },
}),
"",
].join("\n"),
"utf-8",
);
appendSqliteSessionTranscriptEvent({
agentId: "main",
sessionId: "sqlite-import-session",
event: { type: "session", version: 3, id: "sqlite-import-session" },
now: () => 100,
});
appendSqliteSessionTranscriptEvent({
agentId: "main",
sessionId: "sqlite-import-session",
event: {
type: "message",
id: "legacy-first",
parentId: null,
timestamp: new Date().toISOString(),
message: { role: "user", content: "legacy first" },
},
now: () => 101,
});
const appended = await appendSessionTranscriptMessage({
transcriptPath: sessionFile,

View File

@@ -1,12 +1,13 @@
import fs from "node:fs";
import path from "node:path";
import type { SessionWriteLockAcquireTimeoutConfig } from "../../agents/session-write-lock.js";
import type { SessionManager } from "../../agents/transcript/session-transcript-contract.js";
import { formatErrorMessage } from "../../infra/errors.js";
import { DEFAULT_AGENT_ID, normalizeAgentId } from "../../routing/session-key.js";
import { emitSessionTranscriptUpdate } from "../../sessions/transcript-events.js";
import { extractAssistantVisibleText } from "../../shared/chat-message-content.js";
import {
resolveDefaultSessionStorePath,
resolveAgentIdFromSessionStorePath,
resolveSessionFilePath,
resolveSessionFilePathOptions,
resolveSessionTranscriptPath,
@@ -19,36 +20,10 @@ import { resolveMirroredTranscriptText } from "./transcript-mirror.js";
import {
hasSqliteSessionTranscriptEvents,
loadSqliteSessionTranscriptEvents,
resolveSqliteSessionTranscriptScopeForPath,
} from "./transcript-store.sqlite.js";
import type { SessionEntry } from "./types.js";
async function loadCurrentSessionVersion(): Promise<number> {
return (await import("../../agents/transcript/session-transcript-contract.js"))
.CURRENT_SESSION_VERSION;
}
async function ensureSessionHeader(params: {
sessionFile: string;
sessionId: string;
}): Promise<void> {
if (fs.existsSync(params.sessionFile)) {
return;
}
const CURRENT_SESSION_VERSION = await loadCurrentSessionVersion();
await fs.promises.mkdir(path.dirname(params.sessionFile), { recursive: true });
const header = {
type: "session",
version: CURRENT_SESSION_VERSION,
id: params.sessionId,
timestamp: new Date().toISOString(),
cwd: process.cwd(),
};
await fs.promises.writeFile(params.sessionFile, `${JSON.stringify(header)}\n`, {
encoding: "utf-8",
mode: 0o600,
});
}
export type SessionTranscriptAppendResult =
| { ok: true; sessionFile: string; messageId: string }
| { ok: false; reason: string };
@@ -80,15 +55,23 @@ function hasTranscriptQueryScope(scope?: TranscriptQueryScope): scope is {
return Boolean(scope?.agentId?.trim() && scope.sessionId?.trim());
}
function loadScopedSqliteTranscriptEvents(scope?: TranscriptQueryScope): unknown[] | undefined {
if (!hasTranscriptQueryScope(scope)) {
function loadScopedSqliteTranscriptEvents(
scope?: TranscriptQueryScope,
transcriptPath?: string,
): unknown[] | undefined {
const resolvedScope = hasTranscriptQueryScope(scope)
? scope
: transcriptPath?.trim()
? resolveSqliteSessionTranscriptScopeForPath({ transcriptPath })
: undefined;
if (!resolvedScope) {
return undefined;
}
try {
if (!hasSqliteSessionTranscriptEvents(scope)) {
if (!hasSqliteSessionTranscriptEvents(resolvedScope)) {
return undefined;
}
return loadSqliteSessionTranscriptEvents(scope).map((entry) => entry.event);
return loadSqliteSessionTranscriptEvents(resolvedScope).map((entry) => entry.event);
} catch {
return undefined;
}
@@ -172,7 +155,7 @@ export async function readLatestAssistantTextFromSessionTranscript(
sessionFile: string | undefined,
scope?: TranscriptQueryScope,
): Promise<LatestAssistantTranscriptText | undefined> {
const scopedEvents = loadScopedSqliteTranscriptEvents(scope);
const scopedEvents = loadScopedSqliteTranscriptEvents(scope, sessionFile);
if (scopedEvents) {
for (const event of scopedEvents.toReversed()) {
const assistantText = parseAssistantTranscriptEventText(event);
@@ -183,30 +166,6 @@ export async function readLatestAssistantTextFromSessionTranscript(
return undefined;
}
if (!sessionFile?.trim()) {
return undefined;
}
let raw: string;
try {
raw = await fs.promises.readFile(sessionFile, "utf-8");
} catch {
return undefined;
}
for (const line of raw.split(/\r?\n/).toReversed()) {
if (!line.trim()) {
continue;
}
try {
const assistantText = parseAssistantTranscriptText(line);
if (assistantText) {
return assistantText;
}
} catch {
continue;
}
}
return undefined;
}
@@ -214,33 +173,12 @@ export async function readTailAssistantTextFromSessionTranscript(
sessionFile: string | undefined,
scope?: TranscriptQueryScope,
): Promise<TailAssistantTranscriptText | undefined> {
const scopedEvents = loadScopedSqliteTranscriptEvents(scope);
const scopedEvents = loadScopedSqliteTranscriptEvents(scope, sessionFile);
if (scopedEvents) {
const tail = scopedEvents.at(-1);
return tail === undefined ? undefined : parseAssistantTranscriptEventText(tail);
}
if (!sessionFile?.trim()) {
return undefined;
}
let raw: string;
try {
raw = await fs.promises.readFile(sessionFile, "utf-8");
} catch {
return undefined;
}
for (const line of raw.split(/\r?\n/).toReversed()) {
if (!line.trim()) {
continue;
}
try {
return parseAssistantTranscriptText(line);
} catch {
return undefined;
}
}
return undefined;
}
@@ -319,6 +257,9 @@ export async function appendExactAssistantMessageToSessionTranscript(params: {
}
const storePath = params.storePath ?? resolveDefaultSessionStorePath(params.agentId);
const agentId = normalizeAgentId(
params.agentId ?? resolveAgentIdFromSessionStorePath(storePath) ?? DEFAULT_AGENT_ID,
);
const store = loadSessionStore(storePath);
const normalizedKey = normalizeStoreSessionKey(sessionKey);
const entry = (store[normalizedKey] ?? store[sessionKey]) as SessionEntry | undefined;
@@ -334,7 +275,7 @@ export async function appendExactAssistantMessageToSessionTranscript(params: {
sessionStore: store,
storePath,
sessionEntry: entry,
agentId: params.agentId,
agentId,
sessionsDir: path.dirname(storePath),
});
sessionFile = resolvedSessionFile.sessionFile;
@@ -345,13 +286,11 @@ export async function appendExactAssistantMessageToSessionTranscript(params: {
};
}
await ensureSessionHeader({ sessionFile, sessionId: entry.sessionId });
const explicitIdempotencyKey =
params.idempotencyKey ??
((params.message as { idempotencyKey?: unknown }).idempotencyKey as string | undefined);
const transcriptScope = {
agentId: params.agentId,
agentId,
sessionId: entry.sessionId,
};
const existingMessageId = explicitIdempotencyKey
@@ -375,10 +314,10 @@ export async function appendExactAssistantMessageToSessionTranscript(params: {
const message = {
...params.message,
...(explicitIdempotencyKey ? { idempotencyKey: explicitIdempotencyKey } : {}),
} as Parameters<SessionManager["appendMessage"]>[0];
};
const { messageId } = await appendSessionTranscriptMessage({
transcriptPath: sessionFile,
agentId: params.agentId,
agentId,
message,
sessionId: entry.sessionId,
config: params.config,
@@ -402,39 +341,10 @@ async function transcriptHasIdempotencyKey(
idempotencyKey: string,
scope?: TranscriptQueryScope,
): Promise<string | true | undefined> {
const scopedEvents = loadScopedSqliteTranscriptEvents(scope);
const scopedEvents = loadScopedSqliteTranscriptEvents(scope, transcriptPath);
if (scopedEvents) {
return findIdempotencyKeyInTranscriptEvents(scopedEvents, idempotencyKey);
}
try {
const raw = await fs.promises.readFile(transcriptPath, "utf-8");
for (const line of raw.split(/\r?\n/)) {
if (!line.trim()) {
continue;
}
try {
const parsed = JSON.parse(line) as {
id?: unknown;
message?: { idempotencyKey?: unknown };
};
if (
parsed.message?.idempotencyKey === idempotencyKey &&
typeof parsed.id === "string" &&
parsed.id
) {
return parsed.id;
}
if (parsed.message?.idempotencyKey === idempotencyKey) {
return true;
}
} catch {
continue;
}
}
} catch {
return undefined;
}
return undefined;
}
@@ -497,44 +407,11 @@ async function findLatestEquivalentAssistantMessageId(
return undefined;
}
const scopedEvents = loadScopedSqliteTranscriptEvents(scope);
const scopedEvents = loadScopedSqliteTranscriptEvents(scope, transcriptPath);
if (scopedEvents) {
return findLatestEquivalentAssistantMessageIdInEvents(scopedEvents, expectedText);
}
try {
const raw = await fs.promises.readFile(transcriptPath, "utf-8");
const lines = raw.split(/\r?\n/);
for (let index = lines.length - 1; index >= 0; index -= 1) {
const line = lines[index];
if (!line.trim()) {
continue;
}
try {
const parsed = JSON.parse(line) as {
id?: unknown;
message?: SessionTranscriptAssistantMessage;
};
const candidate = parsed.message;
if (!candidate || candidate.role !== "assistant") {
continue;
}
const candidateText = extractAssistantMessageText(candidate);
if (candidateText !== expectedText) {
return undefined;
}
if (typeof parsed.id === "string" && parsed.id) {
return parsed.id;
}
return undefined;
} catch {
continue;
}
}
} catch {
return undefined;
}
return undefined;
}

View File

@@ -687,6 +687,11 @@ export function resolveDoctorHealthContributions(): DoctorHealthContribution[] {
label: "Plugin registry",
run: runPluginRegistryHealth,
}),
createDoctorHealthContribution({
id: "doctor:session-transcripts",
label: "Session transcripts",
run: runSessionTranscriptsHealth,
}),
createDoctorHealthContribution({
id: "doctor:state-integrity",
label: "State integrity",
@@ -702,11 +707,6 @@ export function resolveDoctorHealthContributions(): DoctorHealthContribution[] {
label: "Session locks",
run: runSessionLocksHealth,
}),
createDoctorHealthContribution({
id: "doctor:session-transcripts",
label: "Session transcripts",
run: runSessionTranscriptsHealth,
}),
createDoctorHealthContribution({
id: "doctor:legacy-cron",
label: "Legacy cron",

View File

@@ -1,7 +1,9 @@
import type { SessionWriteLockAcquireTimeoutConfig } from "../../agents/session-write-lock.js";
import type { SessionManager } from "../../agents/transcript/session-transcript-contract.js";
import { appendSessionTranscriptMessage } from "../../config/sessions/transcript-append.js";
import { resolveSqliteSessionTranscriptScopeForPath } from "../../config/sessions/transcript-store.sqlite.js";
import { formatErrorMessage } from "../../infra/errors.js";
import { DEFAULT_AGENT_ID } from "../../routing/session-key.js";
import { emitSessionTranscriptUpdate } from "../../sessions/transcript-events.js";
type AppendMessageArg = Parameters<SessionManager["appendMessage"]>[0];
@@ -105,10 +107,13 @@ export async function appendInjectedAssistantMessageToTranscript(params: {
};
try {
const existingScope = resolveSqliteSessionTranscriptScopeForPath({
transcriptPath: params.transcriptPath,
});
const { messageId } = await appendSessionTranscriptMessage({
transcriptPath: params.transcriptPath,
...(params.agentId ? { agentId: params.agentId } : {}),
...(params.sessionId ? { sessionId: params.sessionId } : {}),
agentId: params.agentId ?? existingScope?.agentId ?? DEFAULT_AGENT_ID,
sessionId: params.sessionId ?? existingScope?.sessionId,
message: messageBody,
now,
useRawWhenLinear: true,

View File

@@ -3,6 +3,11 @@ import os from "node:os";
import path from "node:path";
import { afterEach, describe, expect, it, vi } from "vitest";
import { CURRENT_SESSION_VERSION } from "../../agents/transcript/session-transcript-contract.js";
import {
loadSqliteSessionTranscriptEvents,
replaceSqliteSessionTranscriptEvents,
} from "../../config/sessions/transcript-store.sqlite.js";
import { closeOpenClawStateDatabaseForTest } from "../../state/openclaw-state-db.js";
import {
createActiveRun,
createChatAbortContext,
@@ -45,23 +50,23 @@ async function writeTranscriptHeader(transcriptPath: string, sessionId: string)
timestamp: new Date(0).toISOString(),
cwd: "/tmp",
};
await fs.writeFile(transcriptPath, `${JSON.stringify(header)}\n`, "utf-8");
replaceSqliteSessionTranscriptEvents({
agentId: "main",
sessionId,
transcriptPath,
events: [header],
});
}
async function readTranscriptLines(transcriptPath: string): Promise<TranscriptLine[]> {
const raw = await fs.readFile(transcriptPath, "utf-8");
const lines: TranscriptLine[] = [];
for (const line of raw.split(/\r?\n/)) {
if (line.trim().length === 0) {
continue;
}
try {
lines.push(JSON.parse(line) as TranscriptLine);
} catch {
lines.push({});
}
async function readTranscriptLines(_transcriptPath: string): Promise<TranscriptLine[]> {
const sessionId = sessionEntryState.sessionId;
if (!sessionId) {
return [];
}
return lines;
return loadSqliteSessionTranscriptEvents({
agentId: "main",
sessionId,
}).map((entry) => entry.event as TranscriptLine);
}
function collectMessagesWithIdempotencyKey(
@@ -96,6 +101,7 @@ function setMockSessionEntry(transcriptPath: string, sessionId: string) {
async function createTranscriptFixture(prefix: string) {
const dir = await fs.mkdtemp(path.join(os.tmpdir(), prefix));
vi.stubEnv("OPENCLAW_STATE_DIR", dir);
const sessionId = "sess-main";
const transcriptPath = path.join(dir, `${sessionId}.jsonl`);
await writeTranscriptHeader(transcriptPath, sessionId);
@@ -104,7 +110,9 @@ async function createTranscriptFixture(prefix: string) {
}
afterEach(() => {
closeOpenClawStateDatabaseForTest();
vi.restoreAllMocks();
vi.unstubAllEnvs();
});
describe("chat abort transcript persistence", () => {

View File

@@ -2,21 +2,14 @@ import fs from "node:fs";
import os from "node:os";
import path from "node:path";
import { afterEach, describe, expect, it, vi } from "vitest";
import { loadSqliteSessionTranscriptEvents } from "../../config/sessions/transcript-store.sqlite.js";
import {
appendSqliteSessionTranscriptEvent,
loadSqliteSessionTranscriptEvents,
} from "../../config/sessions/transcript-store.sqlite.js";
import { closeOpenClawStateDatabaseForTest } from "../../state/openclaw-state-db.js";
import { appendInjectedAssistantMessageToTranscript } from "./chat-transcript-inject.js";
import { createTranscriptFixtureSync } from "./chat.test-helpers.js";
function readTranscriptLines(transcriptPath: string): string[] {
const lines: string[] = [];
for (const line of fs.readFileSync(transcriptPath, "utf-8").split(/\r?\n/)) {
if (line.length > 0) {
lines.push(line);
}
}
return lines;
}
afterEach(() => {
closeOpenClawStateDatabaseForTest();
vi.unstubAllEnvs();
@@ -26,14 +19,16 @@ afterEach(() => {
// current leaf with a `parentId` and must not sever compaction history.
describe("gateway chat.inject transcript writes", () => {
it("appends a Pi session entry that includes parentId", async () => {
const { dir, transcriptPath } = createTranscriptFixtureSync({
const { dir, transcriptPath, sessionId } = createTranscriptFixtureSync({
prefix: "openclaw-chat-inject-",
sessionId: "sess-1",
});
vi.stubEnv("OPENCLAW_STATE_DIR", dir);
try {
const appended = await appendInjectedAssistantMessageToTranscript({
transcriptPath,
sessionId,
message: "hello",
});
expect(appended.ok).toBe(true);
@@ -44,13 +39,13 @@ describe("gateway chat.inject transcript writes", () => {
}
expect(messageId.length).toBeGreaterThan(0);
const lines = readTranscriptLines(transcriptPath);
expect(lines.length).toBeGreaterThanOrEqual(2);
const last = JSON.parse(lines.at(-1) as string) as Record<string, unknown>;
const events = loadSqliteSessionTranscriptEvents({
agentId: "main",
sessionId,
}).map((entry) => entry.event as Record<string, unknown>);
const last = events.at(-1) as Record<string, unknown>;
expect(last.type).toBe("message");
// The regression we saw: raw jsonl appends omitted this field entirely.
expect(Object.prototype.hasOwnProperty.call(last, "parentId")).toBe(true);
expect(last).toHaveProperty("id");
expect(last).toHaveProperty("message");
@@ -59,28 +54,32 @@ describe("gateway chat.inject transcript writes", () => {
}
});
it("uses raw append for oversized append-only transcripts", async () => {
const { dir, transcriptPath } = createTranscriptFixtureSync({
it("links injected messages after oversized SQLite transcript entries", async () => {
const { dir, transcriptPath, sessionId } = createTranscriptFixtureSync({
prefix: "openclaw-chat-inject-large-",
sessionId: "sess-1",
});
vi.stubEnv("OPENCLAW_STATE_DIR", dir);
try {
fs.appendFileSync(
appendSqliteSessionTranscriptEvent({
agentId: "main",
sessionId,
transcriptPath,
`${JSON.stringify({
event: {
type: "message",
id: "legacy-large-message",
parentId: null,
message: {
role: "assistant",
content: [{ type: "text", text: "x".repeat(9 * 1024 * 1024) }],
},
})}\n`,
"utf-8",
);
},
});
const appended = await appendInjectedAssistantMessageToTranscript({
transcriptPath,
sessionId,
message: "hello",
});
expect(appended.ok).toBe(true);
@@ -91,13 +90,16 @@ describe("gateway chat.inject transcript writes", () => {
}
expect(messageId.length).toBeGreaterThan(0);
const lines = readTranscriptLines(transcriptPath);
const last = JSON.parse(lines.at(-1) as string) as Record<string, unknown>;
const events = loadSqliteSessionTranscriptEvents({
agentId: "main",
sessionId,
}).map((entry) => entry.event as Record<string, unknown>);
const last = events.at(-1) as Record<string, unknown>;
expect(last.type).toBe("message");
expect(last).toHaveProperty("id", messageId);
expect(last).toHaveProperty("message");
expect(Object.prototype.hasOwnProperty.call(last, "parentId")).toBe(false);
expect(last).toHaveProperty("parentId", "legacy-large-message");
} finally {
fs.rmSync(dir, { recursive: true, force: true });
}

View File

@@ -1,7 +1,6 @@
import fs from "node:fs";
import os from "node:os";
import path from "node:path";
import { CURRENT_SESSION_VERSION } from "../../agents/transcript/session-transcript-contract.js";
export function createTranscriptFixtureSync(params: {
prefix: string;
@@ -9,17 +8,6 @@ export function createTranscriptFixtureSync(params: {
fileName?: string;
}) {
const dir = fs.mkdtempSync(path.join(os.tmpdir(), params.prefix));
const transcriptPath = path.join(dir, params.fileName ?? "sess.jsonl");
fs.writeFileSync(
transcriptPath,
`${JSON.stringify({
type: "session",
version: CURRENT_SESSION_VERSION,
id: params.sessionId,
timestamp: new Date(0).toISOString(),
cwd: "/tmp",
})}\n`,
"utf-8",
);
return { dir, transcriptPath };
const transcriptPath = path.join(dir, params.fileName ?? `${params.sessionId}.jsonl`);
return { dir, transcriptPath, sessionId: params.sessionId };
}

View File

@@ -7,7 +7,6 @@ import { resolveAgentWorkspaceDir, resolveSessionAgentId } from "../../agents/ag
import { rewriteTranscriptEntriesInSessionFile } from "../../agents/pi-embedded-runner/transcript-rewrite.js";
import { ensureSandboxWorkspaceForSession } from "../../agents/sandbox/context.js";
import { resolveAgentTimeoutMs } from "../../agents/timeout.js";
import { CURRENT_SESSION_VERSION } from "../../agents/transcript/session-transcript-contract.js";
import { dispatchInboundMessage } from "../../auto-reply/dispatch.js";
import type { ReplyPayload } from "../../auto-reply/reply-payload.js";
import { createReplyDispatcher } from "../../auto-reply/reply/reply-dispatcher.js";
@@ -1325,53 +1324,6 @@ function resolveTranscriptPath(params: {
}
}
function ensureTranscriptFile(params: { transcriptPath: string; sessionId: string }): {
ok: boolean;
error?: string;
} {
if (fs.existsSync(params.transcriptPath)) {
return { ok: true };
}
try {
fs.mkdirSync(path.dirname(params.transcriptPath), { recursive: true });
const header = {
type: "session",
version: CURRENT_SESSION_VERSION,
id: params.sessionId,
timestamp: new Date().toISOString(),
cwd: process.cwd(),
};
fs.writeFileSync(params.transcriptPath, `${JSON.stringify(header)}\n`, {
encoding: "utf-8",
mode: 0o600,
});
return { ok: true };
} catch (err) {
return { ok: false, error: err instanceof Error ? err.message : String(err) };
}
}
async function transcriptHasIdempotencyKey(
transcriptPath: string,
idempotencyKey: string,
): Promise<boolean> {
try {
const lines = (await fs.promises.readFile(transcriptPath, "utf-8")).split(/\r?\n/);
for (const line of lines) {
if (!line.trim()) {
continue;
}
const parsed = JSON.parse(line) as { message?: { idempotencyKey?: unknown } };
if (parsed?.message?.idempotencyKey === idempotencyKey) {
return true;
}
}
return false;
} catch {
return false;
}
}
async function appendAssistantTranscriptMessage(params: {
message: string;
label?: string;
@@ -1399,26 +1351,6 @@ async function appendAssistantTranscriptMessage(params: {
return { ok: false, error: "transcript path not resolved" };
}
if (!fs.existsSync(transcriptPath)) {
if (!params.createIfMissing) {
return { ok: false, error: "transcript file not found" };
}
const ensured = ensureTranscriptFile({
transcriptPath,
sessionId: params.sessionId,
});
if (!ensured.ok) {
return { ok: false, error: ensured.error ?? "failed to create transcript file" };
}
}
if (
params.idempotencyKey &&
(await transcriptHasIdempotencyKey(transcriptPath, params.idempotencyKey))
) {
return { ok: true };
}
return await appendInjectedAssistantMessageToTranscript({
transcriptPath,
message: params.message,

View File

@@ -27,6 +27,11 @@ import {
updateSessionStore,
} from "../../config/sessions.js";
import { resolveAgentMainSessionKey } from "../../config/sessions/main-session.js";
import {
appendSqliteSessionTranscriptEvent,
hasSqliteSessionTranscriptEvents,
replaceSqliteSessionTranscriptEvents,
} from "../../config/sessions/transcript-store.sqlite.js";
import type { OpenClawConfig } from "../../config/types.openclaw.js";
import {
createInternalHookEvent,
@@ -409,8 +414,9 @@ function ensureSessionTranscriptFile(params: {
agentId: params.agentId,
}),
);
if (!fs.existsSync(transcriptPath)) {
fs.mkdirSync(path.dirname(transcriptPath), { recursive: true });
if (
!hasSqliteSessionTranscriptEvents({ agentId: params.agentId, sessionId: params.sessionId })
) {
const header = {
type: "session",
version: CURRENT_SESSION_VERSION,
@@ -418,9 +424,11 @@ function ensureSessionTranscriptFile(params: {
timestamp: new Date().toISOString(),
cwd: process.cwd(),
};
fs.writeFileSync(transcriptPath, `${JSON.stringify(header)}\n`, {
encoding: "utf-8",
mode: 0o600,
appendSqliteSessionTranscriptEvent({
agentId: params.agentId,
sessionId: params.sessionId,
transcriptPath,
event: header,
});
}
return { ok: true, transcriptPath };
@@ -2077,8 +2085,13 @@ export const sessionsHandlers: GatewayRequestHandlers = {
return;
}
const archived = archiveFileOnDisk(filePath, "bak");
fs.writeFileSync(filePath, `${lines.join("\n")}\n`, "utf-8");
const archived = fs.existsSync(filePath) ? archiveFileOnDisk(filePath, "bak") : undefined;
replaceSqliteSessionTranscriptEvents({
agentId: target.agentId,
sessionId,
transcriptPath: filePath,
events: lines.map((line) => JSON.parse(line) as unknown),
});
await updateSessionStore(storePath, (store) => {
const entryKey = compactTarget.primaryKey;

View File

@@ -14,8 +14,13 @@ import type {
SessionEntry,
} from "../config/sessions.js";
import { isCompactionCheckpointTranscriptFileName } from "../config/sessions/artifacts.js";
import {
replaceSqliteSessionTranscriptEvents,
resolveSqliteSessionTranscriptScopeForPath,
} from "../config/sessions/transcript-store.sqlite.js";
import type { OpenClawConfig } from "../config/types.openclaw.js";
import { createSubsystemLogger } from "../logging/subsystem.js";
import { DEFAULT_AGENT_ID } from "../routing/session-key.js";
import { resolveGatewaySessionStoreTarget } from "./session-utils.js";
const log = createSubsystemLogger("gateway/session-compaction-checkpoints");
@@ -274,21 +279,18 @@ export async function forkCompactionCheckpointTranscriptAsync(params: {
};
try {
await fs.mkdir(sessionDir, { recursive: true });
const lines = [JSON.stringify(header)];
for (const entry of entries) {
if ((entry as { type?: unknown }).type !== "session") {
lines.push(JSON.stringify(entry));
}
}
await fs.writeFile(sessionFile, `${lines.join("\n")}\n`, { encoding: "utf-8", flag: "wx" });
const sourceScope = resolveSqliteSessionTranscriptScopeForPath({ transcriptPath: sourceFile });
replaceSqliteSessionTranscriptEvents({
agentId: sourceScope?.agentId ?? DEFAULT_AGENT_ID,
sessionId,
transcriptPath: sessionFile,
events: [
header,
...entries.filter((entry) => (entry as { type?: unknown }).type !== "session"),
],
});
return { sessionId, sessionFile };
} catch {
try {
await fs.unlink(sessionFile);
} catch {
// Best-effort cleanup for partial fork files.
}
return null;
}
}

View File

@@ -24,6 +24,7 @@ import {
import { resolveSessionFilePath, resolveSessionFilePathOptions } from "../config/sessions/paths.js";
import { resolveResetPreservedSelection } from "../config/sessions/reset-preserved-selection.js";
import {
appendSqliteSessionTranscriptEvent,
hasSqliteSessionTranscriptEvents,
loadSqliteSessionTranscriptEvents,
} from "../config/sessions/transcript-store.sqlite.js";
@@ -691,8 +692,7 @@ export async function performGatewaySessionReset(params: {
agentId: target.agentId,
reason: "reset",
});
fs.mkdirSync(path.dirname(next.sessionFile as string), { recursive: true });
if (!fs.existsSync(next.sessionFile as string)) {
if (!hasSqliteSessionTranscriptEvents({ agentId: target.agentId, sessionId: next.sessionId })) {
const header = {
type: "session",
version: CURRENT_SESSION_VERSION,
@@ -700,9 +700,11 @@ export async function performGatewaySessionReset(params: {
timestamp: new Date().toISOString(),
cwd: process.cwd(),
};
fs.writeFileSync(next.sessionFile as string, `${JSON.stringify(header)}\n`, {
encoding: "utf-8",
mode: 0o600,
appendSqliteSessionTranscriptEvent({
agentId: target.agentId,
sessionId: next.sessionId,
transcriptPath: next.sessionFile as string,
event: header,
});
}
emitGatewaySessionEndPluginHook({

View File

@@ -16,6 +16,7 @@ import {
import type { SessionEntry } from "../config/sessions.js";
import { canonicalizeMainSessionAlias } from "../config/sessions/main-session.js";
import { mergeSqliteSessionStore } from "../config/sessions/store-backend.sqlite.js";
import { replaceSqliteSessionTranscriptEvents } from "../config/sessions/transcript-store.sqlite.js";
import type { SessionScope } from "../config/sessions/types.js";
import type { OpenClawConfig } from "../config/types.openclaw.js";
import { createSubsystemLogger } from "../logging/subsystem.js";
@@ -85,6 +86,59 @@ type LegacySessionSurface = {
}) => string | null | undefined;
};
function parseJsonlEvents(filePath: string): unknown[] {
const raw = fs.readFileSync(filePath, "utf-8");
const events: unknown[] = [];
for (const [index, line] of raw.split(/\r?\n/).entries()) {
if (!line.trim()) {
continue;
}
try {
events.push(JSON.parse(line));
} catch (err) {
throw new Error(`Invalid JSONL at ${filePath}:${index + 1}`, { cause: err });
}
}
return events;
}
function resolveSessionIdFromTranscriptEvents(events: unknown[]): string | null {
for (const event of events) {
if (
event &&
typeof event === "object" &&
!Array.isArray(event) &&
(event as { type?: unknown }).type === "session" &&
typeof (event as { id?: unknown }).id === "string" &&
(event as { id: string }).id.trim()
) {
return (event as { id: string }).id;
}
}
return null;
}
function importLegacyTranscriptFileToSqlite(params: {
sourcePath: string;
transcriptPath: string;
agentId: string;
env?: NodeJS.ProcessEnv;
}): { imported: number; sessionId: string } {
const events = parseJsonlEvents(params.sourcePath);
const sessionId = resolveSessionIdFromTranscriptEvents(events);
if (!sessionId) {
throw new Error(`Transcript missing session header: ${params.sourcePath}`);
}
replaceSqliteSessionTranscriptEvents({
agentId: params.agentId,
sessionId,
transcriptPath: params.transcriptPath,
events,
env: params.env,
});
return { imported: events.length, sessionId };
}
function getLegacySessionSurfaces(): LegacySessionSurface[] {
// Legacy migrations run on cold doctor/startup paths. Prefer the narrower
// setup plugin surface here so session-key cleanup does not materialize full
@@ -894,16 +948,46 @@ async function migrateLegacySessions(
if (entry.name === "sessions.json") {
continue;
}
const from = path.join(detected.sessions.legacyDir, entry.name);
const to = path.join(detected.sessions.targetDir, entry.name);
if (fileExists(to)) {
if (!entry.name.endsWith(".jsonl")) {
continue;
}
const from = path.join(detected.sessions.legacyDir, entry.name);
const to = path.join(detected.sessions.targetDir, entry.name);
try {
fs.renameSync(from, to);
changes.push(`Moved ${entry.name} → agents/${detected.targetAgentId}/sessions`);
const imported = importLegacyTranscriptFileToSqlite({
sourcePath: from,
transcriptPath: to,
agentId: detected.targetAgentId,
env: detected.env,
});
fs.rmSync(from, { force: true });
changes.push(
`Imported ${entry.name} transcript (${imported.imported} event(s)) into SQLite for agent ${detected.targetAgentId}`,
);
} catch (err) {
warnings.push(`Failed moving ${from}: ${String(err)}`);
warnings.push(`Failed importing transcript ${from}: ${String(err)}`);
}
}
const targetEntries = safeReadDir(detected.sessions.targetDir);
for (const entry of targetEntries) {
if (!entry.isFile() || !entry.name.endsWith(".jsonl")) {
continue;
}
const transcriptPath = path.join(detected.sessions.targetDir, entry.name);
try {
const imported = importLegacyTranscriptFileToSqlite({
sourcePath: transcriptPath,
transcriptPath,
agentId: detected.targetAgentId,
env: detected.env,
});
fs.rmSync(transcriptPath, { force: true });
changes.push(
`Imported canonical ${entry.name} transcript (${imported.imported} event(s)) into SQLite for agent ${detected.targetAgentId}`,
);
} catch (err) {
warnings.push(`Failed importing transcript ${transcriptPath}: ${String(err)}`);
}
}