feat: add local backup CLI (#40163)

Merged via squash.

Prepared head SHA: ed46625ae2
Co-authored-by: shichangs <46870204+shichangs@users.noreply.github.com>
Co-authored-by: gumadeiras <5599352+gumadeiras@users.noreply.github.com>
Reviewed-by: @gumadeiras
This commit is contained in:
shichangs 2026-03-09 04:21:20 +08:00 committed by GitHub
parent a075baba84
commit 0ecfd37b44
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
22 changed files with 2256 additions and 12 deletions

View File

@ -12,6 +12,8 @@ Docs: https://docs.openclaw.ai
- CLI/install: include the short git commit hash in `openclaw --version` output when metadata is available, and keep installer version checks compatible with the decorated format. (#39712) thanks @sourman.
- Docs/Web search: restore $5/month free-credit details, replace defunct "Data for Search"/"Data for AI" plan names with current "Search" plan, and note legacy subscription validity in Brave setup docs. Follows up on #26860. (#40111) Thanks @remusao.
- macOS/onboarding: add a remote gateway token field for remote mode, preserve existing non-plaintext `gateway.remote.token` config values until explicitly replaced, and warn when the loaded token shape cannot be used directly from the macOS app. (#40187, supersedes #34614) Thanks @cgdusek.
- CLI/backup: add `openclaw backup create` and `openclaw backup verify` for local state archives, including `--only-config`, `--no-include-workspace`, manifest/payload validation, and backup guidance in destructive flows. (#40163) thanks @shichangs.
- CLI/backup: improve archive naming for date sorting, add config-only backup mode, and harden backup planning, publication, and verification edge cases. (#40163) Thanks @gumadeiras.
### Fixes

76
docs/cli/backup.md Normal file
View File

@ -0,0 +1,76 @@
---
summary: "CLI reference for `openclaw backup` (create local backup archives)"
read_when:
- You want a first-class backup archive for local OpenClaw state
- You want to preview which paths would be included before reset or uninstall
title: "backup"
---
# `openclaw backup`
Create a local backup archive for OpenClaw state, config, credentials, sessions, and optionally workspaces.
```bash
openclaw backup create
openclaw backup create --output ~/Backups
openclaw backup create --dry-run --json
openclaw backup create --verify
openclaw backup create --no-include-workspace
openclaw backup create --only-config
openclaw backup verify ./2026-03-09T00-00-00.000Z-openclaw-backup.tar.gz
```
## Notes
- The archive includes a `manifest.json` file with the resolved source paths and archive layout.
- Default output is a timestamped `.tar.gz` archive in the current working directory.
- If the current working directory is inside a backed-up source tree, OpenClaw falls back to your home directory for the default archive location.
- Existing archive files are never overwritten.
- Output paths inside the source state/workspace trees are rejected to avoid self-inclusion.
- `openclaw backup verify <archive>` validates that the archive contains exactly one root manifest, rejects traversal-style archive paths, and checks that every manifest-declared payload exists in the tarball.
- `openclaw backup create --verify` runs that validation immediately after writing the archive.
- `openclaw backup create --only-config` backs up just the active JSON config file.
## What gets backed up
`openclaw backup create` plans backup sources from your local OpenClaw install:
- The state directory returned by OpenClaw's local state resolver, usually `~/.openclaw`
- The active config file path
- The OAuth / credentials directory
- Workspace directories discovered from the current config, unless you pass `--no-include-workspace`
If you use `--only-config`, OpenClaw skips state, credentials, and workspace discovery and archives only the active config file path.
OpenClaw canonicalizes paths before building the archive. If config, credentials, or a workspace already live inside the state directory, they are not duplicated as separate top-level backup sources. Missing paths are skipped.
The archive payload stores file contents from those source trees, and the embedded `manifest.json` records the resolved absolute source paths plus the archive layout used for each asset.
## Invalid config behavior
`openclaw backup` intentionally bypasses the normal config preflight so it can still help during recovery. Because workspace discovery depends on a valid config, `openclaw backup create` now fails fast when the config file exists but is invalid and workspace backup is still enabled.
If you still want a partial backup in that situation, rerun:
```bash
openclaw backup create --no-include-workspace
```
That keeps state, config, and credentials in scope while skipping workspace discovery entirely.
If you only need a copy of the config file itself, `--only-config` also works when the config is malformed because it does not rely on parsing the config for workspace discovery.
## Size and performance
OpenClaw does not enforce a built-in maximum backup size or per-file size limit.
Practical limits come from the local machine and destination filesystem:
- Available space for the temporary archive write plus the final archive
- Time to walk large workspace trees and compress them into a `.tar.gz`
- Time to rescan the archive if you use `openclaw backup create --verify` or run `openclaw backup verify`
- Filesystem behavior at the destination path. OpenClaw prefers a no-overwrite hard-link publish step and falls back to exclusive copy when hard links are unsupported
Large workspaces are usually the main driver of archive size. If you want a smaller or faster backup, use `--no-include-workspace`.
For the smallest archive, use `--only-config`.

View File

@ -19,6 +19,7 @@ This page describes the current CLI behavior. If commands change, update this do
- [`completion`](/cli/completion)
- [`doctor`](/cli/doctor)
- [`dashboard`](/cli/dashboard)
- [`backup`](/cli/backup)
- [`reset`](/cli/reset)
- [`uninstall`](/cli/uninstall)
- [`update`](/cli/update)
@ -103,6 +104,9 @@ openclaw [--dev] [--profile <name>] <command>
completion
doctor
dashboard
backup
create
verify
security
audit
secrets

View File

@ -11,7 +11,10 @@ title: "reset"
Reset local config/state (keeps the CLI installed).
```bash
openclaw backup create
openclaw reset
openclaw reset --dry-run
openclaw reset --scope config+creds+sessions --yes --non-interactive
```
Run `openclaw backup create` first if you want a restorable snapshot before removing local state.

View File

@ -11,7 +11,10 @@ title: "uninstall"
Uninstall the gateway service + local data (CLI remains).
```bash
openclaw backup create
openclaw uninstall
openclaw uninstall --all --yes
openclaw uninstall --dry-run
```
Run `openclaw backup create` first if you want a restorable snapshot before removing state or workspaces.

View File

@ -273,22 +273,17 @@ describe("resolveCommandSecretRefsViaGateway", () => {
});
it("fails when configured refs remain unresolved after gateway assignments are applied", async () => {
const envKey = "TALK_API_KEY_STRICT_UNRESOLVED";
callGateway.mockResolvedValueOnce({
assignments: [],
diagnostics: [],
});
await expect(
resolveCommandSecretRefsViaGateway({
config: {
talk: {
apiKey: { source: "env", provider: "default", id: "TALK_API_KEY" },
},
} as OpenClawConfig,
commandName: "memory status",
targetIds: new Set(["talk.apiKey"]),
}),
).rejects.toThrow(/talk\.apiKey is unresolved in the active runtime snapshot/i);
await withEnvValue(envKey, undefined, async () => {
await expect(resolveTalkApiKey({ envKey })).rejects.toThrow(
/talk\.apiKey is unresolved in the active runtime snapshot/i,
);
});
});
it("allows unresolved refs when gateway diagnostics mark the target as inactive", async () => {

View File

@ -11,6 +11,13 @@ vi.mock("./register.agent.js", () => ({
},
}));
vi.mock("./register.backup.js", () => ({
registerBackupCommand: (program: Command) => {
const backup = program.command("backup");
backup.command("create");
},
}));
vi.mock("./register.maintenance.js", () => ({
registerMaintenanceCommands: (program: Command) => {
program.command("doctor");
@ -67,6 +74,7 @@ describe("command-registry", () => {
expect(names).toContain("config");
expect(names).toContain("memory");
expect(names).toContain("agents");
expect(names).toContain("backup");
expect(names).toContain("browser");
expect(names).toContain("sessions");
expect(names).not.toContain("agent");

View File

@ -92,6 +92,19 @@ const coreEntries: CoreCliEntry[] = [
mod.registerConfigCli(program);
},
},
{
commands: [
{
name: "backup",
description: "Create and verify local backup archives for OpenClaw state",
hasSubcommands: true,
},
],
register: async ({ program }) => {
const mod = await import("./register.backup.js");
mod.registerBackupCommand(program);
},
},
{
commands: [
{

View File

@ -80,6 +80,11 @@ describe("registerPreActionHooks", () => {
function buildProgram() {
const program = new Command().name("openclaw");
program.command("status").action(() => {});
program
.command("backup")
.command("create")
.option("--json")
.action(() => {});
program.command("doctor").action(() => {});
program.command("completion").action(() => {});
program.command("secrets").action(() => {});
@ -226,6 +231,15 @@ describe("registerPreActionHooks", () => {
expect(ensureConfigReadyMock).not.toHaveBeenCalled();
});
it("bypasses config guard for backup create", async () => {
await runPreAction({
parseArgv: ["backup", "create"],
processArgv: ["node", "openclaw", "backup", "create", "--json"],
});
expect(ensureConfigReadyMock).not.toHaveBeenCalled();
});
beforeAll(() => {
program = buildProgram();
const hooks = (

View File

@ -36,7 +36,7 @@ const PLUGIN_REQUIRED_COMMANDS = new Set([
"status",
"health",
]);
const CONFIG_GUARD_BYPASS_COMMANDS = new Set(["doctor", "completion", "secrets"]);
const CONFIG_GUARD_BYPASS_COMMANDS = new Set(["backup", "doctor", "completion", "secrets"]);
const JSON_PARSE_ONLY_COMMANDS = new Set(["config set"]);
let configGuardModulePromise: Promise<typeof import("./config-guard.js")> | undefined;
let pluginRegistryModulePromise: Promise<typeof import("../plugin-registry.js")> | undefined;

View File

@ -0,0 +1,104 @@
import { Command } from "commander";
import { beforeAll, beforeEach, describe, expect, it, vi } from "vitest";
const backupCreateCommand = vi.fn();
const backupVerifyCommand = vi.fn();
const runtime = {
log: vi.fn(),
error: vi.fn(),
exit: vi.fn(),
};
vi.mock("../../commands/backup.js", () => ({
backupCreateCommand,
}));
vi.mock("../../commands/backup-verify.js", () => ({
backupVerifyCommand,
}));
vi.mock("../../runtime.js", () => ({
defaultRuntime: runtime,
}));
let registerBackupCommand: typeof import("./register.backup.js").registerBackupCommand;
beforeAll(async () => {
({ registerBackupCommand } = await import("./register.backup.js"));
});
describe("registerBackupCommand", () => {
async function runCli(args: string[]) {
const program = new Command();
registerBackupCommand(program);
await program.parseAsync(args, { from: "user" });
}
beforeEach(() => {
vi.clearAllMocks();
backupCreateCommand.mockResolvedValue(undefined);
backupVerifyCommand.mockResolvedValue(undefined);
});
it("runs backup create with forwarded options", async () => {
await runCli(["backup", "create", "--output", "/tmp/backups", "--json", "--dry-run"]);
expect(backupCreateCommand).toHaveBeenCalledWith(
runtime,
expect.objectContaining({
output: "/tmp/backups",
json: true,
dryRun: true,
verify: false,
onlyConfig: false,
includeWorkspace: true,
}),
);
});
it("honors --no-include-workspace", async () => {
await runCli(["backup", "create", "--no-include-workspace"]);
expect(backupCreateCommand).toHaveBeenCalledWith(
runtime,
expect.objectContaining({
includeWorkspace: false,
}),
);
});
it("forwards --verify to backup create", async () => {
await runCli(["backup", "create", "--verify"]);
expect(backupCreateCommand).toHaveBeenCalledWith(
runtime,
expect.objectContaining({
verify: true,
}),
);
});
it("forwards --only-config to backup create", async () => {
await runCli(["backup", "create", "--only-config"]);
expect(backupCreateCommand).toHaveBeenCalledWith(
runtime,
expect.objectContaining({
onlyConfig: true,
}),
);
});
it("runs backup verify with forwarded options", async () => {
await runCli(["backup", "verify", "/tmp/openclaw-backup.tar.gz", "--json"]);
expect(backupVerifyCommand).toHaveBeenCalledWith(
runtime,
expect.objectContaining({
archive: "/tmp/openclaw-backup.tar.gz",
json: true,
}),
);
});
});

View File

@ -0,0 +1,92 @@
import type { Command } from "commander";
import { backupVerifyCommand } from "../../commands/backup-verify.js";
import { backupCreateCommand } from "../../commands/backup.js";
import { defaultRuntime } from "../../runtime.js";
import { formatDocsLink } from "../../terminal/links.js";
import { theme } from "../../terminal/theme.js";
import { runCommandWithRuntime } from "../cli-utils.js";
import { formatHelpExamples } from "../help-format.js";
export function registerBackupCommand(program: Command) {
const backup = program
.command("backup")
.description("Create and verify local backup archives for OpenClaw state")
.addHelpText(
"after",
() =>
`\n${theme.muted("Docs:")} ${formatDocsLink("/cli/backup", "docs.openclaw.ai/cli/backup")}\n`,
);
backup
.command("create")
.description("Write a backup archive for config, credentials, sessions, and workspaces")
.option("--output <path>", "Archive path or destination directory")
.option("--json", "Output JSON", false)
.option("--dry-run", "Print the backup plan without writing the archive", false)
.option("--verify", "Verify the archive after writing it", false)
.option("--only-config", "Back up only the active JSON config file", false)
.option("--no-include-workspace", "Exclude workspace directories from the backup")
.addHelpText(
"after",
() =>
`\n${theme.heading("Examples:")}\n${formatHelpExamples([
["openclaw backup create", "Create a timestamped backup in the current directory."],
[
"openclaw backup create --output ~/Backups",
"Write the archive into an existing backup directory.",
],
[
"openclaw backup create --dry-run --json",
"Preview the archive plan without writing any files.",
],
[
"openclaw backup create --verify",
"Create the archive and immediately validate its manifest and payload layout.",
],
[
"openclaw backup create --no-include-workspace",
"Back up state/config without agent workspace files.",
],
["openclaw backup create --only-config", "Back up only the active JSON config file."],
])}`,
)
.action(async (opts) => {
await runCommandWithRuntime(defaultRuntime, async () => {
await backupCreateCommand(defaultRuntime, {
output: opts.output as string | undefined,
json: Boolean(opts.json),
dryRun: Boolean(opts.dryRun),
verify: Boolean(opts.verify),
onlyConfig: Boolean(opts.onlyConfig),
includeWorkspace: opts.includeWorkspace as boolean,
});
});
});
backup
.command("verify <archive>")
.description("Validate a backup archive and its embedded manifest")
.option("--json", "Output JSON", false)
.addHelpText(
"after",
() =>
`\n${theme.heading("Examples:")}\n${formatHelpExamples([
[
"openclaw backup verify ./2026-03-09T00-00-00.000Z-openclaw-backup.tar.gz",
"Check that the archive structure and manifest are intact.",
],
[
"openclaw backup verify ~/Backups/latest.tar.gz --json",
"Emit machine-readable verification output.",
],
])}`,
)
.action(async (archive, opts) => {
await runCommandWithRuntime(defaultRuntime, async () => {
await backupVerifyCommand(defaultRuntime, {
archive: archive as string,
json: Boolean(opts.json),
});
});
});
}

View File

@ -0,0 +1,254 @@
import fs from "node:fs/promises";
import path from "node:path";
import {
readConfigFileSnapshot,
resolveConfigPath,
resolveOAuthDir,
resolveStateDir,
} from "../config/config.js";
import { formatSessionArchiveTimestamp } from "../config/sessions/artifacts.js";
import { pathExists, shortenHomePath } from "../utils.js";
import { buildCleanupPlan, isPathWithin } from "./cleanup-utils.js";
export type BackupAssetKind = "state" | "config" | "credentials" | "workspace";
export type BackupSkipReason = "covered" | "missing";
export type BackupAsset = {
kind: BackupAssetKind;
sourcePath: string;
displayPath: string;
archivePath: string;
};
export type SkippedBackupAsset = {
kind: BackupAssetKind;
sourcePath: string;
displayPath: string;
reason: BackupSkipReason;
coveredBy?: string;
};
export type BackupPlan = {
stateDir: string;
configPath: string;
oauthDir: string;
workspaceDirs: string[];
included: BackupAsset[];
skipped: SkippedBackupAsset[];
};
type BackupAssetCandidate = {
kind: BackupAssetKind;
sourcePath: string;
canonicalPath: string;
exists: boolean;
};
function backupAssetPriority(kind: BackupAssetKind): number {
switch (kind) {
case "state":
return 0;
case "config":
return 1;
case "credentials":
return 2;
case "workspace":
return 3;
}
}
export function buildBackupArchiveRoot(nowMs = Date.now()): string {
return `${formatSessionArchiveTimestamp(nowMs)}-openclaw-backup`;
}
export function buildBackupArchiveBasename(nowMs = Date.now()): string {
return `${buildBackupArchiveRoot(nowMs)}.tar.gz`;
}
export function encodeAbsolutePathForBackupArchive(sourcePath: string): string {
const normalized = sourcePath.replaceAll("\\", "/");
const windowsMatch = normalized.match(/^([A-Za-z]):\/(.*)$/);
if (windowsMatch) {
const drive = windowsMatch[1]?.toUpperCase() ?? "UNKNOWN";
const rest = windowsMatch[2] ?? "";
return path.posix.join("windows", drive, rest);
}
if (normalized.startsWith("/")) {
return path.posix.join("posix", normalized.slice(1));
}
return path.posix.join("relative", normalized);
}
export function buildBackupArchivePath(archiveRoot: string, sourcePath: string): string {
return path.posix.join(archiveRoot, "payload", encodeAbsolutePathForBackupArchive(sourcePath));
}
function compareCandidates(left: BackupAssetCandidate, right: BackupAssetCandidate): number {
const depthDelta = left.canonicalPath.length - right.canonicalPath.length;
if (depthDelta !== 0) {
return depthDelta;
}
const priorityDelta = backupAssetPriority(left.kind) - backupAssetPriority(right.kind);
if (priorityDelta !== 0) {
return priorityDelta;
}
return left.canonicalPath.localeCompare(right.canonicalPath);
}
async function canonicalizeExistingPath(targetPath: string): Promise<string> {
try {
return await fs.realpath(targetPath);
} catch {
return path.resolve(targetPath);
}
}
export async function resolveBackupPlanFromDisk(
params: {
includeWorkspace?: boolean;
onlyConfig?: boolean;
nowMs?: number;
} = {},
): Promise<BackupPlan> {
const includeWorkspace = params.includeWorkspace ?? true;
const onlyConfig = params.onlyConfig ?? false;
const stateDir = resolveStateDir();
const configPath = resolveConfigPath();
const oauthDir = resolveOAuthDir();
const archiveRoot = buildBackupArchiveRoot(params.nowMs);
if (onlyConfig) {
const resolvedConfigPath = path.resolve(configPath);
if (!(await pathExists(resolvedConfigPath))) {
return {
stateDir,
configPath,
oauthDir,
workspaceDirs: [],
included: [],
skipped: [
{
kind: "config",
sourcePath: resolvedConfigPath,
displayPath: shortenHomePath(resolvedConfigPath),
reason: "missing",
},
],
};
}
const canonicalConfigPath = await canonicalizeExistingPath(resolvedConfigPath);
return {
stateDir,
configPath,
oauthDir,
workspaceDirs: [],
included: [
{
kind: "config",
sourcePath: canonicalConfigPath,
displayPath: shortenHomePath(canonicalConfigPath),
archivePath: buildBackupArchivePath(archiveRoot, canonicalConfigPath),
},
],
skipped: [],
};
}
const configSnapshot = await readConfigFileSnapshot();
if (includeWorkspace && configSnapshot.exists && !configSnapshot.valid) {
throw new Error(
`Config invalid at ${shortenHomePath(configSnapshot.path)}. OpenClaw cannot reliably discover custom workspaces for backup. Fix the config or rerun with --no-include-workspace for a partial backup.`,
);
}
const cleanupPlan = buildCleanupPlan({
cfg: configSnapshot.config,
stateDir,
configPath,
oauthDir,
});
const workspaceDirs = includeWorkspace ? cleanupPlan.workspaceDirs : [];
const rawCandidates: Array<Pick<BackupAssetCandidate, "kind" | "sourcePath">> = [
{ kind: "state", sourcePath: path.resolve(stateDir) },
...(cleanupPlan.configInsideState
? []
: [{ kind: "config" as const, sourcePath: path.resolve(configPath) }]),
...(cleanupPlan.oauthInsideState
? []
: [{ kind: "credentials" as const, sourcePath: path.resolve(oauthDir) }]),
...(includeWorkspace
? workspaceDirs.map((workspaceDir) => ({
kind: "workspace" as const,
sourcePath: path.resolve(workspaceDir),
}))
: []),
];
const candidates: BackupAssetCandidate[] = await Promise.all(
rawCandidates.map(async (candidate) => {
const exists = await pathExists(candidate.sourcePath);
return {
...candidate,
exists,
canonicalPath: exists
? await canonicalizeExistingPath(candidate.sourcePath)
: path.resolve(candidate.sourcePath),
};
}),
);
const uniqueCandidates: BackupAssetCandidate[] = [];
const seenCanonicalPaths = new Set<string>();
for (const candidate of [...candidates].toSorted(compareCandidates)) {
if (seenCanonicalPaths.has(candidate.canonicalPath)) {
continue;
}
seenCanonicalPaths.add(candidate.canonicalPath);
uniqueCandidates.push(candidate);
}
const included: BackupAsset[] = [];
const skipped: SkippedBackupAsset[] = [];
for (const candidate of uniqueCandidates) {
if (!candidate.exists) {
skipped.push({
kind: candidate.kind,
sourcePath: candidate.sourcePath,
displayPath: shortenHomePath(candidate.sourcePath),
reason: "missing",
});
continue;
}
const coveredBy = included.find((asset) =>
isPathWithin(candidate.canonicalPath, asset.sourcePath),
);
if (coveredBy) {
skipped.push({
kind: candidate.kind,
sourcePath: candidate.canonicalPath,
displayPath: shortenHomePath(candidate.canonicalPath),
reason: "covered",
coveredBy: coveredBy.displayPath,
});
continue;
}
included.push({
kind: candidate.kind,
sourcePath: candidate.canonicalPath,
displayPath: shortenHomePath(candidate.canonicalPath),
archivePath: buildBackupArchivePath(archiveRoot, candidate.canonicalPath),
});
}
return {
stateDir,
configPath,
oauthDir,
workspaceDirs: workspaceDirs.map((entry) => path.resolve(entry)),
included,
skipped,
};
}

View File

@ -0,0 +1,274 @@
import fs from "node:fs/promises";
import os from "node:os";
import path from "node:path";
import * as tar from "tar";
import { afterEach, beforeEach, describe, expect, it, vi } from "vitest";
import { createTempHomeEnv, type TempHomeEnv } from "../test-utils/temp-home.js";
import { buildBackupArchiveRoot } from "./backup-shared.js";
import { backupVerifyCommand } from "./backup-verify.js";
import { backupCreateCommand } from "./backup.js";
describe("backupVerifyCommand", () => {
let tempHome: TempHomeEnv;
beforeEach(async () => {
tempHome = await createTempHomeEnv("openclaw-backup-verify-test-");
});
afterEach(async () => {
await tempHome.restore();
});
it("verifies an archive created by backup create", async () => {
const stateDir = path.join(tempHome.home, ".openclaw");
const archiveDir = await fs.mkdtemp(path.join(os.tmpdir(), "openclaw-backup-verify-out-"));
try {
await fs.writeFile(path.join(stateDir, "openclaw.json"), JSON.stringify({}), "utf8");
await fs.writeFile(path.join(stateDir, "state.txt"), "hello\n", "utf8");
const runtime = {
log: vi.fn(),
error: vi.fn(),
exit: vi.fn(),
};
const nowMs = Date.UTC(2026, 2, 9, 0, 0, 0);
const created = await backupCreateCommand(runtime, { output: archiveDir, nowMs });
const verified = await backupVerifyCommand(runtime, { archive: created.archivePath });
expect(verified.ok).toBe(true);
expect(verified.archiveRoot).toBe(buildBackupArchiveRoot(nowMs));
expect(verified.assetCount).toBeGreaterThan(0);
} finally {
await fs.rm(archiveDir, { recursive: true, force: true });
}
});
it("fails when the archive does not contain a manifest", async () => {
const tempDir = await fs.mkdtemp(path.join(os.tmpdir(), "openclaw-backup-no-manifest-"));
const archivePath = path.join(tempDir, "broken.tar.gz");
try {
const root = path.join(tempDir, "root");
await fs.mkdir(path.join(root, "payload"), { recursive: true });
await fs.writeFile(path.join(root, "payload", "data.txt"), "x\n", "utf8");
await tar.c({ file: archivePath, gzip: true, cwd: tempDir }, ["root"]);
const runtime = {
log: vi.fn(),
error: vi.fn(),
exit: vi.fn(),
};
await expect(backupVerifyCommand(runtime, { archive: archivePath })).rejects.toThrow(
/expected exactly one backup manifest entry/i,
);
} finally {
await fs.rm(tempDir, { recursive: true, force: true });
}
});
it("fails when the manifest references a missing asset payload", async () => {
const tempDir = await fs.mkdtemp(path.join(os.tmpdir(), "openclaw-backup-missing-asset-"));
const archivePath = path.join(tempDir, "broken.tar.gz");
try {
const rootName = "2026-03-09T00-00-00.000Z-openclaw-backup";
const root = path.join(tempDir, rootName);
await fs.mkdir(root, { recursive: true });
const manifest = {
schemaVersion: 1,
createdAt: "2026-03-09T00:00:00.000Z",
archiveRoot: rootName,
runtimeVersion: "test",
platform: process.platform,
nodeVersion: process.version,
assets: [
{
kind: "state",
sourcePath: "/tmp/.openclaw",
archivePath: `${rootName}/payload/posix/tmp/.openclaw`,
},
],
};
await fs.writeFile(
path.join(root, "manifest.json"),
`${JSON.stringify(manifest, null, 2)}\n`,
);
await tar.c({ file: archivePath, gzip: true, cwd: tempDir }, [rootName]);
const runtime = {
log: vi.fn(),
error: vi.fn(),
exit: vi.fn(),
};
await expect(backupVerifyCommand(runtime, { archive: archivePath })).rejects.toThrow(
/missing payload for manifest asset/i,
);
} finally {
await fs.rm(tempDir, { recursive: true, force: true });
}
});
it("fails when archive paths contain traversal segments", async () => {
const tempDir = await fs.mkdtemp(path.join(os.tmpdir(), "openclaw-backup-traversal-"));
const archivePath = path.join(tempDir, "broken.tar.gz");
const manifestPath = path.join(tempDir, "manifest.json");
const payloadPath = path.join(tempDir, "payload.txt");
try {
const rootName = "2026-03-09T00-00-00.000Z-openclaw-backup";
const traversalPath = `${rootName}/payload/../escaped.txt`;
const manifest = {
schemaVersion: 1,
createdAt: "2026-03-09T00:00:00.000Z",
archiveRoot: rootName,
runtimeVersion: "test",
platform: process.platform,
nodeVersion: process.version,
assets: [
{
kind: "state",
sourcePath: "/tmp/.openclaw",
archivePath: traversalPath,
},
],
};
await fs.writeFile(manifestPath, `${JSON.stringify(manifest, null, 2)}\n`, "utf8");
await fs.writeFile(payloadPath, "payload\n", "utf8");
await tar.c(
{
file: archivePath,
gzip: true,
portable: true,
preservePaths: true,
onWriteEntry: (entry) => {
if (entry.path === manifestPath) {
entry.path = `${rootName}/manifest.json`;
return;
}
if (entry.path === payloadPath) {
entry.path = traversalPath;
}
},
},
[manifestPath, payloadPath],
);
const runtime = {
log: vi.fn(),
error: vi.fn(),
exit: vi.fn(),
};
await expect(backupVerifyCommand(runtime, { archive: archivePath })).rejects.toThrow(
/path traversal segments/i,
);
} finally {
await fs.rm(tempDir, { recursive: true, force: true });
}
});
it("ignores payload manifest.json files when locating the backup manifest", async () => {
const stateDir = path.join(tempHome.home, ".openclaw");
const externalWorkspace = await fs.mkdtemp(path.join(os.tmpdir(), "openclaw-workspace-"));
const configPath = path.join(tempHome.home, "custom-config.json");
const archiveDir = await fs.mkdtemp(path.join(os.tmpdir(), "openclaw-backup-verify-out-"));
try {
process.env.OPENCLAW_CONFIG_PATH = configPath;
await fs.writeFile(
configPath,
JSON.stringify({
agents: {
defaults: {
workspace: externalWorkspace,
},
},
}),
"utf8",
);
await fs.writeFile(path.join(stateDir, "openclaw.json"), JSON.stringify({}), "utf8");
await fs.writeFile(path.join(stateDir, "state.txt"), "hello\n", "utf8");
await fs.writeFile(
path.join(externalWorkspace, "manifest.json"),
JSON.stringify({ name: "workspace-payload" }),
"utf8",
);
const runtime = {
log: vi.fn(),
error: vi.fn(),
exit: vi.fn(),
};
const created = await backupCreateCommand(runtime, {
output: archiveDir,
includeWorkspace: true,
nowMs: Date.UTC(2026, 2, 9, 2, 0, 0),
});
const verified = await backupVerifyCommand(runtime, { archive: created.archivePath });
expect(verified.ok).toBe(true);
expect(verified.assetCount).toBeGreaterThanOrEqual(2);
} finally {
delete process.env.OPENCLAW_CONFIG_PATH;
await fs.rm(externalWorkspace, { recursive: true, force: true });
await fs.rm(archiveDir, { recursive: true, force: true });
}
});
it("fails when the archive contains duplicate root manifest entries", async () => {
const tempDir = await fs.mkdtemp(path.join(os.tmpdir(), "openclaw-backup-duplicate-manifest-"));
const archivePath = path.join(tempDir, "broken.tar.gz");
const manifestPath = path.join(tempDir, "manifest.json");
const payloadPath = path.join(tempDir, "payload.txt");
try {
const rootName = "2026-03-09T00-00-00.000Z-openclaw-backup";
const manifest = {
schemaVersion: 1,
createdAt: "2026-03-09T00:00:00.000Z",
archiveRoot: rootName,
runtimeVersion: "test",
platform: process.platform,
nodeVersion: process.version,
assets: [
{
kind: "state",
sourcePath: "/tmp/.openclaw",
archivePath: `${rootName}/payload/posix/tmp/.openclaw/payload.txt`,
},
],
};
await fs.writeFile(manifestPath, `${JSON.stringify(manifest, null, 2)}\n`, "utf8");
await fs.writeFile(payloadPath, "payload\n", "utf8");
await tar.c(
{
file: archivePath,
gzip: true,
portable: true,
preservePaths: true,
onWriteEntry: (entry) => {
if (entry.path === manifestPath) {
entry.path = `${rootName}/manifest.json`;
return;
}
if (entry.path === payloadPath) {
entry.path = `${rootName}/payload/posix/tmp/.openclaw/payload.txt`;
}
},
},
[manifestPath, manifestPath, payloadPath],
);
const runtime = {
log: vi.fn(),
error: vi.fn(),
exit: vi.fn(),
};
await expect(backupVerifyCommand(runtime, { archive: archivePath })).rejects.toThrow(
/expected exactly one backup manifest entry, found 2/i,
);
} finally {
await fs.rm(tempDir, { recursive: true, force: true });
}
});
});

View File

@ -0,0 +1,304 @@
import path from "node:path";
import * as tar from "tar";
import type { RuntimeEnv } from "../runtime.js";
import { resolveUserPath } from "../utils.js";
const WINDOWS_ABSOLUTE_ARCHIVE_PATH_RE = /^[A-Za-z]:[\\/]/;
type BackupManifestAsset = {
kind: string;
sourcePath: string;
archivePath: string;
};
type BackupManifest = {
schemaVersion: number;
createdAt: string;
archiveRoot: string;
runtimeVersion: string;
platform: string;
nodeVersion: string;
options?: {
includeWorkspace?: boolean;
};
paths?: {
stateDir?: string;
configPath?: string;
oauthDir?: string;
workspaceDirs?: string[];
};
assets: BackupManifestAsset[];
skipped?: Array<{
kind?: string;
sourcePath?: string;
reason?: string;
coveredBy?: string;
}>;
};
export type BackupVerifyOptions = {
archive: string;
json?: boolean;
};
export type BackupVerifyResult = {
ok: true;
archivePath: string;
archiveRoot: string;
createdAt: string;
runtimeVersion: string;
assetCount: number;
entryCount: number;
};
function isRecord(value: unknown): value is Record<string, unknown> {
return typeof value === "object" && value !== null && !Array.isArray(value);
}
function stripTrailingSlashes(value: string): string {
return value.replace(/\/+$/u, "");
}
function normalizeArchivePath(entryPath: string, label: string): string {
const trimmed = stripTrailingSlashes(entryPath.trim());
if (!trimmed) {
throw new Error(`${label} is empty.`);
}
if (trimmed.startsWith("/") || WINDOWS_ABSOLUTE_ARCHIVE_PATH_RE.test(trimmed)) {
throw new Error(`${label} must be relative: ${entryPath}`);
}
if (trimmed.split("/").some((segment) => segment === "." || segment === "..")) {
throw new Error(`${label} contains path traversal segments: ${entryPath}`);
}
const normalized = stripTrailingSlashes(path.posix.normalize(trimmed));
if (!normalized || normalized === "." || normalized === ".." || normalized.startsWith("../")) {
throw new Error(`${label} resolves outside the archive root: ${entryPath}`);
}
return normalized;
}
function normalizeArchiveRoot(rootName: string): string {
const normalized = normalizeArchivePath(rootName, "Backup manifest archiveRoot");
if (normalized.includes("/")) {
throw new Error(`Backup manifest archiveRoot must be a single path segment: ${rootName}`);
}
return normalized;
}
function isArchivePathWithin(child: string, parent: string): boolean {
const relative = path.posix.relative(parent, child);
return relative === "" || (!relative.startsWith("../") && relative !== "..");
}
function parseManifest(raw: string): BackupManifest {
let parsed: unknown;
try {
parsed = JSON.parse(raw);
} catch (err) {
throw new Error(`Backup manifest is not valid JSON: ${String(err)}`, { cause: err });
}
if (!isRecord(parsed)) {
throw new Error("Backup manifest must be an object.");
}
if (parsed.schemaVersion !== 1) {
throw new Error(`Unsupported backup manifest schemaVersion: ${String(parsed.schemaVersion)}`);
}
if (typeof parsed.archiveRoot !== "string" || !parsed.archiveRoot.trim()) {
throw new Error("Backup manifest is missing archiveRoot.");
}
if (typeof parsed.createdAt !== "string" || !parsed.createdAt.trim()) {
throw new Error("Backup manifest is missing createdAt.");
}
if (!Array.isArray(parsed.assets)) {
throw new Error("Backup manifest is missing assets.");
}
const assets: BackupManifestAsset[] = [];
for (const asset of parsed.assets) {
if (!isRecord(asset)) {
throw new Error("Backup manifest contains a non-object asset.");
}
if (typeof asset.kind !== "string" || !asset.kind.trim()) {
throw new Error("Backup manifest asset is missing kind.");
}
if (typeof asset.sourcePath !== "string" || !asset.sourcePath.trim()) {
throw new Error("Backup manifest asset is missing sourcePath.");
}
if (typeof asset.archivePath !== "string" || !asset.archivePath.trim()) {
throw new Error("Backup manifest asset is missing archivePath.");
}
assets.push({
kind: asset.kind,
sourcePath: asset.sourcePath,
archivePath: asset.archivePath,
});
}
return {
schemaVersion: 1,
archiveRoot: parsed.archiveRoot,
createdAt: parsed.createdAt,
runtimeVersion:
typeof parsed.runtimeVersion === "string" && parsed.runtimeVersion.trim()
? parsed.runtimeVersion
: "unknown",
platform: typeof parsed.platform === "string" ? parsed.platform : "unknown",
nodeVersion: typeof parsed.nodeVersion === "string" ? parsed.nodeVersion : "unknown",
options: isRecord(parsed.options)
? { includeWorkspace: parsed.options.includeWorkspace as boolean | undefined }
: undefined,
paths: isRecord(parsed.paths)
? {
stateDir: typeof parsed.paths.stateDir === "string" ? parsed.paths.stateDir : undefined,
configPath:
typeof parsed.paths.configPath === "string" ? parsed.paths.configPath : undefined,
oauthDir: typeof parsed.paths.oauthDir === "string" ? parsed.paths.oauthDir : undefined,
workspaceDirs: Array.isArray(parsed.paths.workspaceDirs)
? parsed.paths.workspaceDirs.filter(
(entry): entry is string => typeof entry === "string",
)
: undefined,
}
: undefined,
assets,
skipped: Array.isArray(parsed.skipped) ? parsed.skipped : undefined,
};
}
async function listArchiveEntries(archivePath: string): Promise<string[]> {
const entries: string[] = [];
await tar.t({
file: archivePath,
gzip: true,
onentry: (entry) => {
entries.push(entry.path);
},
});
return entries;
}
async function extractManifest(params: {
archivePath: string;
manifestEntryPath: string;
}): Promise<string> {
let manifestContentPromise: Promise<string> | undefined;
await tar.t({
file: params.archivePath,
gzip: true,
onentry: (entry) => {
if (entry.path !== params.manifestEntryPath) {
entry.resume();
return;
}
manifestContentPromise = new Promise<string>((resolve, reject) => {
const chunks: Buffer[] = [];
entry.on("data", (chunk: Buffer | string) => {
chunks.push(Buffer.isBuffer(chunk) ? chunk : Buffer.from(chunk));
});
entry.on("error", reject);
entry.on("end", () => {
resolve(Buffer.concat(chunks).toString("utf8"));
});
});
},
});
if (!manifestContentPromise) {
throw new Error(`Archive is missing manifest entry: ${params.manifestEntryPath}`);
}
return await manifestContentPromise;
}
function isRootManifestEntry(entryPath: string): boolean {
const parts = entryPath.split("/");
return parts.length === 2 && parts[0] !== "" && parts[1] === "manifest.json";
}
function verifyManifestAgainstEntries(manifest: BackupManifest, entries: Set<string>): void {
const archiveRoot = normalizeArchiveRoot(manifest.archiveRoot);
const manifestEntryPath = path.posix.join(archiveRoot, "manifest.json");
const normalizedEntries = [...entries];
const normalizedEntrySet = new Set(normalizedEntries);
if (!normalizedEntrySet.has(manifestEntryPath)) {
throw new Error(`Archive is missing manifest entry: ${manifestEntryPath}`);
}
for (const entry of normalizedEntries) {
if (!isArchivePathWithin(entry, archiveRoot)) {
throw new Error(`Archive entry is outside the declared archive root: ${entry}`);
}
}
const payloadRoot = path.posix.join(archiveRoot, "payload");
for (const asset of manifest.assets) {
const assetArchivePath = normalizeArchivePath(asset.archivePath, "Backup manifest asset path");
if (!isArchivePathWithin(assetArchivePath, payloadRoot)) {
throw new Error(`Manifest asset path is outside payload root: ${asset.archivePath}`);
}
const exact = normalizedEntrySet.has(assetArchivePath);
const nested = normalizedEntries.some(
(entry) => entry !== assetArchivePath && isArchivePathWithin(entry, assetArchivePath),
);
if (!exact && !nested) {
throw new Error(`Archive is missing payload for manifest asset: ${assetArchivePath}`);
}
}
}
function formatResult(result: BackupVerifyResult): string {
return [
`Backup archive OK: ${result.archivePath}`,
`Archive root: ${result.archiveRoot}`,
`Created at: ${result.createdAt}`,
`Runtime version: ${result.runtimeVersion}`,
`Assets verified: ${result.assetCount}`,
`Archive entries scanned: ${result.entryCount}`,
].join("\n");
}
export async function backupVerifyCommand(
runtime: RuntimeEnv,
opts: BackupVerifyOptions,
): Promise<BackupVerifyResult> {
const archivePath = resolveUserPath(opts.archive);
const rawEntries = await listArchiveEntries(archivePath);
if (rawEntries.length === 0) {
throw new Error("Backup archive is empty.");
}
const entries = rawEntries.map((entry) => ({
raw: entry,
normalized: normalizeArchivePath(entry, "Archive entry"),
}));
const normalizedEntrySet = new Set(entries.map((entry) => entry.normalized));
const manifestMatches = entries.filter((entry) => isRootManifestEntry(entry.normalized));
if (manifestMatches.length !== 1) {
throw new Error(`Expected exactly one backup manifest entry, found ${manifestMatches.length}.`);
}
const manifestEntryPath = manifestMatches[0]?.raw;
if (!manifestEntryPath) {
throw new Error("Backup archive manifest entry could not be resolved.");
}
const manifestRaw = await extractManifest({ archivePath, manifestEntryPath });
const manifest = parseManifest(manifestRaw);
verifyManifestAgainstEntries(manifest, normalizedEntrySet);
const result: BackupVerifyResult = {
ok: true,
archivePath,
archiveRoot: manifest.archiveRoot,
createdAt: manifest.createdAt,
runtimeVersion: manifest.runtimeVersion,
assetCount: manifest.assets.length,
entryCount: rawEntries.length,
};
runtime.log(opts.json ? JSON.stringify(result, null, 2) : formatResult(result));
return result;
}

View File

@ -0,0 +1,133 @@
import fs from "node:fs/promises";
import os from "node:os";
import path from "node:path";
import { afterEach, beforeEach, describe, expect, it, vi } from "vitest";
import { createTempHomeEnv, type TempHomeEnv } from "../test-utils/temp-home.js";
const tarCreateMock = vi.hoisted(() => vi.fn());
const backupVerifyCommandMock = vi.hoisted(() => vi.fn());
vi.mock("tar", () => ({
c: tarCreateMock,
}));
vi.mock("./backup-verify.js", () => ({
backupVerifyCommand: backupVerifyCommandMock,
}));
const { backupCreateCommand } = await import("./backup.js");
describe("backupCreateCommand atomic archive write", () => {
let tempHome: TempHomeEnv;
beforeEach(async () => {
tempHome = await createTempHomeEnv("openclaw-backup-atomic-test-");
tarCreateMock.mockReset();
backupVerifyCommandMock.mockReset();
});
afterEach(async () => {
await tempHome.restore();
});
it("does not leave a partial final archive behind when tar creation fails", async () => {
const stateDir = path.join(tempHome.home, ".openclaw");
const archiveDir = await fs.mkdtemp(path.join(os.tmpdir(), "openclaw-backup-failure-"));
try {
await fs.writeFile(path.join(stateDir, "openclaw.json"), JSON.stringify({}), "utf8");
await fs.writeFile(path.join(stateDir, "state.txt"), "state\n", "utf8");
tarCreateMock.mockRejectedValueOnce(new Error("disk full"));
const runtime = {
log: vi.fn(),
error: vi.fn(),
exit: vi.fn(),
};
const outputPath = path.join(archiveDir, "backup.tar.gz");
await expect(
backupCreateCommand(runtime, {
output: outputPath,
}),
).rejects.toThrow(/disk full/i);
await expect(fs.access(outputPath)).rejects.toThrow();
const remaining = await fs.readdir(archiveDir);
expect(remaining).toEqual([]);
} finally {
await fs.rm(archiveDir, { recursive: true, force: true });
}
});
it("does not overwrite an archive created after readiness checks complete", async () => {
const stateDir = path.join(tempHome.home, ".openclaw");
const archiveDir = await fs.mkdtemp(path.join(os.tmpdir(), "openclaw-backup-race-"));
const realLink = fs.link.bind(fs);
const linkSpy = vi.spyOn(fs, "link");
try {
await fs.writeFile(path.join(stateDir, "openclaw.json"), JSON.stringify({}), "utf8");
await fs.writeFile(path.join(stateDir, "state.txt"), "state\n", "utf8");
tarCreateMock.mockImplementationOnce(async ({ file }: { file: string }) => {
await fs.writeFile(file, "archive-bytes", "utf8");
});
linkSpy.mockImplementationOnce(async (existingPath, newPath) => {
await fs.writeFile(newPath, "concurrent-archive", "utf8");
return await realLink(existingPath, newPath);
});
const runtime = {
log: vi.fn(),
error: vi.fn(),
exit: vi.fn(),
};
const outputPath = path.join(archiveDir, "backup.tar.gz");
await expect(
backupCreateCommand(runtime, {
output: outputPath,
}),
).rejects.toThrow(/refusing to overwrite existing backup archive/i);
expect(await fs.readFile(outputPath, "utf8")).toBe("concurrent-archive");
} finally {
linkSpy.mockRestore();
await fs.rm(archiveDir, { recursive: true, force: true });
}
});
it("falls back to exclusive copy when hard-link publication is unsupported", async () => {
const stateDir = path.join(tempHome.home, ".openclaw");
const archiveDir = await fs.mkdtemp(path.join(os.tmpdir(), "openclaw-backup-copy-fallback-"));
const linkSpy = vi.spyOn(fs, "link");
try {
await fs.writeFile(path.join(stateDir, "openclaw.json"), JSON.stringify({}), "utf8");
await fs.writeFile(path.join(stateDir, "state.txt"), "state\n", "utf8");
tarCreateMock.mockImplementationOnce(async ({ file }: { file: string }) => {
await fs.writeFile(file, "archive-bytes", "utf8");
});
linkSpy.mockRejectedValueOnce(
Object.assign(new Error("hard links not supported"), { code: "EOPNOTSUPP" }),
);
const runtime = {
log: vi.fn(),
error: vi.fn(),
exit: vi.fn(),
};
const outputPath = path.join(archiveDir, "backup.tar.gz");
const result = await backupCreateCommand(runtime, {
output: outputPath,
});
expect(result.archivePath).toBe(outputPath);
expect(await fs.readFile(outputPath, "utf8")).toBe("archive-bytes");
} finally {
linkSpy.mockRestore();
await fs.rm(archiveDir, { recursive: true, force: true });
}
});
});

434
src/commands/backup.test.ts Normal file
View File

@ -0,0 +1,434 @@
import fs from "node:fs/promises";
import os from "node:os";
import path from "node:path";
import * as tar from "tar";
import { afterEach, beforeEach, describe, expect, it, vi } from "vitest";
import { createTempHomeEnv, type TempHomeEnv } from "../test-utils/temp-home.js";
import {
buildBackupArchiveRoot,
encodeAbsolutePathForBackupArchive,
resolveBackupPlanFromDisk,
} from "./backup-shared.js";
import { backupCreateCommand } from "./backup.js";
const backupVerifyCommandMock = vi.hoisted(() => vi.fn());
vi.mock("./backup-verify.js", () => ({
backupVerifyCommand: backupVerifyCommandMock,
}));
describe("backup commands", () => {
let tempHome: TempHomeEnv;
let previousCwd: string;
beforeEach(async () => {
tempHome = await createTempHomeEnv("openclaw-backup-test-");
previousCwd = process.cwd();
backupVerifyCommandMock.mockReset();
backupVerifyCommandMock.mockResolvedValue({
ok: true,
archivePath: "/tmp/fake.tar.gz",
archiveRoot: "fake",
createdAt: new Date().toISOString(),
runtimeVersion: "test",
assetCount: 1,
entryCount: 2,
});
});
afterEach(async () => {
process.chdir(previousCwd);
await tempHome.restore();
});
it("collapses default config, credentials, and workspace into the state backup root", async () => {
const stateDir = path.join(tempHome.home, ".openclaw");
await fs.writeFile(path.join(stateDir, "openclaw.json"), JSON.stringify({}), "utf8");
await fs.mkdir(path.join(stateDir, "credentials"), { recursive: true });
await fs.writeFile(path.join(stateDir, "credentials", "oauth.json"), "{}", "utf8");
await fs.mkdir(path.join(stateDir, "workspace"), { recursive: true });
await fs.writeFile(path.join(stateDir, "workspace", "SOUL.md"), "# soul\n", "utf8");
const plan = await resolveBackupPlanFromDisk({ includeWorkspace: true, nowMs: 123 });
expect(plan.included).toHaveLength(1);
expect(plan.included[0]?.kind).toBe("state");
expect(plan.skipped).toEqual(
expect.arrayContaining([expect.objectContaining({ kind: "workspace", reason: "covered" })]),
);
});
it("orders coverage checks by canonical path so symlinked workspaces do not duplicate state", async () => {
if (process.platform === "win32") {
return;
}
const stateDir = path.join(tempHome.home, ".openclaw");
const workspaceDir = path.join(stateDir, "workspace");
const symlinkDir = await fs.mkdtemp(path.join(os.tmpdir(), "openclaw-workspace-link-"));
const workspaceLink = path.join(symlinkDir, "ws-link");
try {
await fs.mkdir(workspaceDir, { recursive: true });
await fs.writeFile(path.join(workspaceDir, "SOUL.md"), "# soul\n", "utf8");
await fs.symlink(workspaceDir, workspaceLink);
await fs.writeFile(
path.join(stateDir, "openclaw.json"),
JSON.stringify({
agents: {
defaults: {
workspace: workspaceLink,
},
},
}),
"utf8",
);
const plan = await resolveBackupPlanFromDisk({ includeWorkspace: true, nowMs: 123 });
expect(plan.included).toHaveLength(1);
expect(plan.included[0]?.kind).toBe("state");
expect(plan.skipped).toEqual(
expect.arrayContaining([expect.objectContaining({ kind: "workspace", reason: "covered" })]),
);
} finally {
await fs.rm(symlinkDir, { recursive: true, force: true });
}
});
it("creates an archive with a manifest and external workspace payload", async () => {
const stateDir = path.join(tempHome.home, ".openclaw");
const externalWorkspace = await fs.mkdtemp(path.join(os.tmpdir(), "openclaw-workspace-"));
const configPath = path.join(tempHome.home, "custom-config.json");
const backupDir = await fs.mkdtemp(path.join(os.tmpdir(), "openclaw-backups-"));
try {
process.env.OPENCLAW_CONFIG_PATH = configPath;
await fs.writeFile(
configPath,
JSON.stringify({
agents: {
defaults: {
workspace: externalWorkspace,
},
},
}),
"utf8",
);
await fs.writeFile(path.join(stateDir, "state.txt"), "state\n", "utf8");
await fs.writeFile(path.join(externalWorkspace, "SOUL.md"), "# external\n", "utf8");
const runtime = {
log: vi.fn(),
error: vi.fn(),
exit: vi.fn(),
};
const nowMs = Date.UTC(2026, 2, 9, 0, 0, 0);
const result = await backupCreateCommand(runtime, {
output: backupDir,
includeWorkspace: true,
nowMs,
});
expect(result.archivePath).toBe(
path.join(backupDir, `${buildBackupArchiveRoot(nowMs)}.tar.gz`),
);
const extractDir = await fs.mkdtemp(path.join(os.tmpdir(), "openclaw-backup-extract-"));
try {
await tar.x({ file: result.archivePath, cwd: extractDir, gzip: true });
const archiveRoot = path.join(extractDir, buildBackupArchiveRoot(nowMs));
const manifest = JSON.parse(
await fs.readFile(path.join(archiveRoot, "manifest.json"), "utf8"),
) as {
assets: Array<{ kind: string; archivePath: string }>;
};
expect(manifest.assets).toEqual(
expect.arrayContaining([
expect.objectContaining({ kind: "state" }),
expect.objectContaining({ kind: "config" }),
expect.objectContaining({ kind: "workspace" }),
]),
);
const stateAsset = result.assets.find((asset) => asset.kind === "state");
const workspaceAsset = result.assets.find((asset) => asset.kind === "workspace");
expect(stateAsset).toBeDefined();
expect(workspaceAsset).toBeDefined();
const encodedStatePath = path.join(
archiveRoot,
"payload",
encodeAbsolutePathForBackupArchive(stateAsset!.sourcePath),
"state.txt",
);
const encodedWorkspacePath = path.join(
archiveRoot,
"payload",
encodeAbsolutePathForBackupArchive(workspaceAsset!.sourcePath),
"SOUL.md",
);
expect(await fs.readFile(encodedStatePath, "utf8")).toBe("state\n");
expect(await fs.readFile(encodedWorkspacePath, "utf8")).toBe("# external\n");
} finally {
await fs.rm(extractDir, { recursive: true, force: true });
}
} finally {
delete process.env.OPENCLAW_CONFIG_PATH;
await fs.rm(externalWorkspace, { recursive: true, force: true });
await fs.rm(backupDir, { recursive: true, force: true });
}
});
it("optionally verifies the archive after writing it", async () => {
const stateDir = path.join(tempHome.home, ".openclaw");
const archiveDir = await fs.mkdtemp(
path.join(os.tmpdir(), "openclaw-backup-verify-on-create-"),
);
try {
await fs.writeFile(path.join(stateDir, "openclaw.json"), JSON.stringify({}), "utf8");
await fs.writeFile(path.join(stateDir, "state.txt"), "state\n", "utf8");
const runtime = {
log: vi.fn(),
error: vi.fn(),
exit: vi.fn(),
};
const result = await backupCreateCommand(runtime, {
output: archiveDir,
verify: true,
});
expect(result.verified).toBe(true);
expect(backupVerifyCommandMock).toHaveBeenCalledWith(
expect.objectContaining({ log: expect.any(Function) }),
expect.objectContaining({ archive: result.archivePath, json: false }),
);
} finally {
await fs.rm(archiveDir, { recursive: true, force: true });
}
});
it("rejects output paths that would be created inside a backed-up directory", async () => {
const stateDir = path.join(tempHome.home, ".openclaw");
await fs.writeFile(path.join(stateDir, "openclaw.json"), JSON.stringify({}), "utf8");
const runtime = {
log: vi.fn(),
error: vi.fn(),
exit: vi.fn(),
};
await expect(
backupCreateCommand(runtime, {
output: path.join(stateDir, "backups"),
}),
).rejects.toThrow(/must not be written inside a source path/i);
});
it("rejects symlinked output paths even when intermediate directories do not exist yet", async () => {
if (process.platform === "win32") {
return;
}
const stateDir = path.join(tempHome.home, ".openclaw");
const symlinkDir = await fs.mkdtemp(path.join(os.tmpdir(), "openclaw-backup-link-"));
const symlinkPath = path.join(symlinkDir, "linked-state");
try {
await fs.writeFile(path.join(stateDir, "openclaw.json"), JSON.stringify({}), "utf8");
await fs.symlink(stateDir, symlinkPath);
const runtime = {
log: vi.fn(),
error: vi.fn(),
exit: vi.fn(),
};
await expect(
backupCreateCommand(runtime, {
output: path.join(symlinkPath, "new", "subdir", "backup.tar.gz"),
}),
).rejects.toThrow(/must not be written inside a source path/i);
} finally {
await fs.rm(symlinkDir, { recursive: true, force: true });
}
});
it("falls back to the home directory when cwd is inside a backed-up source tree", async () => {
const stateDir = path.join(tempHome.home, ".openclaw");
const workspaceDir = path.join(stateDir, "workspace");
await fs.writeFile(path.join(stateDir, "openclaw.json"), JSON.stringify({}), "utf8");
await fs.mkdir(workspaceDir, { recursive: true });
await fs.writeFile(path.join(workspaceDir, "SOUL.md"), "# soul\n", "utf8");
process.chdir(workspaceDir);
const runtime = {
log: vi.fn(),
error: vi.fn(),
exit: vi.fn(),
};
const nowMs = Date.UTC(2026, 2, 9, 1, 2, 3);
const result = await backupCreateCommand(runtime, { nowMs });
expect(result.archivePath).toBe(
path.join(tempHome.home, `${buildBackupArchiveRoot(nowMs)}.tar.gz`),
);
await fs.rm(result.archivePath, { force: true });
});
it("falls back to the home directory when cwd is a symlink into a backed-up source tree", async () => {
if (process.platform === "win32") {
return;
}
const stateDir = path.join(tempHome.home, ".openclaw");
const workspaceDir = path.join(stateDir, "workspace");
const linkParent = await fs.mkdtemp(path.join(os.tmpdir(), "openclaw-backup-cwd-link-"));
const workspaceLink = path.join(linkParent, "workspace-link");
try {
await fs.writeFile(path.join(stateDir, "openclaw.json"), JSON.stringify({}), "utf8");
await fs.mkdir(workspaceDir, { recursive: true });
await fs.writeFile(path.join(workspaceDir, "SOUL.md"), "# soul\n", "utf8");
await fs.symlink(workspaceDir, workspaceLink);
process.chdir(workspaceLink);
const runtime = {
log: vi.fn(),
error: vi.fn(),
exit: vi.fn(),
};
const nowMs = Date.UTC(2026, 2, 9, 1, 3, 4);
const result = await backupCreateCommand(runtime, { nowMs });
expect(result.archivePath).toBe(
path.join(tempHome.home, `${buildBackupArchiveRoot(nowMs)}.tar.gz`),
);
await fs.rm(result.archivePath, { force: true });
} finally {
await fs.rm(linkParent, { recursive: true, force: true });
}
});
it("allows dry-run preview even when the target archive already exists", async () => {
const stateDir = path.join(tempHome.home, ".openclaw");
const existingArchive = path.join(tempHome.home, "existing-backup.tar.gz");
await fs.writeFile(path.join(stateDir, "openclaw.json"), JSON.stringify({}), "utf8");
await fs.writeFile(existingArchive, "already here", "utf8");
const runtime = {
log: vi.fn(),
error: vi.fn(),
exit: vi.fn(),
};
const result = await backupCreateCommand(runtime, {
output: existingArchive,
dryRun: true,
});
expect(result.dryRun).toBe(true);
expect(result.verified).toBe(false);
expect(result.archivePath).toBe(existingArchive);
expect(await fs.readFile(existingArchive, "utf8")).toBe("already here");
});
it("fails fast when config is invalid and workspace backup is enabled", async () => {
const stateDir = path.join(tempHome.home, ".openclaw");
const configPath = path.join(tempHome.home, "custom-config.json");
process.env.OPENCLAW_CONFIG_PATH = configPath;
await fs.writeFile(path.join(stateDir, "openclaw.json"), JSON.stringify({}), "utf8");
await fs.writeFile(configPath, '{"agents": { defaults: { workspace: ', "utf8");
const runtime = {
log: vi.fn(),
error: vi.fn(),
exit: vi.fn(),
};
try {
await expect(backupCreateCommand(runtime, { dryRun: true })).rejects.toThrow(
/--no-include-workspace/i,
);
} finally {
delete process.env.OPENCLAW_CONFIG_PATH;
}
});
it("allows explicit partial backups when config is invalid", async () => {
const stateDir = path.join(tempHome.home, ".openclaw");
const configPath = path.join(tempHome.home, "custom-config.json");
process.env.OPENCLAW_CONFIG_PATH = configPath;
await fs.writeFile(path.join(stateDir, "openclaw.json"), JSON.stringify({}), "utf8");
await fs.writeFile(configPath, '{"agents": { defaults: { workspace: ', "utf8");
const runtime = {
log: vi.fn(),
error: vi.fn(),
exit: vi.fn(),
};
try {
const result = await backupCreateCommand(runtime, {
dryRun: true,
includeWorkspace: false,
});
expect(result.includeWorkspace).toBe(false);
expect(result.assets.some((asset) => asset.kind === "workspace")).toBe(false);
} finally {
delete process.env.OPENCLAW_CONFIG_PATH;
}
});
it("backs up only the active config file when --only-config is requested", async () => {
const stateDir = path.join(tempHome.home, ".openclaw");
const configPath = path.join(stateDir, "openclaw.json");
await fs.mkdir(path.join(stateDir, "credentials"), { recursive: true });
await fs.writeFile(configPath, JSON.stringify({ theme: "config-only" }), "utf8");
await fs.writeFile(path.join(stateDir, "state.txt"), "state\n", "utf8");
await fs.writeFile(path.join(stateDir, "credentials", "oauth.json"), "{}", "utf8");
const runtime = {
log: vi.fn(),
error: vi.fn(),
exit: vi.fn(),
};
const result = await backupCreateCommand(runtime, {
dryRun: true,
onlyConfig: true,
});
expect(result.onlyConfig).toBe(true);
expect(result.includeWorkspace).toBe(false);
expect(result.assets).toHaveLength(1);
expect(result.assets[0]?.kind).toBe("config");
});
it("allows config-only backups even when the config file is invalid", async () => {
const configPath = path.join(tempHome.home, "custom-config.json");
process.env.OPENCLAW_CONFIG_PATH = configPath;
await fs.writeFile(configPath, '{"agents": { defaults: { workspace: ', "utf8");
const runtime = {
log: vi.fn(),
error: vi.fn(),
exit: vi.fn(),
};
try {
const result = await backupCreateCommand(runtime, {
dryRun: true,
onlyConfig: true,
});
expect(result.assets).toHaveLength(1);
expect(result.assets[0]?.kind).toBe("config");
} finally {
delete process.env.OPENCLAW_CONFIG_PATH;
}
});
});

382
src/commands/backup.ts Normal file
View File

@ -0,0 +1,382 @@
import { randomUUID } from "node:crypto";
import { constants as fsConstants } from "node:fs";
import fs from "node:fs/promises";
import os from "node:os";
import path from "node:path";
import * as tar from "tar";
import type { RuntimeEnv } from "../runtime.js";
import { resolveHomeDir, resolveUserPath } from "../utils.js";
import { resolveRuntimeServiceVersion } from "../version.js";
import {
buildBackupArchiveBasename,
buildBackupArchiveRoot,
buildBackupArchivePath,
type BackupAsset,
resolveBackupPlanFromDisk,
} from "./backup-shared.js";
import { backupVerifyCommand } from "./backup-verify.js";
import { isPathWithin } from "./cleanup-utils.js";
export type BackupCreateOptions = {
output?: string;
dryRun?: boolean;
includeWorkspace?: boolean;
onlyConfig?: boolean;
verify?: boolean;
json?: boolean;
nowMs?: number;
};
type BackupManifestAsset = {
kind: BackupAsset["kind"];
sourcePath: string;
archivePath: string;
};
type BackupManifest = {
schemaVersion: 1;
createdAt: string;
archiveRoot: string;
runtimeVersion: string;
platform: NodeJS.Platform;
nodeVersion: string;
options: {
includeWorkspace: boolean;
onlyConfig?: boolean;
};
paths: {
stateDir: string;
configPath: string;
oauthDir: string;
workspaceDirs: string[];
};
assets: BackupManifestAsset[];
skipped: Array<{
kind: string;
sourcePath: string;
reason: string;
coveredBy?: string;
}>;
};
export type BackupCreateResult = {
createdAt: string;
archiveRoot: string;
archivePath: string;
dryRun: boolean;
includeWorkspace: boolean;
onlyConfig: boolean;
verified: boolean;
assets: BackupAsset[];
skipped: Array<{
kind: string;
sourcePath: string;
displayPath: string;
reason: string;
coveredBy?: string;
}>;
};
async function resolveOutputPath(params: {
output?: string;
nowMs: number;
includedAssets: BackupAsset[];
stateDir: string;
}): Promise<string> {
const basename = buildBackupArchiveBasename(params.nowMs);
const rawOutput = params.output?.trim();
if (!rawOutput) {
const cwd = path.resolve(process.cwd());
const canonicalCwd = await fs.realpath(cwd).catch(() => cwd);
const cwdInsideSource = params.includedAssets.some((asset) =>
isPathWithin(canonicalCwd, asset.sourcePath),
);
const defaultDir = cwdInsideSource ? (resolveHomeDir() ?? path.dirname(params.stateDir)) : cwd;
return path.resolve(defaultDir, basename);
}
const resolved = resolveUserPath(rawOutput);
if (rawOutput.endsWith("/") || rawOutput.endsWith("\\")) {
return path.join(resolved, basename);
}
try {
const stat = await fs.stat(resolved);
if (stat.isDirectory()) {
return path.join(resolved, basename);
}
} catch {
// Treat as a file path when the target does not exist yet.
}
return resolved;
}
async function assertOutputPathReady(outputPath: string): Promise<void> {
try {
await fs.access(outputPath);
throw new Error(`Refusing to overwrite existing backup archive: ${outputPath}`);
} catch (err) {
const code = (err as NodeJS.ErrnoException | undefined)?.code;
if (code === "ENOENT") {
return;
}
throw err;
}
}
function buildTempArchivePath(outputPath: string): string {
return `${outputPath}.${randomUUID()}.tmp`;
}
function isLinkUnsupportedError(code: string | undefined): boolean {
return code === "ENOTSUP" || code === "EOPNOTSUPP" || code === "EPERM";
}
async function publishTempArchive(params: {
tempArchivePath: string;
outputPath: string;
}): Promise<void> {
try {
await fs.link(params.tempArchivePath, params.outputPath);
} catch (err) {
const code = (err as NodeJS.ErrnoException | undefined)?.code;
if (code === "EEXIST") {
throw new Error(`Refusing to overwrite existing backup archive: ${params.outputPath}`, {
cause: err,
});
}
if (!isLinkUnsupportedError(code)) {
throw err;
}
try {
// Some backup targets support ordinary files but not hard links.
await fs.copyFile(params.tempArchivePath, params.outputPath, fsConstants.COPYFILE_EXCL);
} catch (copyErr) {
const copyCode = (copyErr as NodeJS.ErrnoException | undefined)?.code;
if (copyCode !== "EEXIST") {
await fs.rm(params.outputPath, { force: true }).catch(() => undefined);
}
if (copyCode === "EEXIST") {
throw new Error(`Refusing to overwrite existing backup archive: ${params.outputPath}`, {
cause: copyErr,
});
}
throw copyErr;
}
}
await fs.rm(params.tempArchivePath, { force: true });
}
async function canonicalizePathForContainment(targetPath: string): Promise<string> {
const resolved = path.resolve(targetPath);
const suffix: string[] = [];
let probe = resolved;
while (true) {
try {
const realProbe = await fs.realpath(probe);
return suffix.length === 0 ? realProbe : path.join(realProbe, ...suffix.toReversed());
} catch {
const parent = path.dirname(probe);
if (parent === probe) {
return resolved;
}
suffix.push(path.basename(probe));
probe = parent;
}
}
}
function buildManifest(params: {
createdAt: string;
archiveRoot: string;
includeWorkspace: boolean;
onlyConfig: boolean;
assets: BackupAsset[];
skipped: BackupCreateResult["skipped"];
stateDir: string;
configPath: string;
oauthDir: string;
workspaceDirs: string[];
}): BackupManifest {
return {
schemaVersion: 1,
createdAt: params.createdAt,
archiveRoot: params.archiveRoot,
runtimeVersion: resolveRuntimeServiceVersion(),
platform: process.platform,
nodeVersion: process.version,
options: {
includeWorkspace: params.includeWorkspace,
onlyConfig: params.onlyConfig,
},
paths: {
stateDir: params.stateDir,
configPath: params.configPath,
oauthDir: params.oauthDir,
workspaceDirs: params.workspaceDirs,
},
assets: params.assets.map((asset) => ({
kind: asset.kind,
sourcePath: asset.sourcePath,
archivePath: asset.archivePath,
})),
skipped: params.skipped.map((entry) => ({
kind: entry.kind,
sourcePath: entry.sourcePath,
reason: entry.reason,
coveredBy: entry.coveredBy,
})),
};
}
function formatTextSummary(result: BackupCreateResult): string[] {
const lines = [`Backup archive: ${result.archivePath}`];
lines.push(`Included ${result.assets.length} path${result.assets.length === 1 ? "" : "s"}:`);
for (const asset of result.assets) {
lines.push(`- ${asset.kind}: ${asset.displayPath}`);
}
if (result.skipped.length > 0) {
lines.push(`Skipped ${result.skipped.length} path${result.skipped.length === 1 ? "" : "s"}:`);
for (const entry of result.skipped) {
if (entry.reason === "covered" && entry.coveredBy) {
lines.push(`- ${entry.kind}: ${entry.displayPath} (${entry.reason} by ${entry.coveredBy})`);
} else {
lines.push(`- ${entry.kind}: ${entry.displayPath} (${entry.reason})`);
}
}
}
if (result.dryRun) {
lines.push("Dry run only; archive was not written.");
} else {
lines.push(`Created ${result.archivePath}`);
if (result.verified) {
lines.push("Archive verification: passed");
}
}
return lines;
}
function remapArchiveEntryPath(params: {
entryPath: string;
manifestPath: string;
archiveRoot: string;
}): string {
const normalizedEntry = path.resolve(params.entryPath);
if (normalizedEntry === params.manifestPath) {
return path.posix.join(params.archiveRoot, "manifest.json");
}
return buildBackupArchivePath(params.archiveRoot, normalizedEntry);
}
export async function backupCreateCommand(
runtime: RuntimeEnv,
opts: BackupCreateOptions = {},
): Promise<BackupCreateResult> {
const nowMs = opts.nowMs ?? Date.now();
const archiveRoot = buildBackupArchiveRoot(nowMs);
const onlyConfig = Boolean(opts.onlyConfig);
const includeWorkspace = onlyConfig ? false : (opts.includeWorkspace ?? true);
const plan = await resolveBackupPlanFromDisk({ includeWorkspace, onlyConfig, nowMs });
const outputPath = await resolveOutputPath({
output: opts.output,
nowMs,
includedAssets: plan.included,
stateDir: plan.stateDir,
});
if (plan.included.length === 0) {
throw new Error(
onlyConfig
? "No OpenClaw config file was found to back up."
: "No local OpenClaw state was found to back up.",
);
}
const canonicalOutputPath = await canonicalizePathForContainment(outputPath);
const overlappingAsset = plan.included.find((asset) =>
isPathWithin(canonicalOutputPath, asset.sourcePath),
);
if (overlappingAsset) {
throw new Error(
`Backup output must not be written inside a source path: ${outputPath} is inside ${overlappingAsset.sourcePath}`,
);
}
if (!opts.dryRun) {
await assertOutputPathReady(outputPath);
}
const createdAt = new Date(nowMs).toISOString();
const result: BackupCreateResult = {
createdAt,
archiveRoot,
archivePath: outputPath,
dryRun: Boolean(opts.dryRun),
includeWorkspace,
onlyConfig,
verified: false,
assets: plan.included,
skipped: plan.skipped,
};
if (!opts.dryRun) {
await fs.mkdir(path.dirname(outputPath), { recursive: true });
const tempDir = await fs.mkdtemp(path.join(os.tmpdir(), "openclaw-backup-"));
const manifestPath = path.join(tempDir, "manifest.json");
const tempArchivePath = buildTempArchivePath(outputPath);
try {
const manifest = buildManifest({
createdAt,
archiveRoot,
includeWorkspace,
onlyConfig,
assets: result.assets,
skipped: result.skipped,
stateDir: plan.stateDir,
configPath: plan.configPath,
oauthDir: plan.oauthDir,
workspaceDirs: plan.workspaceDirs,
});
await fs.writeFile(manifestPath, `${JSON.stringify(manifest, null, 2)}\n`, "utf8");
await tar.c(
{
file: tempArchivePath,
gzip: true,
portable: true,
preservePaths: true,
onWriteEntry: (entry) => {
entry.path = remapArchiveEntryPath({
entryPath: entry.path,
manifestPath,
archiveRoot,
});
},
},
[manifestPath, ...result.assets.map((asset) => asset.sourcePath)],
);
await publishTempArchive({ tempArchivePath, outputPath });
} finally {
await fs.rm(tempArchivePath, { force: true }).catch(() => undefined);
await fs.rm(tempDir, { recursive: true, force: true }).catch(() => undefined);
}
if (opts.verify) {
await backupVerifyCommand(
{
...runtime,
log: () => {},
},
{ archive: outputPath, json: false },
);
result.verified = true;
}
}
const output = opts.json ? JSON.stringify(result, null, 2) : formatTextSummary(result).join("\n");
runtime.log(output);
return result;
}

View File

@ -0,0 +1,69 @@
import { beforeEach, describe, expect, it, vi } from "vitest";
import { createNonExitingRuntime } from "../runtime.js";
const resolveCleanupPlanFromDisk = vi.fn();
const removePath = vi.fn();
const listAgentSessionDirs = vi.fn();
const removeStateAndLinkedPaths = vi.fn();
const removeWorkspaceDirs = vi.fn();
vi.mock("../config/config.js", () => ({
isNixMode: false,
}));
vi.mock("./cleanup-plan.js", () => ({
resolveCleanupPlanFromDisk,
}));
vi.mock("./cleanup-utils.js", () => ({
removePath,
listAgentSessionDirs,
removeStateAndLinkedPaths,
removeWorkspaceDirs,
}));
const { resetCommand } = await import("./reset.js");
describe("resetCommand", () => {
const runtime = createNonExitingRuntime();
beforeEach(() => {
vi.clearAllMocks();
resolveCleanupPlanFromDisk.mockReturnValue({
stateDir: "/tmp/.openclaw",
configPath: "/tmp/.openclaw/openclaw.json",
oauthDir: "/tmp/.openclaw/credentials",
configInsideState: true,
oauthInsideState: true,
workspaceDirs: ["/tmp/.openclaw/workspace"],
});
removePath.mockResolvedValue({ ok: true });
listAgentSessionDirs.mockResolvedValue(["/tmp/.openclaw/agents/main/sessions"]);
removeStateAndLinkedPaths.mockResolvedValue(undefined);
removeWorkspaceDirs.mockResolvedValue(undefined);
vi.spyOn(runtime, "log").mockImplementation(() => {});
vi.spyOn(runtime, "error").mockImplementation(() => {});
});
it("recommends creating a backup before state-destructive reset scopes", async () => {
await resetCommand(runtime, {
scope: "config+creds+sessions",
yes: true,
nonInteractive: true,
dryRun: true,
});
expect(runtime.log).toHaveBeenCalledWith(expect.stringContaining("openclaw backup create"));
});
it("does not recommend backup for config-only reset", async () => {
await resetCommand(runtime, {
scope: "config",
yes: true,
nonInteractive: true,
dryRun: true,
});
expect(runtime.log).not.toHaveBeenCalledWith(expect.stringContaining("openclaw backup create"));
});
});

View File

@ -44,6 +44,10 @@ async function stopGatewayIfRunning(runtime: RuntimeEnv) {
}
}
function logBackupRecommendation(runtime: RuntimeEnv) {
runtime.log(`Recommended first: ${formatCliCommand("openclaw backup create")}`);
}
export async function resetCommand(runtime: RuntimeEnv, opts: ResetOptions) {
const interactive = !opts.nonInteractive;
if (!interactive && !opts.yes) {
@ -110,6 +114,7 @@ export async function resetCommand(runtime: RuntimeEnv, opts: ResetOptions) {
resolveCleanupPlanFromDisk();
if (scope !== "config") {
logBackupRecommendation(runtime);
if (dryRun) {
runtime.log("[dry-run] stop gateway service");
} else {

View File

@ -0,0 +1,66 @@
import { beforeEach, describe, expect, it, vi } from "vitest";
import { createNonExitingRuntime } from "../runtime.js";
const resolveCleanupPlanFromDisk = vi.fn();
const removePath = vi.fn();
const removeStateAndLinkedPaths = vi.fn();
const removeWorkspaceDirs = vi.fn();
vi.mock("../config/config.js", () => ({
isNixMode: false,
}));
vi.mock("./cleanup-plan.js", () => ({
resolveCleanupPlanFromDisk,
}));
vi.mock("./cleanup-utils.js", () => ({
removePath,
removeStateAndLinkedPaths,
removeWorkspaceDirs,
}));
const { uninstallCommand } = await import("./uninstall.js");
describe("uninstallCommand", () => {
const runtime = createNonExitingRuntime();
beforeEach(() => {
vi.clearAllMocks();
resolveCleanupPlanFromDisk.mockReturnValue({
stateDir: "/tmp/.openclaw",
configPath: "/tmp/.openclaw/openclaw.json",
oauthDir: "/tmp/.openclaw/credentials",
configInsideState: true,
oauthInsideState: true,
workspaceDirs: ["/tmp/.openclaw/workspace"],
});
removePath.mockResolvedValue({ ok: true });
removeStateAndLinkedPaths.mockResolvedValue(undefined);
removeWorkspaceDirs.mockResolvedValue(undefined);
vi.spyOn(runtime, "log").mockImplementation(() => {});
vi.spyOn(runtime, "error").mockImplementation(() => {});
});
it("recommends creating a backup before removing state or workspaces", async () => {
await uninstallCommand(runtime, {
state: true,
yes: true,
nonInteractive: true,
dryRun: true,
});
expect(runtime.log).toHaveBeenCalledWith(expect.stringContaining("openclaw backup create"));
});
it("does not recommend backup for service-only uninstall", async () => {
await uninstallCommand(runtime, {
service: true,
yes: true,
nonInteractive: true,
dryRun: true,
});
expect(runtime.log).not.toHaveBeenCalledWith(expect.stringContaining("openclaw backup create"));
});
});

View File

@ -1,5 +1,6 @@
import path from "node:path";
import { cancel, confirm, isCancel, multiselect } from "@clack/prompts";
import { formatCliCommand } from "../cli/command-format.js";
import { isNixMode } from "../config/config.js";
import { resolveGatewayService } from "../daemon/service.js";
import type { RuntimeEnv } from "../runtime.js";
@ -92,6 +93,10 @@ async function removeMacApp(runtime: RuntimeEnv, dryRun?: boolean) {
});
}
function logBackupRecommendation(runtime: RuntimeEnv) {
runtime.log(`Recommended first: ${formatCliCommand("openclaw backup create")}`);
}
export async function uninstallCommand(runtime: RuntimeEnv, opts: UninstallOptions) {
const { scopes, hadExplicit } = buildScopeSelection(opts);
const interactive = !opts.nonInteractive;
@ -155,6 +160,10 @@ export async function uninstallCommand(runtime: RuntimeEnv, opts: UninstallOptio
const { stateDir, configPath, oauthDir, configInsideState, oauthInsideState, workspaceDirs } =
resolveCleanupPlanFromDisk();
if (scopes.has("state") || scopes.has("workspace")) {
logBackupRecommendation(runtime);
}
if (scopes.has("service")) {
if (dryRun) {
runtime.log("[dry-run] remove gateway service");