build: refresh deps and route testbox through crabbox

This commit is contained in:
Peter Steinberger
2026-05-11 03:35:00 +01:00
parent 8ccd3e9236
commit 15cf49222f
531 changed files with 1096 additions and 2487 deletions

View File

@@ -294,18 +294,19 @@ Common Crabbox-only failures:
report the capacity blocker.
If Crabbox cannot dispatch, sync, attach, or stop but Blacksmith itself works,
use direct Blacksmith from the repo root:
first try the same command through the repo wrapper with `--debug` and
`--timing-json`:
```sh
blacksmith testbox warmup ci-check-testbox.yml --ref main --idle-timeout 90
blacksmith testbox run --id <tbx_id> "env CI=1 NODE_OPTIONS=--max-old-space-size=4096 OPENCLAW_TEST_PROJECTS_PARALLEL=6 OPENCLAW_VITEST_MAX_WORKERS=1 OPENCLAW_VITEST_NO_OUTPUT_TIMEOUT_MS=900000 pnpm test:changed"
blacksmith testbox stop --id <tbx_id>
pnpm crabbox:run -- --provider blacksmith-testbox --debug --timing-json -- \
CI=1 NODE_OPTIONS=--max-old-space-size=4096 OPENCLAW_TEST_PROJECTS_PARALLEL=6 OPENCLAW_VITEST_MAX_WORKERS=1 OPENCLAW_VITEST_NO_OUTPUT_TIMEOUT_MS=900000 pnpm test:changed
```
Direct full suite:
Full suite:
```sh
blacksmith testbox run --id <tbx_id> "env CI=1 NODE_OPTIONS=--max-old-space-size=4096 OPENCLAW_TEST_PROJECTS_PARALLEL=6 OPENCLAW_VITEST_MAX_WORKERS=1 OPENCLAW_VITEST_NO_OUTPUT_TIMEOUT_MS=900000 pnpm test"
pnpm crabbox:run -- --provider blacksmith-testbox --debug --timing-json -- \
CI=1 NODE_OPTIONS=--max-old-space-size=4096 OPENCLAW_TEST_PROJECTS_PARALLEL=6 OPENCLAW_VITEST_MAX_WORKERS=1 OPENCLAW_VITEST_NO_OUTPUT_TIMEOUT_MS=900000 pnpm test
```
Auth fallback, only when `blacksmith` says auth is missing:
@@ -340,16 +341,15 @@ The hydration workflow owns checkout, Node/pnpm setup, dependency install,
secrets, ready marker, and keepalive. Crabbox owns dispatch, sync, SSH command
execution, timing, logs/results, and cleanup.
Minimal direct Blacksmith fallback, from repo root:
Minimal Blacksmith-backed Crabbox run, from repo root:
```sh
blacksmith testbox warmup ci-check-testbox.yml --ref main --idle-timeout 90
blacksmith testbox run --id <tbx_id> "env CI=1 NODE_OPTIONS=--max-old-space-size=4096 OPENCLAW_TEST_PROJECTS_PARALLEL=6 OPENCLAW_VITEST_MAX_WORKERS=1 pnpm test:changed"
blacksmith testbox stop --id <tbx_id>
pnpm crabbox:run -- --provider blacksmith-testbox --timing-json -- \
CI=1 NODE_OPTIONS=--max-old-space-size=4096 OPENCLAW_TEST_PROJECTS_PARALLEL=6 OPENCLAW_VITEST_MAX_WORKERS=1 pnpm test:changed
```
Use direct Blacksmith only when Crabbox is the broken layer and Blacksmith
itself still works. Prefer direct `blacksmith testbox list` for cleanup
Use direct Blacksmith only when Crabbox is the broken layer and you are
isolating a Crabbox bug. Prefer direct `blacksmith testbox list` for cleanup
diagnostics, not as a reusable work queue.
Important Blacksmith footguns:

View File

@@ -92,11 +92,11 @@ barrels, package-boundary tests, or extension suites.
- runtime capture should be quiet and config-tolerant.
- command output should include wall time, exit code, and peak RSS when
available.
4. For broad or package-heavy plugin proof, use Blacksmith Testbox by default on
maintainer machines. Warm once and reuse the same box:
- `blacksmith testbox warmup ci-check-testbox.yml --ref main --idle-timeout 90`
- `blacksmith testbox run --id <ID> "OPENCLAW_TESTBOX=1 pnpm test:extensions:batch <ids>"`
- stop the box when done.
4. For broad or package-heavy plugin proof, use Crabbox-backed Blacksmith
Testbox by default on maintainer machines:
- `pnpm crabbox:run -- --provider blacksmith-testbox --timing-json -- OPENCLAW_TESTBOX=1 pnpm test:extensions:batch <ids>`
- add `--keep`/`--id <id-or-slug>` only when several commands must share one
warmed box; stop it with `pnpm crabbox:stop -- <id-or-slug>`.
5. If plugin performance is package-artifact sensitive, switch to
`openclaw-pre-release-plugin-testing` and Package Acceptance rather than
trusting source-only timing.

View File

@@ -36,14 +36,11 @@ Prove the touched surface first. Do not reflexively run the whole suite.
- Prefer GitHub Actions for release/Docker proof when the workflow already has the prepared image and secrets.
- Use `scripts/committer "<msg>" <paths...>` when committing; stage only your files.
- If deps are missing, run `pnpm install`, retry once, then report the first actionable error.
- For Blacksmith Testbox proof, reuse only an id warmed and claimed in this
operator session. `blacksmith testbox list` is diagnostics only; a listed id
can have a local key and still carry stale rsync state from another lane.
After warmup, run `pnpm testbox:claim --id <id>`, then prefer
`pnpm testbox:run --id <id> -- "<command>"` for OpenClaw gates so stale
org-visible ids fail fast before syncing. Claims older than 12 hours are
stale unless `OPENCLAW_TESTBOX_CLAIM_TTL_MINUTES` is explicitly set for long
work.
- For Blacksmith Testbox proof, use Crabbox first. `pnpm crabbox:run -- --provider
blacksmith-testbox --timing-json -- <command...>` warms, claims, syncs, runs,
reports, and cleans up one-shot boxes. Reuse only an id/slug created in this
operator session; `blacksmith testbox list` is diagnostics only, not a shared
work queue.
## Local Test Shortcuts

View File

@@ -31,6 +31,8 @@ Docs: https://docs.openclaw.ai
- Gateway/skills: add an opt-in private skill archive upload install path gated by `skills.install.allowUploadedArchives`, so trusted Gateway clients can stage and install zip-backed skills only when operators explicitly enable the code-install surface. (#74430) Thanks @samzong.
- Codex app-server: enable Codex native code-mode-only for harness threads so deferred OpenClaw dynamic tools run through Codex's own searchable code execution surface instead of a PI-style wrapper.
- Dependencies: refresh workspace pins and patch targets, including ACPX `@agentclientprotocol/claude-agent-acp` `0.33.1`, Codex ACP `0.14.0`, Baileys `7.0.0-rc10`, Google GenAI `2.0.1`, OpenAI `6.37.0`, AWS SDK `3.1045.0`, Kysely `0.29.0`, Tlon skill `0.3.6`, Aimock `1.19.5`, and tsdown `0.22.0`.
- Dependencies: move embedded Pi packages to the `@earendil-works` namespace, refresh Twitch Twurple packages, and move `@openclaw/fs-safe` from the GitHub release pin to the published npm package.
- Build: route Testbox changed-check delegation through Crabbox and remove the OpenClaw-specific Blacksmith Testbox helper scripts.
- Agents/compaction: preserve scoped background exec/process session references across embedded compaction and after-turn runtime contexts without exposing sessions from unrelated scopes. Fixes #79284. (#79307) Thanks @TurboTheTurtle.
- Agents/process: tell agents to inspect background sessions with `process log` before sending interactive input and to use `waitingForInput`/`stdinWritable` hints from `log`/`poll`.
- CLI/onboarding: improve setup, onboarding, configure, and channel command wayfinding so terminal flows explain the next useful command instead of relying on terse setup labels.

View File

@@ -99,7 +99,7 @@ enum ModelCatalogLoader {
]
for root in roots {
let candidate = root
.appendingPathComponent("node_modules/@mariozechner/pi-ai/dist/models.generated.js")
.appendingPathComponent("node_modules/@earendil-works/pi-ai/dist/models.generated.js")
if FileManager().isReadableFile(atPath: candidate.path) {
return candidate.path
}

View File

@@ -493,13 +493,23 @@ Local changed-test routing lives in `scripts/test-projects.test-support.mjs` and
## Testbox validation
Run Testbox from the repo root and prefer a fresh warmed box for broad proof. Before spending a slow gate on a box that was reused, expired, or just reported an unexpectedly large sync, run `pnpm testbox:sanity` inside the box first.
Crabbox is the repo-owned remote-box wrapper for maintainer Linux proof. Use it
from the repo root when a check is too broad for a local edit loop, when CI
parity matters, or when the proof needs secrets, Docker, package lanes,
reusable boxes, or remote logs. The normal OpenClaw backend is
`blacksmith-testbox`; owned AWS/Hetzner capacity is a fallback for Blacksmith
outages, quota issues, or explicit owned-capacity testing.
The sanity check fails fast when required root files such as `pnpm-lock.yaml` disappeared or when `git status --short` shows at least 200 tracked deletions. That usually means the remote sync state is not a trustworthy copy of the PR; stop that box and warm a fresh one instead of debugging the product test failure. For intentional large-deletion PRs, set `OPENCLAW_TESTBOX_ALLOW_MASS_DELETIONS=1` for that sanity run.
Crabbox-backed Blacksmith runs warm, claim, sync, run, report, and clean up
one-shot Testboxes. The built-in sync sanity check fails fast when required
root files such as `pnpm-lock.yaml` disappear or when `git status --short`
shows at least 200 tracked deletions. For intentional large-deletion PRs, set
`OPENCLAW_TESTBOX_ALLOW_MASS_DELETIONS=1` for the remote command.
`pnpm testbox:run` also terminates a local Blacksmith CLI invocation that stays in the sync phase for more than five minutes without post-sync output. Set `OPENCLAW_TESTBOX_SYNC_TIMEOUT_MS=0` to disable that guard, or use a larger millisecond value for unusually large local diffs.
Crabbox is the repo-owned remote-box wrapper for maintainer Linux proof. Use it when a check is too broad for a local edit loop, when CI parity matters, or when the proof needs secrets, Docker, package lanes, reusable boxes, or remote logs. The normal OpenClaw backend is `blacksmith-testbox`; owned AWS/Hetzner capacity is a fallback for Blacksmith outages, quota issues, or explicit owned-capacity testing.
Crabbox also terminates a local Blacksmith CLI invocation that stays in the
sync phase for more than five minutes without post-sync output. Set
`CRABBOX_BLACKSMITH_SYNC_TIMEOUT_MS=0` to disable that guard, or use a larger
millisecond value for unusually large local diffs.
Before a first run, check the wrapper from the repo root:
@@ -569,13 +579,9 @@ pnpm crabbox:run -- --provider blacksmith-testbox --id <tbx_id> --no-sync --timi
pnpm crabbox:stop -- <tbx_id>
```
If Crabbox is the broken layer but Blacksmith itself works, use direct Blacksmith as a narrow fallback:
```bash
blacksmith testbox warmup ci-check-testbox.yml --ref main --idle-timeout 90
blacksmith testbox run --id <tbx_id> "env CI=1 NODE_OPTIONS=--max-old-space-size=4096 OPENCLAW_TEST_PROJECTS_PARALLEL=6 OPENCLAW_VITEST_MAX_WORKERS=1 OPENCLAW_VITEST_NO_OUTPUT_TIMEOUT_MS=900000 pnpm check:changed"
blacksmith testbox stop --id <tbx_id>
```
If Crabbox is the broken layer but Blacksmith itself works, use direct
Blacksmith only for diagnostics such as `list`, `status`, and cleanup. Fix the
Crabbox path before treating a direct Blacksmith run as maintainer proof.
If `blacksmith testbox list --all` and `blacksmith testbox status` work but new
warmups sit `queued` with no IP or Actions run URL after a couple of minutes,

View File

@@ -105,7 +105,7 @@ Claude login on the host, onboarding/configure can reuse it directly.
## OAuth exchange (how login works)
OpenClaw's interactive login flows are implemented in `@mariozechner/pi-ai` and wired into the wizards/commands.
OpenClaw's interactive login flows are implemented in `@earendil-works/pi-ai` and wired into the wizards/commands.
### Anthropic setup-token

View File

@@ -232,7 +232,7 @@ That stages grounded durable candidates into the short-term dreaming store while
</Accordion>
<Accordion title="2b. OpenCode provider overrides">
If you've added `models.providers.opencode`, `opencode-zen`, or `opencode-go` manually, it overrides the built-in OpenCode catalog from `@mariozechner/pi-ai`. That can force models onto the wrong API or zero out costs. Doctor warns so you can remove the override and restore per-model API routing + costs.
If you've added `models.providers.opencode`, `opencode-zen`, or `opencode-go` manually, it overrides the built-in OpenCode catalog from `@earendil-works/pi-ai`. That can force models onto the wrong API or zero out costs. Doctor warns so you can remove the override and restore per-model API routing + costs.
</Accordion>
<Accordion title="2c. Browser migration and Chrome MCP readiness">
If your browser config still points at the removed Chrome extension path, doctor normalizes it to the current host-local Chrome MCP attach model:

View File

@@ -23,10 +23,10 @@ OpenClaw uses the pi SDK to embed an AI coding agent into its messaging gateway
```json
{
"@mariozechner/pi-agent-core": "0.73.0",
"@mariozechner/pi-ai": "0.73.0",
"@mariozechner/pi-coding-agent": "0.73.0",
"@mariozechner/pi-tui": "0.73.0"
"@earendil-works/pi-agent-core": "0.74.0",
"@earendil-works/pi-ai": "0.74.0",
"@earendil-works/pi-coding-agent": "0.74.0",
"@earendil-works/pi-tui": "0.74.0"
}
```
@@ -171,7 +171,7 @@ import {
DefaultResourceLoader,
SessionManager,
SettingsManager,
} from "@mariozechner/pi-coding-agent";
} from "@earendil-works/pi-coding-agent";
const resourceLoader = new DefaultResourceLoader({
cwd: resolvedWorkspace,
@@ -518,7 +518,7 @@ OpenClaw also has a local TUI mode that uses pi-tui components directly:
```typescript
// src/tui/tui.ts
import { ... } from "@mariozechner/pi-tui";
import { ... } from "@earendil-works/pi-tui";
```
This provides the interactive terminal experience similar to pi's native mode.

View File

@@ -69,16 +69,14 @@ not publish the inspector binary from the main `openclaw` package.
### Maintainer acceptance lane
Use Blacksmith Testbox for the installable-package acceptance lane when validating
the external inspector against OpenClaw plugin packages. Run it from a clean
OpenClaw checkout after the package is built:
Use Crabbox-backed Blacksmith Testbox for the installable-package acceptance
lane when validating the external inspector against OpenClaw plugin packages.
Run it from a clean OpenClaw checkout after the package is built:
```sh
blacksmith testbox warmup ci-check-testbox.yml --ref main --idle-timeout 90
blacksmith testbox run --id <tbx_id> "pnpm install && pnpm build && npm exec --yes @openclaw/plugin-inspector@0.1.0 -- ./extensions/telegram --json"
blacksmith testbox run --id <tbx_id> "npm exec --yes @openclaw/plugin-inspector@0.1.0 -- ./extensions/discord --json"
blacksmith testbox run --id <tbx_id> "npm exec --yes @openclaw/plugin-inspector@0.1.0 -- <clawhub-plugin-dir> --json"
blacksmith testbox stop <tbx_id>
pnpm crabbox:run -- --provider blacksmith-testbox --timing-json --shell -- "pnpm install && pnpm build && npm exec --yes @openclaw/plugin-inspector@0.1.0 -- ./extensions/telegram --json"
pnpm crabbox:run -- --provider blacksmith-testbox --timing-json --shell -- "npm exec --yes @openclaw/plugin-inspector@0.1.0 -- ./extensions/discord --json"
pnpm crabbox:run -- --provider blacksmith-testbox --timing-json --shell -- "npm exec --yes @openclaw/plugin-inspector@0.1.0 -- <clawhub-plugin-dir> --json"
```
Keep this lane opt-in for maintainers because it installs an external npm

View File

@@ -200,7 +200,7 @@ The store is safe to edit, but the Gateway is the authority: it may rewrite or r
## Transcript structure (`*.jsonl`)
Transcripts are managed by `@mariozechner/pi-coding-agent`'s `SessionManager`.
Transcripts are managed by `@earendil-works/pi-coding-agent`'s `SessionManager`.
The file is JSONL:

View File

@@ -1,4 +1,4 @@
import type { Api, Model } from "@mariozechner/pi-ai";
import type { Api, Model } from "@earendil-works/pi-ai";
import { describe, expect, it, vi } from "vitest";
import {
createMantleAnthropicStreamFn,

View File

@@ -1,7 +1,7 @@
import Anthropic from "@anthropic-ai/sdk";
import type { StreamFn } from "@mariozechner/pi-agent-core";
import type { Api, Model, SimpleStreamOptions } from "@mariozechner/pi-ai";
import { streamAnthropic } from "@mariozechner/pi-ai/anthropic";
import type { StreamFn } from "@earendil-works/pi-agent-core";
import type { Api, Model, SimpleStreamOptions } from "@earendil-works/pi-ai";
import { streamAnthropic } from "@earendil-works/pi-ai/anthropic";
const MANTLE_ANTHROPIC_BETA = "fine-grained-tool-streaming-2025-05-14";
type AnthropicOptions = ConstructorParameters<typeof Anthropic>[0];

View File

@@ -7,7 +7,7 @@
"dependencies": {
"@anthropic-ai/sdk": "0.95.1",
"@aws/bedrock-token-generator": "^1.1.0",
"@mariozechner/pi-ai": "0.73.1"
"@earendil-works/pi-ai": "0.74.0"
},
"devDependencies": {
"@openclaw/plugin-sdk": "workspace:*"

View File

@@ -8,7 +8,7 @@
"@aws-sdk/client-bedrock": "3.1045.0",
"@aws-sdk/client-bedrock-runtime": "3.1045.0",
"@aws-sdk/credential-provider-node": "3.972.39",
"@mariozechner/pi-ai": "0.73.1",
"@earendil-works/pi-ai": "0.74.0",
"@smithy/shared-ini-file-loader": "4.4.9"
},
"devDependencies": {

View File

@@ -1,5 +1,5 @@
import type { StreamFn } from "@mariozechner/pi-agent-core";
import { streamSimple } from "@mariozechner/pi-ai";
import type { StreamFn } from "@earendil-works/pi-agent-core";
import { streamSimple } from "@earendil-works/pi-ai";
import type { OpenClawConfig } from "openclaw/plugin-sdk/config-contracts";
import { resolvePluginConfigObject } from "openclaw/plugin-sdk/plugin-config-runtime";
import type { OpenClawPluginApi } from "openclaw/plugin-sdk/plugin-entry";
@@ -122,7 +122,7 @@ function createGuardrailWrapStreamFn(
/**
* Mirrors the shipped pi-ai Bedrock `supportsPromptCaching` matcher.
* Keep this in sync with node_modules/@mariozechner/pi-ai/dist/providers/amazon-bedrock.js.
* Keep this in sync with node_modules/@earendil-works/pi-ai/dist/providers/amazon-bedrock.js.
*/
function matchesPiAiPromptCachingModelId(modelId: string): boolean {
const id = modelId.toLowerCase();

View File

@@ -1,4 +1,4 @@
import { createAssistantMessageEventStream, type Model } from "@mariozechner/pi-ai";
import { createAssistantMessageEventStream, type Model } from "@earendil-works/pi-ai";
import { beforeAll, describe, expect, it, vi } from "vitest";
import type { AnthropicVertexStreamDeps } from "./stream-runtime.js";

View File

@@ -1,4 +1,4 @@
import type { StreamFn } from "@mariozechner/pi-agent-core";
import type { StreamFn } from "@earendil-works/pi-agent-core";
import type { AnthropicVertexStreamDeps } from "./stream-runtime.js";
export {

View File

@@ -6,8 +6,8 @@
"type": "module",
"dependencies": {
"@anthropic-ai/vertex-sdk": "^0.16.0",
"@mariozechner/pi-agent-core": "0.73.1",
"@mariozechner/pi-ai": "0.73.1"
"@earendil-works/pi-agent-core": "0.74.0",
"@earendil-works/pi-ai": "0.74.0"
},
"devDependencies": {
"@openclaw/plugin-sdk": "workspace:*"

View File

@@ -1,4 +1,4 @@
import { createAssistantMessageEventStream, type Model } from "@mariozechner/pi-ai";
import { createAssistantMessageEventStream, type Model } from "@earendil-works/pi-ai";
import { beforeAll, describe, expect, it, vi } from "vitest";
import type { AnthropicVertexStreamDeps } from "./stream-runtime.js";

View File

@@ -1,10 +1,10 @@
import { AnthropicVertex as AnthropicVertexSdk } from "@anthropic-ai/vertex-sdk";
import type { StreamFn } from "@mariozechner/pi-agent-core";
import type { StreamFn } from "@earendil-works/pi-agent-core";
import {
streamAnthropic as streamAnthropicDefault,
type AnthropicOptions,
type Model,
} from "@mariozechner/pi-ai";
} from "@earendil-works/pi-ai";
import {
applyAnthropicPayloadPolicyToParams,
resolveAnthropicPayloadPolicy,

View File

@@ -5,7 +5,7 @@
"description": "OpenClaw Anthropic provider plugin",
"type": "module",
"dependencies": {
"@mariozechner/pi-ai": "0.73.1"
"@earendil-works/pi-ai": "0.74.0"
},
"devDependencies": {
"@openclaw/plugin-sdk": "workspace:*"

View File

@@ -1,4 +1,4 @@
import type { StreamFn } from "@mariozechner/pi-agent-core";
import type { StreamFn } from "@earendil-works/pi-agent-core";
import { afterEach, describe, expect, it, vi } from "vitest";
import {
__testing,

View File

@@ -1,5 +1,5 @@
import type { StreamFn } from "@mariozechner/pi-agent-core";
import { streamSimple } from "@mariozechner/pi-ai";
import type { StreamFn } from "@earendil-works/pi-agent-core";
import { streamSimple } from "@earendil-works/pi-ai";
import type { ProviderWrapStreamFnContext } from "openclaw/plugin-sdk/plugin-entry";
import {
applyAnthropicPayloadPolicyToParams,

View File

@@ -1,4 +1,4 @@
import type { AgentToolResult } from "@mariozechner/pi-agent-core";
import type { AgentToolResult } from "@earendil-works/pi-agent-core";
import {
DEFAULT_AI_SNAPSHOT_MAX_CHARS,
browserAct,

View File

@@ -1,4 +1,4 @@
import { completeSimple, type Model } from "@mariozechner/pi-ai";
import { completeSimple, type Model } from "@earendil-works/pi-ai";
import {
createSingleUserPromptMessage,
extractNonEmptyAssistantText,

View File

@@ -1,5 +1,5 @@
import { randomBytes } from "node:crypto";
import type { OAuthCredentials } from "@mariozechner/pi-ai";
import type { OAuthCredentials } from "@earendil-works/pi-ai";
import { generatePkceVerifierChallenge, toFormUrlEncoded } from "openclaw/plugin-sdk/provider-auth";
import {
parseOAuthCallbackInput,

View File

@@ -1,4 +1,4 @@
import type { StreamFn } from "@mariozechner/pi-agent-core";
import type { StreamFn } from "@earendil-works/pi-agent-core";
import { capturePluginRegistration } from "openclaw/plugin-sdk/plugin-test-runtime";
import { describe, expect, it } from "vitest";
import plugin from "./index.js";

View File

@@ -1,4 +1,4 @@
import type { StreamFn } from "@mariozechner/pi-agent-core";
import type { StreamFn } from "@earendil-works/pi-agent-core";
import { afterAll, beforeEach, describe, expect, it, vi } from "vitest";
import {
__testing,

View File

@@ -1,4 +1,4 @@
import type { StreamFn } from "@mariozechner/pi-agent-core";
import type { StreamFn } from "@earendil-works/pi-agent-core";
import type { ProviderWrapStreamFnContext } from "openclaw/plugin-sdk/plugin-entry";
import { createAnthropicThinkingPrefillPayloadWrapper } from "openclaw/plugin-sdk/provider-stream-shared";
import { createSubsystemLogger } from "openclaw/plugin-sdk/runtime-env";

View File

@@ -8,7 +8,7 @@
},
"type": "module",
"dependencies": {
"@mariozechner/pi-coding-agent": "0.73.1",
"@earendil-works/pi-coding-agent": "0.74.0",
"@openai/codex": "0.130.0",
"ajv": "^8.20.0",
"ws": "^8.20.0",

View File

@@ -38,7 +38,7 @@ const providerRuntimeMocks = vi.hoisted(() => ({
),
}));
vi.mock("@mariozechner/pi-ai/oauth", () => ({
vi.mock("@earendil-works/pi-ai/oauth", () => ({
getOAuthApiKey: vi.fn(),
getOAuthProviders: () => [],
loginOpenAICodex: vi.fn(),

View File

@@ -1,4 +1,4 @@
import type { AgentMessage } from "@mariozechner/pi-agent-core";
import type { AgentMessage } from "@earendil-works/pi-agent-core";
import { describe, expect, it } from "vitest";
import { projectContextEngineAssemblyForCodex } from "./context-engine-projection.js";

View File

@@ -1,7 +1,7 @@
import fs from "node:fs/promises";
import os from "node:os";
import path from "node:path";
import { SessionManager } from "@mariozechner/pi-coding-agent";
import { SessionManager } from "@earendil-works/pi-coding-agent";
import type { EmbeddedRunAttemptParams } from "openclaw/plugin-sdk/agent-harness";
import { DELIVERY_NO_REPLY_RUNTIME_CONTRACT } from "openclaw/plugin-sdk/agent-runtime-test-contracts";
import { isSilentReplyPayloadText } from "openclaw/plugin-sdk/reply-chunking";

View File

@@ -1,4 +1,4 @@
import type { AgentToolResult } from "@mariozechner/pi-agent-core";
import type { AgentToolResult } from "@earendil-works/pi-agent-core";
import type { AnyAgentTool } from "openclaw/plugin-sdk/agent-harness";
import {
HEARTBEAT_RESPONSE_TOOL_NAME,

View File

@@ -1,5 +1,5 @@
import type { AgentToolResult } from "@mariozechner/pi-agent-core";
import type { ImageContent, TextContent } from "@mariozechner/pi-ai";
import type { AgentToolResult } from "@earendil-works/pi-agent-core";
import type { ImageContent, TextContent } from "@earendil-works/pi-ai";
import {
createAgentToolResultMiddlewareRunner,
createCodexAppServerToolResultExtensionRunner,

View File

@@ -1,7 +1,7 @@
import fs from "node:fs/promises";
import os from "node:os";
import path from "node:path";
import { SessionManager } from "@mariozechner/pi-coding-agent";
import { SessionManager } from "@earendil-works/pi-coding-agent";
import type { EmbeddedRunAttemptParams } from "openclaw/plugin-sdk/agent-harness";
import { resetAgentEventsForTest } from "openclaw/plugin-sdk/agent-harness-runtime";
import {

View File

@@ -1,4 +1,4 @@
import type { AssistantMessage, Usage } from "@mariozechner/pi-ai";
import type { AssistantMessage, Usage } from "@earendil-works/pi-ai";
import {
classifyAgentHarnessTerminalOutcome,
embeddedAgentLog,

View File

@@ -1,7 +1,7 @@
import fs from "node:fs/promises";
import os from "node:os";
import path from "node:path";
import { SessionManager } from "@mariozechner/pi-coding-agent";
import { SessionManager } from "@earendil-works/pi-coding-agent";
import type { EmbeddedRunAttemptParams } from "openclaw/plugin-sdk/agent-harness";
import { classifyEmbeddedPiRunResultForModelFallback } from "openclaw/plugin-sdk/agent-harness-runtime";
import {

View File

@@ -1,8 +1,8 @@
import fs from "node:fs/promises";
import os from "node:os";
import path from "node:path";
import type { AgentMessage } from "@mariozechner/pi-agent-core";
import { SessionManager } from "@mariozechner/pi-coding-agent";
import type { AgentMessage } from "@earendil-works/pi-agent-core";
import { SessionManager } from "@earendil-works/pi-coding-agent";
import type { EmbeddedRunAttemptParams } from "openclaw/plugin-sdk/agent-harness";
import {
embeddedAgentLog,

View File

@@ -1,7 +1,7 @@
import fs from "node:fs/promises";
import os from "node:os";
import path from "node:path";
import { SessionManager } from "@mariozechner/pi-coding-agent";
import { SessionManager } from "@earendil-works/pi-coding-agent";
import {
abortAgentHarnessRun,
embeddedAgentLog,

View File

@@ -1,10 +1,10 @@
import fs from "node:fs/promises";
import type { SessionEntry } from "@mariozechner/pi-coding-agent";
import type { SessionEntry } from "@earendil-works/pi-coding-agent";
import {
buildSessionContext,
migrateSessionEntries,
parseSessionEntries,
} from "@mariozechner/pi-coding-agent";
} from "@earendil-works/pi-coding-agent";
import type { AgentMessage } from "openclaw/plugin-sdk/agent-harness-runtime";
function isMissingFileError(error: unknown): boolean {

View File

@@ -1,6 +1,6 @@
import { EventEmitter } from "node:events";
import { PassThrough, Writable } from "node:stream";
import type { Api, Model } from "@mariozechner/pi-ai";
import type { Api, Model } from "@earendil-works/pi-ai";
import { vi } from "vitest";
import { CodexAppServerClient } from "./client.js";

View File

@@ -12,7 +12,7 @@ describe("codex package manifest", () => {
fs.readFileSync(new URL("../package.json", import.meta.url), "utf8"),
) as CodexPackageManifest;
expect(packageJson.dependencies).toHaveProperty("@mariozechner/pi-coding-agent");
expect(packageJson.dependencies).toHaveProperty("@earendil-works/pi-coding-agent");
expect(packageJson.dependencies?.["@openai/codex"]).toBe(
MANAGED_CODEX_APP_SERVER_PACKAGE_VERSION,
);

View File

@@ -5,7 +5,7 @@ import {
type AssistantMessage,
type Context,
type Model,
} from "@mariozechner/pi-ai";
} from "@earendil-works/pi-ai";
import {
createSingleUserPromptMessage,
extractNonEmptyAssistantText,

View File

@@ -1,5 +1,5 @@
import type { Context, Model } from "@mariozechner/pi-ai";
import { createAssistantMessageEventStream } from "@mariozechner/pi-ai";
import type { Context, Model } from "@earendil-works/pi-ai";
import { createAssistantMessageEventStream } from "@earendil-works/pi-ai";
import {
registerSingleProviderPlugin,
resolveProviderPluginChoice,

View File

@@ -1,4 +1,4 @@
import type { AgentToolResult } from "@mariozechner/pi-agent-core";
import type { AgentToolResult } from "@earendil-works/pi-agent-core";
import {
readNumberParam,
readStringArrayParam,

View File

@@ -1,4 +1,4 @@
import type { AgentToolResult } from "@mariozechner/pi-agent-core";
import type { AgentToolResult } from "@earendil-works/pi-agent-core";
import {
readNumberParam,
readStringArrayParam,

View File

@@ -1,4 +1,4 @@
import type { AgentToolResult } from "@mariozechner/pi-agent-core";
import type { AgentToolResult } from "@earendil-works/pi-agent-core";
import { resolveDefaultDiscordAccountId } from "../accounts.js";
import { getPresence } from "../monitor/presence-cache.js";
import {

View File

@@ -1,4 +1,4 @@
import type { AgentToolResult } from "@mariozechner/pi-agent-core";
import type { AgentToolResult } from "@earendil-works/pi-agent-core";
import type { ActionGate, DiscordActionConfig, OpenClawConfig } from "../runtime-api.js";
import { handleDiscordMessageManagementAction } from "./runtime.messaging.messages.js";
import { handleDiscordReactionMessagingAction } from "./runtime.messaging.reactions.js";

View File

@@ -1,4 +1,4 @@
import type { AgentToolResult } from "@mariozechner/pi-agent-core";
import type { AgentToolResult } from "@earendil-works/pi-agent-core";
import {
type ActionGate,
jsonResult,

View File

@@ -1,4 +1,4 @@
import type { AgentToolResult } from "@mariozechner/pi-agent-core";
import type { AgentToolResult } from "@earendil-works/pi-agent-core";
import { normalizeLowercaseStringOrEmpty } from "openclaw/plugin-sdk/string-coerce-runtime";
import type { Activity, UpdatePresenceData } from "../internal/gateway.js";
import { getGateway } from "../monitor/gateway-registry.js";

View File

@@ -1,4 +1,4 @@
import type { AgentToolResult } from "@mariozechner/pi-agent-core";
import type { AgentToolResult } from "@earendil-works/pi-agent-core";
import { createDiscordActionGate } from "../accounts.js";
import { readStringParam, type OpenClawConfig } from "../runtime-api.js";
import { handleDiscordGuildAction } from "./runtime.guild.js";

View File

@@ -5,7 +5,7 @@
"description": "OpenClaw Fireworks provider plugin",
"type": "module",
"dependencies": {
"@mariozechner/pi-ai": "0.73.1"
"@earendil-works/pi-ai": "0.74.0"
},
"devDependencies": {
"@openclaw/plugin-sdk": "workspace:*"

View File

@@ -1,5 +1,5 @@
import type { StreamFn } from "@mariozechner/pi-agent-core";
import type { Context, Model } from "@mariozechner/pi-ai";
import type { StreamFn } from "@earendil-works/pi-agent-core";
import type { Context, Model } from "@earendil-works/pi-ai";
import { describe, expect, it } from "vitest";
import {
createFireworksKimiThinkingDisabledWrapper,

View File

@@ -1,5 +1,5 @@
import type { StreamFn } from "@mariozechner/pi-agent-core";
import { streamSimple } from "@mariozechner/pi-ai";
import type { StreamFn } from "@earendil-works/pi-agent-core";
import { streamSimple } from "@earendil-works/pi-ai";
import type { ProviderWrapStreamFnContext } from "openclaw/plugin-sdk/plugin-entry";
import { normalizeProviderId } from "openclaw/plugin-sdk/provider-model-shared";
import { streamWithPayloadPatch } from "openclaw/plugin-sdk/provider-stream-shared";

View File

@@ -1,4 +1,4 @@
import { streamOpenAIResponses, type AssistantMessage, type Model } from "@mariozechner/pi-ai";
import { streamOpenAIResponses, type AssistantMessage, type Model } from "@earendil-works/pi-ai";
import { describe, expect, it } from "vitest";
import { resolveFirstGithubToken } from "./auth.js";
import { buildCopilotDynamicHeaders } from "./stream.js";

View File

@@ -3,9 +3,9 @@ import { afterAll, beforeEach, describe, expect, it, vi } from "vitest";
import { buildCopilotModelDefinition, getDefaultCopilotModelIds } from "./models-defaults.js";
import { fetchCopilotUsage } from "./usage.js";
vi.mock("@mariozechner/pi-ai/oauth", async () => {
const actual = await vi.importActual<typeof import("@mariozechner/pi-ai/oauth")>(
"@mariozechner/pi-ai/oauth",
vi.mock("@earendil-works/pi-ai/oauth", async () => {
const actual = await vi.importActual<typeof import("@earendil-works/pi-ai/oauth")>(
"@earendil-works/pi-ai/oauth",
);
return {
...actual,
@@ -41,7 +41,7 @@ import type { ProviderResolveDynamicModelContext } from "openclaw/plugin-sdk/cor
import { fetchCopilotModelCatalog, resolveCopilotForwardCompatModel } from "./models.js";
afterAll(() => {
vi.doUnmock("@mariozechner/pi-ai/oauth");
vi.doUnmock("@earendil-works/pi-ai/oauth");
vi.doUnmock("openclaw/plugin-sdk/provider-model-shared");
vi.doUnmock("openclaw/plugin-sdk/json-store");
vi.doUnmock("openclaw/plugin-sdk/state-paths");

View File

@@ -8,7 +8,7 @@
"@clack/prompts": "^1.3.0"
},
"devDependencies": {
"@mariozechner/pi-ai": "0.73.1",
"@earendil-works/pi-ai": "0.74.0",
"@openclaw/plugin-sdk": "workspace:*"
},
"openclaw": {

View File

@@ -1,5 +1,5 @@
import type { StreamFn } from "@mariozechner/pi-agent-core";
import type { Context } from "@mariozechner/pi-ai";
import type { StreamFn } from "@earendil-works/pi-agent-core";
import type { Context } from "@earendil-works/pi-ai";
import type { ProviderWrapStreamFnContext } from "openclaw/plugin-sdk/plugin-entry";
import { buildCopilotIdeHeaders, COPILOT_INTEGRATION_ID } from "openclaw/plugin-sdk/provider-auth";
import {

View File

@@ -1,4 +1,4 @@
import type { Model } from "@mariozechner/pi-ai";
import type { Model } from "@earendil-works/pi-ai";
import { expect } from "vitest";
function makeZeroUsageSnapshot() {

View File

@@ -1,9 +1,9 @@
import type { Context, Tool } from "@mariozechner/pi-ai";
import type { Context, Tool } from "@earendil-works/pi-ai";
import { describe, expect, it } from "vitest";
import {
convertMessages,
convertTools,
} from "../../node_modules/@mariozechner/pi-ai/dist/providers/google-shared.js";
} from "../../node_modules/@earendil-works/pi-ai/dist/providers/google-shared.js";
import {
asRecord,
expectConvertedRoles,

View File

@@ -1,4 +1,4 @@
import type { Context, Model } from "@mariozechner/pi-ai";
import type { Context, Model } from "@earendil-works/pi-ai";
import type {
ProviderReplaySessionEntry,
ProviderSanitizeReplayHistoryContext,

View File

@@ -6,7 +6,7 @@
"type": "module",
"dependencies": {
"@google/genai": "^2.0.1",
"@mariozechner/pi-ai": "0.73.1"
"@earendil-works/pi-ai": "0.74.0"
},
"devDependencies": {
"@openclaw/plugin-sdk": "workspace:*"

View File

@@ -1,7 +1,7 @@
import { mkdir, mkdtemp, writeFile } from "node:fs/promises";
import os from "node:os";
import path from "node:path";
import type { Model } from "@mariozechner/pi-ai";
import type { Model } from "@earendil-works/pi-ai";
import { afterAll, afterEach, beforeAll, beforeEach, describe, expect, it, vi } from "vitest";
const { buildGuardedModelFetchMock, guardedFetchMock } = vi.hoisted(() => ({

View File

@@ -1,4 +1,4 @@
import type { StreamFn } from "@mariozechner/pi-agent-core";
import type { StreamFn } from "@earendil-works/pi-agent-core";
import {
calculateCost,
getEnvApiKey,
@@ -6,7 +6,7 @@ import {
type Model,
type SimpleStreamOptions,
type ThinkingLevel,
} from "@mariozechner/pi-ai";
} from "@earendil-works/pi-ai";
import { createProviderHttpError } from "openclaw/plugin-sdk/provider-http";
import {
buildGuardedModelFetch,

View File

@@ -1,5 +1,5 @@
import type { StreamFn } from "@mariozechner/pi-agent-core";
import type { Context, Model } from "@mariozechner/pi-ai";
import type { StreamFn } from "@earendil-works/pi-agent-core";
import type { Context, Model } from "@earendil-works/pi-ai";
import { registerSingleProviderPlugin } from "openclaw/plugin-sdk/plugin-test-runtime";
import { expectPassthroughReplayPolicy } from "openclaw/plugin-sdk/provider-test-contracts";
import { describe, expect, it } from "vitest";

View File

@@ -5,7 +5,7 @@
"description": "OpenClaw Kimi provider plugin",
"type": "module",
"dependencies": {
"@mariozechner/pi-ai": "0.73.1"
"@earendil-works/pi-ai": "0.74.0"
},
"devDependencies": {
"@openclaw/plugin-sdk": "workspace:*"

View File

@@ -1,5 +1,5 @@
import type { StreamFn } from "@mariozechner/pi-agent-core";
import type { Context, Model } from "@mariozechner/pi-ai";
import type { StreamFn } from "@earendil-works/pi-agent-core";
import type { Context, Model } from "@earendil-works/pi-ai";
import { describe, expect, it } from "vitest";
import {
createKimiThinkingWrapper,

View File

@@ -1,5 +1,5 @@
import type { StreamFn } from "@mariozechner/pi-agent-core";
import { streamSimple } from "@mariozechner/pi-ai";
import type { StreamFn } from "@earendil-works/pi-agent-core";
import { streamSimple } from "@earendil-works/pi-ai";
import type { ProviderWrapStreamFnContext } from "openclaw/plugin-sdk/plugin-entry";
import { streamWithPayloadPatch } from "openclaw/plugin-sdk/provider-stream-shared";
import { normalizeOptionalLowercaseString } from "openclaw/plugin-sdk/string-coerce-runtime";

View File

@@ -5,7 +5,7 @@
"description": "OpenClaw LM Studio provider plugin",
"type": "module",
"dependencies": {
"@mariozechner/pi-ai": "0.73.1"
"@earendil-works/pi-ai": "0.74.0"
},
"openclaw": {
"extensions": [

View File

@@ -1,5 +1,5 @@
import type { StreamFn } from "@mariozechner/pi-agent-core";
import { createAssistantMessageEventStream } from "@mariozechner/pi-ai";
import type { StreamFn } from "@earendil-works/pi-agent-core";
import { createAssistantMessageEventStream } from "@earendil-works/pi-ai";
import { afterAll, afterEach, beforeEach, describe, expect, it, vi } from "vitest";
import { __resetLmstudioPreloadCooldownForTest, wrapLmstudioInferencePreload } from "./stream.js";

View File

@@ -1,5 +1,5 @@
import type { StreamFn } from "@mariozechner/pi-agent-core";
import { createAssistantMessageEventStream, streamSimple } from "@mariozechner/pi-ai";
import type { StreamFn } from "@earendil-works/pi-agent-core";
import { createAssistantMessageEventStream, streamSimple } from "@earendil-works/pi-ai";
import { createSubsystemLogger } from "openclaw/plugin-sdk/logging-core";
import type { ProviderWrapStreamFnContext } from "openclaw/plugin-sdk/plugin-entry";
import { ssrfPolicyFromHttpBaseUrlAllowedHostname } from "openclaw/plugin-sdk/ssrf-runtime";

View File

@@ -1,4 +1,4 @@
import type { AgentToolResult } from "@mariozechner/pi-agent-core";
import type { AgentToolResult } from "@earendil-works/pi-agent-core";
import { normalizeOptionalLowercaseString } from "openclaw/plugin-sdk/string-coerce-runtime";
import { resolveMatrixAccountConfig } from "./matrix/accounts.js";
import {

View File

@@ -1,5 +1,5 @@
import type { StreamFn } from "@mariozechner/pi-agent-core";
import type { Context, Model } from "@mariozechner/pi-ai";
import type { StreamFn } from "@earendil-works/pi-agent-core";
import type { Context, Model } from "@earendil-works/pi-ai";
import {
registerProviderPlugin,
requireRegisteredProvider,

View File

@@ -1,5 +1,5 @@
import fs from "node:fs";
import type { Context, Model } from "@mariozechner/pi-ai";
import type { Context, Model } from "@earendil-works/pi-ai";
import { registerSingleProviderPlugin } from "openclaw/plugin-sdk/plugin-test-runtime";
import { createCapturedThinkingConfigStream } from "openclaw/plugin-sdk/provider-test-contracts";
import { describe, expect, it } from "vitest";

View File

@@ -5,7 +5,7 @@
"description": "OpenClaw Ollama provider plugin",
"type": "module",
"dependencies": {
"@mariozechner/pi-ai": "0.73.1",
"@earendil-works/pi-ai": "0.74.0",
"typebox": "1.1.38"
},
"devDependencies": {

View File

@@ -1,5 +1,5 @@
import { randomUUID } from "node:crypto";
import type { StreamFn } from "@mariozechner/pi-agent-core";
import type { StreamFn } from "@earendil-works/pi-agent-core";
import type {
AssistantMessage,
StopReason,
@@ -8,8 +8,8 @@ import type {
ToolCall,
Tool,
Usage,
} from "@mariozechner/pi-ai";
import { createAssistantMessageEventStream, streamSimple } from "@mariozechner/pi-ai";
} from "@earendil-works/pi-ai";
import { createAssistantMessageEventStream, streamSimple } from "@earendil-works/pi-ai";
import { formatErrorMessage } from "openclaw/plugin-sdk/error-runtime";
import type {
OpenClawConfig,

View File

@@ -32,7 +32,7 @@ vi.mock("openclaw/plugin-sdk/runtime-env", async () => {
};
});
vi.mock("@mariozechner/pi-ai/oauth", () => ({
vi.mock("@earendil-works/pi-ai/oauth", () => ({
getOAuthApiKey: vi.fn(),
getOAuthProviders: () => [],
loginOpenAICodex: vi.fn(),

View File

@@ -1,5 +1,5 @@
import type { StreamFn } from "@mariozechner/pi-agent-core";
import { streamSimple } from "@mariozechner/pi-ai";
import type { StreamFn } from "@earendil-works/pi-agent-core";
import { streamSimple } from "@earendil-works/pi-ai";
import type { OpenClawConfig } from "openclaw/plugin-sdk/config-contracts";
import { normalizeProviderId } from "openclaw/plugin-sdk/provider-model-shared";
import { streamWithPayloadPatch } from "openclaw/plugin-sdk/provider-stream-shared";

View File

@@ -1,5 +1,5 @@
import path from "node:path";
import { loginOpenAICodex, type OAuthCredentials } from "@mariozechner/pi-ai/oauth";
import { loginOpenAICodex, type OAuthCredentials } from "@earendil-works/pi-ai/oauth";
import { formatErrorMessage } from "openclaw/plugin-sdk/error-runtime";
import type { ProviderAuthContext } from "openclaw/plugin-sdk/plugin-entry";
import { ensureGlobalUndiciEnvProxyDispatcher } from "openclaw/plugin-sdk/runtime-env";

View File

@@ -1,7 +1,7 @@
import {
getOAuthApiKey as getOAuthApiKeyFromPi,
refreshOpenAICodexToken as refreshOpenAICodexTokenFromPi,
} from "@mariozechner/pi-ai/oauth";
} from "@earendil-works/pi-ai/oauth";
import { ensureGlobalUndiciEnvProxyDispatcher } from "openclaw/plugin-sdk/runtime-env";
type OpenAICodexProviderRuntimeDeps = {

View File

@@ -1,4 +1,4 @@
import { getModel, type Api, type Model } from "@mariozechner/pi-ai";
import { getModel, type Api, type Model } from "@earendil-works/pi-ai";
import OpenAI from "openai";
import type { ProviderRuntimeModel } from "openclaw/plugin-sdk/plugin-entry";
import { describe, expect, it } from "vitest";

View File

@@ -1,5 +1,5 @@
import type { StreamFn } from "@mariozechner/pi-agent-core";
import type { Context, Model, SimpleStreamOptions } from "@mariozechner/pi-ai";
import type { StreamFn } from "@earendil-works/pi-agent-core";
import type { Context, Model, SimpleStreamOptions } from "@earendil-works/pi-ai";
import { beforeEach, describe, expect, it, vi } from "vitest";
import { buildOpenAICodexProviderPlugin } from "./openai-codex-provider.js";
import { buildOpenAIProvider } from "./openai-provider.js";

View File

@@ -1,8 +1,8 @@
import fs from "node:fs/promises";
import os from "node:os";
import path from "node:path";
import { getModel, type Api, type Model } from "@mariozechner/pi-ai";
import { AuthStorage, ModelRegistry } from "@mariozechner/pi-coding-agent";
import { getModel, type Api, type Model } from "@earendil-works/pi-ai";
import { AuthStorage, ModelRegistry } from "@earendil-works/pi-coding-agent";
import OpenAI from "openai";
import type { ResolvedTtsConfig } from "openclaw/plugin-sdk/agent-runtime";
import type { OpenClawConfig } from "openclaw/plugin-sdk/config-contracts";

View File

@@ -68,7 +68,7 @@ function providerWizardByKey() {
describe("OpenAI plugin manifest", () => {
it("keeps runtime dependencies in the package manifest", () => {
expect(packageJson.dependencies?.["@mariozechner/pi-ai"]).toBe("0.73.1");
expect(packageJson.dependencies?.["@earendil-works/pi-ai"]).toBe("0.74.0");
expect(packageJson.dependencies?.ws).toBe("^8.20.0");
});

View File

@@ -5,7 +5,7 @@
"description": "OpenClaw OpenAI provider plugins",
"type": "module",
"dependencies": {
"@mariozechner/pi-ai": "0.73.1",
"@earendil-works/pi-ai": "0.74.0",
"ws": "^8.20.0"
},
"devDependencies": {

View File

@@ -1,4 +1,4 @@
import { getModels } from "@mariozechner/pi-ai";
import { getModels } from "@earendil-works/pi-ai";
import {
registerProviderPlugin,
registerSingleProviderPlugin,

View File

@@ -1,4 +1,4 @@
import type { ProviderStreamOptions } from "@mariozechner/pi-ai";
import type { ProviderStreamOptions } from "@earendil-works/pi-ai";
import {
describeImageWithModelPayloadTransform,
describeImagesWithModelPayloadTransform,

View File

@@ -196,7 +196,7 @@ describe("openrouter provider hooks", () => {
it("injects provider routing into compat before applying stream wrappers", async () => {
const provider = await registerSingleProviderPlugin(openrouterPlugin);
const baseStreamFn = vi.fn(
(..._args: Parameters<import("@mariozechner/pi-agent-core").StreamFn>) =>
(..._args: Parameters<import("@earendil-works/pi-agent-core").StreamFn>) =>
({ async *[Symbol.asyncIterator]() {} }) as never,
);
@@ -235,8 +235,8 @@ describe("openrouter provider hooks", () => {
let capturedPayload: Record<string, unknown> | undefined;
const baseStreamFn = vi.fn(
(
...args: Parameters<import("@mariozechner/pi-agent-core").StreamFn>
): ReturnType<import("@mariozechner/pi-agent-core").StreamFn> => {
...args: Parameters<import("@earendil-works/pi-agent-core").StreamFn>
): ReturnType<import("@earendil-works/pi-agent-core").StreamFn> => {
void args[2]?.onPayload?.({}, args[0]);
return { async *[Symbol.asyncIterator]() {} } as never;
},
@@ -274,8 +274,8 @@ describe("openrouter provider hooks", () => {
let capturedPayload: Record<string, unknown> | undefined;
const baseStreamFn = vi.fn(
(
...args: Parameters<import("@mariozechner/pi-agent-core").StreamFn>
): ReturnType<import("@mariozechner/pi-agent-core").StreamFn> => {
...args: Parameters<import("@earendil-works/pi-agent-core").StreamFn>
): ReturnType<import("@earendil-works/pi-agent-core").StreamFn> => {
const payload = {
messages: [
{ role: "user", content: "read file" },
@@ -329,8 +329,8 @@ describe("openrouter provider hooks", () => {
const payloads: Array<Record<string, unknown>> = [];
const baseStreamFn = vi.fn(
(
...args: Parameters<import("@mariozechner/pi-agent-core").StreamFn>
): ReturnType<import("@mariozechner/pi-agent-core").StreamFn> => {
...args: Parameters<import("@earendil-works/pi-agent-core").StreamFn>
): ReturnType<import("@earendil-works/pi-agent-core").StreamFn> => {
const payload = { messages: [] };
void args[2]?.onPayload?.(payload, args[0]);
payloads.push(payload);
@@ -373,8 +373,8 @@ describe("openrouter provider hooks", () => {
const payloads: Array<Record<string, unknown>> = [];
const baseStreamFn = vi.fn(
(
...args: Parameters<import("@mariozechner/pi-agent-core").StreamFn>
): ReturnType<import("@mariozechner/pi-agent-core").StreamFn> => {
...args: Parameters<import("@earendil-works/pi-agent-core").StreamFn>
): ReturnType<import("@earendil-works/pi-agent-core").StreamFn> => {
const payload = {
messages: [{ role: "assistant", tool_calls: [{ id: "call_1", type: "function" }] }],
};
@@ -437,8 +437,8 @@ describe("openrouter provider hooks", () => {
let capturedPayload: Record<string, unknown> | undefined;
const baseStreamFn = vi.fn(
(
...args: Parameters<import("@mariozechner/pi-agent-core").StreamFn>
): ReturnType<import("@mariozechner/pi-agent-core").StreamFn> => {
...args: Parameters<import("@earendil-works/pi-agent-core").StreamFn>
): ReturnType<import("@earendil-works/pi-agent-core").StreamFn> => {
const payload = {
messages: [
{ role: "user", content: "Return JSON." },
@@ -480,8 +480,8 @@ describe("openrouter provider hooks", () => {
const payloads: Array<Record<string, unknown>> = [];
const baseStreamFn = vi.fn(
(
...args: Parameters<import("@mariozechner/pi-agent-core").StreamFn>
): ReturnType<import("@mariozechner/pi-agent-core").StreamFn> => {
...args: Parameters<import("@earendil-works/pi-agent-core").StreamFn>
): ReturnType<import("@earendil-works/pi-agent-core").StreamFn> => {
const payload = {
messages: [
{ role: "user", content: "Return JSON." },

View File

@@ -1,4 +1,4 @@
import { AuthStorage, ModelRegistry } from "@mariozechner/pi-coding-agent";
import { AuthStorage, ModelRegistry } from "@earendil-works/pi-coding-agent";
import OpenAI from "openai";
import {
registerProviderPlugin,

View File

@@ -1,4 +1,4 @@
import type { StreamFn } from "@mariozechner/pi-agent-core";
import type { StreamFn } from "@earendil-works/pi-agent-core";
import type { ProviderWrapStreamFnContext } from "openclaw/plugin-sdk/plugin-entry";
import { OPENROUTER_THINKING_STREAM_HOOKS } from "openclaw/plugin-sdk/provider-stream-family";
import {

View File

@@ -1,5 +1,5 @@
import type { StreamFn } from "@mariozechner/pi-agent-core";
import type { Context, Model } from "@mariozechner/pi-ai";
import type { StreamFn } from "@earendil-works/pi-agent-core";
import type { Context, Model } from "@earendil-works/pi-ai";
import { describe, expect, it } from "vitest";
import { createQwenThinkingWrapper, wrapQwenProviderStream } from "./stream.js";

View File

@@ -1,4 +1,4 @@
import type { StreamFn } from "@mariozechner/pi-agent-core";
import type { StreamFn } from "@earendil-works/pi-agent-core";
import type { ProviderWrapStreamFnContext } from "openclaw/plugin-sdk/plugin-entry";
import { normalizeProviderId } from "openclaw/plugin-sdk/provider-model-shared";
import {

View File

@@ -1,4 +1,4 @@
import type { AgentToolResult } from "@mariozechner/pi-agent-core";
import type { AgentToolResult } from "@earendil-works/pi-agent-core";
import { readBooleanParam } from "openclaw/plugin-sdk/boolean-param";
import { isSingleUseReplyToMode } from "openclaw/plugin-sdk/reply-reference";
import { normalizeLowercaseStringOrEmpty } from "openclaw/plugin-sdk/string-coerce-runtime";

View File

@@ -1,4 +1,4 @@
import type { AgentToolResult } from "@mariozechner/pi-agent-core";
import type { AgentToolResult } from "@earendil-works/pi-agent-core";
import type { ChannelMessageActionAdapter } from "openclaw/plugin-sdk/channel-contract";
import type { SlackActionContext } from "./action-runtime.js";
import { handleSlackMessageAction } from "./message-action-dispatch.js";

View File

@@ -1,4 +1,4 @@
import type { AgentToolResult } from "@mariozechner/pi-agent-core";
import type { AgentToolResult } from "@earendil-works/pi-agent-core";
import { readBooleanParam } from "openclaw/plugin-sdk/boolean-param";
import type { ChannelMessageActionContext } from "openclaw/plugin-sdk/channel-contract";
import {

View File

@@ -1,4 +1,4 @@
import type { AgentToolResult } from "@mariozechner/pi-agent-core";
import type { AgentToolResult } from "@earendil-works/pi-agent-core";
import { readBooleanParam } from "openclaw/plugin-sdk/boolean-param";
import {
jsonResult,

View File

@@ -1,4 +1,4 @@
import type { ModelRegistry } from "@mariozechner/pi-coding-agent";
import type { ModelRegistry } from "@earendil-works/pi-coding-agent";
import type {
ProviderCatalogContext,
ProviderResolveDynamicModelContext,

View File

@@ -8,9 +8,9 @@
},
"type": "module",
"dependencies": {
"@twurple/api": "^8.1.3",
"@twurple/auth": "^8.1.3",
"@twurple/chat": "^8.1.3",
"@twurple/api": "^8.1.4",
"@twurple/auth": "^8.1.4",
"@twurple/chat": "^8.1.4",
"zod": "^4.4.3"
},
"devDependencies": {

Some files were not shown because too many files have changed in this diff Show More