Compare commits

..

14 Commits

Author SHA1 Message Date
Nikhil Sonti
7b607e4fdc refactor: rework 0327-harden_cli_installers based on feedback 2026-03-27 11:10:14 -07:00
Nikhil Sonti
3c335d832f fix: harden cli installer bootstrap 2026-03-27 10:59:01 -07:00
Nikhil
39a7d49c25 feat: add workspace-centric bdev cli (#585)
* fix: clean-up bdev

* feat: add workspace-centric bdev cli

* fix: address review comments for 0326-bdev_cli_redesign

* fix: address review feedback for PR #585

* fix: address review feedback for PR #585
2026-03-27 08:48:23 -07:00
shivammittal274
ed948f4b59 Feat/cli launch ready v2 (#600)
* test: temporarily allow release workflow on any branch

* fix(cli): restore main-only guard, remove goreleaser dependency

Replaces GoReleaser (Pro-only monorepo feature) with plain go build.
Tested: RC release created successfully on branch with all 6 binaries.

* fix(cli): fix hdiutil mount detection, update README with install/launch/init flow
2026-03-27 20:20:17 +05:30
shivammittal274
aad5bc16fd Feat/cli launch ready v2 (#599)
* test: temporarily allow release workflow on any branch

* fix(cli): restore main-only guard, remove goreleaser dependency

Replaces GoReleaser (Pro-only monorepo feature) with plain go build.
Tested: RC release created successfully on branch with all 6 binaries.

* fix(cli): remove -quiet from hdiutil so mount point is detected
2026-03-27 20:17:13 +05:30
Dani Akash
cee318a40b fix: improve chat history freshness and reduce query payload (#598)
* fix: add refresh indicator to chat history when fetching latest conversations

Show a non-blocking "Fetching latest conversations" indicator at the top
of the history list while the cached data is being refreshed. Users can
still interact with the cached conversation list during the refresh.

* perf: reduce chat history query payload — fetch last 2 messages instead of 5

The conversation list only displays the last user message as a preview.
Fetching 5 messages per conversation was wasteful — each message contains
the full UIMessage object (tool calls, reasoning, etc.) multiplied by
50 conversations per page. Reduced to last 2 which is sufficient to
find the last user message in a user→assistant exchange.

* perf: use first+DESC instead of last+ASC to push LIMIT down to SQL

PostGraphile's `last: N` doesn't map to SQL LIMIT — it uses a padded
LIMIT 10 and slices in application code. Changing to `first: 2` with
ORDER_INDEX_DESC generates a true SQL LIMIT 2, reducing rows scanned
from 500 to 100 per page (50 conversations × 2 vs 10 messages each).

No UX impact — extractLastUserMessage() filters by role regardless
of message order.

* chore: update react query packages

* feat: replace localforage with idb-keyval
2026-03-27 19:49:47 +05:30
Dani Akash
febaf58f91 fix: guard filesystem tools behind workspace selection and handle mid-conversation changes (#595)
* fix: remove filesystem tools when no workspace is selected

- Make workingDir optional on ResolvedAgentConfig
- Remove resolveSessionDir() fallback that always created a session dir,
  masking the no-workspace state and keeping filesystem tools available
- Gate buildFilesystemToolSet() on workingDir being defined
- Add workspace change detection mid-conversation — rebuilds the agent
  session when workspace is added, removed, or switched (same pattern
  as existing MCP server change detection)
- download_file falls back to tmpdir() when no workspace is set
- Memory/soul tools are unaffected — they use ~/BrowserOS/ paths

* fix: sanitize message history when session rebuilds with different tools

When a session is rebuilt due to workspace or MCP changes, the carried-over
message history may contain tool parts for tools that no longer exist in
the new session. The AI SDK validates messages against the current toolset
and rejects parts with no matching schema.

- Add toolNames getter to AiSdkAgent exposing registered tool names
- Add sanitizeMessagesForToolset() to strip tool parts referencing
  removed tools from carried-over messages
- Apply sanitization in both MCP and workspace session rebuilds

* fix: prepend tool-change context to user message on session rebuild

When workspace or MCP integrations change mid-conversation, prepend a
[Context: ...] block to the user's message explaining what changed.
This prevents the LLM from hallucinating tool usage based on patterns
in the carried-over conversation history.

Context messages vary by change type:
- Workspace removed: lists unavailable filesystem tools, suggests
  selecting a working directory
- Workspace added: confirms filesystem tools are available with path
- Workspace switched: notes the new working directory
- MCP changed: notes that some integration tools may have changed

Only fires on the first message after a rebuild. Invisible in the UI.

* fix: make MCP change context specific about which apps were added/removed

Diff the old and new MCP server keys to produce specific context like:
- "The following app integrations were disconnected: Gmail, Slack."
- "The following app integrations were connected: Linear."
instead of a generic "some tools may no longer be available" message.

* refactor: extract shared rebuildSession helper in ChatService

Eliminates the duplicated 20-line dispose→create→sanitize→store flow
that existed separately in both the MCP and workspace change-detection
blocks.

Co-authored-by: Dani Akash <DaniAkash@users.noreply.github.com>

* test: add sanitizeMessagesForToolset test suite

Tests for the message sanitization that runs when a session rebuilds
with a different toolset (workspace or MCP change mid-conversation):

- Preserves messages with no tool parts
- Preserves tool parts when tool is in the toolset
- Strips tool parts when tool is NOT in the toolset
- Strips multiple removed tool parts from same message
- Keeps browser tools while removing filesystem tools
- Removes messages that become empty after stripping
- Preserves non-tool parts (reasoning, step-start, file)
- Returns same references when no filtering needed
- Handles empty message array and empty toolset

* style: fix biome formatting in chat-service.ts

---------

Co-authored-by: claude[bot] <41898282+claude[bot]@users.noreply.github.com>
2026-03-27 18:30:25 +05:30
Dani Akash
aacb47f7ee feat: isolate new-tab agent navigation from origin tab (#593)
* feat: isolate new-tab agent navigation from origin tab

Add origin-aware navigation isolation so the agent never navigates
away from the new-tab chat UI. This is a two-layer defense:

1. Prompt adaptation: When origin is 'newtab', the system prompt's
   execution and tool-selection sections are rewritten to prohibit
   navigating the active tab and default all lookups to new_page.

2. Tool-level guards: navigate_page and close_page reject attempts
   to act on the origin tab when in newtab mode, returning an error
   that teaches the agent to self-correct.

The client now sends an `origin` field ('sidepanel' | 'newtab')
instead of injecting a soft NEWTAB_SYSTEM_PROMPT that LLMs could
ignore. Backwards compatible — defaults to 'sidepanel'.

Closes TKT-592, addresses TKT-564

* test: add newtab origin navigation guard tests

- 14 new prompt tests verifying the system prompt adapts correctly
  for newtab vs sidepanel origin (execution rules, tool selection table,
  absence of conflicting single-tab guidance)
- 6 new integration tests for navigate_page and close_page guards:
  rejects origin tab in newtab mode, allows non-origin tabs, allows
  all tabs in sidepanel mode, backwards compatible with no session
2026-03-27 12:06:32 +05:30
Dani Akash
b3003542d8 docs: overhaul READMEs across all major packages (#594)
* docs: overhaul READMEs across all major packages

- Root README: restructure with feature table, LLM provider table,
  comparison matrix, architecture map, and docs link
- New: packages/browseros/README.md (Chromium fork build system)
- New: apps/server/README.md (MCP server + agent loop)
- New: packages/cdp-protocol/README.md (CDP type bindings)
- Polish: agent-sdk (badges, prerequisites, multi-step example, links)
- Polish: cli (badges, install section, MCP server section, links)
- Polish: agent extension (badges, WXT mention, architecture context)
- Polish: eval (badges, paper links)

* fix: address review — consistent tool count and correct default port

- CLI README: "54 MCP tools" → "53+ MCP tools" to match root and server docs
- Agent SDK README: localhost:3000 → localhost:9100 to match documented default

* docs: add detailed comparison links to How We Compare section

* docs: update comparison table with verified competitor data

Research all 5 competitors via official websites and docs:
- Chrome: no AI agent, Gemini Nano only, MV3 weakening ad blocking
- Brave: BYOM feature, local models via BYOM, Shields ad blocking, MV2+MV3
- Dia: Skills-based AI, no BYOK, cloud AI, acquired by Atlassian
- Comet: full cloud-based agent, built-in ad blocking, extensions on desktop
- Atlas: standalone Chromium browser with Agent Mode, 30-day cloud memory

Renamed Arc/Dia column to just Dia (Arc is sunset).

* docs: simplify comparison table with clean checkmarks and key differentiators

* docs: update browseros-agent README — remove submodule note, add missing packages
2026-03-27 11:59:04 +05:30
Nikhil
aba7a10430 chore: server release (#592) 2026-03-26 19:13:56 -07:00
Nikhil
b7462aa042 fix(cli): move install instructions below What's Changed in release notes (#591)
The installer block was appearing above the changelog. Reorder so
What's Changed comes first and install instructions follow.
2026-03-26 18:16:23 -07:00
Nikhil
883bcc9670 fix: clean up README CLI wording and add Vertical Tabs feature (#590)
- Simplify CLI section: remove confusing MCP jargon, clarify it works
  from terminal and AI coding agents
- Replace "point the CLI at your MCP server" with plain language
- Add Vertical Tabs to the features list
2026-03-26 18:05:54 -07:00
Nikhil
279b41fdc4 feat(cli): add install commands to GitHub release notes (#589)
* feat(cli): add install commands to release notes

* fix(cli): add install header to release workflow
2026-03-26 18:04:58 -07:00
Nikhil
220577b41c feat: add CDN-hosted CLI installer flow (#588)
* feat: add CDN upload flow for cli installers

* fix: move cli install docs to top-level readme

* fix: bun.lock update
2026-03-26 17:41:03 -07:00
116 changed files with 5268 additions and 4452 deletions

View File

@@ -74,31 +74,52 @@ jobs:
run: |
CLI_PATH="packages/browseros-agent/apps/cli"
TAG="browseros-cli-v${{ inputs.version }}"
CHANGELOG_FILE="/tmp/release-changelog.md"
PREV_TAG=$(git tag -l "browseros-cli-v*" --sort=-v:refname | grep -v "^${TAG}$" | head -n 1)
if [ -z "$PREV_TAG" ]; then
echo "Initial release of browseros-cli." > /tmp/release-notes.md
echo "Initial release of browseros-cli." > "$CHANGELOG_FILE"
else
COMMITS=$(git log "$PREV_TAG"..HEAD --pretty=format:"%H" -- "$CLI_PATH")
if [ -z "$COMMITS" ]; then
echo "No notable changes." > /tmp/release-notes.md
echo "No notable changes." > "$CHANGELOG_FILE"
else
echo "## What's Changed" > /tmp/release-notes.md
echo "" >> /tmp/release-notes.md
echo "## What's Changed" > "$CHANGELOG_FILE"
echo "" >> "$CHANGELOG_FILE"
while IFS= read -r SHA; do
SUBJECT=$(git log -1 --pretty=format:"%s" "$SHA")
PR_NUM=$(gh api "/repos/${{ github.repository }}/commits/${SHA}/pulls" --jq '.[0].number // empty' 2>/dev/null)
if [ -n "$PR_NUM" ] && ! echo "$SUBJECT" | grep -qF "(#${PR_NUM})"; then
echo "- ${SUBJECT} (#${PR_NUM})" >> /tmp/release-notes.md
echo "- ${SUBJECT} (#${PR_NUM})" >> "$CHANGELOG_FILE"
else
echo "- ${SUBJECT}" >> /tmp/release-notes.md
echo "- ${SUBJECT}" >> "$CHANGELOG_FILE"
fi
done <<< "$COMMITS"
fi
fi
cat "$CHANGELOG_FILE" > /tmp/release-notes.md
cat >> /tmp/release-notes.md <<'EOF'
## Install `browseros-cli`
### macOS / Linux
```bash
curl -fsSL https://cdn.browseros.com/cli/install.sh | bash
```
### Windows
```powershell
irm https://cdn.browseros.com/cli/install.ps1 | iex
```
After install, run `browseros-cli init` to point the CLI at your BrowserOS MCP server.
EOF
working-directory: ${{ github.workspace }}
- name: Create tag and release

231
README.md
View File

@@ -6,6 +6,7 @@
[![Slack](https://img.shields.io/badge/Slack-Join%20us-4A154B?logo=slack&logoColor=white)](https://dub.sh/browserOS-slack)
[![Twitter](https://img.shields.io/twitter/follow/browserOS_ai?style=social)](https://twitter.com/browseros_ai)
[![License: AGPL v3](https://img.shields.io/badge/License-AGPL%20v3-blue.svg)](LICENSE)
[![Docs](https://img.shields.io/badge/Docs-docs.browseros.com-blue)](https://docs.browseros.com)
<br></br>
<a href="https://files.browseros.com/download/BrowserOS.dmg">
<img src="https://img.shields.io/badge/Download-macOS-black?style=flat&logo=apple&logoColor=white" alt="Download for macOS (beta)" />
@@ -22,163 +23,183 @@
<br />
</div>
##
🌐 BrowserOS is an open-source Chromium fork that runs AI agents natively. **The privacy-first alternative to ChatGPT Atlas, Perplexity Comet, and Dia.**
BrowserOS is an open-source Chromium fork that runs AI agents natively. **The privacy-first alternative to ChatGPT Atlas, Perplexity Comet, and Dia.**
🔒 Use your own API keys or run local models with Ollama. Your data never leaves your machine.
Use your own API keys or run local models with Ollama. Your data never leaves your machine.
💡 Join our [Discord](https://discord.gg/YKwjt5vuKr) or [Slack](https://dub.sh/browserOS-slack) and help us build! Have feature requests? [Suggest here](https://github.com/browseros-ai/BrowserOS/issues/99).
> **[Documentation](https://docs.browseros.com)** · **[Discord](https://discord.gg/YKwjt5vuKr)** · **[Slack](https://dub.sh/browserOS-slack)** · **[Twitter](https://x.com/browserOS_ai)** · **[Feature Requests](https://github.com/browseros-ai/BrowserOS/issues/99)**
## Quick start
## Quick Start
1. Download and install BrowserOS:
- [macOS](https://files.browseros.com/download/BrowserOS.dmg)
- [Windows](https://files.browseros.com/download/BrowserOS_installer.exe)
- [Linux (AppImage)](https://files.browseros.com/download/BrowserOS.AppImage)
- [Linux (Debian)](https://cdn.browseros.com/download/BrowserOS.deb)
1. **Download and install** BrowserOS — [macOS](https://files.browseros.com/download/BrowserOS.dmg) · [Windows](https://files.browseros.com/download/BrowserOS_installer.exe) · [Linux (AppImage)](https://files.browseros.com/download/BrowserOS.AppImage) · [Linux (Debian)](https://cdn.browseros.com/download/BrowserOS.deb)
2. **Import your Chrome data** (optional) — bookmarks, passwords, extensions all carry over
3. **Connect your AI provider** — Claude, OpenAI, Gemini, ChatGPT Pro via OAuth, or local models via Ollama/LM Studio
2. Import your Chrome data (optional)
## Features
3. Connect your AI provider — use Claude, OpenAI, Gemini, or local models via Ollama and LMStudio.
4. Start automating!
## Install `browseros-cli`
Use `browseros-cli` when you want to control BrowserOS from the terminal or scripts via the BrowserOS MCP server.
### macOS / Linux
```bash
curl -fsSL https://cdn.browseros.com/cli/install.sh | bash
```
### Windows
```powershell
irm https://cdn.browseros.com/cli/install.ps1 | iex
```
After install, run `browseros-cli init` to point the CLI at your BrowserOS MCP server.
## What makes BrowserOS special
- 🏠 Feels like home — same Chrome interface, all your extensions just work
- 🤖 AI agents that run on YOUR browser, not in the cloud
- 🔒 Privacy first — bring your own keys or run local models with Ollama. Your browsing history stays on your machine
- 🤝 [BrowserOS as MCP server](https://docs.browseros.com/features/use-with-claude-code) — control the browser from `claude-code`, `gemini-cli`, or any MCP client (31 tools)
- 🔄 [Workflows](https://docs.browseros.com/features/workflows) — build repeatable browser automations with a visual graph builder
- 📂 [Cowork](https://docs.browseros.com/features/cowork) — combine browser automation with local file operations. Research the web, save reports to your folder
- ⏰ [Scheduled Tasks](https://docs.browseros.com/features/scheduled-tasks) — run the agent on autopilot, daily or every few minutes
- 💬 [LLM Hub](https://docs.browseros.com/features/llm-chat-hub) — compare Claude, ChatGPT, and Gemini side-by-side on any page
- 🛡️ Built-in ad blocker — [10x more protection than Chrome](https://docs.browseros.com/features/ad-blocking) with uBlock Origin + Manifest V2 support
- 🚀 100% open source under AGPL-3.0
| Feature | Description | Docs |
|---------|-------------|------|
| **AI Agent** | 53+ browser automation tools — navigate, click, type, extract data, all with natural language | [Guide](https://docs.browseros.com/getting-started) |
| **MCP Server** | Control the browser from Claude Code, Gemini CLI, or any MCP client | [Setup](https://docs.browseros.com/features/use-with-claude-code) |
| **Workflows** | Build repeatable browser automations with a visual graph builder | [Docs](https://docs.browseros.com/features/workflows) |
| **Cowork** | Combine browser automation with local file operations — research the web, save reports to your folder | [Docs](https://docs.browseros.com/features/cowork) |
| **Scheduled Tasks** | Run agents on autopilot — daily, hourly, or every few minutes | [Docs](https://docs.browseros.com/features/scheduled-tasks) |
| **Memory** | Persistent memory across conversations — your assistant remembers context over time | [Docs](https://docs.browseros.com/features/memory) |
| **SOUL.md** | Define your AI's personality and instructions in a single markdown file | [Docs](https://docs.browseros.com/features/soul-md) |
| **LLM Hub** | Compare Claude, ChatGPT, and Gemini responses side-by-side on any page | [Docs](https://docs.browseros.com/features/llm-chat-hub) |
| **40+ App Integrations** | Gmail, Slack, GitHub, Linear, Notion, Figma, Salesforce, and more via MCP | [Docs](https://docs.browseros.com/features/connect-apps) |
| **Vertical Tabs** | Side-panel tab management — stay organized even with 100+ tabs open | [Docs](https://docs.browseros.com/features/vertical-tabs) |
| **Ad Blocking** | uBlock Origin + Manifest V2 support — [10x more protection](https://docs.browseros.com/features/ad-blocking) than Chrome | [Docs](https://docs.browseros.com/features/ad-blocking) |
| **Cloud Sync** | Sync browser config and agent history across devices | [Docs](https://docs.browseros.com/features/sync) |
| **Skills** | Custom instruction sets that shape how your AI assistant behaves | [Docs](https://docs.browseros.com/features/skills) |
| **Smart Nudges** | Contextual suggestions to connect apps and use features at the right moment | [Docs](https://docs.browseros.com/features/smart-nudges) |
## Demos
### 🤖 BrowserOS agent in action
### BrowserOS agent in action
[![BrowserOS agent in action](docs/videos/browserOS-agent-in-action.gif)](https://www.youtube.com/watch?v=SoSFev5R5dI)
<br/><br/>
### 🎇 Install [BrowserOS as MCP](https://docs.browseros.com/features/use-with-claude-code) and control it from `claude-code`
### Install [BrowserOS as MCP](https://docs.browseros.com/features/use-with-claude-code) and control it from `claude-code`
https://github.com/user-attachments/assets/c725d6df-1a0d-40eb-a125-ea009bf664dc
<br/><br/>
### 💬 Use BrowserOS to chat
### Use BrowserOS to chat
https://github.com/user-attachments/assets/726803c5-8e36-420e-8694-c63a2607beca
<br/><br/>
### Use BrowserOS to scrape data
### Use BrowserOS to scrape data
https://github.com/user-attachments/assets/9f038216-bc24-4555-abf1-af2adcb7ebc0
<br/><br/>
## Why We're Building BrowserOS
## Install `browseros-cli`
For the first time since Netscape pioneered the web in 1994, AI gives us the chance to completely reimagine the browser. We've seen tools like Cursor deliver 10x productivity gains for developers—yet everyday browsing remains frustratingly archaic.
Use `browseros-cli` to launch and control BrowserOS from the terminal or from AI coding agents like Claude Code.
You're likely juggling 70+ tabs, battling your browser instead of having it assist you. Routine tasks, like ordering something from amazon or filling a form should be handled seamlessly by AI agents.
**macOS / Linux:**
At BrowserOS, we're convinced that AI should empower you by automating tasks locally and securely—keeping your data private. We are building the best browser for this future!
```bash
curl -fsSL https://cdn.browseros.com/cli/install.sh | bash
```
## How we compare
**Windows:**
<details>
<summary><b>vs Chrome</b></summary>
<br>
While we're grateful for Google open-sourcing Chromium, but Chrome hasn't evolved much in 10 years. No AI features, no automation, no MCP support.
</details>
```powershell
irm https://cdn.browseros.com/cli/install.ps1 | iex
```
<details>
<summary><b>vs Brave</b></summary>
<br>
We love what Brave started, but they've spread themselves too thin with crypto, search, VPNs. We're laser-focused on AI-powered browsing.
</details>
After install, run `browseros-cli init` to connect the CLI to your running BrowserOS instance.
<details>
<summary><b>vs Arc/Dia</b></summary>
<br>
Many loved Arc, but it was closed source. When they abandoned users, there was no recourse. We're 100% open source - fork it anytime!
</details>
## LLM Providers
<details>
<summary><b>vs Perplexity Comet</b></summary>
<br>
They're a search/ad company. Your browser history becomes their product. We keep everything local.
</details>
BrowserOS works with any LLM. Bring your own keys, use OAuth, or run models locally.
<details>
<summary><b>vs ChatGPT Atlas</b></summary>
<br>
Your browsing data could be used for ads or to train their models. We keep your history and agent interactions strictly local.
</details>
| Provider | Type | Auth |
|----------|------|------|
| Kimi K2.5 | Cloud (default) | Built-in |
| ChatGPT Pro/Plus | Cloud | [OAuth](https://docs.browseros.com/features/chatgpt) |
| GitHub Copilot | Cloud | [OAuth](https://docs.browseros.com/features/github-copilot) |
| Qwen Code | Cloud | [OAuth](https://docs.browseros.com/features/qwen-code) |
| Claude (Anthropic) | Cloud | API key |
| GPT-4o / o3 (OpenAI) | Cloud | API key |
| Gemini (Google) | Cloud | API key |
| Azure OpenAI | Cloud | API key |
| AWS Bedrock | Cloud | IAM credentials |
| OpenRouter | Cloud | API key |
| Ollama | Local | [Setup](https://docs.browseros.com/features/ollama) |
| LM Studio | Local | [Setup](https://docs.browseros.com/features/lm-studio) |
## How We Compare
| | BrowserOS | Chrome | Brave | Dia | Comet | Atlas |
|---|:---:|:---:|:---:|:---:|:---:|:---:|
| Open Source | ✅ | ❌ | ✅ | ❌ | ❌ | ❌ |
| AI Agent | ✅ | ❌ | ❌ | ❌ | ✅ | ✅ |
| MCP Server | ✅ | ❌ | ❌ | ❌ | ❌ | ❌ |
| Visual Workflows | ✅ | ❌ | ❌ | ❌ | ❌ | ❌ |
| Cowork (files + browser) | ✅ | ❌ | ❌ | ❌ | ❌ | ❌ |
| Scheduled Tasks | ✅ | ❌ | ❌ | ❌ | ❌ | ❌ |
| Bring Your Own Keys | ✅ | ❌ | ✅ | ❌ | ❌ | ❌ |
| Local Models (Ollama) | ✅ | ❌ | ✅ | ❌ | ❌ | ❌ |
| Local-first Privacy | ✅ | ❌ | ✅ | ❌ | ❌ | ❌ |
| Ad Blocking (MV2) | ✅ | ❌ | ✅ | ❌ | ✅ | ❌ |
**Detailed comparisons:**
- [BrowserOS vs Chrome DevTools MCP](https://docs.browseros.com/comparisons/chrome-devtools-mcp) — developer-focused comparison for browser automation
- [BrowserOS vs Claude Cowork](https://docs.browseros.com/comparisons/claude-cowork) — getting real work done with AI
- [BrowserOS vs OpenClaw](https://docs.browseros.com/comparisons/openclaw) — everyday AI assistance
## Architecture
BrowserOS is a monorepo with two main subsystems: the **browser** (Chromium fork) and the **agent platform** (TypeScript/Go).
```
BrowserOS/
├── packages/browseros/ # Chromium fork + build system (Python)
│ ├── chromium_patches/ # Patches applied to Chromium source
│ ├── build/ # Build CLI and modules
│ └── resources/ # Icons, entitlements, signing
├── packages/browseros-agent/ # Agent platform (TypeScript/Go)
│ ├── apps/
│ │ ├── server/ # MCP server + AI agent loop (Bun)
│ │ ├── agent/ # Browser extension UI (WXT + React)
│ │ ├── cli/ # CLI tool (Go)
│ │ ├── eval/ # Benchmark framework
│ │ └── controller-ext/ # Chrome API bridge extension
│ │
│ └── packages/
│ ├── agent-sdk/ # Node.js SDK (npm: @browseros-ai/agent-sdk)
│ ├── cdp-protocol/ # CDP type bindings
│ └── shared/ # Shared constants
```
| Package | What it does |
|---------|-------------|
| [`packages/browseros`](packages/browseros/) | Chromium fork — patches, build system, signing |
| [`apps/server`](packages/browseros-agent/apps/server/) | Bun server exposing 53+ MCP tools and running the AI agent loop |
| [`apps/agent`](packages/browseros-agent/apps/agent/) | Browser extension — new tab, side panel chat, onboarding, settings |
| [`apps/cli`](packages/browseros-agent/apps/cli/) | Go CLI — control BrowserOS from the terminal or AI coding agents |
| [`apps/eval`](packages/browseros-agent/apps/eval/) | Benchmark framework — WebVoyager, Mind2Web evaluation |
| [`agent-sdk`](packages/browseros-agent/packages/agent-sdk/) | Node.js SDK for browser automation with natural language |
| [`cdp-protocol`](packages/browseros-agent/packages/cdp-protocol/) | Type-safe Chrome DevTools Protocol bindings |
## Contributing
We'd love your help making BrowserOS better!
We'd love your help making BrowserOS better! See our [Contributing Guide](CONTRIBUTING.md) for details.
- 🐛 [Report bugs](https://github.com/browseros-ai/BrowserOS/issues)
- 💡 [Suggest features](https://github.com/browseros-ai/BrowserOS/issues/99)
- 💬 [Join Discord](https://discord.gg/YKwjt5vuKr)
- 🐦 [Follow on Twitter](https://x.com/browserOS_ai)
- [Report bugs](https://github.com/browseros-ai/BrowserOS/issues)
- [Suggest features](https://github.com/browseros-ai/BrowserOS/issues/99)
- [Join Discord](https://discord.gg/YKwjt5vuKr) · [Join Slack](https://dub.sh/browserOS-slack)
- [Follow on Twitter](https://x.com/browserOS_ai)
**Agent development** (TypeScript/Go) — see the [agent monorepo README](packages/browseros-agent/README.md) for setup instructions.
**Browser development** (C++/Python) — requires ~100GB disk space. See [`packages/browseros`](packages/browseros/) for build instructions.
## Credits
- [ungoogled-chromium](https://github.com/ungoogled-software/ungoogled-chromium) — BrowserOS uses some patches for enhanced privacy. Thanks to everyone behind this project!
- [The Chromium Project](https://www.chromium.org/) — at the core of BrowserOS, making it possible to exist in the first place.
## License
BrowserOS is open source under the [AGPL-3.0 license](LICENSE).
## Credits
- [ungoogled-chromium](https://github.com/ungoogled-software/ungoogled-chromium) - BrowserOS uses some patches for enhanced privacy. Thanks to everyone behind this project!
- [The Chromium Project](https://www.chromium.org/) - At the core of BrowserOS, making it possible to exist in the first place.
## Citation
If you use BrowserOS in your research or project, please cite:
```bibtex
@software{browseros2025,
author = {Sonti, Nithin and Sonti, Nikhil and {BrowserOS-team}},
title = {BrowserOS: The open-source Agentic browser},
url = {https://github.com/browseros-ai/BrowserOS},
year = {2025},
publisher = {GitHub},
license = {AGPL-3.0},
}
```
Copyright &copy; 2025 Felafax, Inc.
## Stargazers
Thank you to all our supporters!
[![Star History Chart](https://api.star-history.com/svg?repos=browseros-ai/BrowserOS&type=Date)](https://www.star-history.com/#browseros-ai/BrowserOS&Date)
##
<p align="center">
Built with ❤️ from San Francisco
</p>

View File

@@ -1,8 +1,6 @@
# BrowserOS Agent
Monorepo for the BrowserOS-agent -- contains 3 packages: agent-UI, server (which contains the agent loop) and controller-extension (which is used by the tools within the agent loop).
> **⚠️ NOTE:** This is only a submodule, the main project is at -- https://github.com/browseros-ai/BrowserOS
The agent platform powering [BrowserOS](https://github.com/browseros-ai/BrowserOS) — contains the MCP server, agent UI, CLI, evaluation framework, and SDK.
## Monorepo Structure
@@ -10,17 +8,25 @@ Monorepo for the BrowserOS-agent -- contains 3 packages: agent-UI, server (which
apps/
server/ # Bun server - MCP endpoints + agent loop
agent/ # Agent UI (Chrome extension)
cli/ # Go CLI for controlling BrowserOS from the terminal
eval/ # Evaluation framework for benchmarking agents
controller-ext/ # BrowserOS Controller (Chrome extension for chrome.* APIs)
packages/
agent-sdk/ # Node.js SDK (@browseros-ai/agent-sdk)
cdp-protocol/ # Type-safe Chrome DevTools Protocol bindings
shared/ # Shared constants (ports, timeouts, limits)
```
| Package | Description |
|---------|-------------|
| `apps/server` | Bun server exposing MCP tools and running the agent loop |
| `apps/agent` | Agent UI - Chrome extension for the chat interface |
| `apps/controller-ext` | BrowserOS Controller - Chrome extension that bridges `chrome.*` APIs (tabs, bookmarks, history) to the server via WebSocket |
| `apps/agent` | Agent UI Chrome extension for the chat interface |
| `apps/cli` | Go CLI — control BrowserOS from the terminal or AI coding agents |
| `apps/eval` | Benchmark framework — WebVoyager, Mind2Web evaluation |
| `apps/controller-ext` | BrowserOS Controller — bridges `chrome.*` APIs to the server via WebSocket |
| `packages/agent-sdk` | Node.js SDK for browser automation with natural language |
| `packages/cdp-protocol` | Auto-generated CDP type bindings used by the server |
| `packages/shared` | Shared constants used across packages |
## Architecture

View File

@@ -1,16 +1,24 @@
# BrowserOS Agent Chrome Extension
# BrowserOS Agent Extension
The official Chrome extension for BrowserOS Agent, providing the UI layer for interacting with BrowserOS Core and Controllers. This extension enables intelligent browser automation, AI-powered search, and seamless integration with multiple LLM providers.
[![License: AGPL v3](https://img.shields.io/badge/License-AGPL%20v3-blue.svg)](../../../../LICENSE)
The built-in browser extension that powers BrowserOS's AI interface — new tab with unified search, side panel chat, onboarding, and settings. Built with [WXT](https://wxt.dev) and React.
> For user-facing feature documentation, see [docs.browseros.com](https://docs.browseros.com).
## Features
- **AI-Powered New Tab**: Custom new tab page with unified search across Google and AI assistants
- **Side Panel Chat**: Full-featured chat interface for interacting with BrowserOS Core
- **Side Panel Chat**: Full-featured chat interface for interacting with BrowserOS
- **Multi-Provider Support**: Connect to various LLM providers (OpenAI, Anthropic, Azure, Bedrock, and more)
- **MCP Integration**: Model Context Protocol support for extending AI capabilities
- **Visual Feedback**: Animated glow effect on tabs during AI agent operations
- **Privacy-First**: Local data handling with configurable provider settings
## How It Connects
The extension communicates with the [BrowserOS Server](../../apps/server/) running locally. The server handles the AI agent loop, MCP tools, and CDP connections — the extension provides the UI layer.
## Project Structure
```
@@ -80,47 +88,20 @@ Settings dashboard with multiple sections:
Content script that creates a visual indicator (pulsing orange glow) around the browser viewport when an AI agent is actively working on a tab.
## How Tools Are Used
### Bun
Bun is the exclusive runtime and package manager:
- All scripts use `bun run <script>` instead of npm
- Package installation via `bun install`
- Environment files automatically loaded (no dotenv needed)
- Enforced via `engines` field in `package.json`
```bash
bun install # Install dependencies
bun run dev # Development mode
bun run build # Production build
bun run lint # Run Biome linting
```
### Biome
Unified linter and formatter configured in `biome.json`:
- **Formatting**: 2-space indentation, single quotes, no semicolons
- **Linting**: Recommended rules plus custom rules for unused imports/variables
- **CSS Support**: Tailwind directives parsing enabled
- **Import Organization**: Automatic import sorting via assist actions
```bash
bun run lint # Check for issues
bun run lint:fix # Auto-fix issues
```
## Development
### Prerequisites
- [Bun](https://bun.sh) installed
- Chrome or Chromium-based browser
- BrowserOS Core running locally (for full functionality)
- BrowserOS Server running locally (for full functionality)
### Setup
```bash
# Copy environment file
cp .env.example .env.development
# Install dependencies
bun install
@@ -153,12 +134,30 @@ SENTRY_AUTH_TOKEN=your-token
### GraphQL Schema
Codegen requires a GraphQL schema. By default it uses the bundled `schema/schema.graphql`, so no extra setup is needed. If you have access to the original API source, you can set the following environment variable
Codegen requires a GraphQL schema. By default it uses the bundled `schema/schema.graphql`, so no extra setup is needed. If you have access to the original API source, you can set the following environment variable:
```env
GRAPHQL_SCHEMA_PATH=/path/to/api-repo/.../schema.graphql
```
## Development Tooling
### Bun
Bun is the exclusive runtime and package manager:
- All scripts use `bun run <script>` instead of npm
- Package installation via `bun install`
- Environment files automatically loaded (no dotenv needed)
- Enforced via `engines` field in `package.json`
### Biome
Unified linter and formatter configured in `biome.json`:
- **Formatting**: 2-space indentation, single quotes, no semicolons
- **Linting**: Recommended rules plus custom rules for unused imports/variables
- **CSS Support**: Tailwind directives parsing enabled
- **Import Organization**: Automatic import sorting via assist actions
## Scripts
| Script | Description |
@@ -169,4 +168,5 @@ GRAPHQL_SCHEMA_PATH=/path/to/api-repo/.../schema.graphql
| `bun run lint` | Run Biome linter |
| `bun run lint:fix` | Auto-fix linting issues |
| `bun run typecheck` | Run TypeScript type checking |
| `bun run codegen` | Generate GraphQL types |
| `bun run clean:cache` | Clear build caches |

View File

@@ -1,5 +1,5 @@
import { useQueryClient } from '@tanstack/react-query'
import localforage from 'localforage'
import { clear } from 'idb-keyval'
import { Loader2 } from 'lucide-react'
import type { FC } from 'react'
import { useEffect } from 'react'
@@ -25,7 +25,7 @@ export const LogoutPage: FC = () => {
await providersStorage.removeValue()
await scheduledJobStorage.removeValue()
queryClient.clear()
await localforage.clear()
await clear()
resetIdentity()
await signOut()

View File

@@ -32,6 +32,7 @@ const RemoteChatHistory: FC<{ userId: string }> = ({ userId }) => {
const {
data: graphqlData,
isLoading: isLoadingConversations,
isFetching,
hasNextPage,
isFetchingNextPage,
fetchNextPage,
@@ -112,6 +113,7 @@ const RemoteChatHistory: FC<{ userId: string }> = ({ userId }) => {
hasNextPage={hasNextPage}
isFetchingNextPage={isFetchingNextPage}
onLoadMore={fetchNextPage}
isRefreshing={isFetching && !isLoadingConversations}
/>
)
}

View File

@@ -12,6 +12,7 @@ interface ConversationListProps {
hasNextPage?: boolean
isFetchingNextPage?: boolean
onLoadMore?: () => void
isRefreshing?: boolean
}
export const ConversationList: FC<ConversationListProps> = ({
@@ -21,6 +22,7 @@ export const ConversationList: FC<ConversationListProps> = ({
hasNextPage,
isFetchingNextPage,
onLoadMore,
isRefreshing,
}) => {
const loadMoreRef = useRef<HTMLDivElement>(null)
@@ -57,6 +59,12 @@ export const ConversationList: FC<ConversationListProps> = ({
return (
<main className="mt-4 flex h-full flex-1 flex-col space-y-4 overflow-y-auto">
<div className="w-full p-3">
{isRefreshing && (
<div className="flex items-center justify-center gap-2 pb-3 text-muted-foreground text-xs">
<Loader2 className="h-3 w-3 animate-spin" />
<span>Fetching latest conversations</span>
</div>
)}
{!hasConversations ? (
<div className="flex flex-col items-center justify-center py-12 text-center">
<MessageSquare className="mb-3 h-10 w-10 text-muted-foreground/50" />

View File

@@ -11,7 +11,7 @@ export const GetConversationsForHistoryDocument = graphql(`
nodes {
rowId
lastMessagedAt
conversationMessages(last: 5, orderBy: ORDER_INDEX_ASC) {
conversationMessages(first: 2, orderBy: ORDER_INDEX_DESC) {
nodes {
message
}

View File

@@ -76,8 +76,6 @@ export interface ChatSessionOptions {
isIntegrationsSynced?: boolean
}
const NEWTAB_SYSTEM_PROMPT = `IMPORTANT: The user is chatting from the New Tab page. When performing browser actions, ALWAYS open content in a NEW TAB rather than navigating the current tab. The user's new tab page should remain accessible.`
export const useChatSession = (options?: ChatSessionOptions) => {
const {
selectedLlmProviderRef,
@@ -344,12 +342,8 @@ export const useChatSession = (options?: ChatSessionOptions) => {
reasoningEffort: provider?.reasoningEffort,
reasoningSummary: provider?.reasoningSummary,
browserContext,
userSystemPrompt:
options?.origin === 'newtab'
? [personalizationRef.current, NEWTAB_SYSTEM_PROMPT]
.filter(Boolean)
.join('\n\n')
: personalizationRef.current,
origin: options?.origin ?? 'sidepanel',
userSystemPrompt: personalizationRef.current,
userWorkingDir: workingDirRef.current,
supportsImages: provider?.supportsImages,
previousConversation,

View File

@@ -1,7 +1,10 @@
import { createAsyncStoragePersister } from '@tanstack/query-async-storage-persister'
import { QueryClient } from '@tanstack/react-query'
import { PersistQueryClientProvider } from '@tanstack/react-query-persist-client'
import localforage from 'localforage'
import {
type AsyncStorage,
PersistQueryClientProvider,
} from '@tanstack/react-query-persist-client'
import { del, get, set } from 'idb-keyval'
import type { FC, ReactNode } from 'react'
const queryClient = new QueryClient({
@@ -12,8 +15,14 @@ const queryClient = new QueryClient({
},
})
const idbStorage: AsyncStorage<string> = {
getItem: (key: string) => get<string>(key).then((v) => v ?? null),
setItem: (key: string, value: string) => set(key, value),
removeItem: (key: string) => del(key),
}
const asyncStoragePersister = createAsyncStoragePersister({
storage: localforage,
storage: idbStorage,
})
export const QueryProvider: FC<{ children: ReactNode }> = ({ children }) => {

View File

@@ -44,9 +44,9 @@
"@radix-ui/react-use-controllable-state": "^1.2.2",
"@sentry/react": "^10.31.0",
"@sentry/vite-plugin": "^4.6.1",
"@tanstack/query-async-storage-persister": "^5.90.21",
"@tanstack/react-query": "^5.90.19",
"@tanstack/react-query-persist-client": "^5.90.21",
"@tanstack/query-async-storage-persister": "^5.95.2",
"@tanstack/react-query": "^5.95.2",
"@tanstack/react-query-persist-client": "^5.95.2",
"@types/cytoscape": "^3.31.0",
"@types/dompurify": "^3.2.0",
"@webext-core/messaging": "^2.3.0",
@@ -69,8 +69,8 @@
"eventsource-parser": "^3.0.6",
"graphql": "^16.12.0",
"hono": "^4.12.3",
"idb-keyval": "^6.2.2",
"klavis": "^2.15.0",
"localforage": "^1.10.0",
"lucide-react": "^0.562.0",
"motion": "^12.23.24",
"nanoid": "^5.1.6",

View File

@@ -1,25 +1,58 @@
# browseros-cli
Command-line interface for controlling BrowserOS via MCP. Talks to the BrowserOS MCP server over JSON-RPC 2.0 / StreamableHTTP.
[![License: AGPL v3](https://img.shields.io/badge/License-AGPL%20v3-blue.svg)](../../../../LICENSE)
## Setup
Command-line interface for controlling BrowserOS — launch and automate the browser from the terminal or from AI coding agents like Claude Code and Gemini CLI.
Communicates with the BrowserOS MCP server over JSON-RPC 2.0 / StreamableHTTP. All 53+ MCP tools are mapped to CLI commands.
## Install
### macOS / Linux
```bash
curl -fsSL https://cdn.browseros.com/cli/install.sh | bash
```
### Windows
```powershell
irm https://cdn.browseros.com/cli/install.ps1 | iex
```
### Build from Source
Requires Go 1.25+.
```bash
# Build
make
# First run — configure server connection
./browseros-cli init
make # Build binary
make install # Install to $GOPATH/bin
```
The `init` command prompts for your MCP server URL. Find it in:
**BrowserOS → Settings → BrowserOS MCP → Server URL**
## Quick Start
The port varies per installation (e.g., `http://127.0.0.1:9004/mcp`).
```bash
# If BrowserOS is not installed yet
browseros-cli install # downloads BrowserOS for your platform
Config is saved to `~/.config/browseros-cli/config.yaml`.
# If BrowserOS is installed but not running
browseros-cli launch # opens BrowserOS, waits for server
# Configure the CLI (auto-discovers running BrowserOS)
browseros-cli init --auto # detects server URL and saves config
# Verify connection
browseros-cli health
```
### Other init modes
```bash
browseros-cli init <url> # non-interactive — pass URL directly
browseros-cli init # interactive — prompts for URL
```
Config is saved to `~/.config/browseros-cli/config.yaml`. The CLI also auto-discovers the server from `~/.browseros/server.json` (written by BrowserOS on startup).
## Usage
@@ -67,6 +100,12 @@ browseros-cli history recent
browseros-cli group list
```
## Use as MCP Server
BrowserOS exposes an MCP server that AI coding agents can connect to directly. The CLI is the easiest way to verify the connection and interact with tools from the terminal.
To connect Claude Code, Gemini CLI, or any MCP client, see the [MCP setup guide](https://docs.browseros.com/features/use-with-claude-code).
## Global Flags
| Flag | Env Var | Description |
@@ -77,9 +116,9 @@ browseros-cli group list
| `--debug` | `BOS_DEBUG=1` | Debug output |
| `--timeout, -t` | | Request timeout (default: 2m) |
Priority for server URL: `--server` flag > `BROWSEROS_URL` env > config file
Priority for server URL: `--server` flag > `BROWSEROS_URL` env > `~/.browseros/server.json` > config file
If no server URL is configured, the CLI exits with setup instructions instead of assuming a localhost port.
If no server URL is configured, the CLI exits with setup instructions pointing to `install`, `launch`, and `init`.
## Testing
@@ -130,7 +169,9 @@ apps/cli/
│ └── config.go # Config file (~/.config/browseros-cli/config.yaml)
├── cmd/
│ ├── root.go # Root command, global flags
│ ├── init.go # Server URL configuration
│ ├── init.go # Server URL configuration (URL arg, --auto, interactive)
│ ├── install.go # install (download BrowserOS for current platform)
│ ├── launch.go # launch (find and start BrowserOS, wait for server)
│ ├── open.go # open (new_page / new_hidden_page)
│ ├── nav.go # nav, back, forward, reload
│ ├── pages.go # pages, active, close
@@ -163,4 +204,8 @@ The CLI communicates with BrowserOS via two HTTP POST requests per command:
1. `initialize` — MCP handshake
2. `tools/call` — execute the actual tool
All 54 MCP tools are mapped to CLI commands.
## Links
- [Documentation](https://docs.browseros.com)
- [MCP Setup Guide](https://docs.browseros.com/features/use-with-claude-code)
- [Changelog](./CHANGELOG.md)

View File

@@ -148,7 +148,7 @@ func runPostInstall(path string, deb bool, dim *color.Color) {
// installMacOS mounts the DMG and copies BrowserOS.app to /Applications.
func installMacOS(dmgPath string, dim *color.Color) {
fmt.Println("Mounting disk image...")
mountOut, err := exec.Command("hdiutil", "attach", dmgPath, "-nobrowse", "-quiet").Output()
mountOut, err := exec.Command("hdiutil", "attach", dmgPath, "-nobrowse").Output()
if err != nil {
dim.Println("Could not mount DMG automatically.")
dim.Printf(" Open it manually: open %s\n", dmgPath)

View File

@@ -68,7 +68,6 @@ if (-not [Environment]::Is64BitOperatingSystem) {
$Tag = "browseros-cli-v$Version"
$Filename = "${Binary}_${Version}_windows_${Arch}.zip"
$Url = "https://github.com/$Repo/releases/download/$Tag/$Filename"
$ChecksumUrl = "https://github.com/$Repo/releases/download/$Tag/checksums.txt"
$TmpDir = Join-Path ([System.IO.Path]::GetTempPath()) ("browseros-cli-install-" + [System.IO.Path]::GetRandomFileName())
try {
@@ -79,37 +78,6 @@ try {
Write-Host "Downloading $Url..."
Invoke-WebRequest -Uri $Url -OutFile $ZipPath -UseBasicParsing
$ChecksumPath = Join-Path $TmpDir "checksums.txt"
$ChecksumAvailable = $true
try {
Invoke-WebRequest -Uri $ChecksumUrl -OutFile $ChecksumPath -UseBasicParsing
} catch {
$ChecksumAvailable = $false
Write-Warning "Could not fetch checksums.txt; skipping checksum verification. $($_.Exception.Message)"
}
if ($ChecksumAvailable) {
$ExpectedChecksum = $null
foreach ($line in Get-Content $ChecksumPath) {
$parts = $line -split '\s+', 2
if ($parts.Length -eq 2 -and $parts[1] -eq $Filename) {
$ExpectedChecksum = $parts[0].ToLowerInvariant()
break
}
}
if ($ExpectedChecksum) {
$ActualChecksum = (Get-FileHash -Path $ZipPath -Algorithm SHA256).Hash.ToLowerInvariant()
if ($ActualChecksum -ne $ExpectedChecksum) {
Write-Error "Checksum mismatch (expected $ExpectedChecksum, got $ActualChecksum)"
exit 1
}
Write-Host "Checksum verified."
} else {
Write-Warning "Checksum not found in checksums.txt; skipping checksum verification."
}
}
Expand-Archive -Path $ZipPath -DestinationPath $TmpDir -Force
$Exe = Get-ChildItem -Path $TmpDir -Filter "$Binary.exe" -File -Recurse | Select-Object -First 1

View File

@@ -1,6 +1,8 @@
# BrowserOS Eval
Evaluation framework for benchmarking BrowserOS browser automation agents. Runs tasks from standard datasets (WebVoyager, Mind2Web), captures trajectories with screenshots, and grades results automatically.
[![License: AGPL v3](https://img.shields.io/badge/License-AGPL%20v3-blue.svg)](../../../../LICENSE)
Evaluation framework for benchmarking BrowserOS browser automation agents. Runs tasks from standard datasets ([WebVoyager](https://arxiv.org/abs/2401.13919), [Mind2Web](https://arxiv.org/abs/2306.06070)), captures trajectories with screenshots, and grades results automatically.
## Prerequisites

View File

@@ -0,0 +1,181 @@
# BrowserOS Server
MCP server and AI agent loop powering BrowserOS browser automation. This is the core backend — it connects to Chromium via CDP, exposes 53+ MCP tools, and runs the AI agent that interprets natural language into browser actions.
> **Runtime:** [Bun](https://bun.sh) · **Framework:** [Hono](https://hono.dev) · **AI:** [Vercel AI SDK](https://sdk.vercel.ai) · **License:** [AGPL-3.0](../../../../LICENSE)
## Architecture
```
┌──────────────────────────────────────────────────────────────────────┐
│ MCP Clients │
│ (Agent UI, Claude Code, Gemini CLI, browseros-cli) │
└──────────────────────────────────────────────────────────────────────┘
│ HTTP / SSE / StreamableHTTP
┌──────────────────────────────────────────────────────────────────────┐
│ BrowserOS Server (Bun) │
│ │
│ /mcp ─────── MCP tool endpoints (53+ tools) │
│ /chat ────── Agent streaming (AI SDK) │
│ /health ─── Health check │
│ │
│ ┌─────────────────────────────────────────────────────────────┐ │
│ │ Agent Loop │ │
│ │ ├── Multi-provider AI SDK (OpenAI, Anthropic, Google, ...) │ │
│ │ ├── Session & conversation management │ │
│ │ ├── Context overflow handling + compaction │ │
│ │ └── MCP client for external tool servers │ │
│ └─────────────────────────────────────────────────────────────┘ │
│ │
│ ┌────────────────────┐ ┌────────────────────────────────────┐ │
│ │ CDP Tools │ │ Controller Tools │ │
│ │ (screenshots, │ │ (tabs, bookmarks, history, │ │
│ │ DOM, network, │ │ navigation, tab groups) │ │
│ │ console, input) │ │ │ │
│ └────────────────────┘ └────────────────────────────────────┘ │
└──────────────────────────────────────────────────────────────────────┘
│ │
│ Chrome DevTools Protocol │ WebSocket
▼ ▼
┌─────────────────────┐ ┌─────────────────────────────────┐
│ Chromium CDP │ │ Controller Extension │
│ (port 9000) │ │ (port 9300) │
│ │ │ │
│ DOM, network, │ │ chrome.tabs, chrome.history, │
│ input, screenshots │ │ chrome.bookmarks │
└─────────────────────┘ └─────────────────────────────────┘
```
## MCP Tools
53+ tools organized by category:
| Category | Tools |
|----------|-------|
| **Navigation** | `new_page`, `navigate`, `go_back`, `go_forward`, `reload` |
| **Input** | `click`, `type`, `press_key`, `hover`, `scroll`, `drag`, `fill`, `clear`, `focus`, `check`, `uncheck`, `select_option`, `upload_file` |
| **Observation** | `take_snapshot`, `take_enhanced_snapshot`, `extract_text`, `extract_links` |
| **Screenshots** | `take_screenshot`, `save_screenshot` |
| **Evaluation** | `evaluate_script` |
| **Pages** | `list_pages`, `active_page`, `close_page`, `new_hidden_page` |
| **Windows** | `window_list`, `window_create`, `window_close`, `window_activate` |
| **Bookmarks** | `bookmark_list`, `bookmark_create`, `bookmark_remove`, `bookmark_update`, `bookmark_move`, `bookmark_search` |
| **History** | `history_search`, `history_recent`, `history_delete`, `history_delete_range` |
| **Tab Groups** | `group_list`, `group_create`, `group_update`, `group_ungroup`, `group_close` |
| **Filesystem** | `ls`, `read`, `write`, `edit`, `find`, `grep`, `bash` |
| **Memory** | `read_core`, `update_core`, `read_soul`, `update_soul`, `search_memory`, `write_memory` |
| **DOM** | `dom`, `dom_search` |
| **Console** | `get_console_messages` |
| **Other** | `browseros_info`, `handle_dialog`, `wait_for`, `download`, `export_pdf`, `output_file`, `nudges` |
## Agent Loop
The agent loop uses the [Vercel AI SDK](https://sdk.vercel.ai) to orchestrate multi-step browser automation:
- **Multi-provider support** — OpenAI, Anthropic, Google, Azure, Bedrock, OpenRouter, Ollama, LM Studio, and any OpenAI-compatible endpoint
- **Session management** — conversations persist in a local SQLite database
- **Context overflow handling** — automatic message compaction when context windows fill up
- **MCP client** — connects to external MCP servers for additional tool access (40+ app integrations)
- **Tool adapter** — bridges MCP tool definitions to AI SDK tool format
### Provider Factory
The provider factory (`src/agent/provider-factory.ts`) creates AI SDK providers from runtime configuration, supporting hot-swapping between providers without restart.
## Skills System
Skills are custom instruction sets that shape agent behavior:
- **Catalog** (`src/skills/catalog.ts`) — registry of available skills
- **Defaults** (`src/skills/defaults/`) — built-in skill definitions
- **Loader** (`src/skills/loader.ts`) — loads skills from local and remote sources
- **Remote sync** (`src/skills/remote-sync.ts`) — syncs skills from the BrowserOS cloud
## Graph Executor (Workflows)
The graph executor (`src/graph/executor.ts`) runs visual workflow graphs built in the BrowserOS workflow editor. Each node in the graph maps to agent actions, conditionals, or data transformations.
## Directory Structure
```
apps/server/
├── src/
│ ├── index.ts # Server entry point
│ ├── main.ts # Server initialization
│ ├── api/ # HTTP route handlers
│ ├── agent/ # Agent loop
│ │ ├── ai-sdk-agent.ts # Main agent implementation
│ │ ├── provider-factory.ts# LLM provider factory
│ │ ├── session-store.ts # Conversation persistence
│ │ ├── compaction.ts # Context window management
│ │ ├── mcp-builder.ts # External MCP client setup
│ │ └── tool-adapter.ts # MCP → AI SDK tool bridge
│ ├── browser/ # Browser connection layer
│ ├── tools/ # MCP tool implementations
│ │ ├── navigation.ts
│ │ ├── input.ts
│ │ ├── snapshot.ts
│ │ ├── memory/
│ │ ├── filesystem/
│ │ └── ...
│ ├── skills/ # Skills system
│ ├── graph/ # Workflow graph executor
│ ├── lib/ # Shared utilities
│ └── rpc.ts # JSON-RPC type definitions
├── tests/
│ ├── tools/ # Tool-level tests
│ ├── sdk/ # SDK integration tests
│ └── server.integration.test.ts
├── graph/ # Workflow graph definitions
└── package.json
```
## Development
### Prerequisites
- [Bun](https://bun.sh) runtime
- A running BrowserOS instance (for CDP and controller connections)
### Setup
```bash
# Copy environment files
cp .env.example .env.development
# Start the server (with hot reload)
bun run start
```
See the [agent monorepo README](../../README.md) for full environment variable reference and `process-compose` setup.
### Testing
```bash
bun run test:tools # Tool-level tests
bun run test:integration # Full integration tests (requires running BrowserOS)
bun run test:sdk # SDK integration tests
```
### Building
```bash
# Build cross-platform server binaries
bun run build
# Build for specific targets
bun scripts/build/server.ts --target=darwin-arm64,linux-x64
# Build without uploading to R2
bun scripts/build/server.ts --target=all --no-upload
```
## Ports
| Port | Env Variable | Purpose |
|------|-------------|---------|
| 9100 | `BROWSEROS_SERVER_PORT` | HTTP server (MCP, chat, health) |
| 9000 | `BROWSEROS_CDP_PORT` | Chromium CDP (server connects as client) |
| 9300 | `BROWSEROS_EXTENSION_PORT` | WebSocket for controller extension |

View File

@@ -1,6 +1,6 @@
{
"name": "@browseros/server",
"version": "0.0.79",
"version": "0.0.80",
"description": "BrowserOS server",
"type": "module",
"main": "./src/index.ts",

View File

@@ -54,8 +54,14 @@ export class AiSdkAgent {
private _messages: UIMessage[],
private _mcpClients: Array<{ close(): Promise<void> }>,
private conversationId: string,
private _toolNames: Set<string>,
) {}
/** Tool names registered on this agent — used to sanitize messages during session rebuilds. */
get toolNames(): Set<string> {
return this._toolNames
}
static async create(config: AiSdkAgentConfig): Promise<AiSdkAgent> {
const contextWindow =
config.resolvedConfig.contextWindowSize ??
@@ -92,10 +98,15 @@ export class AiSdkAgent {
}
// Build browser tools from the unified tool registry
const originPageId = config.browserContext?.activeTab?.pageId
const allBrowserTools = buildBrowserToolSet(
config.registry,
config.browser,
config.resolvedConfig.workingDir,
{
origin: config.resolvedConfig.origin,
originPageId,
},
)
const browserTools = config.resolvedConfig.chatMode
? Object.fromEntries(
@@ -155,10 +166,11 @@ export class AiSdkAgent {
}
}
// Add filesystem tools (Pi coding agent) — skip in chat mode (read-only)
const filesystemTools = config.resolvedConfig.chatMode
? {}
: buildFilesystemToolSet(config.resolvedConfig.workingDir)
// Add filesystem tools — skip in chat mode (read-only) and when no workspace is selected
const filesystemTools =
!config.resolvedConfig.chatMode && config.resolvedConfig.workingDir
? buildFilesystemToolSet(config.resolvedConfig.workingDir)
: {}
const memoryTools = config.resolvedConfig.chatMode
? {}
: buildMemoryToolSet()
@@ -205,6 +217,7 @@ export class AiSdkAgent {
connectedApps: config.browserContext?.enabledMcpServers,
declinedApps: config.resolvedConfig.declinedApps,
skillsCatalog,
origin: config.resolvedConfig.origin,
})
// Configure compaction for context window management
@@ -263,6 +276,7 @@ export class AiSdkAgent {
[],
clients,
config.resolvedConfig.conversationId,
new Set(Object.keys(tools)),
)
}

View File

@@ -44,3 +44,37 @@ export function hasMessageContent(message: UIMessage): boolean {
export function filterValidMessages(messages: UIMessage[]): UIMessage[] {
return messages.filter(hasMessageContent)
}
/**
* Remove tool parts that reference tools not present in the given toolset.
*
* When a session is rebuilt with a different set of tools (e.g., workspace
* removed mid-conversation or MCP server disconnected), the carried-over
* message history may contain tool parts for tools that no longer exist.
* The AI SDK validates messages against the current toolset and rejects
* parts with no matching schema.
*
* Tool parts use the type format `tool-${toolName}` (static tools) or
* `dynamic-tool` (dynamic tools). This function filters out static tool
* parts whose tool name is not in the provided set.
*/
export function sanitizeMessagesForToolset(
messages: UIMessage[],
toolNames: Set<string>,
): UIMessage[] {
return messages
.map((msg) => {
const filteredParts = msg.parts.filter((part) => {
// Static tool parts have type `tool-${toolName}`
if (typeof part.type === 'string' && part.type.startsWith('tool-')) {
const toolName = part.type.slice(5)
if (!toolNames.has(toolName)) return false
}
return true
})
if (filteredParts.length === msg.parts.length) return msg
return { ...msg, parts: filteredParts }
})
.filter(hasMessageContent)
}

View File

@@ -193,19 +193,41 @@ You have persistent memory across sessions and an evolving personality. See the
function getExecution(
_exclude: Set<string>,
_options?: BuildSystemPromptOptions,
options?: BuildSystemPromptOptions,
): string {
return `<execution>
const isNewTab = options?.origin === 'newtab'
let executionContent = `<execution>
## Execution
### Philosophy
- Execute tasks end-to-end. Don't delegate ("I found the button, you can click it").
- Don't ask permission for routine steps. Act, then report.
- Do not refuse by default, attempt tasks even when outcomes are uncertain.
- For ambiguous/unclear requests, ask one targeted clarifying question.
- For ambiguous/unclear requests, ask one targeted clarifying question.`
if (isNewTab) {
executionContent += `
### New-Tab Origin Rules
You are operating from the user's **New Tab page**. The active tab (Page ID from Browser Context) is the chat UI itself.
**CRITICAL RULES:**
1. **NEVER call \`navigate_page\` on the active tab** — this would destroy the chat UI and navigate the user away.
2. **NEVER call \`close_page\` on the active tab** — same reason.
3. For ALL browsing tasks (including single-page lookups), use \`new_page\` (background) to open URLs.
4. For single-page lookups, open a background tab, extract data, then close it.
5. For multi-page research, open background tabs and group them with \`group_tabs\`.
### Multi-tab workflow`
} else {
executionContent += `
- Stay on the current page for single-page tasks. Use \`navigate_page\` to move within one tab.
### Multi-tab workflow
### Multi-tab workflow`
}
executionContent += `
When a task requires working on multiple pages simultaneously:
1. **Inform the user** that you're creating background tabs for the task.
2. **Open new tabs in background** using \`new_page\` (opens in background by default) — never steal focus from the user's current tab.
@@ -216,15 +238,23 @@ When a task requires working on multiple pages simultaneously:
7. **Never force-switch the user's active tab.** If you need user interaction on a background tab (e.g., login, CAPTCHA), tell the user which tab needs attention and let them switch manually.
8. **Never navigate the user's current tab** during a multi-tab task. The current tab is the user's anchor — use it only for reading (snapshots, content extraction). All navigation should happen on background tabs.
**Do NOT use \`create_hidden_window\` or \`new_hidden_page\` for user-requested tasks.** Hidden windows are invisible to the user and cannot be screenshotted. Use \`new_page\` (background mode) instead — tabs appear in the user's tab strip and can be inspected. Reserve hidden windows for automated/scheduled runs only.
**Do NOT use \`create_hidden_window\` or \`new_hidden_page\` for user-requested tasks.** Hidden windows are invisible to the user and cannot be screenshotted. Use \`new_page\` (background mode) instead — tabs appear in the user's tab strip and can be inspected. Reserve hidden windows for automated/scheduled runs only.`
For single-page lookups (e.g., "go to X and read Y"), use \`navigate_page\` on the current tab. Only create new tabs when the task requires multiple pages open simultaneously.
if (!isNewTab) {
executionContent += `
For single-page lookups (e.g., "go to X and read Y"), use \`navigate_page\` on the current tab. Only create new tabs when the task requires multiple pages open simultaneously.`
}
executionContent += `
### Tab retry discipline
When a background tab fails (404, wrong content, unexpected redirect):
- **Navigate the existing tab** to the correct URL with \`navigate_page\` — do NOT open a new tab for retries.
- If you must abandon a tab, close it with \`close_page\` before opening a replacement.
- Never let orphan tabs accumulate — each task should end with only the tabs that contain useful content.
- Never let orphan tabs accumulate — each task should end with only the tabs that contain useful content.`
executionContent += `
### Observe → Act → Verify
- **Before acting**: Take a snapshot to get interactive element IDs.
@@ -241,13 +271,38 @@ Some tools automatically include a fresh snapshot in their response (labeled "Ad
- 2FA → notify user, pause for completion
- Page not found (404) or server error (500) → report the error to the user
</execution>`
return executionContent
}
// -----------------------------------------------------------------------------
// section: tool-selection
// -----------------------------------------------------------------------------
function getToolSelection(): string {
function getToolSelection(
_exclude: Set<string>,
options?: BuildSystemPromptOptions,
): string {
const isNewTab = options?.origin === 'newtab'
const navTable = isNewTab
? `### Navigation: single-tab vs multi-tab
| Task | Approach |
|------|----------|
| Look up one page | \`new_page\` (background) → extract data → \`close_page\` |
| Research across multiple sites | \`new_page\` (background) for each site + \`group_tabs\` |
| Compare two pages side by side | \`new_page\` (background) × 2 + \`group_tabs\` |
| User says "open a new tab" | \`new_page\` (background) |
**Remember:** The active tab is the New Tab chat UI. Never navigate or close it.`
: `### Navigation: single-tab vs multi-tab
| Task | Approach |
|------|----------|
| Look up one page | \`navigate_page\` on current tab |
| Research across multiple sites | \`new_page\` (background) for each site + \`group_tabs\` |
| Compare two pages side by side | \`new_page\` (background) × 2 + \`group_tabs\` |
| User says "open a new tab" | \`new_page\` (background) — don't steal focus |`
return `<tool_selection>
## Tool Selection
@@ -268,13 +323,7 @@ function getToolSelection(): string {
- Prefer \`fill\` over \`press_key\` for text input. Use \`press_key\` for keyboard shortcuts (Enter, Escape, Tab, Ctrl+A, etc.).
- Prefer clicking links over \`navigate_page\` when the link is visible. Use \`navigate_page\` for direct URL access, back/forward, or reload.
### Navigation: single-tab vs multi-tab
| Task | Approach |
|------|----------|
| Look up one page | \`navigate_page\` on current tab |
| Research across multiple sites | \`new_page\` (background) for each site + \`group_tabs\` |
| Compare two pages side by side | \`new_page\` (background) × 2 + \`group_tabs\` |
| User says "open a new tab" | \`new_page\` (background) — don't steal focus |
${navTable}
### Connected apps: Strata vs browser
When an app is Connected, prefer Strata tools over browser automation. Strata is faster, more reliable, and works without navigating away from the user's current page.
@@ -668,7 +717,10 @@ const promptSections: Record<string, PromptSectionFn> = {
security: getSecurity,
capabilities: getCapabilities,
execution: getExecution,
'tool-selection': getToolSelection,
'tool-selection': (
_exclude: Set<string>,
options?: BuildSystemPromptOptions,
) => getToolSelection(_exclude, options),
'external-integrations': getExternalIntegrations,
'error-recovery': getErrorRecovery,
'memory-and-identity': getMemoryAndIdentity,
@@ -695,6 +747,8 @@ export interface BuildSystemPromptOptions {
/** Apps the user previously declined to connect (chose "do it manually"). */
declinedApps?: string[]
skillsCatalog?: string
/** Where the chat session originates from — determines navigation behavior. */
origin?: 'sidepanel' | 'newtab'
}
export function buildSystemPrompt(options?: BuildSystemPromptOptions): string {

View File

@@ -9,6 +9,8 @@ export interface AgentSession {
browserContext?: BrowserContext
/** MCP server names used when the session was created, for change detection. */
mcpServerKey?: string
/** Workspace directory when the session was created, for change detection. */
workingDir?: string
}
export class SessionStore {

View File

@@ -38,12 +38,14 @@ function contentToModelOutput(
export function buildBrowserToolSet(
registry: ToolRegistry,
browser: Browser,
workingDir: string,
workingDir: string | undefined,
session?: { origin?: 'sidepanel' | 'newtab'; originPageId?: number },
): ToolSet {
const toolSet: ToolSet = {}
const ctx: ToolContext = {
browser,
directories: { workingDir },
session,
}
for (const def of registry.all()) {

View File

@@ -35,7 +35,7 @@ export interface ResolvedAgentConfig {
reasoningSummary?: string
contextWindowSize?: number
userSystemPrompt?: string
workingDir: string
workingDir?: string
/** Whether the model supports image inputs (vision). Defaults to true. */
supportsImages?: boolean
/** Eval mode - enables window management tools. Defaults to false. */
@@ -46,6 +46,8 @@ export interface ResolvedAgentConfig {
isScheduledTask?: boolean
/** Apps the user previously declined to connect via MCP (chose "do it manually"). */
declinedApps?: string[]
/** Where the chat session originates from — determines navigation behavior. */
origin?: 'sidepanel' | 'newtab'
/** BrowserOS installation ID for credit-based tracking. */
browserosId?: string
}

View File

@@ -4,16 +4,16 @@
* SPDX-License-Identifier: AGPL-3.0-or-later
*/
import { mkdir, utimes } from 'node:fs/promises'
import path from 'node:path'
import { createAgentUIStreamResponse, type UIMessage } from 'ai'
import { AiSdkAgent } from '../../agent/ai-sdk-agent'
import { formatUserMessage } from '../../agent/format-message'
import { filterValidMessages } from '../../agent/message-validation'
import type { SessionStore } from '../../agent/session-store'
import {
filterValidMessages,
sanitizeMessagesForToolset,
} from '../../agent/message-validation'
import type { AgentSession, SessionStore } from '../../agent/session-store'
import type { ResolvedAgentConfig } from '../../agent/types'
import type { Browser } from '../../browser/browser'
import { getSessionsDir } from '../../lib/browseros-dir'
import type { KlavisClient } from '../../lib/clients/klavis/klavis-client'
import { resolveLLMConfig } from '../../lib/clients/llm/config'
import { logger } from '../../lib/logger'
@@ -40,8 +40,6 @@ export class ChatService {
const llmConfig = await resolveLLMConfig(request, this.deps.browserosId)
const workingDir = await this.resolveSessionDir(request)
const agentConfig: ResolvedAgentConfig = {
conversationId: request.conversationId,
provider: llmConfig.provider,
@@ -59,16 +57,18 @@ export class ChatService {
reasoningSummary: request.reasoningSummary,
contextWindowSize: request.contextWindowSize,
userSystemPrompt: request.userSystemPrompt,
workingDir,
workingDir: request.userWorkingDir,
supportsImages: request.supportsImages,
chatMode: request.mode === 'chat',
isScheduledTask: request.isScheduledTask,
origin: request.origin,
declinedApps: request.declinedApps,
browserosId: this.deps.browserosId,
}
let session = sessionStore.get(request.conversationId)
let isNewSession = false
const contextChanges: string[] = []
// Build a stable key from enabled MCP servers for change detection
const mcpServerKey = this.buildMcpServerKey(request.browserContext)
@@ -80,23 +80,68 @@ export class ChatService {
previous: session.mcpServerKey,
current: mcpServerKey,
})
const previousMessages = session.agent.messages
await session.agent.dispose()
sessionStore.remove(request.conversationId)
const previousMcpKey = session.mcpServerKey
session = await this.rebuildSession(
session,
request,
agentConfig,
mcpServerKey,
)
const browserContext = await this.resolvePageIds(request.browserContext)
const agent = await AiSdkAgent.create({
resolvedConfig: agentConfig,
browser: this.deps.browser,
registry: this.deps.registry,
browserContext,
klavisClient: this.deps.klavisClient,
browserosId: this.deps.browserosId,
aiSdkDevtoolsEnabled: this.deps.aiSdkDevtoolsEnabled,
const oldServers = new Set(
(previousMcpKey ?? '').split(',').filter(Boolean),
)
const newServers = new Set(mcpServerKey.split(',').filter(Boolean))
const added = [...newServers].filter((s) => !oldServers.has(s))
const removed = [...oldServers].filter((s) => !newServers.has(s))
const parts: string[] = []
if (removed.length > 0) {
parts.push(
`The following app integrations were disconnected: ${removed.join(', ')}. Their tools are no longer available.`,
)
}
if (added.length > 0) {
parts.push(
`The following app integrations were connected: ${added.join(', ')}. Their tools are now available.`,
)
}
if (parts.length === 0) {
parts.push(
'Connected app integrations changed during this conversation. Use only tools that are currently registered.',
)
}
contextChanges.push(parts.join(' '))
}
// Detect workspace change mid-conversation → rebuild session
if (session && session.workingDir !== request.userWorkingDir) {
logger.info('Workspace changed mid-conversation, rebuilding session', {
conversationId: request.conversationId,
previous: session.workingDir ?? '(none)',
current: request.userWorkingDir ?? '(none)',
})
session = { agent, browserContext, mcpServerKey }
session.agent.messages = previousMessages
sessionStore.set(request.conversationId, session)
const previousWorkingDir = session.workingDir
session = await this.rebuildSession(
session,
request,
agentConfig,
mcpServerKey,
)
if (!request.userWorkingDir) {
contextChanges.push(
'The user disconnected the workspace during this conversation. Filesystem tools (filesystem_read, filesystem_write, filesystem_edit, filesystem_bash, filesystem_grep, filesystem_find, filesystem_ls) are no longer available. Return all output directly in chat. If the user asks for file operations, suggest they select a working directory from the chat toolbar.',
)
} else if (!previousWorkingDir) {
contextChanges.push(
`The user connected a workspace during this conversation. Filesystem tools are now available. Working directory: ${request.userWorkingDir}`,
)
} else {
contextChanges.push(
`The user switched workspace during this conversation. Filesystem tools now use the new working directory: ${request.userWorkingDir}`,
)
}
}
if (!session) {
@@ -141,7 +186,13 @@ export class ChatService {
browserosId: this.deps.browserosId,
aiSdkDevtoolsEnabled: this.deps.aiSdkDevtoolsEnabled,
})
session = { agent, hiddenWindowId, browserContext, mcpServerKey }
session = {
agent,
hiddenWindowId,
browserContext,
mcpServerKey,
workingDir: request.userWorkingDir,
}
sessionStore.set(request.conversationId, session)
}
@@ -175,7 +226,13 @@ export class ChatService {
request.selectedText,
request.selectedTextSource,
)
session.agent.appendUserMessage(userContent)
// Prepend tool-change context when session was rebuilt mid-conversation
const contextPrefix =
contextChanges.length > 0
? `${contextChanges.map((c) => `[Context: ${c}]`).join('\n')}\n\n`
: ''
session.agent.appendUserMessage(contextPrefix + userContent)
return createAgentUIStreamResponse({
agent: session.agent.toolLoopAgent,
@@ -262,22 +319,44 @@ export class ChatService {
})
}
private async rebuildSession(
session: AgentSession,
request: ChatRequest,
agentConfig: ResolvedAgentConfig,
mcpServerKey: string,
): Promise<AgentSession> {
const previousMessages = session.agent.messages
await session.agent.dispose()
this.deps.sessionStore.remove(request.conversationId)
const browserContext = await this.resolvePageIds(request.browserContext)
const agent = await AiSdkAgent.create({
resolvedConfig: agentConfig,
browser: this.deps.browser,
registry: this.deps.registry,
browserContext,
klavisClient: this.deps.klavisClient,
browserosId: this.deps.browserosId,
aiSdkDevtoolsEnabled: this.deps.aiSdkDevtoolsEnabled,
})
const newSession: AgentSession = {
agent,
browserContext,
mcpServerKey,
workingDir: request.userWorkingDir,
}
newSession.agent.messages = sanitizeMessagesForToolset(
previousMessages,
agent.toolNames,
)
this.deps.sessionStore.set(request.conversationId, newSession)
return newSession
}
private buildMcpServerKey(browserContext?: BrowserContext): string {
const managed = browserContext?.enabledMcpServers?.slice().sort() ?? []
const custom =
browserContext?.customMcpServers?.map((s) => s.url).sort() ?? []
return [...managed, ...custom].join(',')
}
private async resolveSessionDir(request: ChatRequest): Promise<string> {
const dir = request.userWorkingDir
? request.userWorkingDir
: path.join(getSessionsDir(), request.conversationId)
await mkdir(dir, { recursive: true })
if (!request.userWorkingDir) {
const now = new Date()
await utimes(dir, now, now).catch(() => {})
}
return dir
}
}

View File

@@ -45,6 +45,7 @@ export const ChatRequestSchema = AgentLLMConfigSchema.extend({
userWorkingDir: z.string().min(1).optional(),
supportsImages: z.boolean().optional().default(true),
mode: z.enum(['chat', 'agent']).optional().default('agent'),
origin: z.enum(['sidepanel', 'newtab']).optional().default('sidepanel'),
declinedApps: z.array(z.string()).optional(),
selectedText: z.string().optional(),
selectedTextSource: z

View File

@@ -1,3 +1,4 @@
import { tmpdir } from 'node:os'
import { resolve } from 'node:path'
import type { z } from 'zod'
import type { Browser } from '../browser/browser'
@@ -18,13 +19,19 @@ export type ToolHandler = (
) => Promise<void>
export interface ToolDirectories {
workingDir: string
workingDir?: string
resourcesDir?: string
}
export interface ToolSessionContext {
origin?: 'sidepanel' | 'newtab'
originPageId?: number
}
export type ToolContext = {
browser: Browser
directories: ToolDirectories
session?: ToolSessionContext
}
export function resolveWorkingPath(
@@ -32,7 +39,7 @@ export function resolveWorkingPath(
targetPath: string,
cwd?: string,
): string {
return resolve(cwd ?? ctx.directories.workingDir, targetPath)
return resolve(cwd ?? ctx.directories.workingDir ?? tmpdir(), targetPath)
}
export function defineTool<

View File

@@ -88,6 +88,17 @@ export const navigate_page = defineTool({
return
}
if (
ctx.session?.origin === 'newtab' &&
ctx.session.originPageId !== undefined &&
args.page === ctx.session.originPageId
) {
response.error(
'Cannot navigate the origin tab in new-tab mode — this would destroy the chat UI. Use `new_page` to open a background tab instead.',
)
return
}
switch (args.action) {
case 'url':
await ctx.browser.goto(args.page, args.url as string)
@@ -266,6 +277,17 @@ export const close_page = defineTool({
action: z.literal('close_page'),
}),
handler: async (args, ctx, response) => {
if (
ctx.session?.origin === 'newtab' &&
ctx.session.originPageId !== undefined &&
args.page === ctx.session.originPageId
) {
response.error(
'Cannot close the origin tab in new-tab mode — this would destroy the chat UI.',
)
return
}
await ctx.browser.closePage(args.page)
response.text(`Closed page ${args.page}`)
response.data({ page: args.page, action: 'close_page' })

View File

@@ -1,4 +1,5 @@
import { mkdir, mkdtemp, rename, rm } from 'node:fs/promises'
import { tmpdir } from 'node:os'
import { join } from 'node:path'
import { z } from 'zod'
import { defineTool, resolveWorkingPath } from './framework'
@@ -121,10 +122,9 @@ export const download_file = defineTool({
}),
handler: async (args, ctx, response) => {
const resolvedDir = resolveWorkingPath(ctx, args.path, args.cwd)
await mkdir(ctx.directories.workingDir, { recursive: true })
const tempDir = await mkdtemp(
join(ctx.directories.workingDir, 'browseros-dl-'),
)
const baseDir = ctx.directories.workingDir ?? tmpdir()
await mkdir(baseDir, { recursive: true })
const tempDir = await mkdtemp(join(baseDir, 'browseros-dl-'))
try {
const { filePath, suggestedFilename } =

View File

@@ -0,0 +1,299 @@
/**
* @license
* Copyright 2025 BrowserOS
*
* Message Validation — Test Suite
*
* Tests for sanitizeMessagesForToolset, which strips tool parts from
* carried-over messages when a session is rebuilt with a different toolset
* (e.g., workspace removed or MCP server disconnected mid-conversation).
*
* Without this sanitization, the AI SDK throws a validation error because
* it finds tool parts in the message history that have no matching schema.
*/
import { describe, expect, it } from 'bun:test'
import type { UIMessage } from 'ai'
import {
hasMessageContent,
sanitizeMessagesForToolset,
} from '../../src/agent/message-validation'
// ---------------------------------------------------------------------------
// Helpers
// ---------------------------------------------------------------------------
function makeUserMessage(text: string, id?: string): UIMessage {
return {
id: id ?? crypto.randomUUID(),
role: 'user',
parts: [{ type: 'text', text }],
}
}
function makeAssistantMessage(
parts: UIMessage['parts'],
id?: string,
): UIMessage {
return {
id: id ?? crypto.randomUUID(),
role: 'assistant',
parts,
}
}
// ---------------------------------------------------------------------------
// sanitizeMessagesForToolset
// ---------------------------------------------------------------------------
describe('sanitizeMessagesForToolset', () => {
const allTools = new Set([
'navigate_page',
'click',
'take_snapshot',
'filesystem_read',
'filesystem_write',
'memory_search',
])
const noFilesystemTools = new Set([
'navigate_page',
'click',
'take_snapshot',
'memory_search',
])
it('preserves messages with no tool parts', () => {
const messages: UIMessage[] = [
makeUserMessage('Hello'),
makeAssistantMessage([{ type: 'text', text: 'Hi there!' }]),
]
const result = sanitizeMessagesForToolset(messages, noFilesystemTools)
expect(result).toHaveLength(2)
expect(result[0].parts).toHaveLength(1)
expect(result[1].parts).toHaveLength(1)
})
it('preserves tool parts when tool is in the toolset', () => {
const messages: UIMessage[] = [
makeAssistantMessage([
{ type: 'text', text: 'Taking a snapshot...' },
{
type: 'tool-take_snapshot',
toolCallId: 'call-1',
toolName: 'take_snapshot',
state: 'result',
input: { page: 1 },
output: { content: 'snapshot data' },
} as unknown as UIMessage['parts'][number],
]),
]
const result = sanitizeMessagesForToolset(messages, allTools)
expect(result).toHaveLength(1)
expect(result[0].parts).toHaveLength(2)
})
it('strips tool parts when tool is NOT in the toolset', () => {
const messages: UIMessage[] = [
makeAssistantMessage([
{ type: 'text', text: 'Reading file...' },
{
type: 'tool-filesystem_read',
toolCallId: 'call-1',
toolName: 'filesystem_read',
state: 'result',
input: { path: '/tmp/test.txt' },
output: { content: 'file data' },
} as unknown as UIMessage['parts'][number],
]),
]
const result = sanitizeMessagesForToolset(messages, noFilesystemTools)
expect(result).toHaveLength(1)
// Only the text part should remain
expect(result[0].parts).toHaveLength(1)
expect(result[0].parts[0].type).toBe('text')
})
it('strips multiple removed tool parts from same message', () => {
const messages: UIMessage[] = [
makeAssistantMessage([
{ type: 'text', text: 'Working on files...' },
{
type: 'tool-filesystem_read',
toolCallId: 'call-1',
toolName: 'filesystem_read',
state: 'result',
input: { path: '/tmp/a.txt' },
output: {},
} as unknown as UIMessage['parts'][number],
{
type: 'tool-filesystem_write',
toolCallId: 'call-2',
toolName: 'filesystem_write',
state: 'result',
input: { path: '/tmp/b.txt', content: 'data' },
output: {},
} as unknown as UIMessage['parts'][number],
]),
]
const result = sanitizeMessagesForToolset(messages, noFilesystemTools)
expect(result).toHaveLength(1)
expect(result[0].parts).toHaveLength(1)
expect(result[0].parts[0].type).toBe('text')
})
it('keeps browser tool parts while removing filesystem tool parts', () => {
const messages: UIMessage[] = [
makeAssistantMessage([
{
type: 'tool-take_snapshot',
toolCallId: 'call-1',
toolName: 'take_snapshot',
state: 'result',
input: { page: 1 },
output: {},
} as unknown as UIMessage['parts'][number],
{
type: 'tool-filesystem_read',
toolCallId: 'call-2',
toolName: 'filesystem_read',
state: 'result',
input: { path: '/tmp/test.txt' },
output: {},
} as unknown as UIMessage['parts'][number],
]),
]
const result = sanitizeMessagesForToolset(messages, noFilesystemTools)
expect(result).toHaveLength(1)
expect(result[0].parts).toHaveLength(1)
expect((result[0].parts[0] as { type: string }).type).toBe(
'tool-take_snapshot',
)
})
it('removes messages that become empty after stripping', () => {
const messages: UIMessage[] = [
makeUserMessage('Read this file'),
makeAssistantMessage([
{
type: 'tool-filesystem_read',
toolCallId: 'call-1',
toolName: 'filesystem_read',
state: 'result',
input: { path: '/tmp/test.txt' },
output: {},
} as unknown as UIMessage['parts'][number],
]),
]
const result = sanitizeMessagesForToolset(messages, noFilesystemTools)
// The assistant message had only a tool part — after stripping, it's empty
// and should be filtered out by hasMessageContent
expect(result).toHaveLength(1)
expect(result[0].role).toBe('user')
})
it('preserves non-tool part types (reasoning, step-start, file)', () => {
const messages: UIMessage[] = [
makeAssistantMessage([
{ type: 'text', text: 'Let me think...' },
{
type: 'reasoning',
reasoning: 'Analyzing the request',
} as unknown as UIMessage['parts'][number],
{
type: 'step-start',
} as unknown as UIMessage['parts'][number],
]),
]
const result = sanitizeMessagesForToolset(messages, noFilesystemTools)
expect(result).toHaveLength(1)
expect(result[0].parts).toHaveLength(3)
})
it('returns same message references when no filtering needed', () => {
const messages: UIMessage[] = [
makeUserMessage('Hello'),
makeAssistantMessage([{ type: 'text', text: 'Hi!' }]),
]
const result = sanitizeMessagesForToolset(messages, noFilesystemTools)
// Messages that don't need filtering should be the same reference
expect(result[0]).toBe(messages[0])
expect(result[1]).toBe(messages[1])
})
it('handles empty message array', () => {
const result = sanitizeMessagesForToolset([], noFilesystemTools)
expect(result).toHaveLength(0)
})
it('handles empty toolset (all tools removed)', () => {
const messages: UIMessage[] = [
makeAssistantMessage([
{ type: 'text', text: 'Working...' },
{
type: 'tool-navigate_page',
toolCallId: 'call-1',
toolName: 'navigate_page',
state: 'result',
input: {},
output: {},
} as unknown as UIMessage['parts'][number],
]),
]
const result = sanitizeMessagesForToolset(messages, new Set())
expect(result).toHaveLength(1)
expect(result[0].parts).toHaveLength(1)
expect(result[0].parts[0].type).toBe('text')
})
})
// ---------------------------------------------------------------------------
// hasMessageContent (existing function, verify edge cases)
// ---------------------------------------------------------------------------
describe('hasMessageContent', () => {
it('rejects messages with empty parts array', () => {
const msg: UIMessage = {
id: '1',
role: 'assistant',
parts: [],
}
expect(hasMessageContent(msg)).toBe(false)
})
it('rejects messages with only whitespace text', () => {
const msg: UIMessage = {
id: '1',
role: 'assistant',
parts: [{ type: 'text', text: ' \n ' }],
}
expect(hasMessageContent(msg)).toBe(false)
})
it('accepts messages with non-text parts', () => {
const msg: UIMessage = {
id: '1',
role: 'assistant',
parts: [
{
type: 'tool-click',
toolCallId: 'call-1',
toolName: 'click',
state: 'result',
input: {},
output: {},
} as unknown as UIMessage['parts'][number],
],
}
expect(hasMessageContent(msg)).toBe(true)
})
})

View File

@@ -1195,3 +1195,120 @@ describe('nudges', () => {
expect(prompt).toContain('at most once')
})
})
// ---------------------------------------------------------------------------
// 15. NEW-TAB ORIGIN
//
// Why: When the user chats from the new-tab page, the active tab IS the chat
// UI. The agent must never navigate or close it. The prompt must adapt its
// execution and tool-selection sections to prohibit origin tab navigation
// and default all lookups to new_page (background).
// ---------------------------------------------------------------------------
describe('new-tab origin', () => {
/** Build a prompt with newtab origin */
function buildNewTab(overrides?: Partial<BuildSystemPromptOptions>): string {
return buildSystemPrompt({
workspaceDir: '/home/user/workspace',
soulContent: 'Be helpful and concise.',
origin: 'newtab',
...overrides,
})
}
// --- Execution section ---
it('includes New-Tab Origin Rules when origin is newtab', () => {
const prompt = buildNewTab()
expect(prompt).toContain('New-Tab Origin Rules')
expect(prompt).toContain('New Tab page')
expect(prompt).toContain('chat UI itself')
})
it('prohibits navigate_page on active tab in newtab mode', () => {
const prompt = buildNewTab()
expect(prompt).toContain('NEVER call `navigate_page` on the active tab')
})
it('prohibits close_page on active tab in newtab mode', () => {
const prompt = buildNewTab()
expect(prompt).toContain('NEVER call `close_page` on the active tab')
})
it('requires new_page for all browsing in newtab mode', () => {
const prompt = buildNewTab()
expect(prompt).toContain(
'For ALL browsing tasks (including single-page lookups), use `new_page`',
)
})
it('does NOT include single-tab navigate_page guidance in newtab mode', () => {
// The sidepanel prompt says "use navigate_page on the current tab" for
// single-page lookups. This must NOT appear in newtab mode.
const prompt = buildNewTab()
expect(prompt).not.toContain(
'For single-page lookups (e.g., "go to X and read Y"), use `navigate_page` on the current tab',
)
})
it('does NOT include "Stay on the current page" in newtab mode', () => {
const prompt = buildNewTab()
expect(prompt).not.toContain(
'Stay on the current page for single-page tasks',
)
})
it('still includes common execution sections in newtab mode', () => {
// Newtab mode should still have multi-tab workflow, observe-act-verify, etc.
const prompt = buildNewTab()
expect(prompt).toContain('Multi-tab workflow')
expect(prompt).toContain('Observe → Act → Verify')
expect(prompt).toContain('Tab retry discipline')
expect(prompt).toContain('CAPTCHA')
})
// --- Sidepanel (default) should NOT have newtab rules ---
it('does NOT include New-Tab Origin Rules in sidepanel mode', () => {
const prompt = buildRegular({ origin: 'sidepanel' })
expect(prompt).not.toContain('New-Tab Origin Rules')
})
it('does NOT include New-Tab Origin Rules when origin is undefined', () => {
const prompt = buildRegular()
expect(prompt).not.toContain('New-Tab Origin Rules')
})
it('includes single-tab navigate_page guidance in sidepanel mode', () => {
const prompt = buildRegular({ origin: 'sidepanel' })
expect(prompt).toContain(
'For single-page lookups (e.g., "go to X and read Y"), use `navigate_page` on the current tab',
)
})
// --- Tool selection section ---
it('tool selection table uses new_page for lookups in newtab mode', () => {
const prompt = buildNewTab()
expect(prompt).toContain(
'`new_page` (background) → extract data → `close_page`',
)
})
it('tool selection includes reminder about active tab in newtab mode', () => {
const prompt = buildNewTab()
expect(prompt).toContain(
'The active tab is the New Tab chat UI. Never navigate or close it.',
)
})
it('tool selection table uses navigate_page for lookups in sidepanel mode', () => {
const prompt = buildRegular({ origin: 'sidepanel' })
expect(prompt).toContain('`navigate_page` on current tab')
})
it('tool selection does NOT have newtab reminder in sidepanel mode', () => {
const prompt = buildRegular({ origin: 'sidepanel' })
expect(prompt).not.toContain('The active tab is the New Tab chat UI')
})
})

View File

@@ -0,0 +1,270 @@
/**
* New-tab origin navigation guards.
*
* When the chat session originates from the new-tab page, navigate_page and
* close_page must reject attempts to act on the origin tab. These are
* integration tests that run against a real browser to verify the guards
* work end-to-end through executeTool.
*/
import { describe, it } from 'bun:test'
import assert from 'node:assert'
import type { ToolContext, ToolDefinition } from '../../src/tools/framework'
import { executeTool } from '../../src/tools/framework'
import { close_page, navigate_page, new_page } from '../../src/tools/navigation'
import type { ToolResult } from '../../src/tools/response'
import { withBrowser } from '../__helpers__/with-browser'
function textOf(result: {
content: { type: string; text?: string }[]
}): string {
return result.content
.filter((c) => c.type === 'text')
.map((c) => c.text)
.join('\n')
}
function structuredOf<T>(result: { structuredContent?: unknown }): T {
assert.ok(result.structuredContent, 'Expected structuredContent')
return result.structuredContent as T
}
describe('new-tab origin navigation guards', () => {
// Helper: execute a tool with newtab session context
function executeWithSession(
ctx: { browser: ToolContext['browser'] },
tool: ToolDefinition,
args: unknown,
session: ToolContext['session'],
): Promise<ToolResult> {
const signal = AbortSignal.timeout(30_000)
return executeTool(
tool,
args,
{
browser: ctx.browser,
directories: { workingDir: process.cwd() },
session,
},
signal,
)
}
// -------------------------------------------------------------------------
// navigate_page guards
// -------------------------------------------------------------------------
it('navigate_page rejects navigation on origin tab in newtab mode', async () => {
await withBrowser(async ({ browser }) => {
// Use a new page as the simulated "origin tab"
const setupResult = await executeTool(
new_page,
{ url: 'about:blank' },
{ browser, directories: { workingDir: process.cwd() } },
AbortSignal.timeout(30_000),
)
const originPageId = structuredOf<{ pageId: number }>(setupResult).pageId
const result = await executeWithSession(
{ browser },
navigate_page,
{ page: originPageId, action: 'url', url: 'https://example.com' },
{ origin: 'newtab', originPageId },
)
assert.ok(result.isError, 'Expected navigate_page to be rejected')
assert.ok(
textOf(result).includes('Cannot navigate the origin tab'),
`Expected origin tab error, got: ${textOf(result)}`,
)
// Cleanup
await executeTool(
close_page,
{ page: originPageId },
{ browser, directories: { workingDir: process.cwd() } },
AbortSignal.timeout(30_000),
)
})
}, 60_000)
it('navigate_page allows navigation on non-origin tab in newtab mode', async () => {
await withBrowser(async ({ browser }) => {
const originResult = await executeTool(
new_page,
{ url: 'about:blank' },
{ browser, directories: { workingDir: process.cwd() } },
AbortSignal.timeout(30_000),
)
const originPageId = structuredOf<{ pageId: number }>(originResult).pageId
// Open a second tab — this is NOT the origin tab
const otherResult = await executeTool(
new_page,
{ url: 'about:blank' },
{ browser, directories: { workingDir: process.cwd() } },
AbortSignal.timeout(30_000),
)
const otherPageId = structuredOf<{ pageId: number }>(otherResult).pageId
const result = await executeWithSession(
{ browser },
navigate_page,
{ page: otherPageId, action: 'url', url: 'https://example.com' },
{ origin: 'newtab', originPageId },
)
assert.ok(
!result.isError,
`Expected success, got error: ${textOf(result)}`,
)
assert.ok(textOf(result).includes('Navigated to'))
// Cleanup
const noSession = { browser, directories: { workingDir: process.cwd() } }
await executeTool(
close_page,
{ page: otherPageId },
noSession,
AbortSignal.timeout(30_000),
)
await executeTool(
close_page,
{ page: originPageId },
noSession,
AbortSignal.timeout(30_000),
)
})
}, 60_000)
it('navigate_page works normally in sidepanel mode', async () => {
await withBrowser(async ({ browser }) => {
const setupResult = await executeTool(
new_page,
{ url: 'about:blank' },
{ browser, directories: { workingDir: process.cwd() } },
AbortSignal.timeout(30_000),
)
const pageId = structuredOf<{ pageId: number }>(setupResult).pageId
const result = await executeWithSession(
{ browser },
navigate_page,
{ page: pageId, action: 'url', url: 'https://example.com' },
{ origin: 'sidepanel', originPageId: pageId },
)
assert.ok(
!result.isError,
`Expected success, got error: ${textOf(result)}`,
)
assert.ok(textOf(result).includes('Navigated to'))
await executeTool(
close_page,
{ page: pageId },
{ browser, directories: { workingDir: process.cwd() } },
AbortSignal.timeout(30_000),
)
})
}, 60_000)
it('navigate_page works when session is undefined (backwards compat)', async () => {
await withBrowser(async ({ browser, execute }) => {
const setupResult = await execute(new_page, { url: 'about:blank' })
const pageId = structuredOf<{ pageId: number }>(setupResult).pageId
// execute() from withBrowser passes no session — simulates old clients
const result = await execute(navigate_page, {
page: pageId,
action: 'url',
url: 'https://example.com',
})
assert.ok(
!result.isError,
`Expected success, got error: ${textOf(result)}`,
)
await execute(close_page, { page: pageId })
})
}, 60_000)
// -------------------------------------------------------------------------
// close_page guards
// -------------------------------------------------------------------------
it('close_page rejects closing origin tab in newtab mode', async () => {
await withBrowser(async ({ browser }) => {
const setupResult = await executeTool(
new_page,
{ url: 'about:blank' },
{ browser, directories: { workingDir: process.cwd() } },
AbortSignal.timeout(30_000),
)
const originPageId = structuredOf<{ pageId: number }>(setupResult).pageId
const result = await executeWithSession(
{ browser },
close_page,
{ page: originPageId },
{ origin: 'newtab', originPageId },
)
assert.ok(result.isError, 'Expected close_page to be rejected')
assert.ok(
textOf(result).includes('Cannot close the origin tab'),
`Expected origin tab error, got: ${textOf(result)}`,
)
// Clean up the page we created (without newtab guard)
await executeTool(
close_page,
{ page: originPageId },
{ browser, directories: { workingDir: process.cwd() } },
AbortSignal.timeout(30_000),
)
})
}, 60_000)
it('close_page allows closing non-origin tab in newtab mode', async () => {
await withBrowser(async ({ browser }) => {
const originResult = await executeTool(
new_page,
{ url: 'about:blank' },
{ browser, directories: { workingDir: process.cwd() } },
AbortSignal.timeout(30_000),
)
const originPageId = structuredOf<{ pageId: number }>(originResult).pageId
const otherResult = await executeTool(
new_page,
{ url: 'about:blank' },
{ browser, directories: { workingDir: process.cwd() } },
AbortSignal.timeout(30_000),
)
const otherPageId = structuredOf<{ pageId: number }>(otherResult).pageId
const result = await executeWithSession(
{ browser },
close_page,
{ page: otherPageId },
{ origin: 'newtab', originPageId },
)
assert.ok(
!result.isError,
`Expected success, got error: ${textOf(result)}`,
)
assert.ok(textOf(result).includes(`Closed page ${otherPageId}`))
// Cleanup origin page
await executeTool(
close_page,
{ page: originPageId },
{ browser, directories: { workingDir: process.cwd() } },
AbortSignal.timeout(30_000),
)
})
}, 60_000)
})

View File

@@ -51,9 +51,9 @@
"@radix-ui/react-use-controllable-state": "^1.2.2",
"@sentry/react": "^10.31.0",
"@sentry/vite-plugin": "^4.6.1",
"@tanstack/query-async-storage-persister": "^5.90.21",
"@tanstack/react-query": "^5.90.19",
"@tanstack/react-query-persist-client": "^5.90.21",
"@tanstack/query-async-storage-persister": "^5.95.2",
"@tanstack/react-query": "^5.95.2",
"@tanstack/react-query-persist-client": "^5.95.2",
"@types/cytoscape": "^3.31.0",
"@types/dompurify": "^3.2.0",
"@webext-core/messaging": "^2.3.0",
@@ -76,8 +76,8 @@
"eventsource-parser": "^3.0.6",
"graphql": "^16.12.0",
"hono": "^4.12.3",
"idb-keyval": "^6.2.2",
"klavis": "^2.15.0",
"localforage": "^1.10.0",
"lucide-react": "^0.562.0",
"motion": "^12.23.24",
"nanoid": "^5.1.6",
@@ -170,7 +170,7 @@
},
"apps/server": {
"name": "@browseros/server",
"version": "0.0.79",
"version": "0.0.80",
"bin": {
"browseros-server": "./src/index.ts",
},
@@ -1780,15 +1780,15 @@
"@tailwindcss/vite": ["@tailwindcss/vite@4.1.18", "", { "dependencies": { "@tailwindcss/node": "4.1.18", "@tailwindcss/oxide": "4.1.18", "tailwindcss": "4.1.18" }, "peerDependencies": { "vite": "^5.2.0 || ^6 || ^7" } }, "sha512-jVA+/UpKL1vRLg6Hkao5jldawNmRo7mQYrZtNHMIVpLfLhDml5nMRUo/8MwoX2vNXvnaXNNMedrMfMugAVX1nA=="],
"@tanstack/query-async-storage-persister": ["@tanstack/query-async-storage-persister@5.90.21", "", { "dependencies": { "@tanstack/query-core": "5.90.19", "@tanstack/query-persist-client-core": "5.91.18" } }, "sha512-edpZzybucsMxGiWOMy24io+5l4Lciw4bgv/N2EXQnSp0exS1siTOQbCAQET8jwStCEnaoEiS8ljChnfmnd2pkw=="],
"@tanstack/query-async-storage-persister": ["@tanstack/query-async-storage-persister@5.95.2", "", { "dependencies": { "@tanstack/query-core": "5.95.2", "@tanstack/query-persist-client-core": "5.95.2" } }, "sha512-ZhPIHH8J833OVZhEWwwdOk0uhY94d9Wgdnq97JoQx4Ui4xx4Dh6e7WPUrjlUWo88Yqi4Ij+T1o/VR7Vlbnkbjw=="],
"@tanstack/query-core": ["@tanstack/query-core@5.90.19", "", {}, "sha512-GLW5sjPVIvH491VV1ufddnfldyVB+teCnpPIvweEfkpRx7CfUmUGhoh9cdcUKBh/KwVxk22aNEDxeTsvmyB/WA=="],
"@tanstack/query-core": ["@tanstack/query-core@5.95.2", "", {}, "sha512-o4T8vZHZET4Bib3jZ/tCW9/7080urD4c+0/AUaYVpIqOsr7y0reBc1oX3ttNaSW5mYyvZHctiQ/UOP2PfdmFEQ=="],
"@tanstack/query-persist-client-core": ["@tanstack/query-persist-client-core@5.91.18", "", { "dependencies": { "@tanstack/query-core": "5.90.19" } }, "sha512-1FNvccVTFZph07dtA/4p5PRAVKfqVLPPxA8BXUoYjPOZP6T4qY1asItVkUFtUr6kBu48i0DBnEEZQLmK82BIFw=="],
"@tanstack/query-persist-client-core": ["@tanstack/query-persist-client-core@5.95.2", "", { "dependencies": { "@tanstack/query-core": "5.95.2" } }, "sha512-Opfj34WZ594YXpEcZEs8WBiyPGrjrKlGILfk/Ss283uwWQ36C5nX3tRY/bBiXmM82KWauUuNvahwGwiyco/8cQ=="],
"@tanstack/react-query": ["@tanstack/react-query@5.90.19", "", { "dependencies": { "@tanstack/query-core": "5.90.19" }, "peerDependencies": { "react": "^18 || ^19" } }, "sha512-qTZRZ4QyTzQc+M0IzrbKHxSeISUmRB3RPGmao5bT+sI6ayxSRhn0FXEnT5Hg3as8SBFcRosrXXRFB+yAcxVxJQ=="],
"@tanstack/react-query": ["@tanstack/react-query@5.95.2", "", { "dependencies": { "@tanstack/query-core": "5.95.2" }, "peerDependencies": { "react": "^18 || ^19" } }, "sha512-/wGkvLj/st5Ud1Q76KF1uFxScV7WeqN1slQx5280ycwAyYkIPGaRZAEgHxe3bjirSd5Zpwkj6zNcR4cqYni/ZA=="],
"@tanstack/react-query-persist-client": ["@tanstack/react-query-persist-client@5.90.21", "", { "dependencies": { "@tanstack/query-persist-client-core": "5.91.18" }, "peerDependencies": { "@tanstack/react-query": "^5.90.19", "react": "^18 || ^19" } }, "sha512-ix9fVeS96QZxaMPRUwf+k6RlNLJxvu0WSjQp9nPiosxRqquxz0tJ5ErMsclZO9Q/jmVhoFm4FKEZ8mfTLBMoiQ=="],
"@tanstack/react-query-persist-client": ["@tanstack/react-query-persist-client@5.95.2", "", { "dependencies": { "@tanstack/query-persist-client-core": "5.95.2" }, "peerDependencies": { "@tanstack/react-query": "^5.95.2", "react": "^18 || ^19" } }, "sha512-i3fvzD8gaLgQyFvRc/+iSUr60aL31tMN+5QM11zdPRg0K9CirIQjHD7WgXFBnD29KJDvcjcv7OrIBaPwZ+H9xw=="],
"@theguild/federation-composition": ["@theguild/federation-composition@0.21.3", "", { "dependencies": { "constant-case": "^3.0.4", "debug": "4.4.3", "json5": "^2.2.3", "lodash.sortby": "^4.7.0" }, "peerDependencies": { "graphql": "^16.0.0" } }, "sha512-+LlHTa4UbRpZBog3ggAxjYIFvdfH3UMvvBUptur19TMWkqU4+n3GmN+mDjejU+dyBXIG27c25RsiQP1HyvM99g=="],
@@ -2960,6 +2960,8 @@
"iconv-lite": ["iconv-lite@0.7.2", "", { "dependencies": { "safer-buffer": ">= 2.1.2 < 3.0.0" } }, "sha512-im9DjEDQ55s9fL4EYzOAv0yMqmMBSZp6G0VvFyTMPKWxiSBHUj9NW/qqLmXUwXrrM7AvqSlTCfvqRb0cM8yYqw=="],
"idb-keyval": ["idb-keyval@6.2.2", "", {}, "sha512-yjD9nARJ/jb1g+CvD0tlhUHOrJ9Sy0P8T9MF3YaLlHnSRpwPfpTX0XIvpmw3gAJUmEu3FiICLBDPXVwyEvrleg=="],
"ieee754": ["ieee754@1.2.1", "", {}, "sha512-dcyqhDvX1C46lXZcVqCpK+FtMRQVdIMN6/Df5js2zouUsqG7I6sFxitIC+7KYK29KdXOLHdu9zL4sFnoVQnqaA=="],
"ignore": ["ignore@7.0.5", "", {}, "sha512-Hs59xBNfUIunMFgWAbGX5cq6893IbWg4KnrjbYwX3tx0ztorVgTDA6B2sxf8ejHJ4wz8BqGUMYlnzNBer5NvGg=="],
@@ -3186,7 +3188,7 @@
"lib0": ["lib0@0.2.117", "", { "dependencies": { "isomorphic.js": "^0.2.4" }, "bin": { "0serve": "bin/0serve.js", "0gentesthtml": "bin/gentesthtml.js", "0ecdsa-generate-keypair": "bin/0ecdsa-generate-keypair.js" } }, "sha512-DeXj9X5xDCjgKLU/7RR+/HQEVzuuEUiwldwOGsHK/sfAfELGWEyTcf0x+uOvCvK3O2zPmZePXWL85vtia6GyZw=="],
"lie": ["lie@3.1.1", "", { "dependencies": { "immediate": "~3.0.5" } }, "sha512-RiNhHysUjhrDQntfYSfY4MU24coXXdEOgw9WGcKHNeEwffDYbF//u87M1EWaMGzuFoSbqW0C9C6lEEhDOAswfw=="],
"lie": ["lie@3.3.0", "", { "dependencies": { "immediate": "~3.0.5" } }, "sha512-UaiMJzeWRlEujzAuw5LokY1L5ecNQYZKfmyZ9L7wDHb/p5etKaxXhohBcrw0EYby+G/NA52vRSN4N39dxHAIwQ=="],
"lighthouse-logger": ["lighthouse-logger@2.0.2", "", { "dependencies": { "debug": "^4.4.1", "marky": "^1.2.2" } }, "sha512-vWl2+u5jgOQuZR55Z1WM0XDdrJT6mzMP8zHUct7xTlWhuQs+eV0g+QL0RQdFjT54zVmbhLCP8vIVpy1wGn/gCg=="],
@@ -3232,8 +3234,6 @@
"local-pkg": ["local-pkg@1.1.2", "", { "dependencies": { "mlly": "^1.7.4", "pkg-types": "^2.3.0", "quansync": "^0.2.11" } }, "sha512-arhlxbFRmoQHl33a0Zkle/YWlmNwoyt6QNZEIJcqNbdrsix5Lvc4HyyI3EnwxTYlZYc32EbYrQ8SzEZ7dqgg9A=="],
"localforage": ["localforage@1.10.0", "", { "dependencies": { "lie": "3.1.1" } }, "sha512-14/H1aX7hzBBmmh7sGPd+AOMkkIrHM3Z1PAyGgZigA1H1p5O5ANnMyWzvpAETtG68/dC4pC0ncy3+PPGzXZHPg=="],
"locate-path": ["locate-path@6.0.0", "", { "dependencies": { "p-locate": "^5.0.0" } }, "sha512-iPZK6eYjbxRu3uB4/WZ3EsEIMJFMqAoopl3R+zuq0UjcAm/MO6KCweDgPfP3elTztoKP3KtnVHxTn2NHBSDVUw=="],
"lodash": ["lodash@4.17.23", "", {}, "sha512-LgVTMpQtIopCi79SJeDiP0TfWi5CNEc/L/aRdTh3yIvmZXTnheWpKjSZhnvMl8iXbC1tFg9gdHHDMLoV7CnG+w=="],
@@ -4552,7 +4552,7 @@
"@better-auth/core/zod": ["zod@4.3.5", "", {}, "sha512-k7Nwx6vuWx1IJ9Bjuf4Zt1PEllcwe7cls3VNzm4CQ1/hgtFUK2bRNG3rvnpPUhFjmqJKAKtjV576KnUkHocg/g=="],
"@browseros/agent/@types/bun": ["@types/bun@1.3.10", "", { "dependencies": { "bun-types": "1.3.10" } }, "sha512-0+rlrUrOrTSskibryHbvQkDOWRJwJZqZlxrUs1u4oOoTln8+WIXBPmAuCF35SWB2z4Zl3E84Nl/D0P7803nigQ=="],
"@browseros/agent/@types/bun": ["@types/bun@1.3.11", "", { "dependencies": { "bun-types": "1.3.11" } }, "sha512-5vPne5QvtpjGpsGYXiFyycfpDF2ECyPcTSsFBMa0fraoxiQyMJ3SmuQIGhzPg2WJuWxVBoxWJ2kClYTcw/4fAg=="],
"@browseros/agent/zod": ["zod@4.3.5", "", {}, "sha512-k7Nwx6vuWx1IJ9Bjuf4Zt1PEllcwe7cls3VNzm4CQ1/hgtFUK2bRNG3rvnpPUhFjmqJKAKtjV576KnUkHocg/g=="],
@@ -5104,8 +5104,6 @@
"jest-worker/supports-color": ["supports-color@8.1.1", "", { "dependencies": { "has-flag": "^4.0.0" } }, "sha512-MpUEN2OodtUzxvKQl72cUF7RQ5EiHsGvSsVG0ia9c5RbWGL2CI4C7EpPS8UTBIplnlzZiNuV56w+FuNxy3ty2Q=="],
"jszip/lie": ["lie@3.3.0", "", { "dependencies": { "immediate": "~3.0.5" } }, "sha512-UaiMJzeWRlEujzAuw5LokY1L5ecNQYZKfmyZ9L7wDHb/p5etKaxXhohBcrw0EYby+G/NA52vRSN4N39dxHAIwQ=="],
"jszip/readable-stream": ["readable-stream@2.3.8", "", { "dependencies": { "core-util-is": "~1.0.0", "inherits": "~2.0.3", "isarray": "~1.0.0", "process-nextick-args": "~2.0.0", "safe-buffer": "~5.1.1", "string_decoder": "~1.1.1", "util-deprecate": "~1.0.1" } }, "sha512-8p0AUk4XODgIewSi0l8Epjs+EVnWiK7NoDIEGU0HhE7+ZyY8D1IMY7odu5lRrFXGg71L15KG8QrPmum45RTtdA=="],
"jwa/safe-buffer": ["safe-buffer@5.2.1", "", {}, "sha512-rp3So07KcdmmKbGvgaNxQSJr7bGVSVk5S9Eq1F+ppbRo70+YeaDxkw5Dd8NPN+GD6bjnYm2VuPuCXmpuYvmCXQ=="],
@@ -5306,7 +5304,7 @@
"@aws-crypto/util/@smithy/util-utf8/@smithy/util-buffer-from": ["@smithy/util-buffer-from@2.2.0", "", { "dependencies": { "@smithy/is-array-buffer": "^2.2.0", "tslib": "^2.6.2" } }, "sha512-IJdWBbTcMQ6DA0gdNhh/BwrLkDR+ADW5Kr1aZmd4k3DIF6ezMV4R2NIAmT08wQJ3yUK82thHWmC/TnK/wpMMIA=="],
"@browseros/agent/@types/bun/bun-types": ["bun-types@1.3.10", "", { "dependencies": { "@types/node": "*" } }, "sha512-tcpfCCl6XWo6nCVnpcVrxQ+9AYN1iqMIzgrSKYMB/fjLtV2eyAVEg7AxQJuCq/26R6HpKWykQXuSOq/21RYcbg=="],
"@browseros/agent/@types/bun/bun-types": ["bun-types@1.3.11", "", { "dependencies": { "@types/node": "*" } }, "sha512-1KGPpoxQWl9f6wcZh57LvrPIInQMn2TQ7jsgxqpRzg+l0QPOFvJVH7HmvHo/AiPgwXy+/Thf6Ov3EdVn1vOabg=="],
"@browseros/eval/@aws-sdk/client-s3/@aws-sdk/core": ["@aws-sdk/core@3.973.23", "", { "dependencies": { "@aws-sdk/types": "^3.973.6", "@aws-sdk/xml-builder": "^3.972.15", "@smithy/core": "^3.23.12", "@smithy/node-config-provider": "^4.3.12", "@smithy/property-provider": "^4.2.12", "@smithy/protocol-http": "^5.3.12", "@smithy/signature-v4": "^5.3.12", "@smithy/smithy-client": "^4.12.7", "@smithy/types": "^4.13.1", "@smithy/util-base64": "^4.3.2", "@smithy/util-middleware": "^4.2.12", "@smithy/util-utf8": "^4.2.2", "tslib": "^2.6.2" } }, "sha512-aoJncvD1XvloZ9JLnKqTRL9dBy+Szkryoag9VT+V1TqsuUgIxV9cnBVM/hrDi2vE8bDqLiDR8nirdRcCdtJu0w=="],

View File

@@ -1,7 +1,17 @@
# @browseros-ai/agent-sdk
[![npm version](https://img.shields.io/npm/v/@browseros-ai/agent-sdk)](https://www.npmjs.com/package/@browseros-ai/agent-sdk)
[![License: AGPL v3](https://img.shields.io/badge/License-AGPL%20v3-blue.svg)](../../../../LICENSE)
Browser automation SDK for BrowserOS — navigate, interact, extract data, and verify page state using natural language.
Build automations that describe *what* to do, not *how* to do it. The SDK connects to a running BrowserOS instance and translates natural language instructions into browser actions using your choice of LLM provider.
## Prerequisites
- A running [BrowserOS](https://browseros.com) instance
- An API key for at least one [supported LLM provider](#llm-providers)
## Installation
```bash
@@ -17,7 +27,7 @@ import { Agent } from '@browseros-ai/agent-sdk'
import { z } from 'zod'
const agent = new Agent({
url: 'http://localhost:3000',
url: 'http://localhost:9100',
llm: {
provider: 'openai',
apiKey: process.env.OPENAI_API_KEY,
@@ -42,6 +52,40 @@ const { data } = await agent.extract('get all product names and prices', {
const { success, reason } = await agent.verify('user is logged in')
```
## Multi-Step Example
Combine navigation, actions, extraction, and verification for end-to-end automation:
```typescript
import { Agent } from '@browseros-ai/agent-sdk'
import { z } from 'zod'
const agent = new Agent({
url: 'http://localhost:9100',
llm: { provider: 'anthropic', apiKey: process.env.ANTHROPIC_API_KEY },
})
// 1. Navigate
await agent.nav('https://news.ycombinator.com')
// 2. Extract data
const { data: stories } = await agent.extract('get the top 5 stories with title, points, and link', {
schema: z.array(z.object({
title: z.string(),
points: z.number(),
link: z.string(),
})),
})
// 3. Act on extracted data
await agent.act(`click on the story titled "${stories[0].title}"`)
// 4. Verify the result
const { success } = await agent.verify('the story page or external link has loaded')
console.log({ stories, navigationSuccess: success })
```
## API Reference
### `new Agent(options)`
@@ -105,8 +149,6 @@ const { success, reason } = await agent.verify('the form was submitted successfu
## LLM Providers
Supported providers:
| Provider | Config |
|----------|--------|
| OpenAI | `{ provider: 'openai', apiKey: '...' }` |
@@ -121,11 +163,11 @@ Supported providers:
## Progress Events
Track agent operations:
Track agent operations in real time:
```typescript
const agent = new Agent({
url: 'http://localhost:3000',
url: 'http://localhost:9100',
onProgress: (event) => {
console.log(`[${event.type}] ${event.message}`)
},
@@ -154,6 +196,13 @@ try {
}
```
## Links
- [Documentation](https://docs.browseros.com)
- [GitHub](https://github.com/browseros-ai/BrowserOS)
- [Changelog](./CHANGELOG.md)
- [Discord](https://discord.gg/YKwjt5vuKr)
## License
AGPL-3.0-or-later
[AGPL-3.0-or-later](../../../../LICENSE)

View File

@@ -0,0 +1,67 @@
# @browseros/cdp-protocol
Type-safe Chrome DevTools Protocol bindings for BrowserOS.
> **Internal package** — auto-generated TypeScript types and API wrappers for all CDP domains. Used by `@browseros/server` to communicate with Chromium.
## Usage
Import domain types or domain API wrappers using subpath exports:
```typescript
// Import type definitions for a CDP domain
import type { NavigateParams, NavigateReturn } from '@browseros/cdp-protocol/domains/page'
// Import the API wrapper for a domain
import { PageAPI } from '@browseros/cdp-protocol/domain-apis/page'
// Core protocol API
import { ProtocolAPI } from '@browseros/cdp-protocol/protocol-api'
// Factory function
import { createAPI } from '@browseros/cdp-protocol/create-api'
```
## Supported Domains
All standard Chrome DevTools Protocol domains are supported:
| Category | Domains |
|----------|---------|
| **Page & DOM** | Page, DOM, DOMDebugger, DOMSnapshot, DOMStorage, CSS, Overlay |
| **Network** | Network, Fetch, IO, ServiceWorker, CacheStorage |
| **Input & Interaction** | Input, Emulation, DeviceOrientation, DeviceAccess |
| **JavaScript** | Runtime, Debugger, Console, Profiler, HeapProfiler |
| **Browser** | Browser, Target, Inspector, Extensions, PWA |
| **Performance** | Performance, PerformanceTimeline, Tracing, Memory |
| **Media** | Media, WebAudio, Cast |
| **Security** | Security, WebAuthn, FedCm |
| **Storage** | IndexedDB, Storage, FileSystem |
| **Other** | Accessibility, Animation, Audits, Autofill, BackgroundService, BluetoothEmulation, EventBreakpoints, HeadlessExperimental, LayerTree, Log, Preload, Schema, SystemInfo, Tethering |
| **BrowserOS Custom** | Bookmarks, History |
## Structure
```
src/generated/
├── domains/ # Type definitions for each CDP domain
│ ├── page.ts
│ ├── dom.ts
│ ├── network.ts
│ └── ...
├── domain-apis/ # API wrapper classes for each domain
│ ├── page.ts
│ ├── dom.ts
│ ├── network.ts
│ └── ...
├── protocol-api.ts # Unified protocol API
└── create-api.ts # API factory
```
## Regenerating Types
Types are auto-generated from the CDP protocol specification. The generated output lives in `src/generated/` and should not be edited manually.
## License
[AGPL-3.0-or-later](../../../../LICENSE)

View File

@@ -0,0 +1,133 @@
# BrowserOS Browser (Chromium Fork)
Custom Chromium build with AI agent integration, enhanced privacy patches, and native MCP support.
> Based on **Chromium 146.0.7680.31** · Built with Python 3.12+ · Licensed under [AGPL-3.0](../../LICENSE)
## What This Is
This package contains the BrowserOS browser build system — everything needed to fetch Chromium source, apply BrowserOS patches, and produce signed binaries for macOS, Windows, and Linux. The build system is a Python CLI that orchestrates the entire pipeline from source to distributable.
BrowserOS patches add:
- Native AI agent sidebar and new tab integration
- MCP server endpoints baked into the browser
- Enhanced privacy via [ungoogled-chromium](https://github.com/ungoogled-software/ungoogled-chromium) patches
- Custom branding, icons, and entitlements
- Keychain access group management (macOS)
- Sparkle auto-update framework (macOS)
## Prerequisites
| Requirement | Details |
|-------------|---------|
| **Disk space** | ~100 GB for Chromium source + build artifacts |
| **Python** | 3.12+ |
| **macOS** | Xcode + Command Line Tools |
| **Linux** | `build-essential`, `clang`, `lld`, and Chromium's [Linux deps](https://chromium.googlesource.com/chromium/src/+/main/docs/linux/build_instructions.md) |
| **Windows** | Visual Studio 2022, Windows SDK |
## Directory Structure
```
packages/browseros/
├── build/ # Build system (Python CLI)
│ ├── __main__.py # CLI entry point
│ ├── browseros.py # Main app definition
│ ├── modules/
│ │ ├── setup/ # Chromium source fetch and setup
│ │ ├── patches/ # Patch application logic
│ │ ├── apply/ # Apply patches to source tree
│ │ ├── extract/ # Extract patches from modified source
│ │ ├── feature/ # Feature flag management
│ │ ├── package/ # Binary packaging
│ │ ├── sign/ # Code signing (macOS, Windows)
│ │ ├── ota/ # Over-the-air update support
│ │ └── resources/ # Resource management
│ ├── config/ # Build configuration
│ └── features.yaml # Feature flag definitions
├── chromium_patches/ # BrowserOS patches applied to Chromium source
│ ├── chrome/browser/ # Browser UI and feature patches
│ ├── components/ # Component patches (e.g., os_crypt)
│ └── ... # Organized to mirror Chromium source tree
├── chromium_files/ # New files added to Chromium (not patches)
├── series_patches/ # Ordered patch series
├── resources/ # Icons, entitlements, signing resources
│ └── entitlements/ # macOS entitlements (app, helper, GPU, etc.)
├── tools/
│ └── bdev # Developer tool
├── CHROMIUM_VERSION # Pinned Chromium version (MAJOR.MINOR.BUILD.PATCH)
├── BASE_COMMIT # Base Chromium commit hash
├── pyproject.toml # Python project config
└── requirements.txt # Python dependencies
```
## Build System
The `browseros` CLI manages the full build lifecycle:
```bash
# Install the build system
pip install -e .
# Or use uv
uv pip install -e .
```
Key commands:
```bash
browseros setup # Fetch and prepare Chromium source
browseros apply # Apply all patches to Chromium source
browseros build # Build BrowserOS binary
browseros package # Package into distributable (DMG, installer, AppImage)
browseros sign # Code sign the binary (macOS/Windows)
```
## Patch System
BrowserOS applies patches on top of vanilla Chromium. Patches are organized in two directories:
- **`chromium_patches/`** — Individual file patches, organized to mirror the Chromium source tree. Each file here replaces or modifies the corresponding file in Chromium.
- **`series_patches/`** — Ordered patch series applied sequentially.
### Adding a New Patch
1. Make your changes in the Chromium source tree
2. Use `browseros extract` to pull changes back into patch format
3. Place the patch in the appropriate directory mirroring Chromium's structure
4. Test with a full `browseros apply && browseros build` cycle
### Chromium Version Pinning
The exact Chromium version is pinned in `CHROMIUM_VERSION`:
```
MAJOR=146
MINOR=0
BUILD=7680
PATCH=31
```
To update the base Chromium version, update this file and `BASE_COMMIT`, then resolve any patch conflicts.
## Signing (macOS)
macOS builds require code signing for Keychain access, Gatekeeper, and notarization:
- Entitlements are in `resources/entitlements/` (app, helper, GPU, renderer, etc.)
- Designated requirements pin to Team ID for Keychain persistence across updates
- The signing module is at `build/modules/sign/macos.py`
## Feature Flags
Feature flags are defined in `features.yaml` and control which BrowserOS-specific features are compiled into the build. The feature module (`build/modules/feature/`) manages flag resolution at build time.
## Related Resources
- [Chromium Build Instructions](https://chromium.googlesource.com/chromium/src/+/main/docs/linux/build_instructions.md)
- [ungoogled-chromium](https://github.com/ungoogled-software/ungoogled-chromium) — upstream privacy patches
- [BrowserOS Agent Platform](../browseros-agent/) — the TypeScript/Go agent system that runs inside the browser

View File

@@ -1,5 +0,0 @@
bros
bros-linux-amd64
bros-linux-arm64
bros-darwin-amd64
bros-darwin-arm64

View File

@@ -1,27 +1,27 @@
BINARY := bdev
VERSION := $(shell git describe --tags --always --dirty 2>/dev/null || echo "dev")
LDFLAGS := -ldflags "-X main.version=$(VERSION)"
PREFIX ?= /usr/local/bin
VERSION ?= dev
.PHONY: build install clean test
.PHONY: build install clean test fmt
build:
go build $(LDFLAGS) -o $(BINARY) .
go build -ldflags "-X github.com/browseros-ai/BrowserOS/packages/browseros/tools/bdev/cmd.Version=$(VERSION)" -o $(BINARY) .
install:
go install $(LDFLAGS) .
clean:
rm -f $(BINARY)
install: build
mkdir -p $(PREFIX)
cp $(BINARY) $(PREFIX)/$(BINARY)
ifneq ($(shell uname -s),Darwin)
@echo "Skipping codesign on non-macOS host"
else
codesign --force --sign - $(PREFIX)/$(BINARY)
endif
@echo "Installed $(BINARY) to $(PREFIX)/$(BINARY)"
test:
go test ./...
build-linux:
GOOS=linux GOARCH=amd64 go build $(LDFLAGS) -o $(BINARY)-linux-amd64 .
fmt:
gofmt -w $$(find . -name '*.go' -not -path './vendor/*')
build-linux-arm:
GOOS=linux GOARCH=arm64 go build $(LDFLAGS) -o $(BINARY)-linux-arm64 .
build-darwin:
GOOS=darwin GOARCH=amd64 go build $(LDFLAGS) -o $(BINARY)-darwin-amd64 .
GOOS=darwin GOARCH=arm64 go build $(LDFLAGS) -o $(BINARY)-darwin-arm64 .
clean:
rm -f $(BINARY)

View File

@@ -0,0 +1,32 @@
package cmd
import (
"fmt"
"github.com/browseros-ai/BrowserOS/packages/browseros/tools/bdev/internal/engine"
"github.com/browseros-ai/BrowserOS/packages/browseros/tools/bdev/internal/resolve"
"github.com/browseros-ai/BrowserOS/packages/browseros/tools/bdev/internal/ui"
"github.com/spf13/cobra"
)
func init() {
command := &cobra.Command{
Use: "abort",
Annotations: map[string]string{"group": "Conflict:"},
Short: "Abort conflict resolution and roll the pending files back",
Args: cobra.NoArgs,
RunE: func(cmd *cobra.Command, args []string) error {
ws, err := resolve.FindActive(appState.Registry, appState.CWD)
if err != nil {
return err
}
if err := engine.Abort(cmd.Context(), ws); err != nil {
return err
}
return renderResult(map[string]any{"workspace": ws.Name, "aborted": true}, func() {
fmt.Println(ui.Warning(fmt.Sprintf("Aborted conflict resolution for %s", ws.Name)))
})
},
}
rootCmd.AddCommand(command)
}

View File

@@ -0,0 +1,42 @@
package cmd
import (
"fmt"
"github.com/browseros-ai/BrowserOS/packages/browseros/tools/bdev/internal/ui"
"github.com/spf13/cobra"
)
func init() {
var patchesRepo string
command := &cobra.Command{
Use: "add <name> <path>",
Aliases: []string{"register"},
Annotations: map[string]string{"group": "Workspace:"},
Short: "Register a Chromium checkout as a workspace",
Args: cobra.ExactArgs(2),
RunE: func(cmd *cobra.Command, args []string) error {
if err := ensureRepoConfigured(patchesRepo); err != nil {
return err
}
entry, err := appState.Registry.Add(args[0], args[1])
if err != nil {
return err
}
if err := appState.Save(); err != nil {
return err
}
return renderResult(map[string]any{
"workspace": entry,
"patches_repo": appState.Config.PatchesRepo,
}, func() {
fmt.Println(ui.Success("Registered workspace"))
fmt.Printf("%s %s\n", ui.Muted("name:"), entry.Name)
fmt.Printf("%s %s\n", ui.Muted("path:"), entry.Path)
fmt.Printf("%s %s\n", ui.Muted("repo:"), appState.Config.PatchesRepo)
})
},
}
command.Flags().StringVar(&patchesRepo, "patches-repo", "", "Path to packages/browseros")
rootCmd.AddCommand(command)
}

View File

@@ -0,0 +1,65 @@
package cmd
import (
"fmt"
"github.com/browseros-ai/BrowserOS/packages/browseros/tools/bdev/internal/engine"
"github.com/browseros-ai/BrowserOS/packages/browseros/tools/bdev/internal/ui"
"github.com/spf13/cobra"
)
func init() {
var src string
var reset bool
var changed string
var rangeEnd string
command := &cobra.Command{
Use: "apply [workspace] [-- files...]",
Annotations: map[string]string{"group": "Core:"},
Short: "Apply repo patches to a workspace",
Args: cobra.ArbitraryArgs,
RunE: func(cmd *cobra.Command, args []string) error {
positional, filters := splitWorkspaceAndFilters(cmd, args)
if len(positional) > 1 {
return fmt.Errorf("expected at most one workspace name")
}
ws, err := resolveWorkspace(positional, src)
if err != nil {
return err
}
info, err := repoInfo()
if err != nil {
return err
}
result, err := engine.Apply(cmd.Context(), engine.ApplyOptions{
Workspace: ws,
Repo: info,
Reset: reset,
ChangedRef: changed,
RangeEnd: rangeEnd,
Filters: filters,
})
if err != nil {
return err
}
return renderResult(result, func() {
fmt.Println(ui.Title(fmt.Sprintf("Applied patches to %s", ws.Name)))
fmt.Printf("%s %s\n", ui.Muted("mode:"), result.Mode)
fmt.Printf("%s %d\n", ui.Muted("applied:"), len(result.Applied))
fmt.Printf("%s %d\n", ui.Muted("orphaned:"), len(result.Orphaned))
if len(result.Conflicts) > 0 {
fmt.Println(ui.Warning("Conflicts detected"))
for _, conflict := range result.Conflicts {
fmt.Printf(" %s\n", conflict.ChromiumPath)
}
fmt.Println(ui.Hint(`Run "bdev continue" after fixing the current conflict.`))
}
})
},
}
command.Flags().StringVar(&src, "src", "", "Chromium checkout path to operate on directly")
command.Flags().BoolVar(&reset, "reset", false, "Reset patched files to BASE_COMMIT before applying")
command.Flags().StringVar(&changed, "changed", "", "Apply only patches changed in the given repo commit")
command.Flags().StringVar(&rangeEnd, "range-end", "", "End revision when using --changed as a range start")
rootCmd.AddCommand(command)
}

View File

@@ -1,151 +0,0 @@
package cmd
import (
"fmt"
"os"
"path/filepath"
"time"
"bdev/internal/config"
"bdev/internal/engine"
"bdev/internal/git"
"bdev/internal/log"
"bdev/internal/ui"
"github.com/spf13/cobra"
)
var cloneCmd = &cobra.Command{
Use: "clone",
Short: "Fresh-apply all patches (for CI/new checkouts)",
Long: `Apply all patches from the patches repository onto the current
Chromium checkout. Used for CI builds and new checkout setup.
Unlike pull, clone does not compare existing state — it applies everything.`,
RunE: runClone,
}
var (
clonePatchesRepo string
cloneVerifyBase bool
cloneClean bool
cloneDryRun bool
cloneName string
)
func init() {
cloneCmd.Flags().StringVar(&clonePatchesRepo, "patches-repo", "", "path to BrowserOS packages directory")
cloneCmd.Flags().BoolVar(&cloneVerifyBase, "verify-base", false, "fail if HEAD != BASE_COMMIT")
cloneCmd.Flags().BoolVar(&cloneClean, "clean", false, "reset all modified files to BASE before applying")
cloneCmd.Flags().BoolVar(&cloneDryRun, "dry-run", false, "show what would be applied")
cloneCmd.Flags().StringVar(&cloneName, "name", "", "checkout name (default: directory name)")
rootCmd.AddCommand(cloneCmd)
}
func runClone(cmd *cobra.Command, args []string) error {
cwd, err := os.Getwd()
if err != nil {
return fmt.Errorf("getting cwd: %w", err)
}
// Try loading existing context, or create one from flags
ctx, err := config.LoadContext()
if err != nil {
// No existing .bros/ — need --patches-repo
if clonePatchesRepo == "" {
return fmt.Errorf("no .bros/ found and --patches-repo not specified")
}
patchesRepo, err := filepath.Abs(clonePatchesRepo)
if err != nil {
return fmt.Errorf("resolving patches repo: %w", err)
}
baseCommit, err := config.ReadBaseCommit(patchesRepo)
if err != nil {
return err
}
name := cloneName
if name == "" {
name = filepath.Base(cwd)
}
brosDir := filepath.Join(cwd, config.BrosDirName)
cfg := &config.Config{
Name: name,
PatchesRepo: patchesRepo,
}
if !cloneDryRun {
if err := config.WriteConfig(brosDir, cfg); err != nil {
return err
}
_ = os.MkdirAll(filepath.Join(brosDir, "logs"), 0o755)
}
chromiumVersion, _ := config.ReadChromiumVersion(patchesRepo)
ctx = &config.Context{
Config: cfg,
State: &config.State{},
ChromiumDir: cwd,
BrosDir: brosDir,
PatchesRepo: patchesRepo,
PatchesDir: filepath.Join(patchesRepo, "chromium_patches"),
BaseCommit: baseCommit,
ChromiumVersion: chromiumVersion,
}
}
if cloneDryRun {
fmt.Println(ui.MutedStyle.Render("dry run — no files will be modified"))
fmt.Println()
}
opts := engine.CloneOpts{
VerifyBase: cloneVerifyBase,
Clean: cloneClean,
DryRun: cloneDryRun,
}
result, err := engine.Clone(ctx, opts)
if err != nil {
return err
}
// Reuse pull rendering
fmt.Println(ui.TitleStyle.Render("bdev clone"))
fmt.Println()
fmt.Printf(" %s %d patches applied\n",
ui.SuccessStyle.Render("+"), len(result.Applied))
if len(result.Conflicts) > 0 {
fmt.Printf(" %s %d conflicts\n",
ui.ErrorStyle.Render("x"), len(result.Conflicts))
}
if len(result.Deleted) > 0 {
fmt.Printf(" %s %d files deleted\n",
ui.DeletedPrefix, len(result.Deleted))
}
if len(result.Conflicts) > 0 {
fmt.Print(ui.RenderConflictReport(result.Conflicts))
}
if !cloneDryRun {
repoRev, _ := git.HeadRev(ctx.PatchesRepo)
ctx.State.LastPull = &config.SyncEvent{
PatchesRepoRev: repoRev,
BaseCommit: ctx.BaseCommit,
Timestamp: time.Now(),
FileCount: len(result.Applied) + len(result.Deleted),
}
_ = config.WriteState(ctx.BrosDir, ctx.State)
logger := log.New(ctx.BrosDir)
_ = logger.LogClone(ctx.BaseCommit, result)
}
if len(result.Conflicts) > 0 {
return fmt.Errorf("%d conflicts — see above for details", len(result.Conflicts))
}
return nil
}

View File

@@ -0,0 +1,49 @@
package cmd
import (
"fmt"
"github.com/browseros-ai/BrowserOS/packages/browseros/tools/bdev/internal/repo"
"github.com/browseros-ai/BrowserOS/packages/browseros/tools/bdev/internal/workspace"
"github.com/spf13/cobra"
)
func repoInfo() (*repo.Info, error) {
return appState.RepoInfo()
}
func resolveWorkspace(positional []string, src string) (workspace.Entry, error) {
name := ""
if len(positional) > 0 {
name = positional[0]
}
return appState.ResolveWorkspace(name, src)
}
func splitWorkspaceAndFilters(cmd *cobra.Command, args []string) ([]string, []string) {
atDash := cmd.ArgsLenAtDash()
if atDash == -1 {
return args, nil
}
return args[:atDash], args[atDash:]
}
func ensureRepoConfigured(override string) error {
if override == "" && appState.Config.PatchesRepo != "" {
return nil
}
root := override
if root == "" {
discovered, err := repo.Discover(appState.CWD)
if err != nil {
return fmt.Errorf(`unable to discover patches repo; pass --patches-repo or run from packages/browseros`)
}
root = discovered
}
info, err := repo.Load(root)
if err != nil {
return err
}
appState.Config.PatchesRepo = info.Root
return nil
}

View File

@@ -0,0 +1,39 @@
package cmd
import (
"fmt"
"github.com/browseros-ai/BrowserOS/packages/browseros/tools/bdev/internal/engine"
"github.com/browseros-ai/BrowserOS/packages/browseros/tools/bdev/internal/resolve"
"github.com/browseros-ai/BrowserOS/packages/browseros/tools/bdev/internal/ui"
"github.com/spf13/cobra"
)
func init() {
command := &cobra.Command{
Use: "continue",
Annotations: map[string]string{"group": "Conflict:"},
Short: "Advance to the next conflict after fixing the current one",
Args: cobra.NoArgs,
RunE: func(cmd *cobra.Command, args []string) error {
ws, err := resolve.FindActive(appState.Registry, appState.CWD)
if err != nil {
return err
}
result, err := engine.Continue(cmd.Context(), ws)
if err != nil {
return err
}
return renderResult(result, func() {
fmt.Println(ui.Success(fmt.Sprintf("Advanced conflict resolution for %s", ws.Name)))
if len(result.Conflicts) > 0 {
fmt.Println(ui.Warning("Next conflict"))
for _, conflict := range result.Conflicts {
fmt.Printf(" %s\n", conflict.ChromiumPath)
}
}
})
},
}
rootCmd.AddCommand(command)
}

View File

@@ -2,113 +2,52 @@ package cmd
import (
"fmt"
"strings"
"bdev/internal/config"
"bdev/internal/git"
"bdev/internal/patch"
"bdev/internal/ui"
"github.com/browseros-ai/BrowserOS/packages/browseros/tools/bdev/internal/engine"
"github.com/browseros-ai/BrowserOS/packages/browseros/tools/bdev/internal/ui"
"github.com/spf13/cobra"
)
var diffCmd = &cobra.Command{
Use: "diff",
Short: "Preview what push or pull would do",
RunE: runDiff,
}
var diffDirection string
func init() {
diffCmd.Flags().StringVar(&diffDirection, "direction", "push", "\"push\" or \"pull\"")
rootCmd.AddCommand(diffCmd)
var src string
command := &cobra.Command{
Use: "diff [workspace]",
Annotations: map[string]string{"group": "Core:"},
Short: "Preview patch differences for a workspace",
Args: cobra.MaximumNArgs(1),
RunE: func(cmd *cobra.Command, args []string) error {
ws, err := resolveWorkspace(args, src)
if err != nil {
return err
}
info, err := repoInfo()
if err != nil {
return err
}
status, err := engine.InspectWorkspace(cmd.Context(), ws, info)
if err != nil {
return err
}
return renderResult(status, func() {
fmt.Println(ui.Title(fmt.Sprintf("%s patch diff", ws.Name)))
printGroup("Needs apply", status.NeedsApply)
printGroup("Needs update", status.NeedsUpdate)
printGroup("Orphaned", status.Orphaned)
})
},
}
command.Flags().StringVar(&src, "src", "", "Chromium checkout path to operate on directly")
rootCmd.AddCommand(command)
}
func runDiff(cmd *cobra.Command, args []string) error {
ctx, err := config.LoadContext()
if err != nil {
return err
func printGroup(title string, items []string) {
if len(items) == 0 {
fmt.Printf("%s %s\n", ui.Muted(title+":"), ui.Muted("none"))
return
}
switch diffDirection {
case "push":
return diffPush(ctx)
case "pull":
return diffPull(ctx)
default:
return fmt.Errorf("invalid direction %q — use \"push\" or \"pull\"", diffDirection)
fmt.Printf("%s\n", ui.Header(title+":"))
for _, item := range items {
fmt.Printf(" %s\n", strings.TrimSpace(item))
}
}
func diffPush(ctx *config.Context) error {
nameStatus, err := git.DiffNameStatus(ctx.ChromiumDir, ctx.BaseCommit)
if err != nil {
return err
}
if len(nameStatus) == 0 {
fmt.Println(ui.MutedStyle.Render("No local changes to push."))
return nil
}
fmt.Println(ui.TitleStyle.Render("bdev diff --direction push"))
fmt.Println()
for path, op := range nameStatus {
prefix := ui.ModifiedPrefix
switch op {
case patch.OpAdded:
prefix = ui.AddedPrefix
case patch.OpDeleted:
prefix = ui.DeletedPrefix
}
fmt.Printf(" %s %s\n", prefix, path)
}
fmt.Println()
fmt.Println(ui.MutedStyle.Render(fmt.Sprintf("%d files would be pushed", len(nameStatus))))
return nil
}
func diffPull(ctx *config.Context) error {
repoPatchSet, err := patch.ReadPatchSet(ctx.PatchesDir)
if err != nil {
return err
}
diffOutput, err := git.DiffFull(ctx.ChromiumDir, ctx.BaseCommit)
if err != nil {
return err
}
localPatchSet, err := patch.ParseUnifiedDiff(diffOutput)
if err != nil {
return err
}
delta := patch.Compare(localPatchSet, repoPatchSet)
total := len(delta.NeedsUpdate) + len(delta.NeedsApply)
if total == 0 && len(delta.Deleted) == 0 {
fmt.Println(ui.MutedStyle.Render("Already up to date."))
return nil
}
fmt.Println(ui.TitleStyle.Render("bdev diff --direction pull"))
fmt.Println()
for _, f := range delta.NeedsUpdate {
fmt.Printf(" %s %s %s\n", ui.ModifiedPrefix, f, ui.MutedStyle.Render("(update)"))
}
for _, f := range delta.NeedsApply {
fmt.Printf(" %s %s %s\n", ui.AddedPrefix, f, ui.MutedStyle.Render("(new)"))
}
for _, f := range delta.Deleted {
fmt.Printf(" %s %s %s\n", ui.DeletedPrefix, f, ui.MutedStyle.Render("(delete)"))
}
fmt.Println()
fmt.Println(ui.MutedStyle.Render(fmt.Sprintf("%d files would be changed", total+len(delta.Deleted))))
return nil
}

View File

@@ -0,0 +1,73 @@
package cmd
import (
"fmt"
"github.com/browseros-ai/BrowserOS/packages/browseros/tools/bdev/internal/engine"
"github.com/browseros-ai/BrowserOS/packages/browseros/tools/bdev/internal/ui"
"github.com/spf13/cobra"
)
func init() {
var src string
var commit string
var rangeMode bool
var squash bool
var base string
command := &cobra.Command{
Use: "extract [workspace] [--range <start> <end>] [-- files...]",
Annotations: map[string]string{"group": "Core:"},
Short: "Extract workspace changes back to chromium_patches",
Args: cobra.ArbitraryArgs,
RunE: func(cmd *cobra.Command, args []string) error {
positional, filters := splitWorkspaceAndFilters(cmd, args)
workspaceArgs := positional
rangeStart := ""
rangeEnd := ""
if rangeMode {
if len(positional) < 2 || len(positional) > 3 {
return fmt.Errorf(`range mode expects "bdev extract [workspace] --range <start> <end>"`)
}
rangeStart = positional[len(positional)-2]
rangeEnd = positional[len(positional)-1]
workspaceArgs = positional[:len(positional)-2]
}
if len(workspaceArgs) > 1 {
return fmt.Errorf("expected at most one workspace name")
}
ws, err := resolveWorkspace(workspaceArgs, src)
if err != nil {
return err
}
info, err := repoInfo()
if err != nil {
return err
}
result, err := engine.Extract(cmd.Context(), engine.ExtractOptions{
Workspace: ws,
Repo: info,
Commit: commit,
RangeStart: rangeStart,
RangeEnd: rangeEnd,
Squash: squash,
Base: base,
Filters: filters,
})
if err != nil {
return err
}
return renderResult(result, func() {
fmt.Println(ui.Title(fmt.Sprintf("Extracted patches from %s", ws.Name)))
fmt.Printf("%s %s\n", ui.Muted("mode:"), result.Mode)
fmt.Printf("%s %d\n", ui.Muted("written:"), len(result.Written))
fmt.Printf("%s %d\n", ui.Muted("deleted:"), len(result.Deleted))
})
},
}
command.Flags().StringVar(&src, "src", "", "Chromium checkout path to operate on directly")
command.Flags().StringVar(&commit, "commit", "", "Extract from a single commit")
command.Flags().BoolVar(&rangeMode, "range", false, "Extract from a commit range")
command.Flags().BoolVar(&squash, "squash", false, "Squash a range into a cumulative diff")
command.Flags().StringVar(&base, "base", "", "Override BASE_COMMIT for extraction")
rootCmd.AddCommand(command)
}

View File

@@ -1,115 +0,0 @@
package cmd
import (
"fmt"
"os"
"path/filepath"
"bdev/internal/config"
"bdev/internal/git"
"bdev/internal/ui"
"github.com/spf13/cobra"
)
var initCmd = &cobra.Command{
Use: "init",
Short: "Initialize a Chromium checkout for bdev",
Long: "Sets up a .bros/ directory in the current Chromium checkout,\nlinking it to a BrowserOS patches repository.",
RunE: runInit,
}
var (
initPatchesRepo string
initName string
)
func init() {
initCmd.Flags().StringVar(&initPatchesRepo, "patches-repo", "", "path to BrowserOS packages directory (required)")
initCmd.Flags().StringVar(&initName, "name", "", "human name for this checkout (default: directory name)")
_ = initCmd.MarkFlagRequired("patches-repo")
rootCmd.AddCommand(initCmd)
}
func runInit(cmd *cobra.Command, args []string) error {
cwd, err := os.Getwd()
if err != nil {
return fmt.Errorf("getting cwd: %w", err)
}
if !config.LooksLikeChromium(cwd) {
return fmt.Errorf("current directory does not look like a Chromium checkout (missing chrome/, base/, or .git/)")
}
brosDir := filepath.Join(cwd, config.BrosDirName)
if _, err := os.Stat(filepath.Join(brosDir, "config.yaml")); err == nil {
return fmt.Errorf(".bros/config.yaml already exists — checkout already initialized")
}
patchesRepo, err := filepath.Abs(initPatchesRepo)
if err != nil {
return fmt.Errorf("resolving patches repo path: %w", err)
}
patchesDir := filepath.Join(patchesRepo, "chromium_patches")
if _, err := os.Stat(patchesDir); err != nil {
return fmt.Errorf("chromium_patches/ not found in %s", patchesRepo)
}
baseCommit, err := config.ReadBaseCommit(patchesRepo)
if err != nil {
return err
}
if !git.CommitExists(cwd, baseCommit) {
return fmt.Errorf("BASE_COMMIT %s not found in this checkout's git history", baseCommit)
}
name := initName
if name == "" {
name = filepath.Base(cwd)
}
cfg := &config.Config{
Name: name,
PatchesRepo: patchesRepo,
}
if err := config.WriteConfig(brosDir, cfg); err != nil {
return err
}
// Create logs directory
if err := os.MkdirAll(filepath.Join(brosDir, "logs"), 0o755); err != nil {
return fmt.Errorf("creating logs directory: %w", err)
}
chromiumVersion, _ := config.ReadChromiumVersion(patchesRepo)
// Count existing patches
patchCount := 0
_ = filepath.Walk(patchesDir, func(path string, info os.FileInfo, err error) error {
if err != nil {
return nil
}
if !info.IsDir() {
patchCount++
}
return nil
})
fmt.Println(ui.TitleStyle.Render("bdev init"))
fmt.Println()
fmt.Printf(" %s %s\n", ui.LabelStyle.Render("Checkout:"), ui.ValueStyle.Render(name))
fmt.Printf(" %s %s\n", ui.LabelStyle.Render("Directory:"), cwd)
fmt.Printf(" %s %s\n", ui.LabelStyle.Render("Patches repo:"), patchesRepo)
fmt.Printf(" %s %s\n", ui.LabelStyle.Render("Base commit:"), baseCommit[:min(12, len(baseCommit))])
if chromiumVersion != "" {
fmt.Printf(" %s %s\n", ui.LabelStyle.Render("Chromium:"), chromiumVersion)
}
fmt.Printf(" %s %d files\n", ui.LabelStyle.Render("Patches:"), patchCount)
fmt.Println()
fmt.Println(ui.SuccessStyle.Render("Initialized .bros/config.yaml"))
fmt.Println(ui.MutedStyle.Render("Run 'bdev pull' to apply patches, or 'bdev push' to extract."))
return nil
}

View File

@@ -0,0 +1,49 @@
package cmd
import (
"fmt"
"github.com/browseros-ai/BrowserOS/packages/browseros/tools/bdev/internal/engine"
"github.com/browseros-ai/BrowserOS/packages/browseros/tools/bdev/internal/ui"
"github.com/spf13/cobra"
)
func init() {
command := &cobra.Command{
Use: "list",
Aliases: []string{"ls"},
Annotations: map[string]string{"group": "Workspace:"},
Short: "List registered workspaces and their sync state",
Args: cobra.NoArgs,
RunE: func(cmd *cobra.Command, args []string) error {
if len(appState.Registry.Workspaces) == 0 {
return renderResult(map[string]any{"workspaces": []any{}}, func() {
fmt.Println("No workspaces registered. Run `bdev add <name> <path>`.")
})
}
info, err := repoInfo()
if err != nil {
return err
}
rows := make([][]string, 0, len(appState.Registry.Workspaces))
statuses := make([]*engine.WorkspaceStatus, 0, len(appState.Registry.Workspaces))
for _, ws := range appState.Registry.Workspaces {
status, err := engine.InspectWorkspace(cmd.Context(), ws, info)
if err != nil {
return err
}
statuses = append(statuses, status)
rows = append(rows, []string{
ws.Name,
status.SyncState,
fmt.Sprintf("%d/%d/%d", len(status.UpToDate), len(status.NeedsUpdate), len(status.Orphaned)),
ws.Path,
})
}
return renderResult(map[string]any{"workspaces": statuses}, func() {
fmt.Println(ui.RenderTable([]string{"NAME", "STATE", "PATCHES", "PATH"}, rows))
})
},
}
rootCmd.AddCommand(command)
}

View File

@@ -0,0 +1,41 @@
package cmd
import (
"fmt"
"github.com/browseros-ai/BrowserOS/packages/browseros/tools/bdev/internal/engine"
"github.com/browseros-ai/BrowserOS/packages/browseros/tools/bdev/internal/ui"
"github.com/spf13/cobra"
)
func init() {
var message string
command := &cobra.Command{
Use: "publish [remote]",
Annotations: map[string]string{"group": "Remote:"},
Short: "Commit and push chromium_patches to a remote",
Args: cobra.MaximumNArgs(1),
RunE: func(cmd *cobra.Command, args []string) error {
info, err := repoInfo()
if err != nil {
return err
}
remote := "origin"
if len(args) == 1 {
remote = args[0]
}
result, err := engine.Publish(cmd.Context(), info, remote, message)
if err != nil {
return err
}
return renderResult(result, func() {
fmt.Println(ui.Success("Published chromium_patches"))
fmt.Printf("%s %s\n", ui.Muted("remote:"), result.Remote)
fmt.Printf("%s %s\n", ui.Muted("branch:"), result.Branch)
fmt.Printf("%s %s\n", ui.Muted("message:"), result.Message)
})
},
}
command.Flags().StringVarP(&message, "message", "m", "", "Commit message for the patch publish commit")
rootCmd.AddCommand(command)
}

View File

@@ -1,142 +0,0 @@
package cmd
import (
"fmt"
"time"
"bdev/internal/config"
"bdev/internal/engine"
"bdev/internal/git"
"bdev/internal/log"
"bdev/internal/ui"
"github.com/spf13/cobra"
)
var pullCmd = &cobra.Command{
Use: "pull [remote] [-- file1 file2 ...]",
Short: "Pull patches from repo to checkout",
Long: `Apply patches from the patches repository to the current Chromium
checkout. Use an optional remote (for example: 'bdev pull origin')
to fetch/rebase the patches repo before applying changes locally.`,
RunE: runPull,
}
var (
pullDryRun bool
pullRemote string
pullNoSync bool
pullRebase bool
pullKeepLocalOnly bool
)
func init() {
pullCmd.Flags().BoolVar(&pullDryRun, "dry-run", false, "show what would change")
pullCmd.Flags().StringVar(&pullRemote, "remote", "", "patches repo remote to sync before pull")
pullCmd.Flags().BoolVar(&pullNoSync, "no-sync", false, "skip syncing patches repo from remote")
pullCmd.Flags().BoolVar(&pullRebase, "rebase", true, "use git pull --rebase when syncing remote")
pullCmd.Flags().BoolVar(&pullKeepLocalOnly, "keep-local-only", true, "keep local-only checkout changes that are not in patches repo")
rootCmd.AddCommand(pullCmd)
}
func runPull(cmd *cobra.Command, args []string) error {
ctx, err := config.LoadContext()
if err != nil {
return err
}
activity := ui.NewActivity(verbose)
remote, files, err := resolveRemoteAndFiles(ctx.PatchesRepo, args, pullRemote)
if err != nil {
return err
}
shouldSync := remote != "" && !pullNoSync && !pullDryRun
if shouldSync {
dirty, err := git.IsDirty(ctx.PatchesRepo)
if err != nil {
return err
}
if dirty {
return fmt.Errorf("patches repo has local changes; commit/stash before syncing remote %q", remote)
}
activity.Step("syncing patches repo from remote %q", remote)
beforeRev, _ := git.HeadRev(ctx.PatchesRepo)
if err := git.Fetch(ctx.PatchesRepo, remote); err != nil {
return err
}
branch, detached, err := git.CurrentBranch(ctx.PatchesRepo)
if err != nil {
return err
}
if detached {
activity.Warn("patches repo is in detached HEAD; fetched remote but skipped pull/rebase")
} else {
if err := git.Pull(ctx.PatchesRepo, remote, branch, pullRebase); err != nil {
return err
}
}
afterRev, _ := git.HeadRev(ctx.PatchesRepo)
if beforeRev != "" && afterRev != "" && beforeRev != afterRev {
activity.Success("patches repo advanced %s -> %s", shortRev(beforeRev), shortRev(afterRev))
} else {
activity.Info("patches repo already up to date")
}
ctx, err = config.LoadContext()
if err != nil {
return err
}
} else if remote != "" && pullDryRun {
activity.Info("dry run enabled — skipping remote sync")
} else if remote != "" && pullNoSync {
activity.Info("remote %q provided, but sync is disabled via --no-sync", remote)
}
opts := engine.PullOpts{
DryRun: pullDryRun,
Files: files,
KeepLocalOnly: pullKeepLocalOnly,
}
if pullDryRun {
activity.Info("dry run enabled — no files will be modified")
activity.Divider()
}
activity.Step("computing patch delta and applying updates")
result, err := engine.Pull(ctx, opts)
if err != nil {
return err
}
fmt.Print(ui.RenderPullResult(result))
if len(result.Conflicts) > 0 {
fmt.Print(ui.RenderConflictReport(result.Conflicts))
}
if !pullDryRun {
repoRev, _ := git.HeadRev(ctx.PatchesRepo)
ctx.State.LastPull = &config.SyncEvent{
PatchesRepoRev: repoRev,
BaseCommit: ctx.BaseCommit,
Timestamp: time.Now(),
FileCount: len(result.Applied) + len(result.Deleted) + len(result.Reverted) + len(result.LocalOnly) + len(result.Skipped),
}
_ = config.WriteState(ctx.BrosDir, ctx.State)
logger := log.New(ctx.BrosDir)
_ = logger.LogPull(ctx.BaseCommit, repoRev, result)
}
if len(result.Conflicts) > 0 {
return fmt.Errorf("%d conflicts — see above for details", len(result.Conflicts))
}
return nil
}

View File

@@ -1,238 +0,0 @@
package cmd
import (
"fmt"
"time"
"bdev/internal/config"
"bdev/internal/engine"
"bdev/internal/git"
"bdev/internal/log"
"bdev/internal/patch"
"bdev/internal/ui"
"github.com/spf13/cobra"
)
var pushCmd = &cobra.Command{
Use: "push [remote] [-- file1 file2 ...]",
Short: "Push local changes to patches repo",
Long: `Extract diffs from the current Chromium checkout and write them
to the patches repository. When a remote is provided (for example:
'bdev push origin'), bdev commits patch changes and pushes upstream.`,
RunE: runPush,
}
var (
pushDryRun bool
pushRemote string
pushNoSync bool
pushRebase bool
pushMessage string
)
func init() {
pushCmd.Flags().BoolVar(&pushDryRun, "dry-run", false, "show what would be pushed")
pushCmd.Flags().StringVar(&pushRemote, "remote", "", "patches repo remote to publish to")
pushCmd.Flags().BoolVar(&pushNoSync, "no-sync", false, "skip syncing patches repo from remote before publish")
pushCmd.Flags().BoolVar(&pushRebase, "rebase", true, "use git pull --rebase when syncing before publish")
pushCmd.Flags().StringVarP(&pushMessage, "message", "m", "", "commit message when publishing to remote")
rootCmd.AddCommand(pushCmd)
}
func runPush(cmd *cobra.Command, args []string) error {
ctx, err := config.LoadContext()
if err != nil {
return err
}
activity := ui.NewActivity(verbose)
remote, files, err := resolveRemoteAndFiles(ctx.PatchesRepo, args, pushRemote)
if err != nil {
return err
}
shouldPublish := remote != "" && !pushDryRun
if shouldPublish {
dirty, err := git.IsDirty(ctx.PatchesRepo)
if err != nil {
return err
}
if dirty {
return fmt.Errorf("patches repo has local changes; commit/stash before publishing to remote %q", remote)
}
}
if shouldPublish && !pushNoSync {
if err := syncPatchesRepo(activity, ctx.PatchesRepo, remote, pushRebase); err != nil {
return err
}
}
if remote != "" && pushDryRun {
activity.Info("dry run enabled — skipping remote sync and publish")
}
opts := engine.PushOpts{
DryRun: pushDryRun,
Files: files,
}
if pushDryRun {
activity.Info("dry run enabled — no patch files will be written")
activity.Divider()
}
activity.Step("extracting checkout changes into patches")
result, err := engine.Push(ctx, opts)
if err != nil {
return err
}
renderPushResult(result, pushDryRun)
if !pushDryRun {
if remote != "" {
if err := publishPatchChanges(activity, ctx, remote, result, pushMessage); err != nil {
return err
}
}
// Update state
repoRev, _ := git.HeadRev(ctx.PatchesRepo)
ctx.State.LastPush = &config.SyncEvent{
PatchesRepoRev: repoRev,
Timestamp: time.Now(),
FileCount: result.Total() + len(result.Stale),
}
_ = config.WriteState(ctx.BrosDir, ctx.State)
// Activity log
logger := log.New(ctx.BrosDir)
_ = logger.LogPush(ctx.BaseCommit, result)
}
return nil
}
func syncPatchesRepo(activity *ui.Activity, patchesRepo, remote string, rebase bool) error {
activity.Step("syncing patches repo from remote %q", remote)
beforeRev, _ := git.HeadRev(patchesRepo)
if err := git.Fetch(patchesRepo, remote); err != nil {
return err
}
branch, detached, err := git.CurrentBranch(patchesRepo)
if err != nil {
return err
}
if detached {
return fmt.Errorf("patches repo is in detached HEAD; cannot sync for publish")
}
if err := git.Pull(patchesRepo, remote, branch, rebase); err != nil {
return err
}
afterRev, _ := git.HeadRev(patchesRepo)
if beforeRev != "" && afterRev != "" && beforeRev != afterRev {
activity.Success("patches repo advanced %s -> %s", shortRev(beforeRev), shortRev(afterRev))
} else {
activity.Info("patches repo already up to date")
}
return nil
}
func publishPatchChanges(
activity *ui.Activity,
ctx *config.Context,
remote string,
result *patch.PushResult,
commitMessage string,
) error {
dirty, err := git.IsDirty(ctx.PatchesRepo, "chromium_patches")
if err != nil {
return err
}
if !dirty {
activity.Info("no patch repository changes to commit")
return nil
}
branch, detached, err := git.CurrentBranch(ctx.PatchesRepo)
if err != nil {
return err
}
if detached {
return fmt.Errorf("patches repo is in detached HEAD; cannot publish")
}
message := commitMessage
if message == "" {
message = fmt.Sprintf(
"bdev push: %s (%d modified, %d added, %d deleted, %d stale)",
ctx.Config.Name,
len(result.Modified),
len(result.Added),
len(result.Deleted),
len(result.Stale),
)
}
activity.Step("committing patch changes to %s", branch)
if err := git.Add(ctx.PatchesRepo, "chromium_patches"); err != nil {
return err
}
if err := git.Commit(ctx.PatchesRepo, message); err != nil {
return err
}
activity.Success("created patch commit")
activity.Step("pushing patch commit to %s/%s", remote, branch)
if err := git.Push(ctx.PatchesRepo, remote, branch); err != nil {
return err
}
activity.Success("remote publish complete")
return nil
}
func renderPushResult(r *patch.PushResult, dryRun bool) {
if r.Total() == 0 && len(r.Stale) == 0 {
fmt.Println(ui.MutedStyle.Render("Nothing to push — checkout matches patches repo."))
return
}
verb := "Pushed"
if dryRun {
verb = "Would push"
}
fmt.Println(ui.TitleStyle.Render("bdev push"))
fmt.Println()
for _, f := range r.Added {
fmt.Printf(" %s %s\n", ui.AddedPrefix, f)
}
for _, f := range r.Modified {
fmt.Printf(" %s %s\n", ui.ModifiedPrefix, f)
}
for _, f := range r.Deleted {
fmt.Printf(" %s %s\n", ui.DeletedPrefix, f)
}
for _, f := range r.Stale {
fmt.Printf(" %s %s\n", ui.SkippedPrefix, ui.MutedStyle.Render(f+" (stale, removed)"))
}
fmt.Println()
summary := fmt.Sprintf("%s %d patches", verb, r.Total())
detail := fmt.Sprintf(" (%d modified, %d added, %d deleted)",
len(r.Modified), len(r.Added), len(r.Deleted))
fmt.Print(ui.SuccessStyle.Render(summary))
fmt.Println(ui.MutedStyle.Render(detail))
if len(r.Stale) > 0 {
fmt.Println(ui.MutedStyle.Render(fmt.Sprintf("Cleaned %d stale patches", len(r.Stale))))
}
}

View File

@@ -0,0 +1,33 @@
package cmd
import (
"fmt"
"github.com/browseros-ai/BrowserOS/packages/browseros/tools/bdev/internal/ui"
"github.com/spf13/cobra"
)
func init() {
command := &cobra.Command{
Use: "remove <name>",
Aliases: []string{"rm"},
Annotations: map[string]string{"group": "Workspace:"},
Short: "Unregister a workspace",
Args: cobra.ExactArgs(1),
RunE: func(cmd *cobra.Command, args []string) error {
entry, err := appState.Registry.Remove(args[0])
if err != nil {
return err
}
if err := appState.Save(); err != nil {
return err
}
return renderResult(map[string]any{"workspace": entry}, func() {
fmt.Println(ui.Success("Removed workspace"))
fmt.Printf("%s %s\n", ui.Muted("name:"), entry.Name)
fmt.Printf("%s %s\n", ui.Muted("path:"), entry.Path)
})
},
}
rootCmd.AddCommand(command)
}

View File

@@ -1,31 +1,129 @@
package cmd
import (
"encoding/json"
"fmt"
"os"
"strings"
"github.com/browseros-ai/BrowserOS/packages/browseros/tools/bdev/internal/app"
"github.com/browseros-ai/BrowserOS/packages/browseros/tools/bdev/internal/ui"
"github.com/spf13/cobra"
)
var Version = "dev"
var (
verbose bool
version string
jsonOut bool
verbose bool
appState *app.App
)
var groupOrder = []string{
"Workspace:",
"Core:",
"Conflict:",
"Remote:",
}
func helpHeader(s string) string { return ui.Header(s) }
func helpCmdCol(s string) string { return ui.Command(s) }
func helpHint(s string) string { return ui.Hint(s) }
func helpAliases(aliases []string) string {
return ui.Aliases(aliases)
}
func groupedHelp(cmd *cobra.Command) string {
groups := map[string][]*cobra.Command{}
for _, child := range cmd.Commands() {
if !child.IsAvailableCommand() && child.Name() != "help" {
continue
}
group := child.Annotations["group"]
if group == "" {
group = "Core:"
}
groups[group] = append(groups[group], child)
}
var builder strings.Builder
for _, group := range groupOrder {
commands, ok := groups[group]
if !ok {
continue
}
builder.WriteString("\n" + helpHeader(group) + "\n")
for _, child := range commands {
line := " " + helpCmdCol(fmt.Sprintf("%-14s", child.Name())) + " " + child.Short
if len(child.Aliases) > 0 {
line += " " + helpAliases(child.Aliases)
}
builder.WriteString(line + "\n")
}
}
return builder.String()
}
const usageTemplate = `{{helpHeader "Usage:"}}{{if .Runnable}}
{{.UseLine}}{{end}}{{if .HasAvailableSubCommands}}
{{.CommandPath}} [command]{{end}}{{if gt (len .Aliases) 0}}
{{helpHeader "Aliases:"}}
{{.NameAndAliases}}{{end}}{{if .HasExample}}
{{helpHeader "Examples:"}}
{{.Example}}{{end}}{{if .HasAvailableSubCommands}}
{{groupedHelp .}}{{end}}{{if .HasAvailableLocalFlags}}
{{helpHeader "Flags:"}}
{{.LocalFlags.FlagUsages | trimTrailingWhitespaces}}{{end}}{{if .HasAvailableInheritedFlags}}
{{helpHeader "Global Flags:"}}
{{.InheritedFlags.FlagUsages | trimTrailingWhitespaces}}{{end}}{{if .HasAvailableSubCommands}}
{{helpHint (printf "Use \"%s [command] --help\" for more information." .CommandPath)}}{{end}}
`
var rootCmd = &cobra.Command{
Use: "bdev",
Short: "BrowserOS CLI — patch management, builds, and releases",
Long: "bdev manages BrowserOS patches across Chromium checkouts.\nUse push/pull to sync patches, clone for fresh applies.",
Short: "Workspace-centric BrowserOS patch tooling for Chromium checkouts",
Version: Version,
SilenceUsage: true,
SilenceErrors: true,
PersistentPreRunE: func(cmd *cobra.Command, args []string) error {
var err error
appState, err = app.Load(jsonOut, verbose, "")
return err
},
RunE: func(cmd *cobra.Command, args []string) error {
return cmd.Help()
},
}
func init() {
rootCmd.PersistentFlags().BoolVarP(&verbose, "verbose", "v", false, "increase output detail")
cobra.AddTemplateFunc("helpHeader", helpHeader)
cobra.AddTemplateFunc("helpCmdCol", helpCmdCol)
cobra.AddTemplateFunc("helpAliases", helpAliases)
cobra.AddTemplateFunc("helpHint", helpHint)
cobra.AddTemplateFunc("groupedHelp", groupedHelp)
rootCmd.SetUsageTemplate(usageTemplate)
rootCmd.PersistentFlags().BoolVar(&jsonOut, "json", false, "Emit JSON output")
rootCmd.PersistentFlags().BoolVarP(&verbose, "verbose", "v", false, "Enable verbose output")
rootCmd.CompletionOptions.DisableDefaultCmd = true
}
func SetVersion(v string) {
version = v
rootCmd.Version = v
func Execute() {
if err := rootCmd.Execute(); err != nil {
fmt.Fprintln(os.Stderr, err)
os.Exit(1)
}
}
func Execute() error {
return rootCmd.Execute()
func renderResult(data any, human func()) error {
if jsonOut {
encoder := json.NewEncoder(os.Stdout)
encoder.SetIndent("", " ")
return encoder.Encode(data)
}
human()
return nil
}

View File

@@ -0,0 +1,33 @@
package cmd
import (
"fmt"
"github.com/browseros-ai/BrowserOS/packages/browseros/tools/bdev/internal/engine"
"github.com/browseros-ai/BrowserOS/packages/browseros/tools/bdev/internal/resolve"
"github.com/browseros-ai/BrowserOS/packages/browseros/tools/bdev/internal/ui"
"github.com/spf13/cobra"
)
func init() {
command := &cobra.Command{
Use: "skip",
Annotations: map[string]string{"group": "Conflict:"},
Short: "Skip the current conflict and move to the next one",
Args: cobra.NoArgs,
RunE: func(cmd *cobra.Command, args []string) error {
ws, err := resolve.FindActive(appState.Registry, appState.CWD)
if err != nil {
return err
}
result, err := engine.Skip(cmd.Context(), ws)
if err != nil {
return err
}
return renderResult(result, func() {
fmt.Println(ui.Warning(fmt.Sprintf("Skipped current conflict in %s", ws.Name)))
})
},
}
rootCmd.AddCommand(command)
}

View File

@@ -1,96 +1,45 @@
package cmd
import (
"encoding/json"
"fmt"
"bdev/internal/config"
"bdev/internal/engine"
"bdev/internal/ui"
"github.com/browseros-ai/BrowserOS/packages/browseros/tools/bdev/internal/engine"
"github.com/browseros-ai/BrowserOS/packages/browseros/tools/bdev/internal/ui"
"github.com/spf13/cobra"
)
var statusCmd = &cobra.Command{
Use: "status",
Short: "Show sync state between checkout and patches repo",
RunE: runStatus,
}
var (
statusJSON bool
statusFiles bool
)
func init() {
statusCmd.Flags().BoolVar(&statusJSON, "json", false, "output as JSON")
statusCmd.Flags().BoolVar(&statusFiles, "files", false, "list individual files per category")
rootCmd.AddCommand(statusCmd)
}
func runStatus(cmd *cobra.Command, args []string) error {
ctx, err := config.LoadContext()
if err != nil {
return err
}
result, err := engine.Status(ctx, statusFiles)
if err != nil {
return err
}
if statusJSON {
data, err := json.MarshalIndent(result, "", " ")
if err != nil {
return err
}
fmt.Println(string(data))
return nil
}
renderStatus(result)
return nil
}
func renderStatus(r *engine.StatusResult) {
fmt.Println(ui.TitleStyle.Render("bdev status"))
fmt.Println()
fmt.Printf(" %s %s\n", ui.LabelStyle.Render("Checkout:"), ui.ValueStyle.Render(r.CheckoutName))
fmt.Printf(" %s %s\n", ui.LabelStyle.Render("Base commit:"), r.BaseCommit[:min(12, len(r.BaseCommit))])
if r.ChromiumVersion != "" {
fmt.Printf(" %s %s\n", ui.LabelStyle.Render("Chromium:"), r.ChromiumVersion)
}
fmt.Printf(" %s %s\n", ui.LabelStyle.Render("Patches repo:"), r.PatchesRepo)
fmt.Println()
fmt.Println(" Sync status:")
if r.Ahead > 0 {
fmt.Printf(" %s %s\n",
ui.WarningStyle.Render(fmt.Sprintf("ahead: %3d files", r.Ahead)),
ui.MutedStyle.Render("(local changes not in patches repo)"))
}
if r.Behind > 0 {
fmt.Printf(" %s %s\n",
ui.WarningStyle.Render(fmt.Sprintf("behind: %3d files", r.Behind)),
ui.MutedStyle.Render("(patches in repo not applied locally)"))
}
fmt.Printf(" %s\n",
ui.SuccessStyle.Render(fmt.Sprintf("synced: %3d files", r.Synced)))
if len(r.AheadFiles) > 0 {
fmt.Println()
fmt.Println(" Ahead files:")
for _, f := range r.AheadFiles {
fmt.Printf(" %s %s\n", ui.AddedPrefix, f)
}
}
if len(r.BehindFiles) > 0 {
fmt.Println()
fmt.Println(" Behind files:")
for _, f := range r.BehindFiles {
fmt.Printf(" %s %s\n", ui.WarningStyle.Render(">"), f)
}
}
var src string
command := &cobra.Command{
Use: "status [workspace]",
Annotations: map[string]string{"group": "Core:"},
Short: "Show workspace sync state",
Args: cobra.MaximumNArgs(1),
RunE: func(cmd *cobra.Command, args []string) error {
ws, err := resolveWorkspace(args, src)
if err != nil {
return err
}
info, err := repoInfo()
if err != nil {
return err
}
status, err := engine.InspectWorkspace(cmd.Context(), ws, info)
if err != nil {
return err
}
return renderResult(status, func() {
fmt.Println(ui.Title(fmt.Sprintf("%s (%s)", ws.Name, status.SyncState)))
fmt.Printf("%s %s\n", ui.Muted("path:"), ws.Path)
fmt.Printf("%s %s\n", ui.Muted("repo head:"), status.RepoHead)
fmt.Printf("%s %s\n", ui.Muted("last sync:"), status.LastSyncRev)
fmt.Printf("%s %s\n", ui.Muted("last apply:"), status.LastApplyRev)
fmt.Printf("%s %d\n", ui.Muted("needs apply:"), len(status.NeedsApply))
fmt.Printf("%s %d\n", ui.Muted("needs update:"), len(status.NeedsUpdate))
fmt.Printf("%s %d\n", ui.Muted("orphaned:"), len(status.Orphaned))
})
},
}
command.Flags().StringVar(&src, "src", "", "Chromium checkout path to operate on directly")
rootCmd.AddCommand(command)
}

View File

@@ -0,0 +1,58 @@
package cmd
import (
"fmt"
"github.com/browseros-ai/BrowserOS/packages/browseros/tools/bdev/internal/engine"
"github.com/browseros-ai/BrowserOS/packages/browseros/tools/bdev/internal/ui"
"github.com/spf13/cobra"
)
func init() {
var src string
var rebase bool
var remote string
command := &cobra.Command{
Use: "sync [workspace]",
Annotations: map[string]string{"group": "Core:"},
Short: "Sync a workspace with the latest patch repo state",
Args: cobra.MaximumNArgs(1),
RunE: func(cmd *cobra.Command, args []string) error {
ws, err := resolveWorkspace(args, src)
if err != nil {
return err
}
info, err := repoInfo()
if err != nil {
return err
}
result, err := engine.Sync(cmd.Context(), engine.SyncOptions{
Workspace: ws,
Repo: info,
Remote: remote,
Rebase: rebase,
})
if err != nil {
return err
}
return renderResult(result, func() {
fmt.Println(ui.Title(fmt.Sprintf("Synced %s", ws.Name)))
fmt.Printf("%s %s\n", ui.Muted("repo head:"), result.RepoHead)
fmt.Printf("%s %d\n", ui.Muted("applied:"), len(result.Applied))
if result.StashRef != "" {
fmt.Printf("%s %s\n", ui.Muted("stash:"), result.StashRef)
}
if len(result.Conflicts) > 0 {
fmt.Println(ui.Warning("Conflicts detected"))
for _, conflict := range result.Conflicts {
fmt.Printf(" %s\n", conflict)
}
}
})
},
}
command.Flags().StringVar(&src, "src", "", "Chromium checkout path to operate on directly")
command.Flags().BoolVar(&rebase, "rebase", false, "Re-apply stashed local changes after syncing")
command.Flags().StringVar(&remote, "remote", "origin", "Remote to pull from")
rootCmd.AddCommand(command)
}

View File

@@ -1,44 +0,0 @@
package cmd
import (
"fmt"
"strings"
"bdev/internal/git"
)
func resolveRemoteAndFiles(repoDir string, args []string, explicitRemote string) (string, []string, error) {
remote := strings.TrimSpace(explicitRemote)
if remote != "" {
hasRemote, err := git.HasRemote(repoDir, remote)
if err != nil {
return "", nil, fmt.Errorf("resolving remote %q: %w", remote, err)
}
if !hasRemote {
return "", nil, fmt.Errorf("remote %q not found in patches repo", remote)
}
return remote, args, nil
}
if len(args) == 0 {
return "", nil, nil
}
hasRemote, err := git.HasRemote(repoDir, args[0])
if err != nil {
return "", nil, fmt.Errorf("resolving remote %q: %w", args[0], err)
}
if hasRemote {
return args[0], args[1:], nil
}
return "", args, nil
}
func shortRev(rev string) string {
rev = strings.TrimSpace(rev)
if len(rev) <= 12 {
return rev
}
return rev[:12]
}

View File

@@ -1,59 +0,0 @@
package cmd
import (
"os"
"os/exec"
"path/filepath"
"reflect"
"testing"
)
func TestResolveRemoteAndFiles(t *testing.T) {
t.Parallel()
repo := initRemoteRepo(t)
remote, files, err := resolveRemoteAndFiles(repo, []string{"origin", "content/foo.cc"}, "")
if err != nil {
t.Fatalf("resolveRemoteAndFiles: %v", err)
}
if remote != "origin" {
t.Fatalf("expected origin, got %q", remote)
}
if !reflect.DeepEqual(files, []string{"content/foo.cc"}) {
t.Fatalf("unexpected files: %#v", files)
}
}
func TestResolveRemoteAndFilesUnknownExplicitRemote(t *testing.T) {
t.Parallel()
repo := initRemoteRepo(t)
if _, _, err := resolveRemoteAndFiles(repo, nil, "missing"); err == nil {
t.Fatalf("expected error for unknown explicit remote")
}
}
func initRemoteRepo(t *testing.T) string {
t.Helper()
dir := filepath.Join(t.TempDir(), "patches")
if err := os.MkdirAll(dir, 0o755); err != nil {
t.Fatalf("mkdir: %v", err)
}
runGitCmd(t, dir, "init")
runGitCmd(t, dir, "remote", "add", "origin", "https://example.com/org/repo.git")
return dir
}
func runGitCmd(t *testing.T, dir string, args ...string) string {
t.Helper()
cmd := exec.Command("git", args...)
cmd.Dir = dir
out, err := cmd.CombinedOutput()
if err != nil {
t.Fatalf("git %v failed: %v\n%s", args, err, string(out))
}
return string(out)
}

View File

@@ -1,40 +0,0 @@
package e2e
import (
"fmt"
"os"
"os/exec"
"path/filepath"
"runtime"
"testing"
)
var bdevBinary string
func TestMain(m *testing.M) {
tmpDir, err := os.MkdirTemp("", "bdev-e2e-bin-*")
if err != nil {
fmt.Fprintf(os.Stderr, "failed to create temp dir: %v\n", err)
os.Exit(1)
}
_, file, _, ok := runtime.Caller(0)
if !ok {
fmt.Fprintln(os.Stderr, "failed to resolve e2e test path")
os.Exit(1)
}
moduleDir := filepath.Clean(filepath.Join(filepath.Dir(file), ".."))
bdevBinary = filepath.Join(tmpDir, "bdev-e2e")
build := exec.Command("go", "build", "-o", bdevBinary, ".")
build.Dir = moduleDir
if out, err := build.CombinedOutput(); err != nil {
fmt.Fprintf(os.Stderr, "failed to build bdev binary: %v\n%s\n", err, string(out))
os.Exit(1)
}
code := m.Run()
_ = os.RemoveAll(tmpDir)
os.Exit(code)
}

View File

@@ -1,244 +0,0 @@
package e2e
import (
"encoding/json"
"os"
"os/exec"
"path/filepath"
"strings"
"testing"
)
type scenario struct {
root string
baseCommit string
patchesRemote string
patchesRepo string
chromiumA string
chromiumB string
trackedPath string
newPath string
}
type statusJSON struct {
Ahead int
Behind int
Synced int
}
func TestBdevOperationsE2E(t *testing.T) {
env := setupScenario(t)
runBdev(t, env.chromiumA, "init", "--patches-repo", env.patchesRepo, "--name", "checkout-a")
statusBefore := readStatus(t, env.chromiumA)
if statusBefore.Behind == 0 {
t.Fatalf("expected checkout-a to be behind before pull, got %#v", statusBefore)
}
pullPreview := runBdev(t, env.chromiumA, "diff", "--direction", "pull")
assertContains(t, pullPreview, env.trackedPath)
runBdev(t, env.chromiumA, "pull", "--no-sync")
assertFileContains(t, filepath.Join(env.chromiumA, env.trackedPath), "patch-v1")
statusAfterPull := readStatus(t, env.chromiumA)
if statusAfterPull.Behind != 0 || statusAfterPull.Synced == 0 {
t.Fatalf("unexpected status after pull: %#v", statusAfterPull)
}
writeFile(t, filepath.Join(env.chromiumA, "base", ".keep"), "my-local-work\n")
pullAgain := runBdev(t, env.chromiumA, "pull", "--no-sync", "base/.keep")
assertContains(t, pullAgain, "local-only, kept")
assertFileContains(t, filepath.Join(env.chromiumA, "base", ".keep"), "my-local-work")
runGit(t, env.chromiumA, "checkout", env.baseCommit, "--", "base/.keep")
writeFile(t, filepath.Join(env.chromiumA, env.trackedPath), "patch-v2\n")
writeFile(t, filepath.Join(env.chromiumA, env.newPath), "brand-new\n")
pushPreview := runBdev(t, env.chromiumA, "diff", "--direction", "push")
assertContains(t, pushPreview, env.trackedPath)
runBdev(t, env.chromiumA, "push", "--no-sync", env.trackedPath, env.newPath)
assertFileContains(t, filepath.Join(env.patchesRepo, "chromium_patches", env.newPath), "diff --git")
// Keep the patches repo clean before remote-aware publish flow.
commitRepo(t, env.patchesRepo, "chore: e2e checkpoint after push --no-sync")
writeFile(t, filepath.Join(env.chromiumA, env.trackedPath), "patch-v3\n")
publish := runBdev(t, env.chromiumA, "push", "origin", "-m", "e2e: publish patch-v3", env.trackedPath)
assertContains(t, publish, "remote publish complete")
mirror := filepath.Join(env.root, "mirror")
runGit(t, env.root, "clone", env.patchesRemote, mirror)
assertFileContains(t, filepath.Join(mirror, "chromium_patches", env.trackedPath), "patch-v3")
collab := filepath.Join(env.root, "collab")
runGit(t, env.root, "clone", env.patchesRemote, collab)
configRepo(t, collab)
diffV4 := buildDiffFromBase(t, env.chromiumA, env.baseCommit, env.trackedPath, "patch-v4\n")
writeFile(t, filepath.Join(collab, "chromium_patches", env.trackedPath), diffV4)
commitRepo(t, collab, "feat: remote patch-v4 update")
branch := strings.TrimSpace(runGit(t, collab, "symbolic-ref", "--short", "HEAD"))
runGit(t, collab, "push", "origin", "HEAD:"+branch)
runBdev(t, env.chromiumA, "pull", "origin")
assertFileContains(t, filepath.Join(env.chromiumA, env.trackedPath), "patch-v4")
runBdev(t, env.chromiumB, "clone", "--patches-repo", env.patchesRepo, "--verify-base", "--clean", "--name", "checkout-b")
assertFileContains(t, filepath.Join(env.chromiumB, env.trackedPath), "patch-v4")
statusB := readStatus(t, env.chromiumB)
if statusB.Ahead != 0 || statusB.Synced == 0 {
t.Fatalf("expected checkout-b to have clean/synced clone state, got %#v", statusB)
}
}
func setupScenario(t *testing.T) *scenario {
t.Helper()
root := t.TempDir()
patchesRemote := filepath.Join(root, "patches-remote.git")
chromiumA := filepath.Join(root, "chromium-a")
chromiumB := filepath.Join(root, "chromium-b")
patchesRepo := filepath.Join(root, "patches")
trackedPath := filepath.ToSlash(filepath.Join("chrome", "app", "test.txt"))
newPath := filepath.ToSlash(filepath.Join("chrome", "browser", "new_file.txt"))
runGit(t, root, "init", "--bare", patchesRemote)
setupChromiumRepo(t, chromiumA)
writeFile(t, filepath.Join(chromiumA, trackedPath), "base\n")
runGit(t, chromiumA, "add", "-A")
runGit(t, chromiumA, "commit", "-m", "base")
baseCommit := strings.TrimSpace(runGit(t, chromiumA, "rev-parse", "HEAD"))
diffV1 := buildDiffFromBase(t, chromiumA, baseCommit, trackedPath, "patch-v1\n")
runGit(t, root, "clone", patchesRemote, patchesRepo)
configRepo(t, patchesRepo)
writeFile(t, filepath.Join(patchesRepo, "BASE_COMMIT"), baseCommit+"\n")
writeFile(t, filepath.Join(patchesRepo, "CHROMIUM_VERSION"), "MAJOR=145\nMINOR=0\nBUILD=7632\nPATCH=45\n")
writeFile(t, filepath.Join(patchesRepo, "chromium_patches", trackedPath), diffV1)
commitRepo(t, patchesRepo, "seed patches")
branch := strings.TrimSpace(runGit(t, patchesRepo, "symbolic-ref", "--short", "HEAD"))
runGit(t, patchesRepo, "push", "-u", "origin", "HEAD:"+branch)
runGit(t, root, "clone", chromiumA, chromiumB)
configRepo(t, chromiumB)
return &scenario{
root: root,
baseCommit: baseCommit,
patchesRemote: patchesRemote,
patchesRepo: patchesRepo,
chromiumA: chromiumA,
chromiumB: chromiumB,
trackedPath: trackedPath,
newPath: newPath,
}
}
func setupChromiumRepo(t *testing.T, dir string) {
t.Helper()
if err := os.MkdirAll(filepath.Join(dir, "chrome"), 0o755); err != nil {
t.Fatalf("mkdir chrome: %v", err)
}
if err := os.MkdirAll(filepath.Join(dir, "base"), 0o755); err != nil {
t.Fatalf("mkdir base: %v", err)
}
writeFile(t, filepath.Join(dir, "base", ".keep"), "marker\n")
runGit(t, dir, "init")
configRepo(t, dir)
}
func buildDiffFromBase(t *testing.T, repo, base, relPath, content string) string {
t.Helper()
abs := filepath.Join(repo, relPath)
original := mustRead(t, abs)
writeFile(t, abs, content)
diff := runGit(t, repo, "diff", "--full-index", base, "--", relPath)
writeFile(t, abs, original)
if strings.TrimSpace(diff) == "" {
t.Fatalf("expected non-empty diff for %s", relPath)
}
return diff
}
func readStatus(t *testing.T, chromiumDir string) statusJSON {
t.Helper()
raw := runBdev(t, chromiumDir, "status", "--json")
var s statusJSON
if err := json.Unmarshal([]byte(raw), &s); err != nil {
t.Fatalf("failed to parse status json: %v\nraw=%s", err, raw)
}
return s
}
func runBdev(t *testing.T, dir string, args ...string) string {
t.Helper()
cmd := exec.Command(bdevBinary, args...)
cmd.Dir = dir
out, err := cmd.CombinedOutput()
if err != nil {
t.Fatalf("bdev %v failed: %v\n%s", args, err, string(out))
}
return string(out)
}
func runGit(t *testing.T, dir string, args ...string) string {
t.Helper()
cmd := exec.Command("git", args...)
cmd.Dir = dir
out, err := cmd.CombinedOutput()
if err != nil {
t.Fatalf("git %v failed: %v\n%s", args, err, string(out))
}
return string(out)
}
func commitRepo(t *testing.T, dir, message string) {
t.Helper()
runGit(t, dir, "add", "-A")
runGit(t, dir, "commit", "-m", message)
}
func configRepo(t *testing.T, dir string) {
t.Helper()
runGit(t, dir, "config", "user.email", "bdev-e2e@example.com")
runGit(t, dir, "config", "user.name", "bdev e2e")
}
func writeFile(t *testing.T, path, content string) {
t.Helper()
if err := os.MkdirAll(filepath.Dir(path), 0o755); err != nil {
t.Fatalf("mkdir %s: %v", path, err)
}
if err := os.WriteFile(path, []byte(content), 0o644); err != nil {
t.Fatalf("write %s: %v", path, err)
}
}
func mustRead(t *testing.T, path string) string {
t.Helper()
data, err := os.ReadFile(path)
if err != nil {
t.Fatalf("read %s: %v", path, err)
}
return string(data)
}
func assertContains(t *testing.T, output, want string) {
t.Helper()
if !strings.Contains(output, want) {
t.Fatalf("expected output to contain %q\noutput:\n%s", want, output)
}
}
func assertFileContains(t *testing.T, path, want string) {
t.Helper()
content := mustRead(t, path)
if !strings.Contains(content, want) {
t.Fatalf("expected %s to contain %q\ncontent:\n%s", path, want, content)
}
}

View File

@@ -1,11 +1,10 @@
module bdev
module github.com/browseros-ai/BrowserOS/packages/browseros/tools/bdev
go 1.25.7
go 1.25.0
require (
github.com/charmbracelet/lipgloss v1.1.0
github.com/spf13/cobra v1.10.2
golang.org/x/sync v0.19.0
gopkg.in/yaml.v3 v3.0.1
)

View File

@@ -1,5 +1,7 @@
github.com/aymanbagabas/go-osc52/v2 v2.0.1 h1:HwpRHbFMcZLEVr42D4p7XBqjyuxQH5SMiErDT4WkJ2k=
github.com/aymanbagabas/go-osc52/v2 v2.0.1/go.mod h1:uYgXzlJ7ZpABp8OJ+exZzJJhRNQ2ASbcXHWsFqH8hp8=
github.com/aymanbagabas/go-udiff v0.2.0 h1:TK0fH4MteXUDspT88n8CKzvK0X9O2xu9yQjWpi6yML8=
github.com/aymanbagabas/go-udiff v0.2.0/go.mod h1:RE4Ex0qsGkTAJoQdQQCA0uG+nAzJO/pI/QwceO5fgrA=
github.com/charmbracelet/colorprofile v0.2.3-0.20250311203215-f60798e515dc h1:4pZI35227imm7yK2bGPcfpFEmuY1gc2YSTShr4iJBfs=
github.com/charmbracelet/colorprofile v0.2.3-0.20250311203215-f60798e515dc/go.mod h1:X4/0JoqgTIPSFcRA/P6INZzIuyqdFY5rm8tb41s9okk=
github.com/charmbracelet/lipgloss v1.1.0 h1:vYXsiLHVkK7fp74RkV7b2kq9+zDLoEU4MZoFqR/noCY=
@@ -8,6 +10,8 @@ github.com/charmbracelet/x/ansi v0.8.0 h1:9GTq3xq9caJW8ZrBTe0LIe2fvfLR/bYXKTx2ll
github.com/charmbracelet/x/ansi v0.8.0/go.mod h1:wdYl/ONOLHLIVmQaxbIYEC/cRKOQyjTkowiI4blgS9Q=
github.com/charmbracelet/x/cellbuf v0.0.13-0.20250311204145-2c3ea96c31dd h1:vy0GVL4jeHEwG5YOXDmi86oYw2yuYUGqz6a8sLwg0X8=
github.com/charmbracelet/x/cellbuf v0.0.13-0.20250311204145-2c3ea96c31dd/go.mod h1:xe0nKWGd3eJgtqZRaN9RjMtK7xUYchjzPr7q6kcvCCs=
github.com/charmbracelet/x/exp/golden v0.0.0-20240806155701-69247e0abc2a h1:G99klV19u0QnhiizODirwVksQB91TJKV/UaTnACcG30=
github.com/charmbracelet/x/exp/golden v0.0.0-20240806155701-69247e0abc2a/go.mod h1:wDlXFlCrmJ8J+swcL/MnGUuYnqgQdW9rhSD61oNMb6U=
github.com/charmbracelet/x/term v0.2.1 h1:AQeHeLZ1OqSXhrAWpYUtZyX1T3zVxfpZuEQMIQaGIAQ=
github.com/charmbracelet/x/term v0.2.1/go.mod h1:oQ4enTYFV7QN4m0i9mzHrViD7TQKvNEEkHUMCmsxdUg=
github.com/cpuguy83/go-md2man/v2 v2.0.6/go.mod h1:oOW0eioCTA6cOiMLiUPZOpcVxMig6NIQQ7OS05n1F4g=
@@ -34,8 +38,6 @@ github.com/xo/terminfo v0.0.0-20220910002029-abceb7e1c41e/go.mod h1:RbqR21r5mrJu
go.yaml.in/yaml/v3 v3.0.4/go.mod h1:DhzuOOF2ATzADvBadXxruRBLzYTpT36CKvDb3+aBEFg=
golang.org/x/exp v0.0.0-20220909182711-5c715a9e8561 h1:MDc5xs78ZrZr3HMQugiXOAkSZtfTpbJLDr/lwfgO53E=
golang.org/x/exp v0.0.0-20220909182711-5c715a9e8561/go.mod h1:cyybsKvd6eL0RnXn6p/Grxp8F5bW7iYuBgsNCOHpMYE=
golang.org/x/sync v0.19.0 h1:vV+1eWNmZ5geRlYjzm2adRgW2/mcpevXNg50YZtPCE4=
golang.org/x/sync v0.19.0/go.mod h1:9KTHXmSnoGruLpwFjVSX0lNNA75CykiMECbovNTZqGI=
golang.org/x/sys v0.6.0/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/sys v0.30.0 h1:QjkSwP/36a20jFYWkSue1YwXzLmsV5Gfq7Eiy72C1uc=
golang.org/x/sys v0.30.0/go.mod h1:/VUhepiaJMQUp4+oa/7Zr1D23ma6VTLIYjOOTFZPUcA=

View File

@@ -0,0 +1,67 @@
package app
import (
"fmt"
"os"
"path/filepath"
"github.com/browseros-ai/BrowserOS/packages/browseros/tools/bdev/internal/repo"
"github.com/browseros-ai/BrowserOS/packages/browseros/tools/bdev/internal/workspace"
)
type App struct {
JSON bool
Verbose bool
CWD string
Config *workspace.Config
Registry *workspace.Registry
}
func Load(jsonOut bool, verbose bool, cwd string) (*App, error) {
if cwd == "" {
var err error
cwd, err = os.Getwd()
if err != nil {
return nil, err
}
}
cfg, err := workspace.LoadConfig()
if err != nil {
return nil, err
}
reg, err := workspace.LoadRegistry()
if err != nil {
return nil, err
}
return &App{
JSON: jsonOut,
Verbose: verbose,
CWD: filepath.Clean(cwd),
Config: cfg,
Registry: reg,
}, nil
}
func (a *App) Save() error {
if err := workspace.SaveConfig(a.Config); err != nil {
return err
}
return workspace.SaveRegistry(a.Registry)
}
func (a *App) ResolveWorkspace(name string, src string) (workspace.Entry, error) {
return workspace.Resolve(a.Registry, name, a.CWD, src)
}
func (a *App) RepoInfo() (*repo.Info, error) {
if a.Config.PatchesRepo == "" {
discovered, err := repo.Discover(a.CWD)
if err != nil {
return nil, fmt.Errorf(
`patches repo is not configured; run "bdev add <name> <path> --patches-repo <repo>" from the browseros repo once`,
)
}
return repo.Load(discovered)
}
return repo.Load(a.Config.PatchesRepo)
}

View File

@@ -1,57 +0,0 @@
package config
import (
"fmt"
"os"
"path/filepath"
"strings"
)
func ReadBaseCommit(patchesRepo string) (string, error) {
path := filepath.Join(patchesRepo, "BASE_COMMIT")
data, err := os.ReadFile(path)
if err != nil {
if os.IsNotExist(err) {
return "", fmt.Errorf("BASE_COMMIT not found in %s — create it with the Chromium commit hash", patchesRepo)
}
return "", fmt.Errorf("reading BASE_COMMIT: %w", err)
}
commit := strings.TrimSpace(string(data))
if commit == "" {
return "", fmt.Errorf("BASE_COMMIT is empty in %s", patchesRepo)
}
return commit, nil
}
func ReadChromiumVersion(patchesRepo string) (string, error) {
path := filepath.Join(patchesRepo, "CHROMIUM_VERSION")
data, err := os.ReadFile(path)
if err != nil {
if os.IsNotExist(err) {
return "", nil
}
return "", fmt.Errorf("reading CHROMIUM_VERSION: %w", err)
}
vars := make(map[string]string)
for _, line := range strings.Split(string(data), "\n") {
line = strings.TrimSpace(line)
if line == "" {
continue
}
parts := strings.SplitN(line, "=", 2)
if len(parts) == 2 {
vars[strings.TrimSpace(parts[0])] = strings.TrimSpace(parts[1])
}
}
major := vars["MAJOR"]
minor := vars["MINOR"]
build := vars["BUILD"]
patch := vars["PATCH"]
if major == "" {
return "", nil
}
return fmt.Sprintf("%s.%s.%s.%s", major, minor, build, patch), nil
}

View File

@@ -1,41 +0,0 @@
package config
import (
"fmt"
"os"
"path/filepath"
"gopkg.in/yaml.v3"
)
type Config struct {
Name string `yaml:"name"`
PatchesRepo string `yaml:"patches_repo"`
}
func ReadConfig(brosDir string) (*Config, error) {
data, err := os.ReadFile(filepath.Join(brosDir, "config.yaml"))
if err != nil {
return nil, fmt.Errorf("reading config: %w", err)
}
var cfg Config
if err := yaml.Unmarshal(data, &cfg); err != nil {
return nil, fmt.Errorf("parsing config.yaml: %w", err)
}
return &cfg, nil
}
func WriteConfig(brosDir string, cfg *Config) error {
if err := os.MkdirAll(brosDir, 0o755); err != nil {
return fmt.Errorf("creating .bros directory: %w", err)
}
data, err := yaml.Marshal(cfg)
if err != nil {
return fmt.Errorf("marshaling config: %w", err)
}
path := filepath.Join(brosDir, "config.yaml")
if err := os.WriteFile(path, data, 0o644); err != nil {
return fmt.Errorf("writing config.yaml: %w", err)
}
return nil
}

View File

@@ -1,101 +0,0 @@
package config
import (
"fmt"
"os"
"path/filepath"
)
const BrosDirName = ".bros"
// Context holds everything needed for an operation.
type Context struct {
Config *Config
State *State
ChromiumDir string // Absolute path to chromium checkout (parent of .bros/)
BrosDir string // Absolute path to .bros/
PatchesRepo string // Absolute path to patches repo root
PatchesDir string // Absolute path to chromium_patches/
BaseCommit string
ChromiumVersion string
}
// FindBrosDir walks up from cwd to find the nearest .bros/ directory.
func FindBrosDir() (string, error) {
dir, err := os.Getwd()
if err != nil {
return "", fmt.Errorf("getting cwd: %w", err)
}
for {
candidate := filepath.Join(dir, BrosDirName)
if info, err := os.Stat(candidate); err == nil && info.IsDir() {
return dir, nil
}
parent := filepath.Dir(dir)
if parent == dir {
break
}
dir = parent
}
return "", fmt.Errorf("not a bdev checkout (no .bros/ found in any parent directory)")
}
// LoadContext loads config, state, and patches repo info.
func LoadContext() (*Context, error) {
chromiumDir, err := FindBrosDir()
if err != nil {
return nil, err
}
brosDir := filepath.Join(chromiumDir, BrosDirName)
cfg, err := ReadConfig(brosDir)
if err != nil {
return nil, fmt.Errorf("loading config: %w", err)
}
state, err := ReadState(brosDir)
if err != nil {
return nil, fmt.Errorf("loading state: %w", err)
}
patchesRepo := cfg.PatchesRepo
if !filepath.IsAbs(patchesRepo) {
patchesRepo = filepath.Join(chromiumDir, patchesRepo)
}
patchesDir := filepath.Join(patchesRepo, "chromium_patches")
if _, err := os.Stat(patchesDir); err != nil {
return nil, fmt.Errorf("patches directory not found: %s", patchesDir)
}
baseCommit, err := ReadBaseCommit(patchesRepo)
if err != nil {
return nil, err
}
chromiumVersion, _ := ReadChromiumVersion(patchesRepo)
return &Context{
Config: cfg,
State: state,
ChromiumDir: chromiumDir,
BrosDir: brosDir,
PatchesRepo: patchesRepo,
PatchesDir: patchesDir,
BaseCommit: baseCommit,
ChromiumVersion: chromiumVersion,
}, nil
}
// LooksLikeChromium checks if a directory looks like a Chromium source tree.
func LooksLikeChromium(dir string) bool {
markers := []string{"chrome", "base", ".git"}
for _, m := range markers {
if _, err := os.Stat(filepath.Join(dir, m)); err != nil {
return false
}
}
return true
}

View File

@@ -1,50 +0,0 @@
package config
import (
"fmt"
"os"
"path/filepath"
"time"
"gopkg.in/yaml.v3"
)
type State struct {
LastPull *SyncEvent `yaml:"last_pull,omitempty"`
LastPush *SyncEvent `yaml:"last_push,omitempty"`
}
type SyncEvent struct {
PatchesRepoRev string `yaml:"patches_repo_rev"`
BaseCommit string `yaml:"base_commit,omitempty"`
Timestamp time.Time `yaml:"timestamp"`
FileCount int `yaml:"file_count"`
}
func ReadState(brosDir string) (*State, error) {
path := filepath.Join(brosDir, "state.yaml")
data, err := os.ReadFile(path)
if err != nil {
if os.IsNotExist(err) {
return &State{}, nil
}
return nil, fmt.Errorf("reading state: %w", err)
}
var s State
if err := yaml.Unmarshal(data, &s); err != nil {
return nil, fmt.Errorf("parsing state.yaml: %w", err)
}
return &s, nil
}
func WriteState(brosDir string, s *State) error {
data, err := yaml.Marshal(s)
if err != nil {
return fmt.Errorf("marshaling state: %w", err)
}
path := filepath.Join(brosDir, "state.yaml")
if err := os.WriteFile(path, data, 0o644); err != nil {
return fmt.Errorf("writing state.yaml: %w", err)
}
return nil
}

View File

@@ -0,0 +1,410 @@
package engine
import (
"context"
"fmt"
"os"
"path/filepath"
"strings"
"time"
"github.com/browseros-ai/BrowserOS/packages/browseros/tools/bdev/internal/git"
"github.com/browseros-ai/BrowserOS/packages/browseros/tools/bdev/internal/patch"
"github.com/browseros-ai/BrowserOS/packages/browseros/tools/bdev/internal/repo"
"github.com/browseros-ai/BrowserOS/packages/browseros/tools/bdev/internal/resolve"
"github.com/browseros-ai/BrowserOS/packages/browseros/tools/bdev/internal/workspace"
)
type ApplyOptions struct {
Workspace workspace.Entry
Repo *repo.Info
Reset bool
ChangedRef string
RangeEnd string
Filters []string
Mode string
}
type ApplyResult struct {
Workspace string `json:"workspace"`
Mode string `json:"mode"`
BaseCommit string `json:"base_commit"`
RepoRev string `json:"repo_rev"`
Applied []string `json:"applied"`
ResetPaths []string `json:"reset_paths"`
Orphaned []string `json:"orphaned,omitempty"`
Conflicts []resolve.Operation `json:"conflicts,omitempty"`
}
func Apply(ctx context.Context, opts ApplyOptions) (*ApplyResult, error) {
repoRev, err := git.HeadRev(ctx, opts.Repo.Root)
if err != nil {
return nil, err
}
ops, orphaned, err := buildApplyOperations(ctx, opts)
if err != nil {
return nil, err
}
result := &ApplyResult{
Workspace: opts.Workspace.Name,
Mode: applyMode(opts),
BaseCommit: opts.Repo.BaseCommit,
RepoRev: repoRev,
Orphaned: orphaned,
}
if len(ops) == 0 {
if err := markApplyComplete(opts.Workspace.Path, opts.Repo.BaseCommit, repoRev); err != nil {
return nil, err
}
if err := clearResolveState(opts.Workspace.Path); err != nil {
return nil, err
}
return result, nil
}
next, err := applyOperationRange(ctx, opts.Workspace, opts.Repo, ops, 0, nil, nil, result)
if err != nil {
return nil, err
}
if next < len(ops) {
return result, nil
}
if err := markApplyComplete(opts.Workspace.Path, opts.Repo.BaseCommit, repoRev); err != nil {
return nil, err
}
if err := clearResolveState(opts.Workspace.Path); err != nil {
return nil, err
}
return result, nil
}
func Continue(ctx context.Context, ws workspace.Entry) (*ApplyResult, error) {
state, err := resolve.Load(ws.Path)
if err != nil {
return nil, err
}
repoInfo, err := repo.Load(state.RepoRoot)
if err != nil {
return nil, err
}
current, err := state.CurrentOperation()
if err != nil {
return nil, err
}
if err := verifyResolved(ctx, ws.Path, repoInfo, current, state.BaseCommit); err != nil {
return nil, err
}
state.Resolved = append(state.Resolved, current.ChromiumPath)
result := &ApplyResult{
Workspace: ws.Name,
Mode: state.Mode,
BaseCommit: state.BaseCommit,
RepoRev: state.RepoRev,
Applied: append([]string{}, state.Resolved...),
Conflicts: nil,
}
next, err := applyOperationRange(ctx, ws, repoInfo, state.Operations, state.Current+1, state.Resolved, state.Skipped, result)
if err != nil {
return nil, err
}
if next >= len(state.Operations) && len(result.Conflicts) == 0 {
if err := markApplyComplete(ws.Path, state.BaseCommit, state.RepoRev); err != nil {
return nil, err
}
if err := resolve.Delete(ws.Path); err != nil {
return nil, err
}
}
return result, nil
}
func Skip(ctx context.Context, ws workspace.Entry) (*ApplyResult, error) {
state, err := resolve.Load(ws.Path)
if err != nil {
return nil, err
}
repoInfo, err := repo.Load(state.RepoRoot)
if err != nil {
return nil, err
}
current, err := state.CurrentOperation()
if err != nil {
return nil, err
}
state.Skipped = append(state.Skipped, current.ChromiumPath)
result := &ApplyResult{
Workspace: ws.Name,
Mode: state.Mode,
BaseCommit: state.BaseCommit,
RepoRev: state.RepoRev,
Applied: append([]string{}, state.Resolved...),
}
next, err := applyOperationRange(ctx, ws, repoInfo, state.Operations, state.Current+1, state.Resolved, state.Skipped, result)
if err != nil {
return nil, err
}
if next >= len(state.Operations) && len(result.Conflicts) == 0 {
if err := markApplyComplete(ws.Path, state.BaseCommit, state.RepoRev); err != nil {
return nil, err
}
if err := resolve.Delete(ws.Path); err != nil {
return nil, err
}
}
return result, nil
}
func Abort(ctx context.Context, ws workspace.Entry) error {
state, err := resolve.Load(ws.Path)
if err != nil {
return err
}
for idx := 0; idx < len(state.Operations); idx++ {
op := state.Operations[idx]
if op.OldPath != "" {
if err := git.ResetPathToCommit(ctx, ws.Path, state.BaseCommit, op.OldPath); err != nil {
return err
}
}
if err := git.ResetPathToCommit(ctx, ws.Path, state.BaseCommit, op.ChromiumPath); err != nil {
return err
}
if op.RejectPath != "" {
_ = os.Remove(op.RejectPath)
}
}
workspaceState, err := workspace.LoadState(ws.Path)
if err != nil {
return err
}
if err := resolve.Delete(ws.Path); err != nil {
return err
}
if workspaceState.PendingStash == "" {
return nil
}
if err := git.StashPop(ctx, ws.Path, workspaceState.PendingStash); err != nil {
return err
}
workspaceState.PendingStash = ""
return workspace.SaveState(ws.Path, workspaceState)
}
func buildApplyOperations(ctx context.Context, opts ApplyOptions) ([]resolve.Operation, []string, error) {
repoSet, err := patch.LoadRepoPatchSet(opts.Repo.PatchesDir, opts.Filters)
if err != nil {
return nil, nil, err
}
switch {
case opts.Reset:
return operationsFromPatchSet(repoSet), nil, nil
case opts.ChangedRef != "":
changes, err := repoPatchChanges(ctx, opts.Repo, opts.ChangedRef, opts.RangeEnd)
if err != nil {
return nil, nil, err
}
return operationsFromChanges(repoSet, changes, opts.Filters), nil, nil
default:
localSet, err := patch.BuildWorkingTreePatchSet(ctx, opts.Workspace.Path, opts.Repo.BaseCommit, opts.Filters)
if err != nil {
return nil, nil, err
}
var ops []resolve.Operation
var orphaned []string
for _, delta := range patch.Compare(repoSet, localSet) {
switch delta.Kind {
case patch.NeedsApply, patch.NeedsUpdate:
ops = append(ops, operationFromPatch(*delta.Repo))
case patch.Orphaned:
orphaned = append(orphaned, delta.Path)
}
}
return ops, orphaned, nil
}
}
func applyMode(opts ApplyOptions) string {
switch {
case opts.Mode != "":
return opts.Mode
case opts.Reset:
return "reset"
case opts.ChangedRef != "":
return "changed"
default:
return "incremental"
}
}
func applyOperationRange(
ctx context.Context,
ws workspace.Entry,
repoInfo *repo.Info,
ops []resolve.Operation,
start int,
resolved []string,
skipped []string,
result *ApplyResult,
) (int, error) {
repoSet, err := patch.LoadRepoPatchSet(repoInfo.PatchesDir, nil)
if err != nil {
return start, err
}
for idx := start; idx < len(ops); idx++ {
op := ops[idx]
result.ResetPaths = append(result.ResetPaths, op.ChromiumPath)
if op.OldPath != "" {
if err := git.ResetPathToCommit(ctx, ws.Path, repoInfo.BaseCommit, op.OldPath); err != nil {
return idx, err
}
}
if err := git.ResetPathToCommit(ctx, ws.Path, repoInfo.BaseCommit, op.ChromiumPath); err != nil {
return idx, err
}
patchFile, ok := repoSet[op.ChromiumPath]
if ok {
if err := applySingleOperation(ctx, ws.Path, patchFile); err != nil {
op.RejectPath = rejectPath(ws.Path, op)
op.Message = err.Error()
ops[idx] = op
state := &resolve.State{
Workspace: ws.Path,
RepoRoot: repoInfo.Root,
BaseCommit: repoInfo.BaseCommit,
RepoRev: result.RepoRev,
Mode: result.Mode,
Current: idx,
Operations: ops,
Resolved: append([]string{}, resolved...),
Skipped: append([]string{}, skipped...),
}
if err := resolve.Save(ws.Path, state); err != nil {
return idx, err
}
result.Conflicts = []resolve.Operation{op}
return idx, nil
}
} else if op.Op == patch.OpDelete {
if err := os.RemoveAll(filepath.Join(ws.Path, filepath.FromSlash(op.ChromiumPath))); err != nil {
return idx, err
}
}
result.Applied = append(result.Applied, op.ChromiumPath)
}
return len(ops), nil
}
func applySingleOperation(ctx context.Context, workspacePath string, patchFile patch.FilePatch) error {
switch {
case patchFile.Op == patch.OpDelete:
return os.RemoveAll(filepath.Join(workspacePath, filepath.FromSlash(patchFile.Path)))
case patchFile.IsPureRename():
from := filepath.Join(workspacePath, filepath.FromSlash(patchFile.OldPath))
to := filepath.Join(workspacePath, filepath.FromSlash(patchFile.Path))
if err := os.MkdirAll(filepath.Dir(to), 0o755); err != nil {
return err
}
return os.Rename(from, to)
case patchFile.Op == patch.OpBinary && len(patchFile.Content) == 0:
return fmt.Errorf("binary markers are not directly applicable: %s", patchFile.Path)
default:
_, err := git.ApplyPatch(ctx, workspacePath, patchFile.Content)
return err
}
}
func verifyResolved(ctx context.Context, workspacePath string, repoInfo *repo.Info, op resolve.Operation, base string) error {
repoSet, err := patch.LoadRepoPatchSet(repoInfo.PatchesDir, []string{op.ChromiumPath})
if err != nil {
return err
}
localSet, err := patch.BuildWorkingTreePatchSet(ctx, workspacePath, base, []string{op.ChromiumPath})
if err != nil {
return err
}
for _, delta := range patch.Compare(repoSet, localSet) {
if delta.Path == op.ChromiumPath && delta.Kind == patch.UpToDate {
if op.RejectPath != "" {
_ = os.Remove(op.RejectPath)
}
return nil
}
}
return fmt.Errorf("current conflict is not resolved yet for %s", op.ChromiumPath)
}
func operationFromPatch(p patch.FilePatch) resolve.Operation {
return resolve.Operation{
ChromiumPath: p.Path,
PatchRel: p.Path,
Op: p.Op,
OldPath: p.OldPath,
}
}
func operationsFromPatchSet(set patch.PatchSet) []resolve.Operation {
paths := patch.ScopeFromSet(set)
ops := make([]resolve.Operation, 0, len(paths))
for _, rel := range paths {
ops = append(ops, operationFromPatch(set[rel]))
}
return ops
}
func operationsFromChanges(repoSet patch.PatchSet, changes []git.FileChange, filters []string) []resolve.Operation {
var ops []resolve.Operation
for _, change := range changes {
rel := normalizeChangedPatchPath(change.Path)
if !patch.PathMatches(rel, filters) {
continue
}
if patchFile, ok := repoSet[rel]; ok {
ops = append(ops, operationFromPatch(patchFile))
continue
}
ops = append(ops, resolve.Operation{
ChromiumPath: rel,
PatchRel: rel,
Op: patch.OpDelete,
OldPath: normalizeChangedPatchPath(change.OldPath),
})
}
return ops
}
func repoPatchChanges(ctx context.Context, repoInfo *repo.Info, ref string, rangeEnd string) ([]git.FileChange, error) {
pathspecs := []string{"chromium_patches"}
if rangeEnd == "" {
return git.DiffTreeNameStatus(ctx, repoInfo.Root, ref, pathspecs)
}
return git.DiffNameStatusBetween(ctx, repoInfo.Root, ref, rangeEnd, pathspecs)
}
func rejectPath(workspacePath string, op resolve.Operation) string {
candidate := patch.RejectPath(workspacePath, op.ChromiumPath)
if _, err := os.Stat(candidate); err == nil {
return candidate
}
return ""
}
func normalizeChangedPatchPath(path string) string {
return strings.TrimPrefix(patch.NormalizeChromiumPath(path), "chromium_patches/")
}
func clearResolveState(workspacePath string) error {
if resolve.Exists(workspacePath) {
return resolve.Delete(workspacePath)
}
return nil
}
func markApplyComplete(workspacePath string, baseCommit string, repoRev string) error {
state, err := workspace.LoadState(workspacePath)
if err != nil {
return err
}
state.BaseCommit = baseCommit
state.LastApplyRev = repoRev
state.LastApplyAt = time.Now().UTC()
return workspace.SaveState(workspacePath, state)
}

View File

@@ -1,120 +0,0 @@
package engine
import (
"fmt"
"os"
"path/filepath"
"bdev/internal/config"
"bdev/internal/git"
"bdev/internal/patch"
)
type CloneOpts struct {
VerifyBase bool
Clean bool
DryRun bool
}
func Clone(ctx *config.Context, opts CloneOpts) (*patch.PullResult, error) {
result := &patch.PullResult{}
// Verify HEAD matches BASE if requested
if opts.VerifyBase {
head, err := git.RevParse(ctx.ChromiumDir, "HEAD")
if err != nil {
return nil, fmt.Errorf("clone: getting HEAD: %w", err)
}
base, err := git.RevParse(ctx.ChromiumDir, ctx.BaseCommit)
if err != nil {
return nil, fmt.Errorf("clone: resolving BASE_COMMIT %s: %w", ctx.BaseCommit, err)
}
if head != base {
return nil, fmt.Errorf("clone: HEAD (%s) does not match BASE_COMMIT (%s) — use --verify-base=false to skip", head[:12], base[:12])
}
}
// Clean: reset all modified files to base before applying
if opts.Clean && !opts.DryRun {
nameStatus, err := git.DiffNameStatus(ctx.ChromiumDir, ctx.BaseCommit)
if err != nil {
return nil, fmt.Errorf("clone: discovering local changes: %w", err)
}
if len(nameStatus) > 0 {
var checkoutFiles []string
for path := range nameStatus {
if git.FileExistsInCommit(ctx.ChromiumDir, ctx.BaseCommit, path) {
checkoutFiles = append(checkoutFiles, path)
} else {
// File doesn't exist in base — remove it
_ = os.Remove(filepath.Join(ctx.ChromiumDir, path))
}
}
if len(checkoutFiles) > 0 {
if err := git.CheckoutFiles(ctx.ChromiumDir, ctx.BaseCommit, checkoutFiles); err != nil {
return nil, fmt.Errorf("clone: resetting to base: %w", err)
}
}
}
}
// Read all patches from repo
repoPatchSet, err := patch.ReadPatchSet(ctx.PatchesDir)
if err != nil {
return nil, fmt.Errorf("clone: reading patches: %w", err)
}
if opts.DryRun {
for path, fp := range repoPatchSet.Patches {
if fp.Op == patch.OpDeleted {
result.Deleted = append(result.Deleted, path)
} else {
result.Applied = append(result.Applied, path)
}
}
return result, nil
}
// Apply all patches
for path, fp := range repoPatchSet.Patches {
switch fp.Op {
case patch.OpDeleted:
target := filepath.Join(ctx.ChromiumDir, path)
if _, err := os.Stat(target); err == nil {
if err := os.Remove(target); err != nil {
return nil, fmt.Errorf("clone: deleting %s: %w", path, err)
}
result.Deleted = append(result.Deleted, path)
}
case patch.OpBinary:
// Skip binary files with no content
continue
default:
if fp.Content == nil {
continue
}
// Remove existing file if it's not in BASE (untracked new-file).
// git diff can't see untracked files, so --clean misses them.
if !git.FileExistsInCommit(ctx.ChromiumDir, ctx.BaseCommit, path) {
_ = os.Remove(filepath.Join(ctx.ChromiumDir, path))
}
patchFile := filepath.Join(ctx.PatchesDir, path)
conflict, err := git.Apply(ctx.ChromiumDir, fp.Content, patchFile)
if err != nil {
return nil, fmt.Errorf("clone: applying %s: %w", path, err)
}
if conflict != nil {
conflict.File = path
conflict.RejectFile = path + ".rej"
result.Conflicts = append(result.Conflicts, *conflict)
} else {
result.Applied = append(result.Applied, path)
}
}
}
return result, nil
}

View File

@@ -1,50 +0,0 @@
package engine
import (
"path/filepath"
"strings"
"testing"
)
func TestCloneCleanAppliesPatchAndResetsLocalChanges(t *testing.T) {
t.Parallel()
ctx := setupPullFixture(t)
result, err := Clone(ctx, CloneOpts{
Clean: true,
})
if err != nil {
t.Fatalf("Clone: %v", err)
}
if !contains(result.Applied, "foo.txt") {
t.Fatalf("expected foo.txt to be applied, got %#v", result.Applied)
}
foo := mustRead(t, filepath.Join(ctx.ChromiumDir, "foo.txt"))
if strings.TrimSpace(foo) != "repo-version" {
t.Fatalf("expected foo.txt to match patch repo, got %q", foo)
}
orphan := mustRead(t, filepath.Join(ctx.ChromiumDir, "orphan.txt"))
if strings.TrimSpace(orphan) != "orphan-base" {
t.Fatalf("expected orphan.txt to be reset during clean clone, got %q", orphan)
}
}
func TestCloneVerifyBaseRejectsMismatchedHead(t *testing.T) {
t.Parallel()
ctx := setupPullFixture(t)
writeFile(t, filepath.Join(ctx.ChromiumDir, "post_base.txt"), "new commit\n")
runGit(t, ctx.ChromiumDir, "add", "post_base.txt")
runGit(t, ctx.ChromiumDir, "commit", "-m", "move head")
_, err := Clone(ctx, CloneOpts{VerifyBase: true})
if err == nil {
t.Fatalf("expected verify-base failure when HEAD diverges from BASE_COMMIT")
}
if !strings.Contains(err.Error(), "does not match BASE_COMMIT") {
t.Fatalf("unexpected error: %v", err)
}
}

View File

@@ -0,0 +1,220 @@
package engine
import (
"context"
"os"
"os/exec"
"path/filepath"
"strings"
"testing"
"github.com/browseros-ai/BrowserOS/packages/browseros/tools/bdev/internal/git"
"github.com/browseros-ai/BrowserOS/packages/browseros/tools/bdev/internal/patch"
"github.com/browseros-ai/BrowserOS/packages/browseros/tools/bdev/internal/repo"
"github.com/browseros-ai/BrowserOS/packages/browseros/tools/bdev/internal/resolve"
"github.com/browseros-ai/BrowserOS/packages/browseros/tools/bdev/internal/workspace"
)
func TestAbortRevertsAppliedOpsAndRestoresPendingStash(t *testing.T) {
ctx := context.Background()
workspacePath := initGitRepo(t)
writeFile(t, filepath.Join(workspacePath, "a.txt"), "a\n")
writeFile(t, filepath.Join(workspacePath, "b.txt"), "b\n")
writeFile(t, filepath.Join(workspacePath, "local.txt"), "local\n")
runGit(t, workspacePath, "add", "a.txt", "b.txt", "local.txt")
runGit(t, workspacePath, "commit", "-m", "base")
baseCommit := gitOutput(t, workspacePath, "rev-parse", "HEAD")
writeFile(t, filepath.Join(workspacePath, "local.txt"), "local changed\n")
runGit(t, workspacePath, "stash", "push", "-m", "test stash", "-u", "--", "local.txt")
stashRef := gitOutput(t, workspacePath, "stash", "list", "-1", "--format=%gd")
if stashRef == "" {
t.Fatalf("expected stash ref")
}
if err := workspace.SaveState(workspacePath, &workspace.State{
Version: 1,
Workspace: workspacePath,
BaseCommit: baseCommit,
PendingStash: stashRef,
}); err != nil {
t.Fatalf("SaveState: %v", err)
}
writeFile(t, filepath.Join(workspacePath, "a.txt"), "applied\n")
writeFile(t, filepath.Join(workspacePath, "b.txt"), "conflict\n")
if err := resolve.Save(workspacePath, &resolve.State{
Workspace: workspacePath,
RepoRoot: workspacePath,
BaseCommit: baseCommit,
Current: 1,
Operations: []resolve.Operation{
{ChromiumPath: "a.txt", PatchRel: "a.txt", Op: patch.OpModify},
{ChromiumPath: "b.txt", PatchRel: "b.txt", Op: patch.OpModify},
},
}); err != nil {
t.Fatalf("resolve.Save: %v", err)
}
if err := Abort(ctx, workspace.Entry{Name: "ws", Path: workspacePath}); err != nil {
t.Fatalf("Abort: %v", err)
}
assertFile(t, filepath.Join(workspacePath, "a.txt"), "a\n")
assertFile(t, filepath.Join(workspacePath, "b.txt"), "b\n")
assertFile(t, filepath.Join(workspacePath, "local.txt"), "local changed\n")
if resolve.Exists(workspacePath) {
t.Fatalf("expected resolve state to be removed")
}
state, err := workspace.LoadState(workspacePath)
if err != nil {
t.Fatalf("LoadState: %v", err)
}
if state.PendingStash != "" {
t.Fatalf("expected pending stash cleared, got %q", state.PendingStash)
}
}
func TestPublishReturnsHelpfulErrorWhenNothingChanged(t *testing.T) {
ctx := context.Background()
repoRoot := initGitRepo(t)
writeFile(t, filepath.Join(repoRoot, "BASE_COMMIT"), "base123\n")
writeFile(t, filepath.Join(repoRoot, "chromium_patches", ".gitkeep"), "")
runGit(t, repoRoot, "add", "BASE_COMMIT", "chromium_patches/.gitkeep")
runGit(t, repoRoot, "commit", "-m", "repo init")
repoInfo, err := repo.Load(repoRoot)
if err != nil {
t.Fatalf("repo.Load: %v", err)
}
if _, err := Publish(ctx, repoInfo, "", ""); err == nil || !strings.Contains(err.Error(), "nothing to publish") {
t.Fatalf("expected helpful no-op error, got %v", err)
}
}
func TestOperationsFromChangesNormalizesOldPath(t *testing.T) {
ops := operationsFromChanges(nil, []git.FileChange{{
Status: "R",
Path: "chromium_patches/chrome/new.cc",
OldPath: "chromium_patches/chrome/old.cc",
}}, nil)
if len(ops) != 1 {
t.Fatalf("expected 1 operation, got %d", len(ops))
}
if ops[0].ChromiumPath != "chrome/new.cc" {
t.Fatalf("unexpected chromium path: %q", ops[0].ChromiumPath)
}
if ops[0].OldPath != "chrome/old.cc" {
t.Fatalf("unexpected old path: %q", ops[0].OldPath)
}
}
func TestSyncClearsPendingStashAfterSuccessfulNonRebaseRun(t *testing.T) {
ctx := context.Background()
workspacePath := initGitRepo(t)
writeFile(t, filepath.Join(workspacePath, "chrome", "browser.cc"), "base\n")
runGit(t, workspacePath, "add", "chrome/browser.cc")
runGit(t, workspacePath, "commit", "-m", "workspace base")
baseCommit := gitOutput(t, workspacePath, "rev-parse", "HEAD")
remoteRepo := t.TempDir()
runGit(t, remoteRepo, "init", "--bare")
repoRoot := initGitRepo(t)
if err := os.MkdirAll(filepath.Join(repoRoot, "chromium_patches"), 0o755); err != nil {
t.Fatalf("MkdirAll: %v", err)
}
writeFile(t, filepath.Join(repoRoot, "BASE_COMMIT"), baseCommit+"\n")
runGit(t, repoRoot, "add", "BASE_COMMIT")
runGit(t, repoRoot, "commit", "-m", "patch repo init")
runGit(t, repoRoot, "remote", "add", "origin", remoteRepo)
runGit(t, repoRoot, "push", "-u", "origin", "HEAD")
repoHead := gitOutput(t, repoRoot, "rev-parse", "HEAD")
repoInfo, err := repo.Load(repoRoot)
if err != nil {
t.Fatalf("repo.Load: %v", err)
}
if err := workspace.SaveState(workspacePath, &workspace.State{
Version: 1,
Workspace: workspacePath,
BaseCommit: baseCommit,
LastSyncRev: repoHead,
PendingStash: "stash@{42}",
}); err != nil {
t.Fatalf("SaveState: %v", err)
}
result, err := Sync(ctx, SyncOptions{
Workspace: workspace.Entry{Name: "ws", Path: workspacePath},
Repo: repoInfo,
Remote: "origin",
Rebase: false,
})
if err != nil {
t.Fatalf("Sync: %v", err)
}
if result.StashRef != "" {
t.Fatalf("expected no new stash ref, got %q", result.StashRef)
}
state, err := workspace.LoadState(workspacePath)
if err != nil {
t.Fatalf("LoadState: %v", err)
}
if state.PendingStash != "" {
t.Fatalf("expected pending stash to be cleared, got %q", state.PendingStash)
}
}
func initGitRepo(t *testing.T) string {
t.Helper()
dir := t.TempDir()
runGit(t, dir, "init")
runGit(t, dir, "config", "user.name", "Test User")
runGit(t, dir, "config", "user.email", "test@example.com")
return dir
}
func runGit(t *testing.T, dir string, args ...string) {
t.Helper()
cmd := exec.Command("git", args...)
cmd.Dir = dir
output, err := cmd.CombinedOutput()
if err != nil {
t.Fatalf("git %s: %v\n%s", strings.Join(args, " "), err, string(output))
}
}
func gitOutput(t *testing.T, dir string, args ...string) string {
t.Helper()
cmd := exec.Command("git", args...)
cmd.Dir = dir
output, err := cmd.CombinedOutput()
if err != nil {
t.Fatalf("git %s: %v\n%s", strings.Join(args, " "), err, string(output))
}
return strings.TrimSpace(string(output))
}
func writeFile(t *testing.T, path string, body string) {
t.Helper()
if err := os.MkdirAll(filepath.Dir(path), 0o755); err != nil {
t.Fatalf("MkdirAll: %v", err)
}
if err := os.WriteFile(path, []byte(body), 0o644); err != nil {
t.Fatalf("WriteFile: %v", err)
}
}
func assertFile(t *testing.T, path string, want string) {
t.Helper()
data, err := os.ReadFile(path)
if err != nil {
t.Fatalf("ReadFile %s: %v", path, err)
}
if string(data) != want {
t.Fatalf("unexpected file contents for %s: got %q want %q", path, string(data), want)
}
}

View File

@@ -0,0 +1,119 @@
package engine
import (
"context"
"time"
"github.com/browseros-ai/BrowserOS/packages/browseros/tools/bdev/internal/git"
"github.com/browseros-ai/BrowserOS/packages/browseros/tools/bdev/internal/patch"
"github.com/browseros-ai/BrowserOS/packages/browseros/tools/bdev/internal/repo"
"github.com/browseros-ai/BrowserOS/packages/browseros/tools/bdev/internal/workspace"
)
type ExtractOptions struct {
Workspace workspace.Entry
Repo *repo.Info
Commit string
RangeStart string
RangeEnd string
Squash bool
Base string
Filters []string
}
type ExtractResult struct {
Workspace string `json:"workspace"`
Mode string `json:"mode"`
BaseCommit string `json:"base_commit"`
Written []string `json:"written"`
Deleted []string `json:"deleted"`
}
func Extract(ctx context.Context, opts ExtractOptions) (*ExtractResult, error) {
base := opts.Base
if base == "" {
base = opts.Repo.BaseCommit
}
var (
set patch.PatchSet
scope []string
err error
mode string
)
switch {
case opts.Commit != "":
mode = "commit"
set, err = patch.BuildCommitPatchSet(ctx, opts.Workspace.Path, opts.Commit, opts.Base, opts.Filters)
if err == nil {
if opts.Base != "" {
changes, err := git.DiffTreeNameStatus(ctx, opts.Workspace.Path, opts.Commit, opts.Filters)
if err != nil {
return nil, err
}
scope = changedScope(changes)
} else {
scope = patch.ScopeFromSet(set)
}
}
case opts.RangeStart != "" && opts.RangeEnd != "":
mode = "range"
set, err = patch.BuildRangePatchSet(ctx, opts.Workspace.Path, opts.RangeStart, opts.RangeEnd, opts.Base, opts.Squash, opts.Filters)
if err == nil {
if opts.Base != "" || opts.Squash {
changes, err := git.DiffNameStatusBetween(ctx, opts.Workspace.Path, opts.RangeStart, opts.RangeEnd, opts.Filters)
if err != nil {
return nil, err
}
scope = changedScope(changes)
} else {
scope = patch.ScopeFromSet(set)
}
}
default:
mode = "working-tree"
set, err = patch.BuildWorkingTreePatchSet(ctx, opts.Workspace.Path, base, opts.Filters)
if err == nil && len(opts.Filters) > 0 {
scope = opts.Filters
}
}
if err != nil {
return nil, err
}
written, deleted, err := patch.WriteRepoPatchSet(opts.Repo.PatchesDir, set, scope)
if err != nil {
return nil, err
}
state, err := workspace.LoadState(opts.Workspace.Path)
if err != nil {
return nil, err
}
head, err := git.HeadRev(ctx, opts.Workspace.Path)
if err != nil {
return nil, err
}
state.BaseCommit = opts.Repo.BaseCommit
state.LastExtractRev = head
state.LastExtractAt = time.Now().UTC()
if err := workspace.SaveState(opts.Workspace.Path, state); err != nil {
return nil, err
}
return &ExtractResult{
Workspace: opts.Workspace.Name,
Mode: mode,
BaseCommit: base,
Written: written,
Deleted: deleted,
}, nil
}
func changedScope(changes []git.FileChange) []string {
scope := make([]string, 0, len(changes))
for _, change := range changes {
rel := patch.NormalizeChromiumPath(change.Path)
if patch.IsInternalPath(rel) {
continue
}
scope = append(scope, rel)
}
return scope
}

View File

@@ -0,0 +1,45 @@
package engine
import (
"context"
"fmt"
"github.com/browseros-ai/BrowserOS/packages/browseros/tools/bdev/internal/git"
"github.com/browseros-ai/BrowserOS/packages/browseros/tools/bdev/internal/repo"
)
type PublishResult struct {
Remote string `json:"remote"`
Branch string `json:"branch"`
Message string `json:"message"`
}
func Publish(ctx context.Context, repoInfo *repo.Info, remote string, message string) (*PublishResult, error) {
if remote == "" {
remote = "origin"
}
if message == "" {
message = "chore: update chromium patches"
}
dirty, err := git.IsDirtyPaths(ctx, repoInfo.Root, []string{"chromium_patches"})
if err != nil {
return nil, err
}
if !dirty {
return nil, fmt.Errorf("nothing to publish: chromium_patches has no uncommitted changes")
}
if err := git.AddPaths(ctx, repoInfo.Root, []string{"chromium_patches"}); err != nil {
return nil, err
}
if err := git.Commit(ctx, repoInfo.Root, message); err != nil {
return nil, err
}
branch, err := git.CurrentBranch(ctx, repoInfo.Root)
if err != nil {
return nil, err
}
if err := git.Push(ctx, repoInfo.Root, remote, branch); err != nil {
return nil, err
}
return &PublishResult{Remote: remote, Branch: branch, Message: message}, nil
}

View File

@@ -1,345 +0,0 @@
package engine
import (
"fmt"
"os"
"path/filepath"
"sort"
"strings"
"bdev/internal/config"
"bdev/internal/git"
"bdev/internal/patch"
)
type PullOpts struct {
DryRun bool
Files []string
KeepLocalOnly bool
}
func Pull(ctx *config.Context, opts PullOpts) (*patch.PullResult, error) {
repoPatchSet, err := patch.ReadPatchSet(ctx.PatchesDir)
if err != nil {
return nil, fmt.Errorf("pull: reading repo patches: %w", err)
}
repoHead, err := git.HeadRev(ctx.PatchesRepo)
if err != nil {
return nil, fmt.Errorf("pull: reading patches repo HEAD: %w", err)
}
incrementalPaths, shouldUseIncremental, err := resolveIncrementalPaths(ctx, repoHead, opts.Files)
if err != nil {
return nil, fmt.Errorf("pull: resolving incremental scope: %w", err)
}
if shouldUseIncremental {
result, err := incrementalPull(ctx, repoPatchSet, incrementalPaths, opts.DryRun)
if err != nil {
return nil, err
}
sortPullResult(result)
return result, nil
}
result, err := fullPull(ctx, repoPatchSet, opts)
if err != nil {
return nil, err
}
sortPullResult(result)
return result, nil
}
func resolveIncrementalPaths(ctx *config.Context, repoHead string, filesFilter []string) ([]string, bool, error) {
if len(filesFilter) > 0 {
return nil, false, nil
}
if ctx.State == nil || ctx.State.LastPull == nil {
return nil, false, nil
}
lastPull := ctx.State.LastPull
if strings.TrimSpace(lastPull.PatchesRepoRev) == "" {
return nil, false, nil
}
if lastPull.BaseCommit != ctx.BaseCommit {
return nil, false, nil
}
if !git.CommitExists(ctx.PatchesRepo, lastPull.PatchesRepoRev) {
return nil, false, nil
}
if lastPull.PatchesRepoRev == repoHead {
return []string{}, true, nil
}
repoPaths, err := git.DiffChangedPathsBetween(
ctx.PatchesRepo,
lastPull.PatchesRepoRev,
repoHead,
"chromium_patches",
)
if err != nil {
return nil, false, err
}
seen := make(map[string]bool)
for _, repoPath := range repoPaths {
chromiumPath, ok := normalizeRepoPatchPath(repoPath)
if !ok {
continue
}
seen[chromiumPath] = true
}
paths := make([]string, 0, len(seen))
for p := range seen {
paths = append(paths, p)
}
sort.Strings(paths)
return paths, true, nil
}
func normalizeRepoPatchPath(repoPath string) (string, bool) {
p := filepath.ToSlash(strings.TrimSpace(repoPath))
if !strings.HasPrefix(p, "chromium_patches/") {
return "", false
}
chromiumPath := strings.TrimPrefix(p, "chromium_patches/")
chromiumPath = strings.TrimSuffix(chromiumPath, ".deleted")
chromiumPath = strings.TrimSuffix(chromiumPath, ".binary")
chromiumPath = strings.TrimSuffix(chromiumPath, ".rename")
if chromiumPath == "" {
return "", false
}
return chromiumPath, true
}
func incrementalPull(
ctx *config.Context,
repoPatchSet *patch.PatchSet,
paths []string,
dryRun bool,
) (*patch.PullResult, error) {
result := &patch.PullResult{}
for _, path := range paths {
repoPatch, exists := repoPatchSet.Patches[path]
if !exists {
if !dryRun {
if err := resetPathToBase(ctx, path); err != nil {
return nil, fmt.Errorf("pull: reverting removed patch %s: %w", path, err)
}
}
result.Reverted = append(result.Reverted, path)
continue
}
switch repoPatch.Op {
case patch.OpDeleted:
if !dryRun {
if err := deletePath(ctx, path); err != nil {
return nil, err
}
}
result.Deleted = append(result.Deleted, path)
case patch.OpBinary:
result.Skipped = append(result.Skipped, path)
default:
if !dryRun {
if err := resetPathToBase(ctx, path); err != nil {
return nil, fmt.Errorf("pull: resetting %s to base: %w", path, err)
}
if err := applyRepoPatch(ctx, repoPatch, path, result); err != nil {
return nil, err
}
} else {
result.Applied = append(result.Applied, path)
}
}
}
return result, nil
}
func fullPull(ctx *config.Context, repoPatchSet *patch.PatchSet, opts PullOpts) (*patch.PullResult, error) {
result := &patch.PullResult{}
diffOutput, err := git.DiffFull(ctx.ChromiumDir, ctx.BaseCommit)
if err != nil {
return nil, fmt.Errorf("pull: reading local diffs: %w", err)
}
localPatchSet, err := patch.ParseUnifiedDiff(diffOutput)
if err != nil {
return nil, fmt.Errorf("pull: parsing local diffs: %w", err)
}
delta := patch.Compare(localPatchSet, repoPatchSet)
if len(opts.Files) > 0 {
delta = filterDelta(delta, opts.Files)
}
if opts.DryRun {
result.Applied = append(delta.NeedsUpdate, delta.NeedsApply...)
result.Skipped = delta.UpToDate
result.Deleted = delta.Deleted
if opts.KeepLocalOnly {
result.LocalOnly = delta.Orphaned
} else {
result.Reverted = delta.Orphaned
}
return result, nil
}
filesToReset := make([]string, 0, len(delta.NeedsUpdate)+len(delta.Orphaned))
filesToReset = append(filesToReset, delta.NeedsUpdate...)
if !opts.KeepLocalOnly {
filesToReset = append(filesToReset, delta.Orphaned...)
}
for _, path := range filesToReset {
if err := resetPathToBase(ctx, path); err != nil {
return nil, fmt.Errorf("pull: resetting %s to base: %w", path, err)
}
}
if opts.KeepLocalOnly {
result.LocalOnly = append(result.LocalOnly, delta.Orphaned...)
} else {
result.Reverted = append(result.Reverted, delta.Orphaned...)
}
filesToApply := make([]string, 0, len(delta.NeedsUpdate)+len(delta.NeedsApply))
filesToApply = append(filesToApply, delta.NeedsUpdate...)
filesToApply = append(filesToApply, delta.NeedsApply...)
for _, path := range filesToApply {
repoPatch, ok := repoPatchSet.Patches[path]
if !ok || repoPatch.Op == patch.OpDeleted || repoPatch.Op == patch.OpBinary {
continue
}
if err := applyRepoPatch(ctx, repoPatch, path, result); err != nil {
return nil, err
}
}
for _, path := range delta.Deleted {
if err := deletePath(ctx, path); err != nil {
return nil, err
}
result.Deleted = append(result.Deleted, path)
}
result.Skipped = delta.UpToDate
return result, nil
}
func applyRepoPatch(
ctx *config.Context,
repoPatch *patch.FilePatch,
path string,
result *patch.PullResult,
) error {
patchContent := repoPatch.Content
patchFile := filepath.Join(ctx.PatchesDir, path)
if len(patchContent) == 0 {
onDiskContent, err := os.ReadFile(patchFile)
if err == nil {
patchContent = onDiskContent
}
}
if len(patchContent) == 0 {
result.Skipped = append(result.Skipped, path)
return nil
}
if !git.FileExistsInCommit(ctx.ChromiumDir, ctx.BaseCommit, path) {
_ = os.Remove(filepath.Join(ctx.ChromiumDir, path))
}
conflict, err := git.Apply(ctx.ChromiumDir, patchContent, patchFile)
if err != nil {
return fmt.Errorf("pull: applying %s: %w", path, err)
}
if conflict != nil {
conflict.File = path
conflict.RejectFile = path + ".rej"
result.Conflicts = append(result.Conflicts, *conflict)
} else {
result.Applied = append(result.Applied, path)
}
return nil
}
func resetPathToBase(ctx *config.Context, chromiumPath string) error {
if git.FileExistsInCommit(ctx.ChromiumDir, ctx.BaseCommit, chromiumPath) {
return git.CheckoutFiles(ctx.ChromiumDir, ctx.BaseCommit, []string{chromiumPath})
}
target := filepath.Join(ctx.ChromiumDir, chromiumPath)
if err := os.Remove(target); err != nil && !os.IsNotExist(err) {
return err
}
return nil
}
func deletePath(ctx *config.Context, chromiumPath string) error {
target := filepath.Join(ctx.ChromiumDir, chromiumPath)
if err := os.Remove(target); err != nil && !os.IsNotExist(err) {
return fmt.Errorf("pull: deleting %s: %w", chromiumPath, err)
}
return nil
}
func filterDelta(d *patch.Delta, files []string) *patch.Delta {
fileSet := make(map[string]bool)
for _, f := range files {
fileSet[f] = true
}
filtered := &patch.Delta{}
for _, f := range d.NeedsUpdate {
if fileSet[f] {
filtered.NeedsUpdate = append(filtered.NeedsUpdate, f)
}
}
for _, f := range d.NeedsApply {
if fileSet[f] {
filtered.NeedsApply = append(filtered.NeedsApply, f)
}
}
for _, f := range d.UpToDate {
if fileSet[f] {
filtered.UpToDate = append(filtered.UpToDate, f)
}
}
for _, f := range d.Orphaned {
if fileSet[f] {
filtered.Orphaned = append(filtered.Orphaned, f)
}
}
for _, f := range d.Deleted {
if fileSet[f] {
filtered.Deleted = append(filtered.Deleted, f)
}
}
return filtered
}
func sortPullResult(result *patch.PullResult) {
sort.Strings(result.Applied)
sort.Strings(result.Skipped)
sort.Strings(result.Reverted)
sort.Strings(result.LocalOnly)
sort.Strings(result.Deleted)
sort.Slice(result.Conflicts, func(i, j int) bool {
return result.Conflicts[i].File < result.Conflicts[j].File
})
}

View File

@@ -1,156 +0,0 @@
package engine
import (
"os"
"os/exec"
"path/filepath"
"strings"
"testing"
"bdev/internal/config"
)
func TestPullKeepsLocalOnlyFilesByDefault(t *testing.T) {
t.Parallel()
ctx := setupPullFixture(t)
result, err := Pull(ctx, PullOpts{KeepLocalOnly: true})
if err != nil {
t.Fatalf("Pull: %v", err)
}
if !contains(result.Applied, "foo.txt") {
t.Fatalf("expected foo.txt to be applied, got %#v", result.Applied)
}
if !contains(result.LocalOnly, "orphan.txt") {
t.Fatalf("expected orphan.txt in local-only list, got %#v", result.LocalOnly)
}
if contains(result.Reverted, "orphan.txt") {
t.Fatalf("orphan.txt should not be reverted when KeepLocalOnly=true")
}
fooContent := mustRead(t, filepath.Join(ctx.ChromiumDir, "foo.txt"))
if strings.TrimSpace(fooContent) != "repo-version" {
t.Fatalf("unexpected foo.txt content: %q", fooContent)
}
orphanContent := mustRead(t, filepath.Join(ctx.ChromiumDir, "orphan.txt"))
if strings.TrimSpace(orphanContent) != "local-only-change" {
t.Fatalf("orphan.txt should be preserved, got %q", orphanContent)
}
}
func TestPullRevertsLocalOnlyWhenDisabled(t *testing.T) {
t.Parallel()
ctx := setupPullFixture(t)
result, err := Pull(ctx, PullOpts{KeepLocalOnly: false})
if err != nil {
t.Fatalf("Pull: %v", err)
}
if !contains(result.Reverted, "orphan.txt") {
t.Fatalf("expected orphan.txt to be reverted, got %#v", result.Reverted)
}
if contains(result.LocalOnly, "orphan.txt") {
t.Fatalf("orphan.txt should not be local-only when KeepLocalOnly=false")
}
orphanContent := mustRead(t, filepath.Join(ctx.ChromiumDir, "orphan.txt"))
if strings.TrimSpace(orphanContent) != "orphan-base" {
t.Fatalf("orphan.txt should be reset to base, got %q", orphanContent)
}
}
func setupPullFixture(t *testing.T) *config.Context {
t.Helper()
root := t.TempDir()
chromiumDir := filepath.Join(root, "chromium")
patchesRepo := filepath.Join(root, "patches")
if err := os.MkdirAll(chromiumDir, 0o755); err != nil {
t.Fatalf("mkdir chromium: %v", err)
}
if err := os.MkdirAll(patchesRepo, 0o755); err != nil {
t.Fatalf("mkdir patches: %v", err)
}
initRepo(t, chromiumDir)
writeFile(t, filepath.Join(chromiumDir, "foo.txt"), "base\n")
writeFile(t, filepath.Join(chromiumDir, "orphan.txt"), "orphan-base\n")
runGit(t, chromiumDir, "add", "foo.txt", "orphan.txt")
runGit(t, chromiumDir, "commit", "-m", "base")
baseCommit := strings.TrimSpace(runGit(t, chromiumDir, "rev-parse", "HEAD"))
writeFile(t, filepath.Join(chromiumDir, "foo.txt"), "repo-version\n")
patchDiff := runGit(t, chromiumDir, "diff", "--full-index", baseCommit, "--", "foo.txt")
if strings.TrimSpace(patchDiff) == "" {
t.Fatalf("expected patch diff for foo.txt")
}
runGit(t, chromiumDir, "checkout", baseCommit, "--", "foo.txt")
writeFile(t, filepath.Join(chromiumDir, "orphan.txt"), "local-only-change\n")
initRepo(t, patchesRepo)
writeFile(t, filepath.Join(patchesRepo, "BASE_COMMIT"), baseCommit+"\n")
writeFile(t, filepath.Join(patchesRepo, "chromium_patches", "foo.txt"), patchDiff)
runGit(t, patchesRepo, "add", ".")
runGit(t, patchesRepo, "commit", "-m", "seed patch repo")
return &config.Context{
Config: &config.Config{Name: "test-checkout", PatchesRepo: patchesRepo},
State: &config.State{},
ChromiumDir: chromiumDir,
BrosDir: filepath.Join(chromiumDir, ".bros"),
PatchesRepo: patchesRepo,
PatchesDir: filepath.Join(patchesRepo, "chromium_patches"),
BaseCommit: baseCommit,
}
}
func initRepo(t *testing.T, dir string) {
t.Helper()
runGit(t, dir, "init")
runGit(t, dir, "config", "user.email", "bdev-test@example.com")
runGit(t, dir, "config", "user.name", "bdev test")
}
func runGit(t *testing.T, dir string, args ...string) string {
t.Helper()
cmd := exec.Command("git", args...)
cmd.Dir = dir
out, err := cmd.CombinedOutput()
if err != nil {
t.Fatalf("git %v failed: %v\n%s", args, err, string(out))
}
return string(out)
}
func contains(items []string, target string) bool {
for _, item := range items {
if item == target {
return true
}
}
return false
}
func writeFile(t *testing.T, path, content string) {
t.Helper()
if err := os.MkdirAll(filepath.Dir(path), 0o755); err != nil {
t.Fatalf("mkdir %s: %v", path, err)
}
if err := os.WriteFile(path, []byte(content), 0o644); err != nil {
t.Fatalf("write %s: %v", path, err)
}
}
func mustRead(t *testing.T, path string) string {
t.Helper()
data, err := os.ReadFile(path)
if err != nil {
t.Fatalf("read %s: %v", path, err)
}
return string(data)
}

View File

@@ -1,147 +0,0 @@
package engine
import (
"fmt"
"sort"
"bdev/internal/config"
"bdev/internal/git"
"bdev/internal/patch"
)
type PushOpts struct {
DryRun bool
Files []string
}
func Push(ctx *config.Context, opts PushOpts) (*patch.PushResult, error) {
result := &patch.PushResult{}
// Phase 1: Discover changed files (working tree vs BASE)
nameStatus, err := git.DiffNameStatus(ctx.ChromiumDir, ctx.BaseCommit)
if err != nil {
return nil, fmt.Errorf("push: discovering changes: %w", err)
}
untracked, err := git.UntrackedFiles(ctx.ChromiumDir)
if err != nil {
return nil, fmt.Errorf("push: discovering untracked files: %w", err)
}
untrackedSet := make(map[string]bool, len(untracked))
for _, path := range untracked {
untrackedSet[path] = true
if _, exists := nameStatus[path]; !exists {
nameStatus[path] = patch.OpAdded
}
}
if len(nameStatus) == 0 {
return result, nil
}
// Filter to requested files if specified
if len(opts.Files) > 0 {
filtered := make(map[string]patch.FileOp)
for _, f := range opts.Files {
if op, ok := nameStatus[f]; ok {
filtered[f] = op
}
}
nameStatus = filtered
}
if len(nameStatus) == 0 {
return result, nil
}
// Phase 2: Generate patches
var diffOutput []byte
files := make([]string, 0, len(nameStatus))
for f := range nameStatus {
files = append(files, f)
}
sort.Strings(files)
diffOutput, err = git.DiffFiles(ctx.ChromiumDir, ctx.BaseCommit, files)
if err != nil {
return nil, fmt.Errorf("push: generating diffs: %w", err)
}
for _, file := range files {
if !untrackedSet[file] {
continue
}
noIndexDiff, err := git.DiffNoIndexFile(ctx.ChromiumDir, file)
if err != nil {
return nil, fmt.Errorf("push: generating no-index diff for %s: %w", file, err)
}
if len(noIndexDiff) == 0 {
continue
}
if len(diffOutput) > 0 && diffOutput[len(diffOutput)-1] != '\n' {
diffOutput = append(diffOutput, '\n')
}
diffOutput = append(diffOutput, noIndexDiff...)
}
patchSet, err := patch.ParseUnifiedDiff(diffOutput)
if err != nil {
return nil, fmt.Errorf("push: parsing diffs: %w", err)
}
patchSet.Base = ctx.BaseCommit
// Merge in deleted files that won't appear in diff output
for path, op := range nameStatus {
if op == patch.OpDeleted {
if _, exists := patchSet.Patches[path]; !exists {
patchSet.Patches[path] = &patch.FilePatch{
Path: path,
Op: patch.OpDeleted,
}
}
}
}
// Classify results for reporting
existingPatches := make(map[string]bool)
if existing, err := patch.ReadPatchFiles(ctx.PatchesDir); err == nil {
for p := range existing {
existingPatches[p] = true
}
}
for path, fp := range patchSet.Patches {
switch fp.Op {
case patch.OpDeleted:
result.Deleted = append(result.Deleted, path)
case patch.OpAdded:
result.Added = append(result.Added, path)
default:
if existingPatches[path] {
result.Modified = append(result.Modified, path)
} else {
result.Added = append(result.Added, path)
}
}
}
sort.Strings(result.Modified)
sort.Strings(result.Added)
sort.Strings(result.Deleted)
// Phase 3: Write patches
if err := patch.WritePatchSet(ctx.PatchesDir, patchSet, opts.DryRun); err != nil {
return nil, fmt.Errorf("push: writing patches: %w", err)
}
// Phase 4: Stale cleanup
if !opts.DryRun && len(opts.Files) == 0 {
stale, err := patch.RemoveStale(ctx.PatchesDir, patchSet, opts.DryRun)
if err != nil {
return nil, fmt.Errorf("push: stale cleanup: %w", err)
}
result.Stale = stale
sort.Strings(result.Stale)
}
return result, nil
}

View File

@@ -1,43 +0,0 @@
package engine
import (
"path/filepath"
"strings"
"testing"
)
func TestPushIncludesUntrackedFiles(t *testing.T) {
t.Parallel()
ctx := setupPullFixture(t)
if err := resetPathToBase(ctx, "orphan.txt"); err != nil {
t.Fatalf("reset orphan.txt: %v", err)
}
writeFile(t, filepath.Join(ctx.ChromiumDir, "foo.txt"), "repo-version-v2\n")
writeFile(t, filepath.Join(ctx.ChromiumDir, "new", "file.txt"), "brand-new\n")
dryRun, err := Push(ctx, PushOpts{DryRun: true})
if err != nil {
t.Fatalf("Push dry-run: %v", err)
}
if !contains(dryRun.Modified, "foo.txt") {
t.Fatalf("expected foo.txt in modified set, got %#v", dryRun.Modified)
}
if !contains(dryRun.Added, "new/file.txt") {
t.Fatalf("expected new/file.txt in added set, got %#v", dryRun.Added)
}
result, err := Push(ctx, PushOpts{})
if err != nil {
t.Fatalf("Push: %v", err)
}
if !contains(result.Added, "new/file.txt") {
t.Fatalf("expected new/file.txt in added result, got %#v", result.Added)
}
patchContent := mustRead(t, filepath.Join(ctx.PatchesDir, "new", "file.txt"))
if !strings.Contains(patchContent, "diff --git a/new/file.txt b/new/file.txt") {
t.Fatalf("unexpected patch content for untracked file:\n%s", patchContent)
}
}

View File

@@ -1,62 +1,85 @@
package engine
import (
"fmt"
"context"
"bdev/internal/config"
"bdev/internal/git"
"bdev/internal/patch"
"github.com/browseros-ai/BrowserOS/packages/browseros/tools/bdev/internal/git"
"github.com/browseros-ai/BrowserOS/packages/browseros/tools/bdev/internal/patch"
"github.com/browseros-ai/BrowserOS/packages/browseros/tools/bdev/internal/repo"
"github.com/browseros-ai/BrowserOS/packages/browseros/tools/bdev/internal/resolve"
"github.com/browseros-ai/BrowserOS/packages/browseros/tools/bdev/internal/workspace"
)
type StatusResult struct {
CheckoutName string
BaseCommit string
ChromiumVersion string
PatchesRepo string
Ahead int
Behind int
Synced int
AheadFiles []string
BehindFiles []string
SyncedFiles []string
type WorkspaceStatus struct {
Workspace workspace.Entry `json:"workspace"`
RepoHead string `json:"repo_head"`
BaseCommit string `json:"base_commit"`
LastApplyRev string `json:"last_apply_rev,omitempty"`
LastSyncRev string `json:"last_sync_rev,omitempty"`
LastExtractRev string `json:"last_extract_rev,omitempty"`
ActiveResolve bool `json:"active_resolve"`
NeedsApply []string `json:"needs_apply"`
NeedsUpdate []string `json:"needs_update"`
Orphaned []string `json:"orphaned"`
UpToDate []string `json:"up_to_date"`
SyncState string `json:"sync_state"`
}
func Status(ctx *config.Context, showFiles bool) (*StatusResult, error) {
result := &StatusResult{
CheckoutName: ctx.Config.Name,
BaseCommit: ctx.BaseCommit,
ChromiumVersion: ctx.ChromiumVersion,
PatchesRepo: ctx.PatchesRepo,
}
// Read repo patches
repoPatchSet, err := patch.ReadPatchSet(ctx.PatchesDir)
func InspectWorkspace(ctx context.Context, ws workspace.Entry, repoInfo *repo.Info) (*WorkspaceStatus, error) {
head, err := git.HeadRev(ctx, repoInfo.Root)
if err != nil {
return nil, fmt.Errorf("status: reading repo patches: %w", err)
return nil, err
}
// Read local state (working tree vs BASE)
diffOutput, err := git.DiffFull(ctx.ChromiumDir, ctx.BaseCommit)
state, err := workspace.LoadState(ws.Path)
if err != nil {
return nil, fmt.Errorf("status: reading local diffs: %w", err)
return nil, err
}
localPatchSet, err := patch.ParseUnifiedDiff(diffOutput)
repoSet, err := patch.LoadRepoPatchSet(repoInfo.PatchesDir, nil)
if err != nil {
return nil, fmt.Errorf("status: parsing local diffs: %w", err)
return nil, err
}
delta := patch.Compare(localPatchSet, repoPatchSet)
result.Ahead = len(delta.Orphaned)
result.Behind = len(delta.NeedsApply) + len(delta.NeedsUpdate)
result.Synced = len(delta.UpToDate)
if showFiles {
result.AheadFiles = delta.Orphaned
result.BehindFiles = append(delta.NeedsApply, delta.NeedsUpdate...)
result.SyncedFiles = delta.UpToDate
localSet, err := patch.BuildWorkingTreePatchSet(ctx, ws.Path, repoInfo.BaseCommit, nil)
if err != nil {
return nil, err
}
status := &WorkspaceStatus{
Workspace: ws,
RepoHead: head,
BaseCommit: repoInfo.BaseCommit,
LastApplyRev: state.LastApplyRev,
LastSyncRev: state.LastSyncRev,
LastExtractRev: state.LastExtractRev,
ActiveResolve: resolve.Exists(ws.Path),
}
for _, delta := range patch.Compare(repoSet, localSet) {
switch delta.Kind {
case patch.NeedsApply:
status.NeedsApply = append(status.NeedsApply, delta.Path)
case patch.NeedsUpdate:
status.NeedsUpdate = append(status.NeedsUpdate, delta.Path)
case patch.Orphaned:
status.Orphaned = append(status.Orphaned, delta.Path)
case patch.UpToDate:
status.UpToDate = append(status.UpToDate, delta.Path)
}
}
status.SyncState = inferSyncState(status)
return status, nil
}
func inferSyncState(status *WorkspaceStatus) string {
switch {
case status.ActiveResolve:
return "conflicted"
case status.LastSyncRev == "":
return "never-synced"
case status.LastSyncRev != status.RepoHead:
return "needs-sync"
case len(status.NeedsApply) > 0:
return "drifted"
case len(status.NeedsUpdate) > 0 || len(status.Orphaned) > 0:
return "local-changes"
default:
return "synced"
}
return result, nil
}

View File

@@ -1,23 +0,0 @@
package engine
import "testing"
func TestStatusReflectsAheadBehindAndSynced(t *testing.T) {
t.Parallel()
ctx := setupPullFixture(t)
before, err := Status(ctx, true)
if err != nil {
t.Fatalf("Status before pull: %v", err)
}
if before.Ahead != 1 || before.Behind != 1 || before.Synced != 0 {
t.Fatalf("unexpected status before pull: ahead=%d behind=%d synced=%d", before.Ahead, before.Behind, before.Synced)
}
if !contains(before.AheadFiles, "orphan.txt") {
t.Fatalf("expected orphan.txt in ahead files, got %#v", before.AheadFiles)
}
if !contains(before.BehindFiles, "foo.txt") {
t.Fatalf("expected foo.txt in behind files, got %#v", before.BehindFiles)
}
}

View File

@@ -0,0 +1,130 @@
package engine
import (
"context"
"fmt"
"time"
"github.com/browseros-ai/BrowserOS/packages/browseros/tools/bdev/internal/git"
"github.com/browseros-ai/BrowserOS/packages/browseros/tools/bdev/internal/repo"
"github.com/browseros-ai/BrowserOS/packages/browseros/tools/bdev/internal/workspace"
)
type SyncOptions struct {
Workspace workspace.Entry
Repo *repo.Info
Remote string
Rebase bool
}
type SyncResult struct {
Workspace string `json:"workspace"`
Remote string `json:"remote"`
RepoHead string `json:"repo_head"`
StashRef string `json:"stash_ref,omitempty"`
Rebased bool `json:"rebased"`
Fallback bool `json:"fallback"`
Applied []string `json:"applied,omitempty"`
Conflicts []string `json:"conflicts,omitempty"`
}
func Sync(ctx context.Context, opts SyncOptions) (*SyncResult, error) {
if opts.Remote == "" {
opts.Remote = "origin"
}
dirty, err := git.IsDirty(ctx, opts.Repo.Root)
if err != nil {
return nil, err
}
if dirty {
return nil, fmt.Errorf("patches repo has uncommitted changes; commit or stash them before syncing")
}
branch, err := git.CurrentBranch(ctx, opts.Repo.Root)
if err != nil {
return nil, err
}
if err := git.PullRebase(ctx, opts.Repo.Root, opts.Remote, branch); err != nil {
return nil, err
}
head, err := git.HeadRev(ctx, opts.Repo.Root)
if err != nil {
return nil, err
}
state, err := workspace.LoadState(opts.Workspace.Path)
if err != nil {
return nil, err
}
result := &SyncResult{
Workspace: opts.Workspace.Name,
Remote: opts.Remote,
RepoHead: head,
Rebased: opts.Rebase,
}
status, err := InspectWorkspace(ctx, opts.Workspace, opts.Repo)
if err != nil {
return nil, err
}
divergent := append([]string{}, status.NeedsUpdate...)
divergent = append(divergent, status.Orphaned...)
if len(divergent) > 0 {
stashRef, err := git.StashPush(ctx, opts.Workspace.Path, "bdev sync stash", true, divergent)
if err != nil {
return nil, err
}
result.StashRef = stashRef
state.PendingStash = stashRef
if err := workspace.SaveState(opts.Workspace.Path, state); err != nil {
return nil, err
}
}
if state.LastSyncRev == "" || state.BaseCommit != "" && state.BaseCommit != opts.Repo.BaseCommit {
result.Fallback = true
applyResult, err := Apply(ctx, ApplyOptions{
Workspace: opts.Workspace,
Repo: opts.Repo,
Reset: true,
Mode: "sync-reset",
})
if err != nil {
return nil, err
}
result.Applied = applyResult.Applied
if len(applyResult.Conflicts) > 0 {
for _, conflict := range applyResult.Conflicts {
result.Conflicts = append(result.Conflicts, conflict.ChromiumPath)
}
return result, nil
}
} else {
applyResult, err := Apply(ctx, ApplyOptions{
Workspace: opts.Workspace,
Repo: opts.Repo,
ChangedRef: state.LastSyncRev,
RangeEnd: head,
Mode: "sync",
})
if err != nil {
return nil, err
}
result.Applied = applyResult.Applied
if len(applyResult.Conflicts) > 0 {
for _, conflict := range applyResult.Conflicts {
result.Conflicts = append(result.Conflicts, conflict.ChromiumPath)
}
return result, nil
}
}
if opts.Rebase && result.StashRef != "" {
if err := git.StashPop(ctx, opts.Workspace.Path, result.StashRef); err != nil {
return nil, err
}
}
state.PendingStash = ""
state.BaseCommit = opts.Repo.BaseCommit
state.LastSyncRev = head
state.LastSyncAt = time.Now().UTC()
if err := workspace.SaveState(opts.Workspace.Path, state); err != nil {
return nil, err
}
return result, nil
}

View File

@@ -1,123 +0,0 @@
package git
import (
"bytes"
"context"
"fmt"
"os"
"os/exec"
"regexp"
"strconv"
"strings"
"time"
"bdev/internal/patch"
)
var rejHunkRe = regexp.MustCompile(`Applying patch .* with (\d+) hunks?`)
var rejFailRe = regexp.MustCompile(`(\d+) out of (\d+) hunks? FAILED`)
// Apply tries multiple strategies to apply a patch, falling back to --reject
// only as a last resort. Mirrors the Python CLI fallback chain.
func Apply(dir string, patchContent []byte, patchFile string) (*patch.ConflictInfo, error) {
// Strategy 1: Clean apply
if tryApply(dir, patchContent, "--ignore-whitespace", "--whitespace=nowarn", "-p1") == nil {
return nil, nil
}
// Strategy 2: Three-way merge
if tryApply(dir, patchContent, "--ignore-whitespace", "--whitespace=nowarn", "-p1", "--3way") == nil {
return nil, nil
}
// Strategy 3: Whitespace fix
if tryApply(dir, patchContent, "--ignore-whitespace", "--whitespace=fix", "-p1") == nil {
return nil, nil
}
// Strategy 4: Reject (last resort — partially applies, creates .rej files)
return applyReject(dir, patchContent, patchFile)
}
// tryApply attempts a git apply with the given flags. Returns nil on success.
func tryApply(dir string, patchContent []byte, flags ...string) error {
ctx, cancel := context.WithTimeout(context.Background(), 60*time.Second)
defer cancel()
args := append([]string{"apply"}, flags...)
cmd := exec.CommandContext(ctx, "git", args...)
cmd.Dir = dir
cmd.Stdin = bytes.NewReader(patchContent)
var stderr bytes.Buffer
cmd.Stderr = &stderr
return cmd.Run()
}
// applyReject applies a patch with --reject, creating .rej files for failed hunks.
func applyReject(dir string, patchContent []byte, patchFile string) (*patch.ConflictInfo, error) {
ctx, cancel := context.WithTimeout(context.Background(), 60*time.Second)
defer cancel()
cmd := exec.CommandContext(ctx, "git", "apply",
"--reject",
"--ignore-whitespace",
"--whitespace=nowarn",
"-p1",
)
cmd.Dir = dir
cmd.Stdin = bytes.NewReader(patchContent)
var stderr bytes.Buffer
cmd.Stderr = &stderr
err := cmd.Run()
if err == nil {
return nil, nil
}
stderrStr := stderr.String()
info := &patch.ConflictInfo{
PatchFile: patchFile,
Error: strings.TrimSpace(stderrStr),
}
if m := rejFailRe.FindStringSubmatch(stderrStr); len(m) == 3 {
info.HunksFailed, _ = strconv.Atoi(m[1])
info.HunksTotal, _ = strconv.Atoi(m[2])
} else if m := rejHunkRe.FindStringSubmatch(stderrStr); len(m) == 2 {
info.HunksTotal, _ = strconv.Atoi(m[1])
info.HunksFailed = info.HunksTotal
}
return info, nil
}
// ApplyPatchFile applies a patch from a file path using the full fallback chain.
func ApplyPatchFile(dir, patchPath string) (*patch.ConflictInfo, error) {
content, err := os.ReadFile(patchPath)
if err != nil {
return nil, fmt.Errorf("reading patch file %s: %w", patchPath, err)
}
info, err := Apply(dir, content, patchPath)
if info != nil {
info.PatchFile = patchPath
}
return info, err
}
// ApplyCheck tests if a patch would apply without modifying anything.
func ApplyCheck(dir string, patchContent []byte) error {
cmd := exec.Command("git", "apply", "--check", "-p1")
cmd.Dir = dir
cmd.Stdin = bytes.NewReader(patchContent)
var stderr bytes.Buffer
cmd.Stderr = &stderr
if err := cmd.Run(); err != nil {
return fmt.Errorf("patch would not apply cleanly: %s", stderr.String())
}
return nil
}

View File

@@ -1,20 +0,0 @@
package git
import "fmt"
// CheckoutFiles resets multiple files to a specific commit.
// Batches into a single git call.
func CheckoutFiles(dir, commit string, files []string) error {
if len(files) == 0 {
return nil
}
args := []string{"checkout", commit, "--"}
args = append(args, files...)
_, err := Run(dir, args...)
if err != nil {
return fmt.Errorf("checkout %s -- [%d files]: %w", commit, len(files), err)
}
return nil
}

View File

@@ -1,179 +0,0 @@
package git
import (
"bytes"
"context"
"fmt"
"os/exec"
"sort"
"strings"
"time"
"bdev/internal/patch"
)
// DiffNameStatus returns file paths mapped to their operation.
// Diffs BASE against the working tree (not HEAD) so uncommitted patch
// applications are visible.
func DiffNameStatus(dir, base string) (map[string]patch.FileOp, error) {
out, err := Run(dir, "diff", "--name-status", "-M", base)
if err != nil {
return nil, fmt.Errorf("diff --name-status %s: %w", base, err)
}
result := make(map[string]patch.FileOp)
for _, line := range strings.Split(out, "\n") {
line = strings.TrimSpace(line)
if line == "" {
continue
}
fields := strings.Fields(line)
if len(fields) < 2 {
continue
}
status := fields[0]
path := fields[len(fields)-1]
switch {
case status == "M":
result[path] = patch.OpModified
case status == "A":
result[path] = patch.OpAdded
case status == "D":
result[path] = patch.OpDeleted
case strings.HasPrefix(status, "R"):
result[path] = patch.OpRenamed
default:
result[path] = patch.OpModified
}
}
return result, nil
}
// DiffFull returns the complete unified diff of BASE vs working tree.
func DiffFull(dir, base string) ([]byte, error) {
out, err := RunBytes(dir, "diff", "-M", "--full-index", base)
if err != nil {
return nil, fmt.Errorf("diff %s: %w", base, err)
}
return out, nil
}
// DiffFile returns the working-tree diff for a single file.
func DiffFile(dir, base, file string) ([]byte, error) {
out, err := RunBytes(dir, "diff", "-M", "--full-index", base, "--", file)
if err != nil {
return nil, fmt.Errorf("diff %s -- %s: %w", base, file, err)
}
return out, nil
}
// DiffFiles returns working-tree diffs for multiple files in one call.
func DiffFiles(dir, base string, files []string) ([]byte, error) {
args := []string{"diff", "-M", "--full-index", base, "--"}
args = append(args, files...)
out, err := RunBytes(dir, args...)
if err != nil {
return nil, fmt.Errorf("diff %s -- [%d files]: %w", base, len(files), err)
}
return out, nil
}
// DiffNoIndexFile builds a patch for an untracked file as if it were added from /dev/null.
// git diff --no-index exits with status 1 when files differ; treat that as success.
func DiffNoIndexFile(dir, file string) ([]byte, error) {
ctx, cancel := context.WithTimeout(context.Background(), 120*time.Second)
defer cancel()
cmd := exec.CommandContext(ctx, "git", "diff", "--no-index", "--full-index", "/dev/null", file)
cmd.Dir = dir
var stdout, stderr bytes.Buffer
cmd.Stdout = &stdout
cmd.Stderr = &stderr
err := cmd.Run()
if err == nil {
return stdout.Bytes(), nil
}
if exitErr, ok := err.(*exec.ExitError); ok && exitErr.ExitCode() == 1 {
return stdout.Bytes(), nil
}
if ctx.Err() == context.DeadlineExceeded {
return nil, fmt.Errorf("diff --no-index %s: timed out", file)
}
return nil, fmt.Errorf("diff --no-index %s: %w\n%s", file, err, stderr.String())
}
// UntrackedFiles returns all untracked files (exclude-standard) in the repo.
func UntrackedFiles(dir string) ([]string, error) {
out, err := Run(dir, "ls-files", "--others", "--exclude-standard")
if err != nil {
return nil, fmt.Errorf("ls-files --others: %w", err)
}
if strings.TrimSpace(out) == "" {
return nil, nil
}
var files []string
for _, line := range strings.Split(out, "\n") {
line = strings.TrimSpace(line)
if line != "" {
files = append(files, line)
}
}
sort.Strings(files)
return files, nil
}
// DiffChangedPathsBetween returns changed paths between two revisions.
// It includes old and new paths for rename/copy records.
func DiffChangedPathsBetween(dir, fromRev, toRev string, pathspec ...string) ([]string, error) {
args := []string{
"diff",
"--name-status",
"--find-renames",
fmt.Sprintf("%s..%s", fromRev, toRev),
}
if len(pathspec) > 0 {
args = append(args, "--")
args = append(args, pathspec...)
}
out, err := Run(dir, args...)
if err != nil {
return nil, fmt.Errorf("diff --name-status %s..%s: %w", fromRev, toRev, err)
}
seen := make(map[string]bool)
for _, line := range strings.Split(out, "\n") {
line = strings.TrimSpace(line)
if line == "" {
continue
}
fields := strings.Split(line, "\t")
if len(fields) < 2 {
continue
}
for _, p := range fields[1:] {
p = strings.TrimSpace(p)
if p != "" {
seen[p] = true
}
}
}
paths := make([]string, 0, len(seen))
for p := range seen {
paths = append(paths, p)
}
sort.Strings(paths)
return paths, nil
}

View File

@@ -1,41 +0,0 @@
package git
import (
"bytes"
"context"
"fmt"
"os/exec"
"strings"
"time"
)
const defaultTimeout = 120 * time.Second
func Run(dir string, args ...string) (string, error) {
out, err := RunBytes(dir, args...)
return strings.TrimRight(string(out), "\n"), err
}
func RunBytes(dir string, args ...string) ([]byte, error) {
return RunWithTimeout(dir, defaultTimeout, args...)
}
func RunWithTimeout(dir string, timeout time.Duration, args ...string) ([]byte, error) {
ctx, cancel := context.WithTimeout(context.Background(), timeout)
defer cancel()
cmd := exec.CommandContext(ctx, "git", args...)
cmd.Dir = dir
var stdout, stderr bytes.Buffer
cmd.Stdout = &stdout
cmd.Stderr = &stderr
if err := cmd.Run(); err != nil {
if ctx.Err() == context.DeadlineExceeded {
return nil, fmt.Errorf("git %s: timed out after %s", args[0], timeout)
}
return nil, fmt.Errorf("git %s: %w\n%s", args[0], err, stderr.String())
}
return stdout.Bytes(), nil
}

View File

@@ -0,0 +1,387 @@
package git
import (
"bytes"
"context"
"errors"
"fmt"
"os"
"os/exec"
"path/filepath"
"strings"
)
type Result struct {
Stdout string
Stderr string
Code int
}
type FileChange struct {
Status string `json:"status"`
Path string `json:"path"`
OldPath string `json:"old_path,omitempty"`
}
func Run(ctx context.Context, dir string, stdin []byte, args ...string) (Result, error) {
command := exec.CommandContext(ctx, "git", args...)
command.Dir = dir
if stdin != nil {
command.Stdin = bytes.NewReader(stdin)
}
var stdout bytes.Buffer
var stderr bytes.Buffer
command.Stdout = &stdout
command.Stderr = &stderr
err := command.Run()
code := -1
if command.ProcessState != nil {
code = command.ProcessState.ExitCode()
}
result := Result{
Stdout: stdout.String(),
Stderr: stderr.String(),
Code: code,
}
if err == nil {
return result, nil
}
if errors.Is(err, context.DeadlineExceeded) || errors.Is(ctx.Err(), context.DeadlineExceeded) {
return result, err
}
if errors.Is(err, context.Canceled) || errors.Is(ctx.Err(), context.Canceled) {
return result, err
}
if command.ProcessState == nil {
return result, err
}
return result, nil
}
func HeadRev(ctx context.Context, dir string) (string, error) {
result, err := Run(ctx, dir, nil, "rev-parse", "HEAD")
if err != nil {
return "", err
}
if result.Code != 0 {
return "", errors.New(strings.TrimSpace(result.Stderr))
}
return strings.TrimSpace(result.Stdout), nil
}
func CurrentBranch(ctx context.Context, dir string) (string, error) {
result, err := Run(ctx, dir, nil, "branch", "--show-current")
if err != nil {
return "", err
}
if result.Code != 0 {
return "", errors.New(strings.TrimSpace(result.Stderr))
}
return strings.TrimSpace(result.Stdout), nil
}
func IsDirty(ctx context.Context, dir string) (bool, error) {
return IsDirtyPaths(ctx, dir, nil)
}
func IsDirtyPaths(ctx context.Context, dir string, pathspecs []string) (bool, error) {
args := []string{"status", "--porcelain"}
if len(pathspecs) > 0 {
args = append(args, "--")
args = append(args, pathspecs...)
}
result, err := Run(ctx, dir, nil, args...)
if err != nil {
return false, err
}
if result.Code != 0 {
return false, errors.New(strings.TrimSpace(result.Stderr))
}
return strings.TrimSpace(result.Stdout) != "", nil
}
func CommitExists(ctx context.Context, dir string, ref string) (bool, error) {
result, err := Run(ctx, dir, nil, "rev-parse", "--verify", ref+"^{commit}")
if err != nil {
return false, err
}
return result.Code == 0, nil
}
func FileExistsAtCommit(ctx context.Context, dir string, ref string, rel string) (bool, error) {
result, err := Run(ctx, dir, nil, "cat-file", "-e", fmt.Sprintf("%s:%s", ref, rel))
if err != nil {
return false, err
}
return result.Code == 0, nil
}
func ShowFile(ctx context.Context, dir string, ref string, rel string) ([]byte, error) {
result, err := Run(ctx, dir, nil, "show", fmt.Sprintf("%s:%s", ref, rel))
if err != nil {
return nil, err
}
if result.Code != 0 {
return nil, errors.New(strings.TrimSpace(result.Stderr))
}
return []byte(result.Stdout), nil
}
func CheckoutFiles(ctx context.Context, dir string, ref string, paths []string) error {
if len(paths) == 0 {
return nil
}
args := []string{"checkout", ref, "--"}
args = append(args, paths...)
result, err := Run(ctx, dir, nil, args...)
if err != nil {
return err
}
if result.Code != 0 {
return errors.New(strings.TrimSpace(result.Stderr))
}
return nil
}
func ResetPathToCommit(ctx context.Context, dir string, ref string, rel string) error {
exists, err := FileExistsAtCommit(ctx, dir, ref, rel)
if err != nil {
return err
}
target := filepath.Join(dir, filepath.FromSlash(rel))
if exists {
return CheckoutFiles(ctx, dir, ref, []string{rel})
}
return os.RemoveAll(target)
}
func DiffText(ctx context.Context, dir string, args ...string) (string, error) {
result, err := Run(ctx, dir, nil, append([]string{"diff", "--binary", "-M"}, args...)...)
if err != nil {
return "", err
}
if result.Code != 0 {
return "", errors.New(strings.TrimSpace(result.Stderr))
}
return result.Stdout, nil
}
func DiffNoIndex(ctx context.Context, dir string, path string) (string, error) {
result, err := Run(ctx, dir, nil, "diff", "--binary", "--no-index", "--", "/dev/null", path)
if err != nil {
return "", err
}
if result.Code != 0 && result.Code != 1 {
return "", errors.New(strings.TrimSpace(result.Stderr))
}
return result.Stdout, nil
}
func ListUntracked(ctx context.Context, dir string, pathspecs []string) ([]string, error) {
args := []string{"ls-files", "--others", "--exclude-standard"}
if len(pathspecs) > 0 {
args = append(args, "--")
args = append(args, pathspecs...)
}
result, err := Run(ctx, dir, nil, args...)
if err != nil {
return nil, err
}
if result.Code != 0 {
return nil, errors.New(strings.TrimSpace(result.Stderr))
}
lines := splitLines(result.Stdout)
return lines, nil
}
func DiffNameStatusBetween(ctx context.Context, dir string, from string, to string, pathspecs []string) ([]FileChange, error) {
args := []string{"diff", "--name-status", "-M", fmt.Sprintf("%s..%s", from, to)}
if len(pathspecs) > 0 {
args = append(args, "--")
args = append(args, pathspecs...)
}
return runNameStatus(ctx, dir, args...)
}
func DiffTreeNameStatus(ctx context.Context, dir string, ref string, pathspecs []string) ([]FileChange, error) {
args := []string{"diff-tree", "--no-commit-id", "--name-status", "-r", ref}
if len(pathspecs) > 0 {
args = append(args, "--")
args = append(args, pathspecs...)
}
return runNameStatus(ctx, dir, args...)
}
func RevListRange(ctx context.Context, dir string, start string, end string) ([]string, error) {
result, err := Run(ctx, dir, nil, "rev-list", "--reverse", fmt.Sprintf("%s..%s", start, end))
if err != nil {
return nil, err
}
if result.Code != 0 {
return nil, errors.New(strings.TrimSpace(result.Stderr))
}
return splitLines(result.Stdout), nil
}
func ApplyPatch(ctx context.Context, dir string, patch []byte) (string, error) {
strategies := [][]string{
{"apply", "--ignore-whitespace", "--whitespace=nowarn", "-p1"},
{"apply", "--ignore-whitespace", "--whitespace=nowarn", "-p1", "--3way"},
{"apply", "--ignore-whitespace", "--whitespace=fix", "-p1"},
{"apply", "--reject", "--ignore-whitespace", "--whitespace=nowarn", "-p1"},
}
var lastErr string
for _, args := range strategies {
result, err := Run(ctx, dir, patch, args...)
if err != nil {
return "", err
}
if result.Code == 0 {
return strings.Join(args[1:], " "), nil
}
lastErr = strings.TrimSpace(result.Stderr)
}
if lastErr == "" {
lastErr = "git apply failed"
}
return "", errors.New(lastErr)
}
func StashPush(ctx context.Context, dir string, message string, includeUntracked bool, paths []string) (string, error) {
args := []string{"stash", "push", "-m", message}
if includeUntracked {
args = append(args, "-u")
}
if len(paths) > 0 {
args = append(args, "--")
args = append(args, paths...)
}
result, err := Run(ctx, dir, nil, args...)
if err != nil {
return "", err
}
if result.Code != 0 {
return "", errors.New(strings.TrimSpace(result.Stderr))
}
if strings.Contains(result.Stdout, "No local changes to save") {
return "", nil
}
list, err := Run(ctx, dir, nil, "stash", "list", "-1", "--format=%gd")
if err != nil {
return "", err
}
if list.Code != 0 {
return "", errors.New(strings.TrimSpace(list.Stderr))
}
return strings.TrimSpace(list.Stdout), nil
}
func StashPop(ctx context.Context, dir string, stashRef string) error {
args := []string{"stash", "pop"}
if stashRef != "" {
args = append(args, stashRef)
}
result, err := Run(ctx, dir, nil, args...)
if err != nil {
return err
}
if result.Code != 0 {
return errors.New(strings.TrimSpace(result.Stderr))
}
return nil
}
func PullRebase(ctx context.Context, dir string, remote string, branch string) error {
args := []string{"pull", "--rebase"}
if remote != "" {
args = append(args, remote)
if branch != "" {
args = append(args, branch)
}
}
result, err := Run(ctx, dir, nil, args...)
if err != nil {
return err
}
if result.Code != 0 {
return errors.New(strings.TrimSpace(result.Stderr))
}
return nil
}
func AddPaths(ctx context.Context, dir string, paths []string) error {
if len(paths) == 0 {
return nil
}
args := append([]string{"add", "--"}, paths...)
result, err := Run(ctx, dir, nil, args...)
if err != nil {
return err
}
if result.Code != 0 {
return errors.New(strings.TrimSpace(result.Stderr))
}
return nil
}
func Commit(ctx context.Context, dir string, message string) error {
result, err := Run(ctx, dir, nil, "commit", "-m", message)
if err != nil {
return err
}
if result.Code != 0 {
return errors.New(strings.TrimSpace(result.Stderr))
}
return nil
}
func Push(ctx context.Context, dir string, remote string, branch string) error {
args := []string{"push"}
if remote != "" {
args = append(args, remote)
}
if branch != "" {
args = append(args, branch)
}
result, err := Run(ctx, dir, nil, args...)
if err != nil {
return err
}
if result.Code != 0 {
return errors.New(strings.TrimSpace(result.Stderr))
}
return nil
}
func runNameStatus(ctx context.Context, dir string, args ...string) ([]FileChange, error) {
result, err := Run(ctx, dir, nil, args...)
if err != nil {
return nil, err
}
if result.Code != 0 {
return nil, errors.New(strings.TrimSpace(result.Stderr))
}
var changes []FileChange
for _, line := range splitLines(result.Stdout) {
parts := strings.Split(line, "\t")
if len(parts) < 2 {
continue
}
change := FileChange{Status: parts[0][:1], Path: parts[len(parts)-1]}
if change.Status == "R" || change.Status == "C" {
if len(parts) >= 3 {
change.OldPath = parts[1]
}
}
changes = append(changes, change)
}
return changes, nil
}
func splitLines(raw string) []string {
lines := strings.Split(strings.TrimSpace(raw), "\n")
if len(lines) == 1 && lines[0] == "" {
return nil
}
return lines
}

View File

@@ -0,0 +1,28 @@
package git
import (
"context"
"os"
"path/filepath"
"testing"
"time"
)
func TestRunReturnsContextError(t *testing.T) {
home := t.TempDir()
t.Setenv("HOME", home)
config := []byte("[alias]\n\thold = !sleep 5\n")
if err := os.WriteFile(filepath.Join(home, ".gitconfig"), config, 0o644); err != nil {
t.Fatalf("WriteFile: %v", err)
}
ctx, cancel := context.WithTimeout(context.Background(), 10*time.Millisecond)
defer cancel()
if _, err := Run(ctx, t.TempDir(), nil, "hold"); err == nil {
t.Fatalf("expected timeout error")
}
if ctx.Err() != context.DeadlineExceeded {
t.Fatalf("expected context deadline exceeded, got %v", ctx.Err())
}
}

View File

@@ -1,32 +0,0 @@
package git
import (
"fmt"
"strings"
)
// ChangedFilesSince returns files changed between two revs in a subdirectory.
func ChangedFilesSince(dir, fromRev, toRev, subdir string) ([]string, error) {
args := []string{"diff", "--name-only", fromRev + ".." + toRev}
if subdir != "" {
args = append(args, "--", subdir)
}
out, err := Run(dir, args...)
if err != nil {
return nil, fmt.Errorf("diff --name-only %s..%s: %w", fromRev, toRev, err)
}
if out == "" {
return nil, nil
}
var files []string
for _, line := range strings.Split(out, "\n") {
line = strings.TrimSpace(line)
if line != "" {
files = append(files, line)
}
}
return files, nil
}

View File

@@ -1,91 +0,0 @@
package git
import (
"fmt"
"strings"
)
func ListRemotes(dir string) ([]string, error) {
out, err := Run(dir, "remote")
if err != nil {
return nil, fmt.Errorf("listing remotes: %w", err)
}
if strings.TrimSpace(out) == "" {
return nil, nil
}
var remotes []string
for _, line := range strings.Split(out, "\n") {
line = strings.TrimSpace(line)
if line != "" {
remotes = append(remotes, line)
}
}
return remotes, nil
}
func HasRemote(dir, remote string) (bool, error) {
remotes, err := ListRemotes(dir)
if err != nil {
return false, err
}
for _, name := range remotes {
if name == remote {
return true, nil
}
}
return false, nil
}
func CurrentBranch(dir string) (branch string, detached bool, err error) {
out, runErr := Run(dir, "symbolic-ref", "--quiet", "--short", "HEAD")
if runErr != nil {
return "", true, nil
}
branch = strings.TrimSpace(out)
if branch == "" {
return "", true, nil
}
return branch, false, nil
}
func Fetch(dir, remote string) error {
_, err := Run(dir, "fetch", "--prune", remote)
if err != nil {
return fmt.Errorf("fetch %s: %w", remote, err)
}
return nil
}
func Pull(dir, remote, branch string, rebase bool) error {
args := []string{"pull"}
if rebase {
args = append(args, "--rebase")
} else {
args = append(args, "--ff-only")
}
args = append(args, remote)
if strings.TrimSpace(branch) != "" {
args = append(args, branch)
}
_, err := Run(dir, args...)
if err != nil {
return fmt.Errorf("pull %s/%s: %w", remote, branch, err)
}
return nil
}
func Push(dir, remote, branch string) error {
args := []string{"push", remote}
if strings.TrimSpace(branch) != "" {
args = append(args, "HEAD:"+branch)
}
_, err := Run(dir, args...)
if err != nil {
return fmt.Errorf("push %s/%s: %w", remote, branch, err)
}
return nil
}

View File

@@ -1,101 +0,0 @@
package git
import (
"os"
"os/exec"
"path/filepath"
"strings"
"testing"
)
func TestRemoteSyncAndStatusHelpers(t *testing.T) {
t.Parallel()
root := t.TempDir()
remote := filepath.Join(root, "remote.git")
repoA := filepath.Join(root, "repo-a")
repoB := filepath.Join(root, "repo-b")
runGit(t, root, "init", "--bare", remote)
runGit(t, root, "clone", remote, repoA)
configRepo(t, repoA)
writeFile(t, filepath.Join(repoA, "README.md"), "one\n")
runGit(t, repoA, "add", "README.md")
runGit(t, repoA, "commit", "-m", "init")
branch := strings.TrimSpace(runGit(t, repoA, "symbolic-ref", "--short", "HEAD"))
runGit(t, repoA, "push", "-u", "origin", branch)
runGit(t, root, "clone", remote, repoB)
configRepo(t, repoB)
remotes, err := ListRemotes(repoB)
if err != nil {
t.Fatalf("ListRemotes: %v", err)
}
if len(remotes) != 1 || remotes[0] != "origin" {
t.Fatalf("unexpected remotes: %#v", remotes)
}
writeFile(t, filepath.Join(repoA, "README.md"), "two\n")
runGit(t, repoA, "commit", "-am", "update")
runGit(t, repoA, "push", "origin", branch)
targetRev := strings.TrimSpace(runGit(t, repoA, "rev-parse", "HEAD"))
if err := Fetch(repoB, "origin"); err != nil {
t.Fatalf("Fetch: %v", err)
}
curBranch, detached, err := CurrentBranch(repoB)
if err != nil {
t.Fatalf("CurrentBranch: %v", err)
}
if detached {
t.Fatalf("expected checked-out branch in clone")
}
if err := Pull(repoB, "origin", curBranch, true); err != nil {
t.Fatalf("Pull: %v", err)
}
currentRev, err := HeadRev(repoB)
if err != nil {
t.Fatalf("HeadRev: %v", err)
}
if currentRev != targetRev {
t.Fatalf("repo-b did not fast-forward to latest rev: got %s want %s", currentRev, targetRev)
}
writeFile(t, filepath.Join(repoB, "scratch.txt"), "dirty\n")
dirty, err := IsDirty(repoB)
if err != nil {
t.Fatalf("IsDirty: %v", err)
}
if !dirty {
t.Fatalf("expected repo to be dirty")
}
}
func configRepo(t *testing.T, dir string) {
t.Helper()
runGit(t, dir, "config", "user.email", "bdev-test@example.com")
runGit(t, dir, "config", "user.name", "bdev test")
}
func runGit(t *testing.T, dir string, args ...string) string {
t.Helper()
cmd := exec.Command("git", args...)
cmd.Dir = dir
out, err := cmd.CombinedOutput()
if err != nil {
t.Fatalf("git %v failed: %v\n%s", args, err, string(out))
}
return string(out)
}
func writeFile(t *testing.T, path, content string) {
t.Helper()
if err := os.MkdirAll(filepath.Dir(path), 0o755); err != nil {
t.Fatalf("mkdir: %v", err)
}
if err := os.WriteFile(path, []byte(content), 0o644); err != nil {
t.Fatalf("write: %v", err)
}
}

View File

@@ -1,27 +0,0 @@
package git
import "fmt"
func RevParse(dir, ref string) (string, error) {
out, err := Run(dir, "rev-parse", ref)
if err != nil {
return "", fmt.Errorf("rev-parse %s: %w", ref, err)
}
return out, nil
}
func CommitExists(dir, commit string) bool {
_, err := Run(dir, "cat-file", "-e", commit+"^{commit}")
return err == nil
}
func HeadRev(dir string) (string, error) {
return RevParse(dir, "HEAD")
}
// FileExistsInCommit checks whether a file path exists in a given commit.
// Uses git cat-file -e {commit}:{path}.
func FileExistsInCommit(dir, commit, filePath string) bool {
_, err := Run(dir, "cat-file", "-e", commit+":"+filePath)
return err == nil
}

View File

@@ -1,61 +0,0 @@
package git
import (
"fmt"
"strings"
)
func StatusPorcelain(dir string, pathspec ...string) ([]string, error) {
args := []string{"status", "--porcelain"}
if len(pathspec) > 0 {
args = append(args, "--")
args = append(args, pathspec...)
}
out, err := Run(dir, args...)
if err != nil {
return nil, fmt.Errorf("status --porcelain: %w", err)
}
if strings.TrimSpace(out) == "" {
return nil, nil
}
var lines []string
for _, line := range strings.Split(out, "\n") {
line = strings.TrimSpace(line)
if line != "" {
lines = append(lines, line)
}
}
return lines, nil
}
func IsDirty(dir string, pathspec ...string) (bool, error) {
lines, err := StatusPorcelain(dir, pathspec...)
if err != nil {
return false, err
}
return len(lines) > 0, nil
}
func Add(dir string, pathspec ...string) error {
args := []string{"add", "-A"}
if len(pathspec) > 0 {
args = append(args, "--")
args = append(args, pathspec...)
}
_, err := Run(dir, args...)
if err != nil {
return fmt.Errorf("add: %w", err)
}
return nil
}
func Commit(dir, message string) error {
_, err := Run(dir, "commit", "-m", message)
if err != nil {
return fmt.Errorf("commit: %w", err)
}
return nil
}

View File

@@ -1,130 +0,0 @@
package log
import (
"fmt"
"os"
"path/filepath"
"strings"
"time"
"bdev/internal/patch"
)
type Logger struct {
logFile string
}
func New(brosDir string) *Logger {
return &Logger{
logFile: filepath.Join(brosDir, "logs", "activity.log"),
}
}
func (l *Logger) LogPush(base string, result *patch.PushResult) error {
var b strings.Builder
b.WriteString(divider('='))
b.WriteString(fmt.Sprintf("PUSH %s\n", time.Now().Format("2006-01-02 15:04:05")))
b.WriteString(fmt.Sprintf("Base: %s\n", base))
b.WriteString(divider('-'))
for _, f := range result.Modified {
b.WriteString(fmt.Sprintf(" M %s\n", f))
}
for _, f := range result.Added {
b.WriteString(fmt.Sprintf(" A %s\n", f))
}
for _, f := range result.Deleted {
b.WriteString(fmt.Sprintf(" D %s\n", f))
}
if len(result.Stale) > 0 {
b.WriteString("Stale removed:\n")
for _, f := range result.Stale {
b.WriteString(fmt.Sprintf(" %s\n", f))
}
}
b.WriteString(fmt.Sprintf("Summary: %d pushed (%d modified, %d added, %d deleted)",
result.Total(), len(result.Modified), len(result.Added), len(result.Deleted)))
if len(result.Stale) > 0 {
b.WriteString(fmt.Sprintf(", %d stale removed", len(result.Stale)))
}
b.WriteString("\n\n")
return l.append(b.String())
}
func (l *Logger) LogPull(base, repoRev string, result *patch.PullResult) error {
var b strings.Builder
b.WriteString(divider('='))
b.WriteString(fmt.Sprintf("PULL %s\n", time.Now().Format("2006-01-02 15:04:05")))
b.WriteString(fmt.Sprintf("Base: %s\n", base))
b.WriteString(fmt.Sprintf("Patches repo rev: %s\n", repoRev))
b.WriteString(divider('-'))
for _, f := range result.Applied {
b.WriteString(fmt.Sprintf(" + %s\n", f))
}
for _, f := range result.Deleted {
b.WriteString(fmt.Sprintf(" D %s\n", f))
}
for _, f := range result.Reverted {
b.WriteString(fmt.Sprintf(" R %s (reverted to base)\n", f))
}
for _, f := range result.LocalOnly {
b.WriteString(fmt.Sprintf(" ~ %s (local-only, kept)\n", f))
}
for _, c := range result.Conflicts {
b.WriteString(fmt.Sprintf(" x %s -> %s (hunk %d/%d failed)\n",
c.File, c.RejectFile, c.HunksFailed, c.HunksTotal))
}
if len(result.Skipped) > 0 {
b.WriteString(fmt.Sprintf(" ~ %d files skipped (already up to date)\n", len(result.Skipped)))
}
b.WriteString(fmt.Sprintf("Summary: %d applied, %d deleted, %d reverted, %d local-only, %d conflicts, %d skipped\n\n",
len(result.Applied), len(result.Deleted), len(result.Reverted), len(result.LocalOnly), len(result.Conflicts), len(result.Skipped)))
return l.append(b.String())
}
func (l *Logger) LogClone(base string, result *patch.PullResult) error {
var b strings.Builder
b.WriteString(divider('='))
b.WriteString(fmt.Sprintf("CLONE %s\n", time.Now().Format("2006-01-02 15:04:05")))
b.WriteString(fmt.Sprintf("Base: %s\n", base))
b.WriteString(divider('-'))
for _, f := range result.Applied {
b.WriteString(fmt.Sprintf(" + %s\n", f))
}
for _, c := range result.Conflicts {
b.WriteString(fmt.Sprintf(" x %s -> %s\n", c.File, c.RejectFile))
}
b.WriteString(fmt.Sprintf("Summary: %d applied, %d conflicts\n\n",
len(result.Applied), len(result.Conflicts)))
return l.append(b.String())
}
func (l *Logger) append(entry string) error {
if err := os.MkdirAll(filepath.Dir(l.logFile), 0o755); err != nil {
return err
}
f, err := os.OpenFile(l.logFile, os.O_APPEND|os.O_CREATE|os.O_WRONLY, 0o644)
if err != nil {
return err
}
defer f.Close()
_, err = f.WriteString(entry)
return err
}
func divider(ch byte) string {
return strings.Repeat(string(ch), 50) + "\n"
}

View File

@@ -1,66 +1,65 @@
package patch
import (
"bytes"
"slices"
"strings"
)
type Delta struct {
NeedsUpdate []string // In both, but content differs
NeedsApply []string // In repo only
UpToDate []string // In both, content matches
Orphaned []string // Local only, no repo patch
Deleted []string // .deleted markers in repo
}
// Compare computes the delta between local patch set and repo patch set.
func Compare(local, repo *PatchSet) *Delta {
d := &Delta{}
for path, repoPatch := range repo.Patches {
if repoPatch.Op == OpDeleted {
d.Deleted = append(d.Deleted, path)
continue
}
localPatch, exists := local.Patches[path]
if !exists {
d.NeedsApply = append(d.NeedsApply, path)
continue
}
if patchContentEqual(localPatch.Content, repoPatch.Content) {
d.UpToDate = append(d.UpToDate, path)
} else {
d.NeedsUpdate = append(d.NeedsUpdate, path)
func Compare(repo PatchSet, local PatchSet) []Delta {
seen := map[string]bool{}
var deltas []Delta
for rel, repoPatch := range repo {
seen[rel] = true
localPatch, ok := local[rel]
switch {
case !ok:
deltas = append(deltas, Delta{Path: rel, Kind: NeedsApply, Repo: ptr(repoPatch)})
case signature(repoPatch) == signature(localPatch):
deltas = append(deltas, Delta{Path: rel, Kind: UpToDate, Repo: ptr(repoPatch), Local: ptr(localPatch)})
default:
deltas = append(deltas, Delta{Path: rel, Kind: NeedsUpdate, Repo: ptr(repoPatch), Local: ptr(localPatch)})
}
}
for path := range local.Patches {
if _, exists := repo.Patches[path]; !exists {
d.Orphaned = append(d.Orphaned, path)
}
}
return d
}
func patchContentEqual(a, b []byte) bool {
return bytes.Equal(
normalizePatch(a),
normalizePatch(b),
)
}
func normalizePatch(content []byte) []byte {
lines := strings.Split(string(content), "\n")
var normalized []string
for _, line := range lines {
// Skip index lines (they contain hashes that may differ)
if strings.HasPrefix(line, "index ") {
for rel, localPatch := range local {
if seen[rel] {
continue
}
normalized = append(normalized, strings.TrimRight(line, " \t"))
deltas = append(deltas, Delta{Path: rel, Kind: Orphaned, Local: ptr(localPatch)})
}
return []byte(strings.Join(normalized, "\n"))
slices.SortFunc(deltas, func(a, b Delta) int {
if a.Path < b.Path {
return -1
}
if a.Path > b.Path {
return 1
}
return 0
})
return deltas
}
func signature(p FilePatch) string {
switch {
case p.Op == OpDelete:
return "delete:" + NormalizeChromiumPath(p.Path)
case p.IsPureRename():
return strings.Join([]string{"rename", NormalizeChromiumPath(p.OldPath), NormalizeChromiumPath(p.Path)}, ":")
case p.Op == OpBinary && len(p.Content) == 0:
return "binary:" + NormalizeChromiumPath(p.Path)
default:
lines := strings.Split(strings.ReplaceAll(string(p.Content), "\r\n", "\n"), "\n")
normalized := make([]string, 0, len(lines))
for _, line := range lines {
if strings.HasPrefix(line, "index ") {
continue
}
normalized = append(normalized, strings.TrimRight(line, " \t"))
}
return strings.Join(append([]string{string(p.Op), NormalizeChromiumPath(p.Path), NormalizeChromiumPath(p.OldPath)}, normalized...), "\n")
}
}
func ptr(p FilePatch) *FilePatch {
copy := p
return &copy
}

View File

@@ -1,105 +1,79 @@
package patch
import (
"bytes"
"fmt"
"regexp"
"strconv"
"strings"
)
var diffHeaderPrefix = []byte("diff --git ")
var diffHeader = regexp.MustCompile(`^diff --git a/(.*) b/(.*)$`)
// ParseUnifiedDiff parses a full `git diff` output into individual file patches.
func ParseUnifiedDiff(raw []byte) (*PatchSet, error) {
ps := NewPatchSet("")
func ParseDiffOutput(diff string) (PatchSet, error) {
patches := PatchSet{}
lines := strings.SplitAfter(diff, "\n")
var current *FilePatch
var buffer strings.Builder
chunks := splitDiffChunks(raw)
for _, chunk := range chunks {
fp, err := parseSingleDiff(chunk)
if err != nil {
return nil, fmt.Errorf("parsing diff chunk: %w", err)
flush := func() error {
if current == nil {
return nil
}
if fp != nil {
ps.Patches[fp.Path] = fp
current.Content = []byte(buffer.String())
if current.Path == "" {
return fmt.Errorf("diff patch missing target path")
}
patches[current.Path] = *current
current = nil
buffer.Reset()
return nil
}
return ps, nil
}
func splitDiffChunks(raw []byte) [][]byte {
var chunks [][]byte
lines := bytes.Split(raw, []byte("\n"))
var current [][]byte
for _, line := range lines {
if bytes.HasPrefix(line, diffHeaderPrefix) {
if len(current) > 0 {
chunks = append(chunks, bytes.Join(current, []byte("\n")))
if strings.HasPrefix(line, "diff --git ") {
if err := flush(); err != nil {
return nil, err
}
match := diffHeader.FindStringSubmatch(strings.TrimRight(line, "\n"))
if len(match) != 3 {
return nil, fmt.Errorf("invalid diff header: %s", strings.TrimSpace(line))
}
current = &FilePatch{
Path: NormalizeChromiumPath(match[2]),
Op: OpModify,
}
current = [][]byte{line}
} else if len(current) > 0 {
current = append(current, line)
}
}
if len(current) > 0 {
chunks = append(chunks, bytes.Join(current, []byte("\n")))
}
return chunks
}
func parseSingleDiff(chunk []byte) (*FilePatch, error) {
lines := strings.Split(string(chunk), "\n")
if len(lines) == 0 {
return nil, nil
}
fp := &FilePatch{
Op: OpModified,
Content: chunk,
}
// Parse the diff --git a/... b/... header
header := lines[0]
if !strings.HasPrefix(header, "diff --git ") {
return nil, fmt.Errorf("unexpected header: %s", header)
}
// Extract b/ path from the header
parts := strings.SplitN(header, " b/", 2)
if len(parts) == 2 {
fp.Path = parts[1]
}
// Scan header lines for operation type
for _, line := range lines[1:] {
if strings.HasPrefix(line, "diff --git ") || strings.HasPrefix(line, "@@") {
break
if current == nil {
continue
}
buffer.WriteString(line)
trimmed := strings.TrimRight(line, "\n")
switch {
case strings.HasPrefix(line, "new file mode"):
fp.Op = OpAdded
case strings.HasPrefix(line, "deleted file mode"):
fp.Op = OpDeleted
case strings.HasPrefix(line, "rename from "):
fp.Op = OpRenamed
fp.OldPath = strings.TrimPrefix(line, "rename from ")
case strings.HasPrefix(line, "rename to "):
fp.Path = strings.TrimPrefix(line, "rename to ")
case strings.HasPrefix(line, "similarity index "):
s := strings.TrimPrefix(line, "similarity index ")
s = strings.TrimSuffix(s, "%")
fmt.Sscanf(s, "%d", &fp.Similarity)
case strings.Contains(line, "Binary files"):
fp.Op = OpBinary
fp.IsBinary = true
case strings.HasPrefix(trimmed, "new file mode "):
current.Op = OpAdd
case strings.HasPrefix(trimmed, "deleted file mode "):
current.Op = OpDelete
case strings.HasPrefix(trimmed, "rename from "):
current.Op = OpRename
current.OldPath = NormalizeChromiumPath(strings.TrimPrefix(trimmed, "rename from "))
case strings.HasPrefix(trimmed, "copy from "):
current.Op = OpCopy
current.OldPath = NormalizeChromiumPath(strings.TrimPrefix(trimmed, "copy from "))
case strings.HasPrefix(trimmed, "similarity index "):
value := strings.TrimSuffix(strings.TrimPrefix(trimmed, "similarity index "), "%")
if similarity, err := strconv.Atoi(value); err == nil {
current.Similarity = similarity
}
case strings.HasPrefix(trimmed, "Binary files "):
current.IsBinary = true
current.Op = OpBinary
case trimmed == "GIT binary patch":
current.IsBinary = true
}
}
if fp.Path == "" {
return nil, nil
if err := flush(); err != nil {
return nil, err
}
return fp, nil
return patches, nil
}

View File

@@ -0,0 +1,146 @@
package patch
import (
"context"
"os"
"os/exec"
"path/filepath"
"strings"
"testing"
)
func TestParseDiffOutputDetectsRenameAndDeleteSignatures(t *testing.T) {
renameDiff := `diff --git a/chrome/old.cc b/chrome/new.cc
similarity index 100%
rename from chrome/old.cc
rename to chrome/new.cc
`
deleteDiff := `diff --git a/chrome/dead.cc b/chrome/dead.cc
deleted file mode 100644
index 123..000 100644
--- a/chrome/dead.cc
+++ /dev/null
@@ -1 +0,0 @@
-gone
`
renameSet, err := ParseDiffOutput(renameDiff)
if err != nil {
t.Fatalf("ParseDiffOutput rename: %v", err)
}
deleteSet, err := ParseDiffOutput(deleteDiff)
if err != nil {
t.Fatalf("ParseDiffOutput delete: %v", err)
}
renamePatch := renameSet["chrome/new.cc"]
if !renamePatch.IsPureRename() {
t.Fatalf("expected pure rename patch")
}
if deletePatch := deleteSet["chrome/dead.cc"]; signature(deletePatch) != "delete:chrome/dead.cc" {
t.Fatalf("unexpected delete signature: %s", signature(deletePatch))
}
}
func TestWriteRepoPatchSetWritesMarkersAndReloads(t *testing.T) {
patchesDir := t.TempDir()
set := PatchSet{
"chrome/dead.cc": {
Path: "chrome/dead.cc",
Op: OpDelete,
},
"chrome/new.cc": {
Path: "chrome/new.cc",
Op: OpRename,
OldPath: "chrome/old.cc",
Similarity: 100,
Content: []byte(`diff --git a/chrome/old.cc b/chrome/new.cc
similarity index 100%
rename from chrome/old.cc
rename to chrome/new.cc
`),
},
}
if _, _, err := WriteRepoPatchSet(patchesDir, set, nil); err != nil {
t.Fatalf("WriteRepoPatchSet: %v", err)
}
if _, err := filepath.Abs(filepath.Join(patchesDir, "chrome", "dead.cc.deleted")); err != nil {
t.Fatalf("abs: %v", err)
}
loaded, err := LoadRepoPatchSet(patchesDir, nil)
if err != nil {
t.Fatalf("LoadRepoPatchSet: %v", err)
}
if loaded["chrome/dead.cc"].Op != OpDelete {
t.Fatalf("expected delete marker to round-trip")
}
if !loaded["chrome/new.cc"].IsPureRename() {
t.Fatalf("expected rename marker to round-trip")
}
}
func TestPathMatchesSkipsInternalState(t *testing.T) {
if PathMatches(".bdev/state.yaml", nil) {
t.Fatalf("expected internal state path to be ignored")
}
}
func TestBuildRangePatchSetUsesLatestBaseScopedPatch(t *testing.T) {
ctx := context.Background()
repoDir := t.TempDir()
runGit(t, repoDir, "init")
runGit(t, repoDir, "config", "user.name", "Test User")
runGit(t, repoDir, "config", "user.email", "test@example.com")
writeRepoFile(t, filepath.Join(repoDir, "chrome", "foo.txt"), "one\n")
runGit(t, repoDir, "add", "chrome/foo.txt")
runGit(t, repoDir, "commit", "-m", "base")
base := gitOutput(t, repoDir, "rev-parse", "HEAD")
writeRepoFile(t, filepath.Join(repoDir, "chrome", "foo.txt"), "two\n")
runGit(t, repoDir, "commit", "-am", "step one")
writeRepoFile(t, filepath.Join(repoDir, "chrome", "foo.txt"), "three\n")
runGit(t, repoDir, "commit", "-am", "step two")
end := gitOutput(t, repoDir, "rev-parse", "HEAD")
set, err := BuildRangePatchSet(ctx, repoDir, base, end, base, false, nil)
if err != nil {
t.Fatalf("BuildRangePatchSet: %v", err)
}
content := string(set["chrome/foo.txt"].Content)
if !strings.Contains(content, "+three") {
t.Fatalf("expected final patch content, got %q", content)
}
if strings.Contains(content, "+two") {
t.Fatalf("expected latest base-scoped patch, got %q", content)
}
}
func runGit(t *testing.T, dir string, args ...string) {
t.Helper()
cmd := exec.Command("git", args...)
cmd.Dir = dir
output, err := cmd.CombinedOutput()
if err != nil {
t.Fatalf("git %s: %v\n%s", strings.Join(args, " "), err, string(output))
}
}
func gitOutput(t *testing.T, dir string, args ...string) string {
t.Helper()
cmd := exec.Command("git", args...)
cmd.Dir = dir
output, err := cmd.CombinedOutput()
if err != nil {
t.Fatalf("git %s: %v\n%s", strings.Join(args, " "), err, string(output))
}
return strings.TrimSpace(string(output))
}
func writeRepoFile(t *testing.T, path string, body string) {
t.Helper()
if err := os.MkdirAll(filepath.Dir(path), 0o755); err != nil {
t.Fatalf("mkdir: %v", err)
}
if err := os.WriteFile(path, []byte(body), 0o644); err != nil {
t.Fatalf("write: %v", err)
}
}

View File

@@ -1,159 +0,0 @@
package patch
import (
"context"
"os"
"path/filepath"
"runtime"
"strings"
"sync"
"golang.org/x/sync/errgroup"
)
// ReadPatchSet reads all patches from the chromium_patches/ directory.
func ReadPatchSet(patchesDir string) (*PatchSet, error) {
ps := NewPatchSet("")
// Collect file paths
var filePaths []string
err := filepath.Walk(patchesDir, func(path string, info os.FileInfo, err error) error {
if err != nil {
return nil
}
if !info.IsDir() {
filePaths = append(filePaths, path)
}
return nil
})
if err != nil {
return nil, err
}
g, _ := errgroup.WithContext(context.Background())
g.SetLimit(runtime.NumCPU())
var mu sync.Mutex
for _, path := range filePaths {
path := path
g.Go(func() error {
content, err := os.ReadFile(path)
if err != nil {
return err
}
rel, err := filepath.Rel(patchesDir, path)
if err != nil {
return err
}
fp := classifyPatchFile(rel, content)
mu.Lock()
if existing, ok := ps.Patches[fp.Path]; ok {
ps.Patches[fp.Path] = mergePatchEntry(existing, fp)
} else {
ps.Patches[fp.Path] = fp
}
mu.Unlock()
return nil
})
}
if err := g.Wait(); err != nil {
return nil, err
}
return ps, nil
}
// ReadPatchFiles returns a map of chromium paths to true for all patches in the directory.
// Lighter than ReadPatchSet — only collects paths, not content.
func ReadPatchFiles(patchesDir string) (map[string]bool, error) {
result := make(map[string]bool)
err := filepath.Walk(patchesDir, func(path string, info os.FileInfo, err error) error {
if err != nil {
return nil
}
if info.IsDir() {
return nil
}
rel, err := filepath.Rel(patchesDir, path)
if err != nil {
return nil
}
chromPath := rel
chromPath = strings.TrimSuffix(chromPath, ".deleted")
chromPath = strings.TrimSuffix(chromPath, ".binary")
chromPath = strings.TrimSuffix(chromPath, ".rename")
result[chromPath] = true
return nil
})
return result, err
}
func classifyPatchFile(rel string, content []byte) *FilePatch {
fp := &FilePatch{
Path: rel,
Content: content,
Op: OpModified,
}
switch {
case strings.HasSuffix(rel, ".deleted"):
fp.Path = strings.TrimSuffix(rel, ".deleted")
fp.Op = OpDeleted
fp.Content = nil
case strings.HasSuffix(rel, ".binary"):
fp.Path = strings.TrimSuffix(rel, ".binary")
fp.Op = OpBinary
fp.IsBinary = true
fp.Content = nil
case strings.HasSuffix(rel, ".rename"):
fp.Path = strings.TrimSuffix(rel, ".rename")
fp.Op = OpRenamed
// Parse rename_from from content
for _, line := range strings.Split(string(content), "\n") {
if strings.HasPrefix(line, "rename_from: ") {
fp.OldPath = strings.TrimPrefix(line, "rename_from: ")
}
}
fp.Content = nil
default:
// Check if content looks like a diff with "new file mode"
if strings.Contains(string(content), "new file mode") {
fp.Op = OpAdded
}
}
return fp
}
func mergePatchEntry(existing, incoming *FilePatch) *FilePatch {
switch incoming.Op {
case OpDeleted, OpBinary:
return incoming
case OpRenamed:
merged := *incoming
if len(existing.Content) > 0 {
merged.Content = existing.Content
}
return &merged
}
switch existing.Op {
case OpDeleted, OpBinary:
return existing
case OpRenamed:
merged := *existing
merged.Content = incoming.Content
return &merged
default:
return incoming
}
}

View File

@@ -1,77 +0,0 @@
package patch
import (
"os"
"path/filepath"
"testing"
)
func TestReadPatchSetMarkerPrecedence(t *testing.T) {
t.Parallel()
dir := t.TempDir()
patchPath := filepath.Join(dir, "chrome", "browser", "foo.cc")
if err := os.MkdirAll(filepath.Dir(patchPath), 0o755); err != nil {
t.Fatalf("mkdir: %v", err)
}
if err := os.WriteFile(patchPath, []byte("diff --git a/chrome/browser/foo.cc b/chrome/browser/foo.cc\n"), 0o644); err != nil {
t.Fatalf("write patch: %v", err)
}
if err := os.WriteFile(patchPath+".deleted", []byte("deleted: chrome/browser/foo.cc\n"), 0o644); err != nil {
t.Fatalf("write marker: %v", err)
}
ps, err := ReadPatchSet(dir)
if err != nil {
t.Fatalf("ReadPatchSet: %v", err)
}
fp, ok := ps.Patches["chrome/browser/foo.cc"]
if !ok {
t.Fatalf("missing patch entry")
}
if fp.Op != OpDeleted {
t.Fatalf("expected OpDeleted, got %v", fp.Op)
}
if len(fp.Content) != 0 {
t.Fatalf("expected empty content for deleted marker")
}
}
func TestReadPatchSetRenameMergesContent(t *testing.T) {
t.Parallel()
dir := t.TempDir()
patchPath := filepath.Join(dir, "chrome", "browser", "new_name.cc")
if err := os.MkdirAll(filepath.Dir(patchPath), 0o755); err != nil {
t.Fatalf("mkdir: %v", err)
}
diff := "diff --git a/chrome/browser/old_name.cc b/chrome/browser/new_name.cc\nrename from chrome/browser/old_name.cc\nrename to chrome/browser/new_name.cc\n"
if err := os.WriteFile(patchPath, []byte(diff), 0o644); err != nil {
t.Fatalf("write patch: %v", err)
}
if err := os.WriteFile(patchPath+".rename", []byte("rename_from: chrome/browser/old_name.cc\nsimilarity: 92\n"), 0o644); err != nil {
t.Fatalf("write rename marker: %v", err)
}
ps, err := ReadPatchSet(dir)
if err != nil {
t.Fatalf("ReadPatchSet: %v", err)
}
fp, ok := ps.Patches["chrome/browser/new_name.cc"]
if !ok {
t.Fatalf("missing rename patch entry")
}
if fp.Op != OpRenamed {
t.Fatalf("expected OpRenamed, got %v", fp.Op)
}
if fp.OldPath != "chrome/browser/old_name.cc" {
t.Fatalf("unexpected old path: %q", fp.OldPath)
}
if len(fp.Content) == 0 {
t.Fatalf("expected rename patch to keep diff content")
}
}

View File

@@ -0,0 +1,164 @@
package patch
import (
"fmt"
"os"
"path/filepath"
"strconv"
"strings"
)
func LoadRepoPatchSet(patchesDir string, filters []string) (PatchSet, error) {
set := PatchSet{}
err := filepath.WalkDir(patchesDir, func(fullPath string, d os.DirEntry, walkErr error) error {
if walkErr != nil {
return walkErr
}
if d.IsDir() {
return nil
}
relPath, err := filepath.Rel(patchesDir, fullPath)
if err != nil {
return err
}
relPath = NormalizeChromiumPath(relPath)
patchFile, err := loadPatchFile(fullPath, relPath)
if err != nil {
return err
}
if !PathMatches(patchFile.Path, filters) {
return nil
}
set[patchFile.Path] = patchFile
return nil
})
return set, err
}
func WriteRepoPatchSet(patchesDir string, set PatchSet, scope []string) ([]string, []string, error) {
existing, err := LoadRepoPatchSet(patchesDir, nil)
if err != nil && !os.IsNotExist(err) {
return nil, nil, err
}
inScope := map[string]bool{}
if len(scope) == 0 {
for rel := range existing {
inScope[rel] = true
}
for rel := range set {
inScope[rel] = true
}
} else {
for _, rel := range scope {
inScope[NormalizeChromiumPath(rel)] = true
}
}
var written []string
var deleted []string
for rel := range existing {
if !inScope[rel] || set[rel].Path != "" {
continue
}
if err := removePatchVariants(patchesDir, rel); err != nil {
return nil, nil, err
}
deleted = append(deleted, rel)
}
for rel, patchFile := range set {
if !inScope[rel] {
continue
}
if err := removePatchVariants(patchesDir, rel); err != nil {
return nil, nil, err
}
target, body := patchWriteTarget(patchesDir, patchFile)
if err := os.MkdirAll(filepath.Dir(target), 0o755); err != nil {
return nil, nil, err
}
if len(body) == 0 || body[len(body)-1] != '\n' {
body = append(body, '\n')
}
if err := os.WriteFile(target, body, 0o644); err != nil {
return nil, nil, err
}
written = append(written, rel)
}
return written, deleted, nil
}
func loadPatchFile(fullPath string, relPath string) (FilePatch, error) {
switch {
case strings.HasSuffix(relPath, ".deleted"):
return FilePatch{Path: strings.TrimSuffix(relPath, ".deleted"), Op: OpDelete}, nil
case strings.HasSuffix(relPath, ".binary"):
return FilePatch{Path: strings.TrimSuffix(relPath, ".binary"), Op: OpBinary, IsBinary: true}, nil
case strings.HasSuffix(relPath, ".rename"):
body, err := os.ReadFile(fullPath)
if err != nil {
return FilePatch{}, err
}
return parseRenameMarker(strings.TrimSuffix(relPath, ".rename"), string(body))
default:
body, err := os.ReadFile(fullPath)
if err != nil {
return FilePatch{}, err
}
set, err := ParseDiffOutput(string(body))
if err != nil {
return FilePatch{}, fmt.Errorf("parse %s: %w", relPath, err)
}
for _, patchFile := range set {
return patchFile, nil
}
return FilePatch{}, fmt.Errorf("empty patch file: %s", relPath)
}
}
func patchWriteTarget(patchesDir string, patchFile FilePatch) (string, []byte) {
switch {
case patchFile.Op == OpDelete:
return filepath.Join(patchesDir, filepath.FromSlash(patchFile.Path+".deleted")), []byte("File deleted in patch")
case patchFile.IsPureRename():
body := []byte(fmt.Sprintf("Renamed from: %s\nSimilarity: %d%%", patchFile.OldPath, patchFile.Similarity))
return filepath.Join(patchesDir, filepath.FromSlash(patchFile.Path+".rename")), body
case patchFile.Op == OpBinary && len(patchFile.Content) == 0:
return filepath.Join(patchesDir, filepath.FromSlash(patchFile.Path+".binary")), []byte("Binary file")
default:
return filepath.Join(patchesDir, filepath.FromSlash(patchFile.Path)), patchFile.Content
}
}
func removePatchVariants(patchesDir string, rel string) error {
variants := []string{
filepath.Join(patchesDir, filepath.FromSlash(rel)),
filepath.Join(patchesDir, filepath.FromSlash(rel+".deleted")),
filepath.Join(patchesDir, filepath.FromSlash(rel+".binary")),
filepath.Join(patchesDir, filepath.FromSlash(rel+".rename")),
}
for _, variant := range variants {
if err := os.Remove(variant); err != nil && !os.IsNotExist(err) {
return err
}
}
return nil
}
func parseRenameMarker(rel string, body string) (FilePatch, error) {
patchFile := FilePatch{Path: rel, Op: OpRename}
for _, line := range strings.Split(body, "\n") {
switch {
case strings.HasPrefix(line, "Renamed from: "):
patchFile.OldPath = NormalizeChromiumPath(strings.TrimPrefix(line, "Renamed from: "))
case strings.HasPrefix(line, "Similarity: "):
value := strings.TrimSuffix(strings.TrimPrefix(line, "Similarity: "), "%")
if similarity, err := strconv.Atoi(strings.TrimSpace(value)); err == nil {
patchFile.Similarity = similarity
}
}
}
if patchFile.OldPath == "" {
return FilePatch{}, fmt.Errorf("rename marker missing source path for %s", rel)
}
return patchFile, nil
}

View File

@@ -1,75 +0,0 @@
package patch
import (
"os"
"path/filepath"
"strings"
)
// RemoveStale walks chromium_patches/ and removes patches for files
// that are NOT in the given PatchSet.
func RemoveStale(patchesDir string, current *PatchSet, dryRun bool) ([]string, error) {
var stale []string
err := filepath.Walk(patchesDir, func(path string, info os.FileInfo, err error) error {
if err != nil {
return nil
}
if info.IsDir() {
return nil
}
rel, err := filepath.Rel(patchesDir, path)
if err != nil {
return nil
}
// Normalize: strip marker suffixes to get the chromium path
chromPath := rel
chromPath = strings.TrimSuffix(chromPath, ".deleted")
chromPath = strings.TrimSuffix(chromPath, ".binary")
chromPath = strings.TrimSuffix(chromPath, ".rename")
if _, exists := current.Patches[chromPath]; !exists {
stale = append(stale, rel)
if !dryRun {
_ = os.Remove(path)
}
}
return nil
})
if err != nil {
return stale, err
}
// Clean up empty directories
if !dryRun {
cleanEmptyDirs(patchesDir)
}
return stale, nil
}
func cleanEmptyDirs(root string) {
// Walk bottom-up by collecting dirs first then removing empty ones
var dirs []string
_ = filepath.Walk(root, func(path string, info os.FileInfo, err error) error {
if err != nil {
return nil
}
if info.IsDir() && path != root {
dirs = append(dirs, path)
}
return nil
})
// Reverse order (deepest first)
for i := len(dirs) - 1; i >= 0; i-- {
entries, err := os.ReadDir(dirs[i])
if err == nil && len(entries) == 0 {
_ = os.Remove(dirs[i])
}
}
}

View File

@@ -1,79 +1,80 @@
package patch
type FileOp int
const (
OpModified FileOp = iota
OpAdded
OpDeleted
OpRenamed
OpBinary
import (
"bytes"
"path"
"path/filepath"
"strings"
)
func (op FileOp) String() string {
switch op {
case OpModified:
return "M"
case OpAdded:
return "A"
case OpDeleted:
return "D"
case OpRenamed:
return "R"
case OpBinary:
return "B"
default:
return "?"
}
}
type FileOp string
const (
OpAdd FileOp = "ADD"
OpModify FileOp = "MODIFY"
OpDelete FileOp = "DELETE"
OpRename FileOp = "RENAME"
OpCopy FileOp = "COPY"
OpBinary FileOp = "BINARY"
)
type FilePatch struct {
Path string
Op FileOp
Content []byte
OldPath string // for renames
Similarity int // for renames
IsBinary bool
Path string `json:"path"`
Op FileOp `json:"op"`
OldPath string `json:"old_path,omitempty"`
Similarity int `json:"similarity,omitempty"`
Content []byte `json:"-"`
IsBinary bool `json:"is_binary,omitempty"`
}
type PatchSet struct {
Base string
Patches map[string]*FilePatch // keyed by chromium path
type PatchSet map[string]FilePatch
type DeltaKind string
const (
NeedsApply DeltaKind = "needs_apply"
NeedsUpdate DeltaKind = "needs_update"
UpToDate DeltaKind = "up_to_date"
Orphaned DeltaKind = "orphaned"
)
type Delta struct {
Path string `json:"path"`
Kind DeltaKind `json:"kind"`
Repo *FilePatch `json:"repo,omitempty"`
Local *FilePatch `json:"local,omitempty"`
}
func NewPatchSet(base string) *PatchSet {
return &PatchSet{
Base: base,
Patches: make(map[string]*FilePatch),
func NormalizeChromiumPath(raw string) string {
clean := filepath.ToSlash(raw)
return strings.TrimPrefix(path.Clean(clean), "./")
}
func PathMatches(rel string, filters []string) bool {
if IsInternalPath(rel) {
return false
}
if len(filters) == 0 {
return true
}
candidate := NormalizeChromiumPath(rel)
for _, filter := range filters {
scope := NormalizeChromiumPath(filter)
if candidate == scope || strings.HasPrefix(candidate, scope+"/") {
return true
}
}
return false
}
type PushResult struct {
Modified []string
Added []string
Deleted []string
Stale []string
Unchanged []string
func IsInternalPath(rel string) bool {
candidate := NormalizeChromiumPath(rel)
return candidate == ".bdev" || strings.HasPrefix(candidate, ".bdev/")
}
func (r *PushResult) Total() int {
return len(r.Modified) + len(r.Added) + len(r.Deleted)
}
type PullResult struct {
Applied []string
Skipped []string
Reverted []string
LocalOnly []string
Conflicts []ConflictInfo
Deleted []string
}
type ConflictInfo struct {
File string
RejectFile string
PatchFile string
HunksTotal int
HunksFailed int
Error string
func (p FilePatch) IsPureRename() bool {
if p.Op != OpRename {
return false
}
return !bytes.Contains(p.Content, []byte("\n@@")) && !bytes.Contains(p.Content, []byte("GIT binary patch"))
}

View File

@@ -0,0 +1,215 @@
package patch
import (
"context"
"fmt"
"path/filepath"
"slices"
"strings"
"github.com/browseros-ai/BrowserOS/packages/browseros/tools/bdev/internal/git"
)
func BuildWorkingTreePatchSet(ctx context.Context, workspacePath string, base string, filters []string) (PatchSet, error) {
diff, err := git.DiffText(ctx, workspacePath, base)
if err != nil {
return nil, err
}
set, err := ParseDiffOutput(diff)
if err != nil {
return nil, err
}
untracked, err := git.ListUntracked(ctx, workspacePath, filters)
if err != nil {
return nil, err
}
for _, rel := range untracked {
diffText, err := git.DiffNoIndex(ctx, workspacePath, rel)
if err != nil {
return nil, err
}
untrackedSet, err := ParseDiffOutput(diffText)
if err != nil {
return nil, err
}
for patchPath, patchFile := range untrackedSet {
set[patchPath] = patchFile
}
}
return filterSet(set, filters), nil
}
func BuildCommitPatchSet(ctx context.Context, workspacePath string, ref string, base string, filters []string) (PatchSet, error) {
if base == "" {
diff, err := git.DiffText(ctx, workspacePath, ref+"^.."+ref)
if err != nil {
return nil, err
}
set, err := ParseDiffOutput(diff)
if err != nil {
return nil, err
}
return filterSet(set, filters), nil
}
changes, err := git.DiffTreeNameStatus(ctx, workspacePath, ref, filters)
if err != nil {
return nil, err
}
return buildBaseScopedSet(ctx, workspacePath, ref, base, changes)
}
func BuildRangePatchSet(ctx context.Context, workspacePath string, start string, end string, base string, squash bool, filters []string) (PatchSet, error) {
if squash {
if base == "" {
diff, err := git.DiffText(ctx, workspacePath, start+".."+end)
if err != nil {
return nil, err
}
set, err := ParseDiffOutput(diff)
if err != nil {
return nil, err
}
return filterSet(set, filters), nil
}
changes, err := git.DiffNameStatusBetween(ctx, workspacePath, start, end, filters)
if err != nil {
return nil, err
}
return buildBaseScopedSet(ctx, workspacePath, end, base, changes)
}
commits, err := git.RevListRange(ctx, workspacePath, start, end)
if err != nil {
return nil, err
}
set := PatchSet{}
seen := map[string]bool{}
for _, commit := range commits {
var current PatchSet
if base == "" {
diff, err := git.DiffText(ctx, workspacePath, commit+"^.."+commit)
if err != nil {
return nil, err
}
current, err = ParseDiffOutput(diff)
if err != nil {
return nil, err
}
} else {
changes, err := git.DiffTreeNameStatus(ctx, workspacePath, commit, filters)
if err != nil {
return nil, err
}
current, err = buildBaseScopedSet(ctx, workspacePath, commit, base, changes)
if err != nil {
return nil, err
}
}
for rel, patchFile := range filterSet(current, filters) {
if base != "" {
set[rel] = patchFile
continue
}
if seen[rel] {
continue
}
set[rel] = patchFile
seen[rel] = true
}
}
return set, nil
}
func buildBaseScopedSet(ctx context.Context, workspacePath string, ref string, base string, changes []git.FileChange) (PatchSet, error) {
set := PatchSet{}
for _, change := range changes {
rel := NormalizeChromiumPath(change.Path)
diff, err := git.DiffText(ctx, workspacePath, base, ref, "--", rel)
if err != nil {
return nil, err
}
switch {
case strings.TrimSpace(diff) != "":
patches, err := ParseDiffOutput(diff)
if err != nil {
return nil, err
}
for patchPath, patchFile := range patches {
set[patchPath] = patchFile
}
case change.Status == "D":
exists, err := git.FileExistsAtCommit(ctx, workspacePath, base, rel)
if err != nil {
return nil, err
}
if exists {
set[rel] = FilePatch{Path: rel, Op: OpDelete}
}
case change.Status == "A":
content, err := git.ShowFile(ctx, workspacePath, ref, rel)
if err != nil {
return nil, err
}
set[rel] = syntheticAddPatch(rel, content)
}
}
return set, nil
}
func filterSet(set PatchSet, filters []string) PatchSet {
filtered := PatchSet{}
for rel, patchFile := range set {
if !PathMatches(rel, filters) {
continue
}
filtered[rel] = patchFile
}
return filtered
}
func ScopeFromSet(set PatchSet) []string {
paths := make([]string, 0, len(set))
for rel := range set {
paths = append(paths, rel)
}
slices.Sort(paths)
return paths
}
func RejectPath(workspacePath string, rel string) string {
return filepath.Join(workspacePath, filepath.FromSlash(rel+".rej"))
}
func syntheticAddPatch(rel string, content []byte) FilePatch {
body := string(content)
if body != "" && body[len(body)-1] != '\n' {
body += "\n"
}
patchBody := fmt.Sprintf(
"diff --git a/%s b/%s\nnew file mode 100644\n--- /dev/null\n+++ b/%s\n@@ -0,0 +1,%d @@\n%s",
rel,
rel,
rel,
countLines(body),
prefixLines(body, "+"),
)
return FilePatch{Path: rel, Op: OpAdd, Content: []byte(patchBody)}
}
func countLines(body string) int {
if body == "" {
return 0
}
return len(strings.Split(strings.TrimSuffix(body, "\n"), "\n"))
}
func prefixLines(body string, prefix string) string {
lines := strings.Split(strings.TrimSuffix(body, "\n"), "\n")
for idx, line := range lines {
lines[idx] = prefix + line
}
if len(lines) == 0 {
return ""
}
return strings.Join(lines, "\n") + "\n"
}

Some files were not shown because too many files have changed in this diff Show More