BranchProvider: Coding Agent CLI

Provider: Coding Agent CLI

Shell out to claude, codex, or pi — zero config.

You already have a coding agent installed — that's how you're taking this course. Use it as your LLM provider. No API keys, no env vars, no fetch calls.

The Run Script

#!/usr/bin/env bun
import { readdir, readFile, writeFile, mkdir } from "node:fs/promises";
import { existsSync } from "node:fs";
import { join, resolve } from "node:path";

const PROJECT_ROOT = resolve(import.meta.dir, "../..");
const NOTES_DIR = join(PROJECT_ROOT, "workdir", "notes");
const DIGESTS_DIR = join(PROJECT_ROOT, "workdir", "digests");

// Gather notes
const files = existsSync(NOTES_DIR)
  ? (await readdir(NOTES_DIR)).filter(f => f.endsWith(".md"))
  : [];

if (files.length === 0) {
  console.log("[digest] No notes found in workdir/notes/");
  process.exit(0);
}

const notes = await Promise.all(
  files.map(async f => `### ${f}\n${await readFile(join(NOTES_DIR, f), "utf-8")}`)
);

const prompt = `Summarize these notes into a concise daily digest. Key points per note, then a "Focus today" section.\n\n${notes.join("\n\n---\n\n")}`;

// Call whichever agent is available
const agents = [
  { cmd: "claude", args: ["-p"] },
  { cmd: "codex", args: ["-q"] },
  { cmd: "pi", args: ["--prompt"] },
];

let result = "";
for (const agent of agents) {
  const proc = Bun.spawnSync([agent.cmd, ...agent.args, prompt], { timeout: 60_000 });
  if (proc.exitCode === 0) {
    result = new TextDecoder().decode(proc.stdout).trim();
    if (result) {
      console.log(`[digest] Using ${agent.cmd}`);
      break;
    }
  }
}

if (!result) {
  console.error("[digest] No coding agent available. Install claude, codex, or pi.");
  process.exit(1);
}

const date = new Date().toISOString().split("T")[0];
await mkdir(DIGESTS_DIR, { recursive: true });
await writeFile(join(DIGESTS_DIR, `${date}.md`), `# Daily Digest — ${date}\n\n${result}\n`);
console.log(`[digest] Written to workdir/digests/${date}.md`);

Why This Works

The coding agent handles everything — authentication, model selection, retries. You don't manage API keys. You don't pick models. You just pass a prompt and get text back.

Watch For

  • Timeout: Set timeout: 60_000 (60 seconds). LLM calls can hang.
  • Exit code: If the agent isn't installed, Bun.spawnSync returns a non-zero exit code. The loop tries the next one.
  • PATH: The manager injects your shell's PATH into the plist, so claude/codex/pi are findable. If you installed an agent AFTER the last sync, re-sync: bun run sync.

Test It

bun run src/cli.ts kick daily-digest
bun run src/cli.ts logs daily-digest
cat workdir/digests/$(date +%Y-%m-%d).md

Companion Notes

Branch: Coding Agent as LLM Provider

The simplest path. You already have a coding agent installed — that's how you're taking this course. Use it.

Why This Path

  • Zero configuration — no API keys, no env vars
  • The agent handles auth, model selection, retries
  • It can read files and use tools, not just generate text
  • Your run script is 10 lines of bash

The Pattern

Your run script shells out to whichever agent you have:

#!/bin/bash
# The simplest possible AI job
claude -p "Summarize these notes into a daily digest:" notes/*.md > digest.md

Or with bun for more control:

#!/usr/bin/env bun
const prompt = "Summarize these notes into a daily digest:";
const files = await Array.fromAsync(new Bun.Glob("workdir/notes/*.md").scan());
const content = await Promise.all(files.map(f => Bun.file(f).text()));

const input = content.map((c, i) => `### ${files[i]}\n${c}`).join("\n---\n");

const proc = Bun.spawnSync(["claude", "-p", `${prompt}\n\n${input}`], {
  timeout: 60_000,
});

const result = new TextDecoder().decode(proc.stdout).trim();
await Bun.write("workdir/digests/" + new Date().toISOString().split("T")[0] + ".md", result);
console.log(`[digest] Written. ${result.length} chars.`);

Detection

The companion checks:

which claude && echo "claude-code"
which codex && echo "codex"  
which pi && echo "pi"

Agent-Specific Notes

Claude Code (claude -p)

  • -p flag sends a prompt and returns the response
  • Handles Anthropic auth automatically
  • Can read files if you pass paths

Codex (codex -q)

  • -q flag for quiet/non-interactive mode
  • Uses OpenAI auth
  • Good for code generation tasks

Pi (pi --prompt)

  • --prompt flag for non-interactive
  • Multi-provider support
  • Has its own tool ecosystem

What to Watch For

  • Timeout: Set a timeout on Bun.spawnSync. LLM calls can hang. 60 seconds is reasonable.
  • Exit code: Check proc.exitCode. Non-zero means the agent errored.
  • stderr: The agent might print warnings to stderr. Your job's stderr.log captures this.
  • Context length: If you're passing lots of files, you might hit the agent's context limit. Truncate or summarize first.

Verification

# Test manually first
claude -p "Say hello in exactly 3 words"

# Then through the job
bun run src/cli.ts kick daily-digest
bun run src/cli.ts logs daily-digest