Everything so far has been infrastructure. Now you build something that does real work — a job that calls an LLM.
The Job: Daily Digest
You'll build a job that reads markdown files from workdir/notes/, sends them to an LLM, and writes a summary to workdir/digests/. It runs every morning.
The structure is the same as every other job:
system-jobs/daily-digest/
schedule # when: daily at 8am
run # what: read notes, summarize with AI, write digest
The Schedule
{"type": "scheduled", "calendar": {"Hour": 8, "Minute": 0}}
Daily at 8am. Change the hour to whatever makes sense for you.
The Run Script Pattern
Every AI job does three things:
- Gather — read input (files, APIs, git log, whatever)
- Process — call an LLM with a prompt
- Output — write the result somewhere
The gather and output are yours. The process step depends on which LLM provider you use.
Choose Your Provider
How do you want to call the LLM? Pick the path that matches what you have:
- You have a coding agent installed (claude, codex, pi) → Coding Agent path — zero config, the simplest option
- You have an OpenAI API key → OpenAI path — direct API calls with fetch
- You have an Anthropic API key → Anthropic path — Claude's messages API
- You want to run locally → Ollama path — no API key, no cloud, everything on your Mac
Not sure? If you're taking this course with a coding agent, start with the Coding Agent path. It's 10 lines and needs no configuration.
Add Some Notes to Summarize
mkdir -p workdir/notes
cat > workdir/notes/project-ideas.md << 'EOF'
# Project Ideas
- Build a local AI job scheduler using launchd
- Explore WatchPaths for reactive file processing
- Look into a dashboard for monitoring
EOF
cat > workdir/notes/reading.md << 'EOF'
# Reading Notes
- Inngest blog: agents need a harness, not a framework
- launchd is the simplest possible harness
- Koster: fun = pattern learning at the right pace
EOF
After You Pick a Provider
Once you've built the run script using your chosen provider, test it:
bun run src/cli.ts sync
bun run src/cli.ts kick daily-digest
bun run src/cli.ts logs daily-digest
Check the output:
cat workdir/digests/$(date +%Y-%m-%d).md
You should see a digest of your notes, generated by AI, triggered by launchd.
What You Learned
- AI jobs follow the same directory-as-job pattern as everything else
- The gather/process/output structure applies to any AI task
- Your choice of LLM provider doesn't change the job structure — only the run script's internals
- The companion can help you build with any provider