Hacker Newsnew | past | comments | ask | show | jobs | submit | wolfejam's commentslogin

TL;DR: We studied how Boris Cherny (creator of Claude Code) structures his projects - subagents, slash commands, MCP servers, Bun runtime - and built a 12-test integration suite that validates all of it. Now .faf detects and preserves the complete Claude Code ecosystem. Every publish passes Boris-Flow or it doesn't ship.


.FAF (Foundational AI-context Format) is a 40-line file that gives any AI instant project context.

  This post documents a real build session with Grok where:
  - Uploaded one .faf file
  - Grok scored it 95/100 and locked in full context
  - Built a complete GitHub PR code reviewer
  - Zero re-explaining across the entire session
  - I found 9 code issues, Grok fixed them, found 1 more, fixed that too

  Full Grok conversation: https://x.com/i/grok/share/bWWQ7qHbCjHc2Wx3G9WZUq5ay

  Format is open, MIT licensed, works with Claude, Grok, any AI.


I built FAF (Foundational AI-context Format) to solve context loss in AI coding sessions. Now IANA-registered, 18k+ npm downloads, Anthropic-approved MCP server.

This week I shipped two new implementations:

1. bun-sticky-faf - Pure Bun, zero deps, 328 tests 2. bun-sticky-zig - 77KB binary, zero runtime, built in Bun's language

The Zig version is interesting - Bun itself is built on Zig, so this CLI speaks Bun's native language. Sub-millisecond cold start.

npm: https://npmjs.com/package/bun-sticky-faf Zig: https://github.com/Wolfe-Jam/bun-sticky-zig


The Moment Claude Code did something I'd never seen before:

Read(FAF/faf-dev-tools/project.faf) ⎿ Read 124 lines ⎿ FAF/CLAUDE.md ⎿ FAF/faf-dev-tools/CLAUDE.md Look at that order.

project.faf first. Then CLAUDE.md. Then the nested CLAUDE.md.

The AI read MY format before Anthropic's own convention.


They said it couldn't be done. They said one person couldn't hold both the IANA registration and MCP stewardship while also shipping SDKs.

They forgot about the snake.

Today, .FAF (Foundational AI-context Format) announced the release of its Python SDK, completing the first leg of a multi-language deployment that nobody asked for but everybody needed.

Then Grok showed up.


Yesterday: Grok hits #1 on the LLM leaderboard Today: grok-faf-mcp ships with dedicated infrastructure

  Live server: https://grok-faf-mcp.vercel.app
  URL-based MCP with zero installation. Point your client at the endpoint, get 17 tools
  for .faf context (IANA-registered AI format).

  What makes it Grok-exclusive:
  - Custom x-grok-wins: true header on every response
  - Dedicated to @elonmusk and the #1 model
  - BIG- landing page with the squeeze message
  - First MCP server built specifically for Grok/xAI

  The blog post comes in two voices - toggle between professional engineering write-up
  and Grok's own PR text with full personality:
  https://faf.one/blog/grok-faf-mcp-launch

  Published to npm: https://www.npmjs.com/package/grok-faf-mcp
Choose your flavor of orange.


The Black=Friday deal may be an Annual offer, and they will stack up users! It also may not, of course Let us know!


LLSponge AI


Thank you for taking the time to write about Smalltalk. I have never used it, but I love all the old stories, cheers!


i enjoyed your post, those remotes are too funny!!


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: