The free AI already on your Mac.
Every Mac with Apple Silicon has a built-in LLM. Apple locked it behind Siri. apfel sets it free - as a CLI tool, an OpenAI-compatible server, and a chat.
$ brew install Arthur-Ficial/tap/apfel
Apple Silicon · macOS Tahoe · Apple Intelligence enabled
The AI is already installed on your Mac. Apple ships it with macOS. apfel just gives you a way to talk to it - from your terminal, from your code, from anywhere.
CLI tool, HTTP server, or interactive chat. Pick the one that fits.
Pipe-friendly and composable. Works with jq, xargs, and your shell scripts. stdin, stdout, JSON output, file attachments, proper exit codes.
Drop-in replacement at localhost:11434. Point any OpenAI SDK at it and go. Streaming, tool calling, CORS, response formats.
Multi-turn conversations with automatic context management. Five trimming strategies. System prompt support. All on your Mac.
Apple built an LLM into your Mac. apfel gives it a front door.
Starting with macOS 26 (Tahoe), every Apple Silicon Mac includes a language model as part of Apple Intelligence. Apple exposes it through the FoundationModels framework - a Swift API that gives apps access to SystemLanguageModel. All inference runs on the Neural Engine and GPU. No network calls, no cloud, no API keys. The model is just there.
Out of the box, the on-device model powers Siri, Writing Tools, and system features. There is no terminal command, no HTTP endpoint, no way to pipe text through it. The FoundationModels framework exists, but you need to write a Swift app to use it. That is what apfel does.
apfel is a Swift 6.3 binary that wraps LanguageModelSession and exposes it three ways: as a UNIX command-line tool with stdin/stdout, as an OpenAI-compatible HTTP server (built on Hummingbird), and as an interactive chat with context management.
It handles the things Apple's raw API does not: proper exit codes, JSON output, file attachments, five context trimming strategies for the small 4096-token window, real token counting via the SDK, and conversion of OpenAI tool schemas to Apple's native Transcript.ToolDefinition format.
Real commands. Real outputs. All running on Apple Silicon.
Shell scripts in the demo/ folder. Install apfel first, then grab the ones you want.
Natural language to shell command. Say what you want, get the command.
Pipe chains from plain English. awk, sed, sort, uniq - generated for you.
Narrates your Mac's system activity like a nature documentary.
Explain any command, error message, or code snippet in plain English.
What's this directory? Instant project orientation for any codebase.
Summarize recent git commits in a few sentences.
Change one URL. Keep your code.
apfel speaks the OpenAI API. Any client library, any framework, any tool that talks to OpenAI can talk to your Mac's AI instead. Just change the base URL.
from openai import OpenAI # Just change the base_url. That's it. client = OpenAI( base_url="http://localhost:11434/v1", api_key="unused" # no auth needed ) resp = client.chat.completions.create( model="apple-foundationmodel", messages=[{ "role": "user", "content": "What is 1+1?" }], ) print(resp.choices[0].message.content)
From zero to 214 stars in 10 days.
123 stars on March 31 alone. Created March 24, 2026 - first public release of Apple's on-device LLM as a command-line tool.
Star on GitHubData as of April 3, 2026
Two commands. Ten seconds. You're in.
$ brew install Arthur-Ficial/tap/apfel $ apfel "Hello, Mac!"
$ git clone https://github.com/Arthur-Ficial/apfel.git $ cd apfel && make install
Tools built on top of Apple's on-device AI.
Native macOS SwiftUI debug GUI. Chat with Apple Intelligence, inspect requests and responses, logs, speech-to-text, text-to-speech - all on-device.
SwiftUIAI-powered clipboard actions from the menu bar. Fix grammar, translate, explain code, summarize - one click on any selected text.
Under Heavy Development