Apple Foundation Models Framework: Learn to Use The On-device LLM

Quick Take: Apple just dropped its long-awaited AI strategy, “Apple Intelligence,”. The combo for developers is a ~3B parameter on-device model and the new “Foundation Models” framework in Swift, which introduces game-changing features like type-safe “Guided Generation” and “Tool Calling” to build powerful, private, and efficient AI features directly into your apps.


🚀 The Crunch

Source: Apple

🎯 Why This Matters: Apple is finally shipping its own on-device foundation models, and they’re giving developers the keys with a new Swift framework. This isn’t just another API call; it’s deeply integrated AI that leverages Apple’s full vertical stack—from silicon to OS to language. The focus is on privacy, efficiency, and enabling powerful, context-aware features directly within your apps.

📱
Powerful On-Device Model
A ~3B parameter model that runs directly on-device, optimized for speed and efficiency. Apple claims it outperforms larger models like Llama-3-8B.
⚙️
Guided Generation
The killer feature for Swift devs. Use the @Generable macro to get guaranteed, type-safe structured output from the model, eliminating messy JSON parsing.
🔧
Type-Safe Tool Calling
Extend the model’s capabilities by defining tools that conform to a Swift protocol. Guided Generation ensures the model can’t call your tools with invalid arguments.
Streaming Snapshots
A developer-friendly streaming API that provides partially generated Swift objects (“snapshots”) instead of raw tokens, making it trivial to update SwiftUI views in real time.

What You Can Build

  • An app that summarizes long articles or documents with a single tap
  • A feature that automatically extracts structured data (like contacts or events) from text
  • A tool that rewrites user-inputted text to match a specific tone or style
  • Custom tools that let the model interact with your app’s specific services or data

⚡ Developer Tip: Dive straight into Xcode Playgrounds. With just a few lines of code, you can start prompting the on-device model and iterating on your @Generable Swift types without having to build and run your full app. It’s the fastest way to get a feel for the new framework.

Critical Caveats & Considerations

  • Apple Ecosystem Only: This is deeply integrated into iOS 18, iPadOS 18, and macOS Sequoia. It’s not a cross-platform solution.
  • On-Device Model is Specialized: The ~3B model is designed for common user tasks (summarization, extraction), not for general world knowledge or as a free-form chatbot.
  • No Training on User Data: Apple makes a hard promise: they do not use private personal data or user interactions to train their foundation models.
  • Instructions vs. Prompts: The framework makes a key distinction: developer-provided `instructions` are obeyed more strictly than user-provided `prompts` to mitigate injection attacks.

✅ Availability: Apple Intelligence will be available in beta as part of iOS 18, iPadOS 18, and macOS Sequoia this fall. The Foundation Models framework will be available to developers with the new OS SDKs.


🔬 The Dive

The Big Picture: The Power of Vertical Integration. Apple’s AI strategy is a masterclass in leveraging its greatest strength: complete control over the hardware and software stack. By designing models that are co-optimized with their own silicon (Neural Engine), operating systems, and developer tools (Swift), they can achieve a level of performance, privacy, and developer experience that is difficult for competitors to match with software alone.

How the Foundation Models Framework Works

  • Guided Generation: The End of JSON Parsing. This is the framework’s core innovation. You define a Swift struct or enum and annotate it with @Generable. The framework guarantees the model’s output will be a valid instance of that type. This is achieved through a combination of prompt injection, constrained decoding at the OS level, and model training.
  • Streaming Snapshots, Not Deltas: Instead of streaming raw text tokens that you have to parse and accumulate, the framework streams partially generated instances of your @Generable type. This makes updating SwiftUI views from a streaming response trivial and robust.
  • Type-Safe Tool Calling: To create a tool, you define a class that conforms to the Tool protocol. The arguments for your tool’s `call` method must be a @Generable type. This leverages Guided Generation to ensure the model can never call your tool with malformed or invalid arguments, preventing a whole class of runtime errors.
  • Stateful Sessions: All interactions happen within a session object, which maintains a `transcript` of the conversation for context. You provide high-level, trusted `instructions` to the session once, and it handles passing them to the model alongside user `prompts` for each turn.

TLDR: Apple’s AI is here, and it’s all about on-device power and privacy. The new Foundation Models framework for Swift lets you build AI features with guaranteed, type-safe structured output. Stop parsing JSON, start building.

Tom Furlanis
Researcher. Narrative designer. Wannabe Developer.
Twenty years ago, Tom was coding his 1st web applications in PHP. But then he left it all to pursue studies in humanities. Now, two decades later, empowered by his coding assistants, a degree in AI ethics and a plethora of unrealized dreams, Tom is determined to develop his apps. Developer heaven or bust? Stay tuned to discover!