Unlocking Apple’s FoundationModels in iOS 26 Beta: A Developer’s Guide to On-Device Generative AI
Author(s): Rajeev KR
Originally published on Towards AI.
Empowering developers with on-device generative AI — fast, private, and ready for iOS 26.
Apple has stepped boldly into the generative AI landscape with the FoundationModels framework, introduced in iOS 26 Beta. This new framework exposes Apple’s proprietary large language models (LLMs) directly to developers via Swift APIs, enabling rich, privacy-first AI experiences that run entirely on-device.
If you’re a developer eager to harness Apple Intelligence’s power without compromising privacy or speed, this guide unpacks how FoundationModels works, what you can do with it today, and how to get started with practical code examples.
Why FoundationModels Matters
Before diving into code, here’s what sets FoundationModels apart:
- On-device performance — models run locally on the latest Apple silicon (A17 Pro, M2/M3 chips), delivering instant results without network lag.
- Privacy-centric design — user data stays on the device unless explicitly routed through Apple’s encrypted Private Cloud Compute.
- Easy Swift integration — intuitive APIs with support for dynamic schemas, guided generation, and multi-turn interactions.
- Cross-platform reach — use the same APIs on iPhone, iPad, Mac, and visionOS.
Prerequisites
Before you begin building with FoundationModels, make sure you have the following:
- Xcode 26 Beta installed from the Apple Developer portal.
- A device or simulator running iOS 26 Beta, iPadOS 26 Beta, macOS Sequoia Beta, or visionOS 2 Beta.
- A Mac with Apple Silicon (M1/M2/M3) for the best experience and on-device execution.
- Your app’s deployment target set to iOS 26 or higher.
- Familiarity with Swift and Swift Concurrency (async/await).
Once you’re set up with the beta SDK, you’ll be ready to import the new FoundationModels module and begin building with Apple Intelligence.
Getting Started: FoundationModels Core Concepts
At its core, FoundationModels provides:
- SystemLanguageModel — a pre-trained Apple language model you can query.
- Sessions — conversational or task-oriented contexts you create to hold state.
- Guided Generation — a way to define structured output formats and prompts.
- Generable structs — models that output strongly-typed data for reliability.
Example 1: Summarization With SystemLanguageModel
Summarization is often the first step in content-focused apps. Here’s a minimal example:
import FoundationModels
@MainActor
func summarizeText(_ text: String) async throws {
// Access the default Apple language model
let model = SystemLanguageModel.default
// Create a session for conversational context
let session = await model.makeSession()
// Define the prompt to instruct the model
let prompt = "Summarize this text:\n\n\(text)"
// Get the response asynchronously
let response = try await session.respond(to: prompt)
// Print the concise summary
print("Summary: \(response.output)")
}
let article = """
Apple's new FoundationModels framework in iOS 26 Beta allows developers to easily integrate advanced generative AI features that run fully on-device, preserving privacy and performance.
"""
Task {
try await summarizeText(article)
}
Example 2: Tone Rewriting Using Guided Generation
FoundationModels supports guided generation with structured outputs, allowing you to specify exactly how you want the output to look.
import FoundationModels
// Define the expected output structure
struct RewriteResponse: Decodable {
let rewritten: String
}
// Annotate with @Guide to specify prompt and schema
@Guide(schema: RewriteResponse.self, prompt: "Rewrite the following text in a professional tone.")
func rewriteText(_ input: String) async throws -> RewriteResponse {}
@MainActor
func demoRewrite() async throws {
let casualText = "Hey! Can you send me the report ASAP?"
// Call the generated API
let result = try await rewriteText(casualText)
print("Original: \(casualText)")
print("Rewritten: \(result.rewritten)")
}
Task {
try await demoRewrite()
}
Example 3: Extracting Keywords with Generable Structs
Extracting keywords or entities can be done with @Generable
structs, which instruct the model to output typed lists.
import FoundationModels
@Generable
struct Keyword {
let keyword: String
}
@MainActor
func extractKeywords(from text: String) async throws {
let model = SystemLanguageModel.default
// Prompt the model to extract keywords
let prompt = "Extract important keywords from the following text:\n\n\(text)"
// Generate a typed array of keywords
let keywords: [Keyword] = try await model.generate(from: prompt)
print("Extracted Keywords:")
for kw in keywords {
print("- \(kw.keyword)")
}
}
let content = "Apple announced the iPhone 16 Pro with A19 chip at WWDC 2025 in Cupertino."
Task {
try await extractKeywords(from: content)
}
Example 4: Semantic Search Across Documents
You can perform semantic similarity search using sessions designed for semantic queries.
import FoundationModels
@MainActor
func semanticSearchExample() async throws {
let model = SystemLanguageModel.default
// Create a semantic search session (conceptual API)
let session = await model.makeSemanticSession()
let query = "What AI model powers iPhones?"
let documents = [
"Apple introduced Apple Intelligence in iOS 26.",
"The iPhone 16 Pro features a powerful A19 chip.",
"MacBook Air M4 was announced at WWDC."
]
// Perform the search
let matches = try await session.search(query: query, in: documents)
print("Semantic Search Results:")
for match in matches {
print("- \"\(match.text)\" (score: \(match.score))")
}
}
Task {
try await semanticSearchExample()
}
Tips for Developers
- Run on supported hardware: For best performance, use devices with A17 Pro or newer Apple silicon.
- Use structured outputs: Define your own
@Generable
or@Guide
schemas to keep responses predictable and easy to parse. - Manage sessions: Leverage sessions for multi-turn chats or contextual tasks to maintain state.
- Stay updated: The framework is evolving in beta; watch Apple’s developer docs and WWDC sessions for updates.
Useful Apple Developer Links
- Apple Intelligence Overview
https://developer.apple.com/apple-intelligence
Get the official overview of Apple’s new generative AI system, including use cases, architecture, and supported devices. - Foundation Models Documentation
https://developer.apple.com/documentation/foundationmodels
Browse the full API reference for Apple’s FoundationModels framework. - WWDC24 Session: Meet Foundation Models
https://developer.apple.com/videos/play/wwdc2024/10076
Watch this session to see live code demos and explanations of how to integrate FoundationModels into your app. - Download Xcode 26 Beta
https://developer.apple.com/download
Required to access the iOS 26 SDK and use theFoundationModels
module. - Apple Intelligence on-device and cloud architecture whitepaper
https://www.apple.com/newsroom/pdfs/Apple-Intelligence-Security-Whitepaper.pdf
A detailed technical breakdown of Apple Intelligence’s privacy and security architecture. - Private Cloud Compute
https://www.apple.com/privacy/private-cloud-compute
Learn how Apple offloads AI tasks securely to the cloud when on-device processing isn’t possible.
Final Thoughts
Apple’s FoundationModels framework is a powerful toolkit for building generative AI features that respect user privacy and deliver lightning-fast performance. Though currently in beta, it points toward a future where AI is a native part of every Apple device.
If you’re building apps that require text understanding, generation, or semantic search — now is the time to start exploring FoundationModels with iOS 26 Beta and Xcode 26 Beta.
Join thousands of data leaders on the AI newsletter. Join over 80,000 subscribers and keep up to date with the latest developments in AI. From research to projects and ideas. If you are building an AI startup, an AI-related product, or a service, we invite you to consider becoming a sponsor.
Published via Towards AI
Take our 90+ lesson From Beginner to Advanced LLM Developer Certification: From choosing a project to deploying a working product this is the most comprehensive and practical LLM course out there!
Towards AI has published Building LLMs for Production—our 470+ page guide to mastering LLMs with practical projects and expert insights!

Discover Your Dream AI Career at Towards AI Jobs
Towards AI has built a jobs board tailored specifically to Machine Learning and Data Science Jobs and Skills. Our software searches for live AI jobs each hour, labels and categorises them and makes them easily searchable. Explore over 40,000 live jobs today with Towards AI Jobs!
Note: Content contains the views of the contributing authors and not Towards AI.