Apple is launching what it calls the Foundation Models framework, which the company says will let developers tap into its AI models in an offline, on-device fashion.
Onstage at WWDC 2025 on Monday, Apple VP of software engineering Craig Federighi said that the Foundation Models framework will let apps use on-device AI models created by Apple to drive experiences. These models ship as a part of Apple Intelligence, Apple’s family of models that power a number of iOS features and capabilities.
“For example, if you’re getting ready for an exam, an app like Kahoot can create a personalized quiz from your notes to make studying more engaging,” Federighi said. “And because it happens using on-device models, this happens without cloud API costs […] We couldn’t be more excited about how developers can build on Apple intelligence to bring you new experiences that are smart, available when you’re offline, and that protect your privacy.”
In a blog post, Apple says that the Foundation Models framework has native support for Swift, Apple’s programming language for building apps for its various platforms. The company claims developers can access Apple Intelligence models with as few as three lines of code.
Guided generation, tool calling, and more are all built into the Foundation Models framework, according to Apple. Automattic is already using the framework in its Day One journaling app, Apple says, while mapping app AllTrails is tapping the framework to recommend different hiking routes.
The Foundation Models framework is available for testing starting today through the Apple Developer Program, and a public beta will be available early next month.