What’s New in the iOS SDK
Learn about the key technologies and capabilities available in the iOS SDK, the toolkit used to build apps for iPhone, iPad, or iPod touch. For detailed information on API changes in the latest released versions, including each beta release, see the iOS Release Notes.
With the iOS 12 SDK, apps can take advantage of latest advancements in ARKit, Siri, Core ML, HealthKit, CarPlay, notifications, and more.
Multiuser and Persistent AR. Use world-mapping data to bring your app’s users together in shared AR experiences, and store data from an AR session in your app so it’s easy to revisit it later.
For more information, see Creating a Multiuser AR Experience and Archiving World Map Data for Persistence or Sharing in the ARKit developer documentation.
Object Detection. Let nearby objects become a part of your AR apps. Adopting object detection in ARKit 2 helps you empower users to scan in real-world objects and incorporate their position and movement into your app’s augmented reality.
Siri can predict shortcuts to actions that a user may want to perform using your app, and suggest those shortcuts to the user in places such as Spotlight search, Lock Screen, and Siri Watch Face. Siri learns about the shortcuts available for your app through donations that your app makes to Siri. Users can also use donated shortcuts to add personalized voice phrases to Siri.
You determine which actions in your app are pertinent to the user and may be something they’d like to do in the future. Your app tells Siri about these actions by specifying them as relevant shortcuts.
HealthKit in iOS 12 lets users share their medical history with your app. With their permission, you can personalize health experiences based on health record data such as conditions, labs, medications, vitals, and more.
Stickers from your Sticker Packs and images from iMessage apps can now appear as effects. For more information see Adding Sticker Packs and iMessage Apps to Effects and the Messages developer documentation.
Interactive Controls in Notifications
Notification content app extensions now support user interactivity in custom views. If the content of your app’s notifications needs to prompt user interaction, add controls like buttons and switches. For more information, see Customizing the Appearance of Notifications and the UserNotificationsUI developer documentation.
Let users access functionality in your app by double-tapping on Apple Pencil. For more information, see Pencil Interactions in the UIKit developer documentation for iOS 12.1.
The new AuthenticationServices framework lets you integrate password manager apps with Password AutoFill. Your apps can also use the new ASWebAuthenticationSession class to share login session information between your website in Safari and your associated app to make app logins simpler.
For more information about integrating with password manager apps, see the AuthenticationServices developer documentation.
CarPlay for Navigation Apps
Using the new CarPlay framework, navigations apps can now display navigation information in CarPlay. For more information, see the CarPlay developer documentation.
The new Network framework makes it easier to create network connections to send and receive data using transport and security protocols.
Use this framework when you need direct access to protocols like TLS, TCP, and UDP for your custom application protocols. Continue to use URLSession, which is built upon this framework, for loading HTTP- and URL-based resources.
For information about the Network framework and how you can use it to replace calls to low-level socket APIs in your apps, see the Network framework developer documentation.
The Natural Language framework is a new framework you use to analyze natural language text and deduce its language-specific metadata. You can use this framework with Create ML to train and deploy custom NLP models.
For more information about how your apps can process and understand natural language text, see the Natural Language framework documentation.
Periodically, Apple adds deprecation macros to APIs to indicate that those APIs should no longer be used in active development. When a deprecation occurs, it’s not an immediate end of life for the specified API. Instead, it is the beginning of a grace period for transitioning from that API and to newer and more modern replacements. Deprecated APIs typically remain present and usable in the system for a reasonable time past the release in which they were deprecated. However, active development on them ceases, and the APIs receive only minor changes to accommodate security patches or to fix other critical bugs. Deprecated APIs may be removed entirely from a future version of the operating system.
As a developer, avoid using deprecated APIs in your code as soon as possible. At a minimum, new code you write should never use deprecated APIs. And if your existing code uses deprecated APIs, update that code as soon as possible.
Deprecation of OpenGL ES
Apps built using OpenGL ES will continue to run in iOS 12, but OpenGL ES is deprecated in iOS 12. Games and graphics-intensive apps that previously used OpenGL ES should now adopt Metal.
Metal is designed from the ground up to provide the best access to the modern GPUs on iOS, macOS, and tvOS devices. Metal avoids the overhead inherent in legacy technologies and exposes the latest graphics processing functionality. Unified support for graphics and compute in Metal lets your apps efficiently utilize the latest rendering techniques. For information about developing apps and games using Metal, see the developer documentation for Metal, Metal Performance Shaders, and MetalKit.