iOS 14 Checklist for Developers
What you need to know to get your apps ready for iOS 14
Apple’s iOS 14 rollout without prior notice has taken the whole developer community by surprise. Regardless, it’s that time of the year again when you’ll be shipping your app updates for the latest OS.
To kick things off, here are the big changes in iOS 14 that you should be aware of:
Introduction of Widgets and App Clips on the home screen.
Replacement of IDFA with a new AppTrackingTransparency framework for opt-in ad tracking (at the time of writing, Apple has delayed this until January 2021).
New Vision requests for contour, trajectory detection along with hand and body pose estimation.
ARKit brings a new depth API for LiDAR scanners as well as location anchors to place AR experiences in specific places.
Apple’s PencilKit framework introduces the ability to track the speed and force of gestures drawn on
PKCanvas
. There’s also a new ScribbleUITextField
that recognizes handwritten text using an on-device machine learning.
Now, let’s dig through some of the significant updates across different Apple frameworks and changes in APIs so that you’re all set for app releases on iOS 14.
Enhanced Pasteboard API
Apple is the leader in data privacy, and with the latest iOS update, they’ve shown that once again.
iOS 14 introduces a floating notification every time your app reads contents from the clipboard. Now, to prevent your apps from needlessly accessing the pasteboard, there’s an enhanced UIPasteboard
API that lets you determine the kind of the content present in the UIPasteboard
before actually reading its value.
The detectPatternsForPatterns:completionHandler
and detectPatterns(for:inItemSet:completionHandler:)
methods let you find certain patterns without triggering the notification.
At the same time, you can leverage the UIPasteboard.DetectionPattern
struct to determine if the pasteboard contains a probableWebUrl
(which might be relevant to deep links) or a number
.
Picture in Picture Mode
While iPadOS did support Picture in Picture mode earlier, iOS 14 finally brings it to the iPhone.
By using AVPictureInPictureController.isPictureInPictureSupported()
, you can check whether the feature to play videos in the background is supported:
AVPictureInPictureController(playerLayer: playerView.playerLayer)
If, like me, you’re adopting the PiP mode in your AVPlayer
-based apps for iOS 14, you could run into strange errors — Picture in Picture not launching automatically when the app is in the background. Thankfully, this Stack Overflow page provides a solution by initializing AVAudioSession.sharedInstance().setActive(true)
before the AVPictureInPictureController
.
Limited Photos Library Access Permission
In iOS 13 and prior versions, allowing apps to access your photos library would literally let them access all your albums and media assets. This could easily open the door for privacy breaches, as developers could upload the libraries to their cloud servers.
With iOS 14, Apple introduces limited photo access permission, which lets the user opt for only selected photos or give access to the entire library, thereby preserving privacy. This means iOS developers have their work cut out.
So, there’s a new PHAccessLevel
enum property that lets you define it as readWrite
or addOnly
:
let accessLevel: PHAccessLevel = .readWrite
To query the authorization status of the photos library, simply pass the enum above in the following function:
let authorizationStatus = PHPhotoLibrary.authorizationStatus(for: accessLevel)
Starting in iOS 14, the authorizationStatus
above returns a new limited
enum property, which means only the photos selected by the user would be visible to the developers. To request limited photo access permission, invoke the following function:
PHPhotoLibrary.requestAuthorization(for: .readWrite) { status in | |
switch status { | |
case .limited: | |
print("limited access granted") | |
default: | |
print("not implemented") | |
} | |
} |
The following piece of code presents the image selection picker UI:
PHPhotoLibrary.shared().presentLimitedLibraryPicker(from: self)
Images selected/deselected by the user can be monitored in the photoLibraryDidChange
function by conforming and registering to the PHPhotoLibraryChangeObserver
protocol.
Now, to prevent automatic photo access prompts every time, set the PHPhotoLibraryPreventAutomaticLimitedAccessAlert
key to true
in your Info.plist
file.
SwiftUI Brings New Property Wrappers, Views, Modifiers, and App Lifecycle
SwiftUI, Apple’s new declarative UI framework, was the talk of the town during WWDC 2019 — and this year has been no different. In its second iteration with iOS 14, SwiftUI now includes a whole lot of new UI components ranging from VideoPlayer
to Maps
, Labels
, Links
, ColorPicker
, and ProgressView
.
More importantly, iOS 14 introduces support for lazy loading of VStack
and HStack
by using LazyHStack
and LazyVStack
views instead. This means you needn’t worry about NavigationLinks
loading destination views immediately.
There’s also a new Grid
component that helps replicate UICollectionView
to some extent and a matchedGeometryEffect
modifier to create amazing transitions and animations.
Besides introducing a SwiftUI’s own App lifecycle by using brand new property wrappers and protocols, iOS 14 also introduces the WidgetKit
framework that lets you build beautiful powerful widgets purely using SwiftUI.
More Powerful CollectionView
While CollectionView didn’t debut in SwiftUI during WWDC 2020, that didn’t stop it from receiving some powerful new updates.
Here are the major changes that you can leverage for your apps that use iOS 14:
UICollectionViewCompositionalLayout.list
lets you createUITableView
-like appearances inUITableView
, thereby further boosting the ability to customize compositional layouts. I believe this strongly indicates thatTableViews
might go obsolete in the future.The
UICollectionView.CellRegistration
structure brings a new way to configureUICollectionView
cells. So you needn’t define cell identifiers anymore, as the new struct automatically takes care of cell registration when passed inside thedequeueConfiguredReusableCell
.DiffableDataSources
that arrived with iOS 13 now bringSectionSnapshots
as well to customize and update data on a per-section basis.
Better Privacy With CoreLocation
While iOS 13 brought deferred “Always allow” and a new “Allow once” permission, iOS 14 further tightens privacy by allowing the user to grant access to an approximate location.
This means there’s a new property of the type CLAccuracyAuthorization
that has two enum cases — fullAccuracy
and reducedAccuracy
(returns an approximate instead of the exact location).
Also, the authorizationStatus()
function now stands deprecated and you should use locationManagerDidChangeAuthorization
instead to query location permission status.
Core ML Model Encryption
Core ML is Apple’s machine learning framework that lets you initialize models, run inferences, and even do on-device training. With iOS 14, Apple bumps up Core ML with the introduction of Model Deployment. This means you can ship updates to your machine learning models on the fly without updating the apps.
There’s also an improved Core ML model viewer in Xcode that shows the underlying layers. But it’s model encryption that stands out. Machine learning models aren’t easy to build and at times contain sensitive information. Earlier, you could easily extract the .mlmodelc
Core ML model files embedded in apps.
Now, that’s no longer possible once you encrypt models in Xcode 12. In doing so, Core ML will automatically decrypt and load them in your app’s memory.
For handling encrypted models, iOS 14 brings a new Core ML model asynchronous initializer function:
MyModel.apply{ | |
switch result { | |
case .success(let model): | |
currentModel = model | |
case .failure(let error): | |
handleFailure(for: error) | |
} | |
} |
The model only loads once it’s been decrypted successfully. It’s worth noting that the old init()
way of initializing Core ML models will be deprecated in the future.
Conclusion
While these updates are the most significant ones to get your apps up to speed, there are also other important changes such as the inclusion of sentence embedding in the Natural Language framework and support for training style transfer models using CreateML.
This sums up the major changes developers need to know for iOS 14. Thanks for reading.