You could be forgiven for thinking that this is starting to feel a little bit like the iOS 13 fiasco last year, but it’s actually not quite the same. While developers have found some of the iOS 14 tools like Xcode and the iOS simulator to be rough around the edges — which has undoubtedly been the cause for some of the rapid-fire follow-up updates we’ve seen to apps that were otherwise ready to go on day one of the iOS 14 general release — end users have found relatively few problems hiding in the new iOS 14 release.
By contrast, when Apple released iOS 13.0 last year, it already had iOS 13.1 waiting in the wings — it started its developer beta cycle a couple of weeks before Apple’s September launch event — and had even gone so far as to announce a release date for the follow-up version.
While the new iOS 14.2 developer beta strongly suggests that Apple has earmarked the 14.1 version to address a few bugs, there’s been no word on when that will be coming, nor have we yet found any major problems that would necessitate Apple rushing it out.
It’s also quite likely that iOS 14.1 might not even be a bug fix release, per se, but merely the version that Apple will have preinstalled on the new iPhone 12 models that will be announced next month.
On the other hand, the iOS 14.2 developer beta does offer a few pretty interesting changes, at least one of which seems geared toward the coming iPhone 12 Pro.
New Music and AirPlay Controls
Probably the most noticeable feature right out of the gate is that Apple has completely redesigned the music playback controls found on the lock screen and in Control Center.
While the actual collapsed panel looks the same as before, opening it up now displays a much larger full-screen music player view, with album artwork front-and-center.
The volume and scrubbing controls have also been made larger and more accessible, so you no longer have to fiddle with getting your finger at just the right point to use them.
Beyond the much nicer visual redesign, Apple has also split out AirPlay and remote playback controls into separate sections, which should make the whole process of determining whether you’re streaming to a remote AirPlay device or controlling it directly a bit more obvious.
The AirPlay button brings up a list of audio playback destinations, much like before, including AirPods or any other connected Bluetooth headphones, followed by AirPlay-capable speakers and TVs. If a remote AirPlay device is currently playing something else, or even if just has another track queued up, you’ll now see that listed below the device name as well.
Below the expanded music panel in Control Center is another button to specifically control other speakers and TVs, such as a HomePod or Apple TV, which will bring up the familiar multi-device view on an entirely separate screen.
As before, toggling from here to a HomePod or Apple TV will have that device’s playback controls take over the main music panel, with the device name shown above the track name so you know which one you’re looking at.
These controls are also replicated on the lock screen, although in this case tapping the AirPlay button simply opens up a more abbreviated AirPlay panel, without the playback controls or album artwork, and the “Control other Speakers & TVs” is displayed at the bottom of the panel rather than as a discrete button.
Despite Apple’s acquisition of music-recognition platform Shazam over two years ago, the company hasn’t done too much with the app beyond adding some behind-the-scenes integration to try and make its Apple Music playlists more intelligent and using it to offer up some free Apple Music promos.
To be fair, Apple has of course enhanced the Shazam app, but it’s otherwise remained a standalone app that users had to manually install — until now.
iOS 14.2 hides a button that you can add to your Control Center that will trigger a new built-in Shazam feature. The button doesn’t show up by default — you’ll have to go into the Control Center section of your iOS Settings app and add it manually, but once you’ve done so you can position it where it’s readily accessible, and then use it to instantly kick in Shazam’s listening mode.
Tapping on the Shazam button will begin listening for the next song; you can close the Control Center and go back to whatever you were doing as you’ll get a standard iOS notification as soon as it identifies the song.
The notification that appears will include a button to listen to the song directly on Apple Music, or you can tap on the notification to be taken to the Shazam page for that song, either in Safari or directly in the Shazam app if you have it installed.
Note that you don’t need to have the Shazam app installed to make use of this feature. It’s built into iOS 14.2.
You’ll also have to toggle this on a per-song basis, as it shuts off automatically after each song that Shazam recognizes, so you can’t (yet) simply leave it on to identify a whole stream of music.
While some of Apple’s rivals like Spotify will no doubt once again claim the company isn’t playing fair with this integration, it does show what Apple can do when it owns all of the pieces. Not only do third-party apps not have access to Control Center buttons, but in this case the core function of the app — music recognition — is now part of iOS itself.
We’ve seen several features arrive in recent iOS updates that seem clearly aimed at helping users cope in a time of lockdowns, mask-wearing, and social distancing. Back in the spring, Apple made it slightly easier to unlock your iPhone while wearing a mask by making the passcode entry screen come up much more quickly, and although the new watchOS 7 handwashing feature has been in the works for years, there’s no doubt that it’s arrived in a timely manner for the things that we’re facing in the current climate.
Now Apple is adding a new “People Detection” feature to iOS 14.2 that also seems clearly aimed at helping folks maintain necessary social distancing boundaries, especially those who may have visual impairments that make it difficult for them to determine how far away other people actually are.
The new feature has been added as an option in the Magnifier accessibility tool, which is already designed to help folks with vision problems by acting as a digital magnifying glass, and now if you tap on the settings icon in the bottom left corner of the Magnifier app, you’ll see the ability to add a new control for “People Detection.”
A new settings panel also allows you to decide whether you want measurements in feet or meters, and what type of feedback you want to let you know when people are detected. In addition to identifying people through the camera, augmented reality features are used to determine their approximate distance, and this is something that will undoubtedly get even more accurate with the new LiDAR scanner that’s supposed to be coming to the iPhone 12 Pro.
When’s It Coming?
It’s a very big question when we’ll see an actual release of iOS 14.2. It does seem likely that Apple will be pushing out a public beta of the new version at some point, so more early adopters should get an opportunity to try the new version sooner, but the fact that we haven’t even seen iOS 14.1 arrive yet suggests that a full release might be a ways off.
At this point, our slightly educated guess is that iOS 14.1 will ship as the standard version on the new iPhones that arrive next month, at which point it will also be made available as an update to the rest of Apple’s iOS 14 compatible devices. iOS 14.2 would be released sometime after that, and if rumours of the iPhone 12 Pro models being delayed into November are true, it’s also entirely possible that those devices will arrive with iOS 14.2 preinstalled; the new People Detection feature, which seems like it was made to take advantage of the coming iPhone 12 Pro LiDAR Scanner, would also seem to strongly suggest this possibility as well.