By: Brett Keck, Mobile Practice Lead at Eleven Fifty Consulting
WWDC has come and gone, and it all happened without any new hardware announcements or “One More Thing…”. Instead, Apple announced an almost endless amount of new features across their four platforms – iOS, macOS, tvOS and watchOS. Across these platforms, there were a combination of small niche features, integrations between platforms, and major additions that together made for an exciting keynote and a packed week of sessions for developers to begin diving into all the new things we’ll be seeing roll out over the next 3-4 months. Out of all these announcements, I couldn’t help but notice a few themes forming along the way.
“We were wrong”
In a very un-Apple like way, Apple began the keynote by talking poorly about their own product, the Apple Watch. And it seemed fitting – even people that love their Watch (myself included) have had bad things to say about it. Rather than doubling down on what hasn’t worked, though, Apple went back to the drawing board and gave us a completely new vision with watchOS 3. Gone are glances, the side button contact view that no one uses, and apps that take 5-20 seconds to load. Here are the dock and instant apps. Apple also gave us some new first party apps and included some amazing changes for wheelchair bound users (in classic Apple fashion, given that they have been one of the few innovators in tech space when it comes to accessibility). But the main message Apple was giving was loud and clear – “we heard you, and we can admit when we are wrong”.
The other guys have some good ideas, and Apple wants to make them better
Not every great idea starts with Apple, and some of the better ideas of Android, Waze, WhatsApp, Evernote, and a handful of other apps and services are coming in troves to iOS 10. The home and lock screens are getting a major revamp, with card-style notifications and widgets making their way over from Android. Maps is getting better traffic information, stops along a route, and other features that some other mapping applications have given users for years. Shared notes? That’s straight out of Evernote. In one instance, Apple is even borrowing from itself, bringing the “raise to wake” feature over from the Apple Watch to the iPhone. And Messages – what isn’t it trying to bring over now? Between stickers, reactions, iMessage apps, larger emoji, and effects, Messages is bringing in some of the best (or worst, depending on how you use your messaging apps) of Lyne, WhatsApp, Messenger, WeChat, Hangouts, and Slack.
The key to all of these features isn’t so much that Apple is finally doing them, but in how Apple is implementing them. Collaboration will be done with security in mind. Widgets won’t be a constant drain on battery life. Mapping features will be a combination of Apple mining data, crowdsourcing, and even developer implemented features. While lots of ideas happen elsewhere before they come to an Apple device, history shows that Apple doesn’t just straight-out copy a feature, but instead takes the best of a feature and integrates it into the other things that Apple does best, whether it be a more intuitive user experience or added features to take things to another level.
Apple sees where tech is headed, too
With the rise of the Amazon Echo, Google Photos, and advances in assistive AI, there have been a lot of questions about when Apple was going to get into the game (either more or at all) in any of these fields. Numerous articles have been written over the past few months making claims ranging from “Apple is going to be left behind” to “Apple has a secret announcement for WWDC that addresses everything”. Of course, neither of these came to be, and Apple gave us a look into where they have been headed on these fields.
Photos has gained the ability to search your photos, along with better face recognition, so it can catch up to what Google Photos has been doing for the past year. Siri is now more contextually aware, and lets developers use it (albeit in a limited fashion, but clearly this will be expanding over time). The HomeKit app – and it’s presence on the watch, Apple TV, and iOS – is a step towards turning Apple devices into your home hub that the announcement of HomeKit started to promise two years ago. Add in how Siri works with HomeKit, and Apple is taking a piece out of the Echo’s playbook. Yes, these other technologies still have advantages over what Apple is currently offering – in the areas of time, maturity, and features – but Apple brings something else to the table that the competition doesn’t…
With Siri and Photos, a lot of assistive AI is involved – and that cannot happen without data. In the past, iOS has relied almost entirely on data located on the device. This year, though, Apple spoke about a technology known as Differential Privacy. At its core, differential privacy takes information stored on your device such as usage patterns, adds noise to the information so it can’t be traced back to any individual user, and sends it off to a server to be aggregated. Then Apple can use these patterns to provide more intelligent search results, photo filtering, word choices, etc. – all without sacrificing a users privacy. It’s a big step forward from what Apple was previously doing, and it will remain to be seen how far it can close the gap that Google and Amazon keep widening by collecting very large amounts of non-anonymous user data.
Sometimes Apple gives us the “help us help you” features. This year, those things came in the form of Siri, MapKit, and Messages extensions. No longer are these closed gardens where we only get what Apple wants to give us. Instead, developers have the ability to make an app that works right with Maps (like booking a ride to a destination without leaving the Maps app) or making a reservation for dinner, playing a game, or sending money to a friend right within the Messages app. With Siri, if you have a ride sharing, messaging, photos, payment, VoIP, workout, or HomeKit app you can now include voice driven features for your app. These are the kind of features that are 10% of what Apple has given us to work with, and 90% about what developers can dream of doing with them.
What most of the general public doesn’t hear about WWDC week is what it means to developers. Behind the scenes, Apple has added new features to Xcode (including a way for developers to easily extend the Xcode itself) that will make it easier for us to make better and more innovative apps. Apple also announced a new file system that will roll out onto all Apple devices over 2017, that promises to enhance both performance and security without requiring any change by users or developers.
But this WWDC wasn’t just about existing developers, it’s about future developers as well, with Apple announcing Swift Playgrounds for iOS. While Swift Playgrounds can be used by anyone for prototyping Swift code on an iPad, at it’s heart it’s a learning platform – with Apple providing lessons to help the next generation of developers learn to develop with Swift, and introducing ways for others to provide their own learning materials. Simple lessons and a new coding keyboard make it a great way for anyone of all ages to get started with Swift.
Summing it all up
Technology is moving fast – it always is – but now it’s not just moving forward but horizontally. In the past few years we’ve seen Google continue to lead the way with assistive AI, Amazon break into the home with the Echo, and Apple becoming an industry leader in connected TV boxes and wearables. All of this has happened while Android and iOS continue pushing each other forward and learning from each other as they evolve their mobile operating systems. In some areas, Apple has continued to lead the charge this year. In other areas, Apple is playing catch up. But what matters most is that in almost every area across four platforms, Apple moved forward in significant ways that will provide a user with a simpler, more efficient, and safer experience.