It has been a couple of weeks since Apple’s CEO, Tim Cook, took to the stage in San Jose to address the worldwide developer community during a two-hour keynote that shared the latest developments to the iOS, watchOS, tvOS and macOS platforms. After spending the last fortnight digesting all of the announcements, I’m here to share my personal highlights from the WWDC.
AR has been mentioned everywhere this past year but, in my opinion, aside from placing 3D objects in the real world, we have yet to see it really reach its potential. This looks set to change, however, with the announcement of ARKit 2.0.
With its updated augmented reality framework, the possibility of using this technology within a wide range of industries has increased exponentially. Apple have also introduced a new file format for storing and transmitting AR content so you can, for example, now present 3D content from Safari.
It’s now our responsibility, in my opinion, as developers and innovative thinkers to use the technology to enrich experiences and solve real world problems.
Another interesting announcement at the conference was Siri Shortcuts.
This gives users the ability to trigger actions whilst in-app, meaning that commonly used tasks are now just a voice command away. Users do, however, have to assign a shortcut phrase to trigger said tasks.
I was particularly impressed by this announcement because, as an iPhone user, it’s one of the things that I’ve felt has been lacking for a long time.
With the announcement of Siri Shortcuts, Apple is introducing tech that reduces the gap with Google Home and Amazon’s Alexa. In a year or so from now, I imagine they’ll have gathered a lot of useful information to help train Siri.
Machine learning had been sat at the top of my list of things that I needed to learn next. Luckily for me, Apple have now simplified the way in which you train machine learning models with the introduction of Create ML.
Create ML allows developers to create their own models using XCode playgrounds. All you have to do is classify your data into a folder, write a few lines of code, wait a few minutes to train your model, and you’re done! Another particularly impressive thing about it is that it uses a Natural Language framework to analyse natural language text and deduce language-specific metadata.
In my opinion, Apple is taking the time to solidify its ecosystem in ways that only Apple is capable of doing and it’s, in turn, equipping developers with much better tools for the job.
It could be said, perhaps, that the conference didn’t really give the community any jaw-dropping moments. Some developers, for instance, were expecting news of a Scene Kit based rendered UI and that didn’t happen. Sorry, Lee!
We have, however, seen improvements to many areas – from AR to ML, Metal to development tools. As such, I believe that Apple are providing us with the foundations for much bigger technology. Equipping us with the tools that we, as developers, need to create forward-thinking and enriching experiences for our users in the years to come.
If you’ve yet to catch up on the conference announcements, here’s a list summarizing those that we deemed the most important: