Go to homepage

Reid Main

  1. Archives
  2. Tags
  3. About Me
  4. Email
  5. Resume

WWDC 2024: Thoughts on the Keynote

I have been building iOS apps since 2009 and in my opinion us third-party developers usually go into these WWDC keynotes with a wishlist of things we'd like to see from Apple, but rarely any concrete expectations. There have been several major announcements that have changed the trajectory of my career that I never saw coming such as the iOS 7 redesign, Swift programming language, and Apple Silicon hardware.

But this is the first WWDC keynote I can remember where the expectations of Apple were so concrete: deliver their answer to generative AI. ChatGPT, Google Gemini, Microsoft Copilot, and other tools have completely overtaken the zeitgeist in the past 12 months. It has been impossible to escape the doom and gloom articles about how if Apple doesn't embrace AI they will collapse.

So after a brief skit (anything set to Mötley Crüe's Kickstart My Heart gets my approval), and the prerequisite circle-jerking over how great Apple TV+ is, we finally got Apple's answer.

visionOS 2

Surprisingly, Apple chose to start with visionOS by highlighting all of the "amazing" third-party apps that have been created for it. Now I'm in Canada so I haven't been able to use Apple Vision Pro yet but I cannot tell you how excited I am to look at my data with SAP Analytics Cloud or do tai chi with Po from Kung Fu Panda. Also, today I learned that "spatial computing has reinvented how you view your photos" which really surprised me since the vast majority of this planet hasn't actually been able to experience spatial computing yet. /sarcasm

It wasn't obvious at the time, but this segment set the tone of the presentation to come. This keynote wasn't for developers. Apple was selling consumers and investors on product features that were coming this year. I'm sure automatically generating "spatial photos" from 2D images is a cool feature but as a developer I don't care. The Mac Virtual Display improvements in resolution and size will definitely be nice to have but there was no mention as to how this is a benefit to third-party developers.

It wasn't until 10 minutes and 20 seconds in that we got our first mention of developer enhancements. Over the next 50 seconds they name dropped three frameworks developers would gain access to (Volumetric APIs, TabletopKit, Enterprise APIs) and then quickly moved onto their next topic. Partnering with Blackmagic Design to create cameras and software to capture and edit Apple Immersive Video is very cool, but as a developer I could not care less.

At least we finally got confirmation that Apple Vision Pro is being released outside of the US over the next month but only to eight additional countries (China, Japan, Singapore, Australia, Canada, France, Germany, and the UK).

iOS 18

Next was iOS 18 which also focused mostly on consumer centric features. We're finally going to be able to place app icons wherever we'd like on the Home Screen. No longer will they be forced to appear in sequential order. You'll also be allowed to tint the colour of the entire Home Screen for some reason. Control Center is getting major customization improvements with widgets effectively being enabled there. I will admit it looks very useful and I am hoping to see a lot of the widgets I currently use get moved to the Control Center. You're also going to be able to assign these widgets to both the flashlight and camera buttons on the lock screen as well the action button on the iPhone 15 Pro.

Messages is being updated to allow you to use any emoji or sticker for reacting to messages. You'll also be able to schedule messages to be sent later. Mail is getting on device categorization to theoretically make it easier to find and read the emails that are important to you. Topographical maps with detailed trails are coming to the Maps app. Apple Cash will be able to transfer money to another iPhone user by holding your phones near one another. We even got mention of Game Mode coming to iPhone which will minimize background activity to maximize framerates and reduce controller latency. This segment was then capped off with demo of the "biggest redesign ever" of the Photos app.

To be clear, I am not saying that these features are pointless and that I'm not going to enjoy them. But by the end of this iOS 18 segment, outside of widgets, I still wasn't sure as to what I should be excited about as a developer. This felt like the presentation Apple gives in September when they are showcasing the latest version of iOS that is coming out the next week. Not what developers get a preview of three months in advance.

Audio & Home

This was a bizarre middle segment on "Audio & Home" where we got a quick look at some new AirPods functionality and mention of an Apple TV+ feature called Insight which is effectively X-Ray from Amazon Prime Video. This entire segment could have been cut from the keynote and saved five minutes but again Apple felt compelled to showcase some more random features that don't affect developers at all.

watchOS 11

Lots of great improvements are coming to the fitness aspects of watchOS. Workouts will now contain an "effort rating" showing you how difficult the workout was, and an overall "training load" to highlight if you're pushing yourself too hard or if there is still room for improvement. This will work in tandem with a new Vitals app that will highlight any irregularities in health metrics.

There was also mention of interactive widgets and live activities coming to watchOS so both of Apple's third-party watch developers must be thrilled.

iPadOS 18

I would bet serious cash that Apple wishes they could go back in time and not brand iPadOS as something separate from iOS. Maybe in 2019 it made sense but nowadays the only real differentiating factors between iPhones and iPads are the screen size and Apple Pencil support. From a software perspective, effectively everything you can do on iPadOS can also be done on iOS and this year's WWDC keynote did nothing to change that.

This entire segment could be summarized as "Everything we talked about for iOS 18 is also here, plus we finally brought the calculator app over". That's it. If I was feeling generous I could call out the tab bar redesign as being something new for iPadOS. There's also the Math Notes feature but hilariously it is also coming to iOS 18. Apple chose to demo Math Notes on the iPad because admittedly using it with the Apple Pencil looks really cool. But you're going to be able to do the exact same thing on your iPhone and Mac.

And at this point it may go without saying, but there was no mention of anything that developers would get to use.

macOS 15

Codenamed Sequoia, the latest version of macOS is going to get all of the same stuff that was announced for iOS and iPadOS 18. But what I am most excited about (as a consumer not a developer) is iPhone Mirroring. I love the idea of bringing up my iPhone as a stand-alone window on my Mac and having complete control of it. Apple is also finally breaking out all of their password tracking functionality into its own app, unsurprisingly named Passwords.

Apple Intelligence

We are now 65 minutes into this presentation and the words "artificial intelligence" have not been said once. But to anyone paying close attention it is really obvious that the majority of the features talked about are enabled via machine learning.

Tim Cook finally broke the taboo as he introduced Apple's take on generative intelligence and large language models which are being marketed as "Apple Intelligence".

The core of Apple's pitch is that this is AI being powered by your personal data and is done entirely on your device. Other AI tools are cloud based which have very little knowledge about who you are and at the same time are potentially storing information about you that you don't want. Apple Intelligence will catalog and understand all of the data stored on your device and aim to only use it locally, but of course there are some edge cases which we'll dive into later.

Apple Intelligence is one of those things that if it done correctly consumers will barely realize they are using it. With such tight integration into the OS it could become the new default way of interacting with your device. Prioritizing and summarizing your notifications could redefine how you interact with them. Generative text tools could not just change how you write on your device but also how you consume written information. If Siri is not just able to understand you correcting yourself but also follow up commands, I may legitimately use her to actually complete tasks instead of just asking "what is the weather outside?" three different ways until she understands me. Apple even showcased their version of Google Eraser, called Clean Up, which has the potential to drastically change how iOS users take and share photos.

But these are all really big what ifs and based on Apple's history with these sort of problems I am leery of it working as well as they demonstrated. The example of asking Siri "When is my Mom's flight landing?" and Siri is able to pull the flight number from an email and automatically check the flight status seems absolutely insane. But who knows, maybe it is finally possible?

Unfortunately, in my mind, all of this potential was overshadowed by Apple's demonstration of generative images that look as atrocious as you'd expect. It is the kind of glossy, uncanny valley crap that we all thought was garbage when it was first showcased over a year ago. But while all the online tools have improved dramatically (while also continuing to skirt the ethical implications of how they source their data) Apple seems content to generate junk. One of the presenters literally said "with the Image Playground experience and Genmoji, you can create fun and delightful images right where you need them" while showing this nightmare fuel.

Apple's nightmare fuel example of Genomji

It's funny how Apple took so much shit for their "Crush!" ad and still decided to double down on how little they value artists and creatives. I swear I did not make this example up. Another one of the tone deaf presenters said "Image Wand can transform a rough sketch into a polished image that complements your notes and makes them more visual" and proceeded to show this:

Apple's "rough" sketch

It boggles my mind that this demonstration and wording actually got greenlit by Apple. The subtext being you no longer need artists. But if you are one, we can now generate better (i.e worse) images of your "rough" stuff" and no one saw any issues with this. I will bet that next year Apple showcases a feature which allows users to "easily create music" for their photo memory videos while believing musicians will actually thank them for it.

Earlier I mentioned that all of this is intended to happen on device but Apple did admit there are times where your devices don't have the memory or computing power to complete a task. To facilitate this (while theoretically also protecting your privacy) Apple has created what they call Private Cloud Compute. These cloud servers are running a hardened OS that omits features which are not strictly necessary such as persistent data storage or remote shell access. Versions of this OS are also going to be made available to security researchers so they can theoretically verify Apple's claims, as well as claim bug bounties for any shortcomings. This still obviously requires putting a lot of trust into Apple, but they said all the right things so it will be interesting to see how this actually gets leveraged by third-party developers in reality.

To cap off the keynote Apple finally revealed their integration with ChatGPT but did it in the most Apple way possible. If for whatever reason Apple cannot find the answer to a question they will ask the user if they want their question to be passed to ChatGPT. You will need to give permission each and every time so there is no way for information to be automatically uploaded. Apple isn't even paying ChatGPT for this integration so its no surprise that Craig Federighi had a throwaway sentence ("we also intend to add support for other AI models in the future") indicating that other providers, perhaps Google Gemini, are coming.

Conclusion

I came away from his presentation with two core thoughts:

First, that WWDC's keynote is no longer for developers. It has been regressing since they started prerecording them for the pandemic but its obvious that Apple sees these as customer oriented presentations. We're never going to see a reveal like the Swift programming language in these keynotes again.

Second, this is the most reactive I have ever seen Apple. Sure "Apple Intelligence" is their take on AI, but they didn't announce anything new or unique. Everything in this presentation is effectively already possible with cloud tools. Yes, by having access to all of your personal data Apple could definitely generate better results but based on their history with this sort of thing I am not going to blindly believe them. Even from their own examples in this presentation they appear to already be beat by cloud tooling. Is their age old marketing tool of "we value your privacy" going to close that gap? We'll have to wait until September to find out.

#WWDC