Apple WWDC: AI as a Feature and Big Platform Improvements

Summary

It’s been nearly two weeks since Apple WWDC. I covered the event live on social media, but I held off writing a full report until I had time to think through the implications and test developer builds of OS 26 across most of Apple’s product categories. My conclusion: Apple can’t tout its internal AI models as class leading for Wall Street, so it focused on AI as a feature for consumers. This buys Apple time but could backfire if it turns out that vertical integration with AI models is required for success. Liquid Glass is welcome though it needs some small tweaks before launch. The iPad is no longer held back by its OS but by its apps, and Apple is building core capabilities it needs for smart glasses and enterprise functionality for Apple Vision Pro today.

AI As a Feature: Pros and Cons

The narrative in the press — and from financial analysts — that Apple is behind in the AI race. This perspective is predicated on a bet that the big AI model makers will see a return on their investment rather than just the silicon providers like Nvidia, and that OS platform vendors like Apple need to build their own LLMs rather than buy them. Apple is behind in the race for advanced generative AI models, and it has admitted challenges just rolling out the more capable version of Siri in Apple Intelligence that it announced at WWDC last year, let alone keeping up with rivals racing ahead with AI models that generate video or remember everything you saw so you can find where you put down your car keys. At a briefing with me after the keynote, an Apple executive admitted that Apple is “leaving broad world knowledge to OpenAI” and other model vendors and focusing on language models geared towards its specific hardware constraints. To that end, Apple announced the ability to access Apple’s on-device 3 billion parameter LLM. This has two benefits: privacy and storage. By leaning more heavily on privacy using its models on-device and with Private Cloud Compute, Apple is reinforcing a differentiated marketing message to consumers. Apple’s AI models are not as capable as rivals, but they are already included on every phone. Developers already had the ability to use third party AI models on device, but to do so they have to include the model itself inside the app, bloating the size of their app by at least 500MB and as much as 4 GB.

Apple’s trailing position in AI research may be a problem for Apple’s stock, but not for its business model, at least not in the short term. As incredible as Google’s Gemini advances have been, it has not transformed how people use their phone yet. If its internal AI models do become a critical liability for the iPhone, Apple has the option of leaning more heavily on OpenAI or any of its competitors, including Google itself. As I noted after Google I/O, Google will likely jump at the chance to integrate Gemini in iOS even at the expense of Android exclusivity.

In the meantime, Apple is able to match some of Google’s integrated AI features like live translation of calls and identifying the content of screen captures, though Apple’s Visual Intelligence requires an extra step to get to ChatGPT compared to Google’s native integration of Gemini into Android. The other major feature updates to iOS use AI as a feature to add live translation, spam detection, and call screening. These are all areas where Google and Samsung were first — along with AI image editing, which iOS got last year — but Apple is able to close these gaps with its own tech.

The bigger area of concern may be in wearables and XR. VisionOS 26 is a huge update (see below), but rich AI-first interfaces may be the key to unlocking these new form factors. Meta’s Llama turns its smart glasses from sunglasses with a camera and earbuds into a genuinely useful tool. In my demo of a Samsung Project Moohan prototype, Gemini already makes navigating AndroidXR’s virtual environment simpler than gestures alone. As Google adds Project Astra features, AI could make Google’s glasses indispensable for anyone with visual impairments, ADHD, or even just regular forgetfulness.

Apple also needs keep developers building apps for its platforms. Menus and multitasking in iPadOS and enterprise features in VisionOS will help, and Apple announced plenty of new APIs along with LLM support within Xcode. However, Apple did not provide any concessions to critics — both regulatory and among its developer base — on the App Store structure, revenue share model, or create any other new incentives for developers. This is concerning.

Apple’s Platforms Get a Design Overhaul

While Apple did embed more AI-powered features in its phones, tablets, and computers, the main update to Apple’s software this year is the Liquid Glass design overhaul. Liquid Glass is pretty, and the consistency across devices makes Apple’s product line more cohesive. Incremental improvements are often derided by a jaded tech press, but anything that impacts a billion users who use their devices hundreds of times daily is meaningful. Where the Liquid Glass’ transparency is justifiably controversial is how it reduces contrast and adds visual clutter. Apple assured me that accessibility is not compromised — you can turn on high contrast mode and Liquid Glass goes away. The Liquid Glass effect dynamically changes based on dark or light images behind controls, mitigating some issues. However, I think there needs to be the ability to adjust transparency levels — high contrast mode is extreme, and there are no options for people who have difficulty seeing app icons or Control Center depending on what happens to be in the background. We will see how (or if) Apple adjusts the effect as the software moves from developer beta to general beta to production, but this is my only genuine issue with the new design.

Two other AI features are worth highlighting, one for the Mac, and one for Apple Watch:

  • MacOS gets a second new name – MacOS 26 and “Tahoe” – and Shortcuts get some help from Apple Intelligence. Shortcuts are macros for power users, and now they are even more capable and theoretically easier to set up. Given the almost limitless possibilities on offer in the new version of Shortcuts, this is still not a feature most Mac users are likely to understand or use, but it is exactly the type of feature a Mac should have, even as the iPad gets immensely more Mac-like (see below).

  • There’s an AI workout buddy for WatchOS that will delight or annoy fitness enthusiasts depending on their tolerance for enthusiastic AI chatter when they’re trying to work out. I strongly — and seriously — suggest that Apple add Workout Buddy modes for common household chores. Let’s gamify housework! Grab that laundry — you’ve got this! Mow that lawn, you’re halfway there. You’ve started your Costco run? Let’s go get that roasted chicken — it’s at the back of the store, so if you go through each aisle on your way there, you’ll close your Move Ring!

iPad is a Computer Now

The iPad was wildly popular at launch, filling a bigger tablet-sized hole in the market than even Apple anticipated. Apple continued to dominate the tablet market even as competitors copied Apple’s initial ‘it’s just a giant iPod Touch’ approach. Consumers primarily used iPads for content consumption, social media, and casual gaming even though Apple thought of the iPad as a more approachable appliance-like computing device for any computing purpose. As Apple pushed the iPad line upmarket with the iPad Air and iPad Pro, the hardware got well ahead of the OS and apps even before the iPad Pro got an Apple Silicon M4 SoC over a year ago.

With iPadOS 26, Apple is catching the software up to its ambitions. The addition of full windowing and window tiling catches Apple up not just to the best tablet multitasking UX (from OnePlus) but also to Microsoft’s Windows 11, which isn’t great to navigate with a finger, but does a terrific job with window arrangement templates. Apple introduced overlapping windows to consumer computing back in 1984, who knew it would work in 2025, too? In use, iPadOS 26 multitasking is even better than it appeared in the keynote; Apple took over a decade to get here on the iPad, but they nailed it in the end. Several other trappings of modern computing are now available on iPadOS 26 as well, including an updated Files app, the ability to have folders in the dock, Exposé to find everything open, Preview for dealing with images and PDFs, and a system-level Menu so apps can take advantage of the iPad’s enormous power with additional features in a structured way. Apps can now perform tasks in the background (ex: compiling or exporting files). External monitors are better supported. In a nod to creators, full external A/V input selection is now supported.

So the iPad is a Mac now, right? Not fully. Apple is not going to bring touchscreens to the Mac, and the walled garden iPad user experience remains distinct. I have talked to multiple Apple executives about this, and they continue to believe that there is a meaningful distinction between an OS with a direct finger-to-pixel interface versus an indirect mouse/trackpad to screen relationship. The new functionality in iPadOS 26 is entirely optional; users who want an iPad because of full screen app simplicity can continue to use their iPad exactly as they did before. You can manage your own files to an extent, but the app experience is still entirely curated: unlike Mac, you can still only download and install apps from the App Store. And there is still just one user per iPad (presumably so you buy one for each family member. Sharing your iPad is NOT caring for Apple’s bottom line). The iPad is still the device you want for handheld use cases like reading, and the Mac is where pro users can customize the OS to meet their specific needs. The iPad is also currently the only large format Apple computing platform with optional cellular connectivity, but I expect that to change on the Mac with the C1 or C2 modem in 2026 or 2027.

There is no question that the newly capable iPad will make it harder for some consumers to choose what computing platform to use. Apple would say “both,” and point to Continuity to get the most out of whatever device you are using at the moment. It remains to be seen if developers will see the iPad for the productivity and creative powerhouse it has become and add more functionality to their apps. However, even if the app situation doesn’t improve, consumers are spoiled for choice in Apple’s ecosystem. iPadOS 26 will put a lot of pressure on Google and its hardware partners to greatly improve their tablet capabilities and apps to match.

VisionOS 26 Prepares Apple for the Future

Coming into WWDC, Apple’s spatial computing platform certainly appeared stuck. The hardware hasn’t sold all that well. Unlike the iPad which got a huge hardware refresh 14 months after its debut, Apple doesn’t seem to be on a path of rapid iteration for its headset. Unlike the Apple Watch, which shifted its software focus and overall purpose to fitness a year after its launch, VisionOS updates have been fairly minor. With that context, going from VisionOS 2.4 to version 26.0 may seem like quite the stretch, but the VisionOS 26 is a massive leap forward, deserving multiple release number bumps.

VisionOS 26 improves multiple quality of life aspects of the Apple Vision Pro software experience for consumers and enterprise users today with VR and passthrough, but more importantly it sets up Apple for future see-through wearables (i.e., glasses).

On a phone, tablet, or PC, widgets are useful little displays. In a virtual environment, widgets can be so much more: windows into the real or an imagined world, shared calendars, information monitors for real-world objects, controls for real-world objects, and potentially so much more. I suspect that once developers get creative, widgets will be a core construct for spatial computing and XR. Nobody is going to put on a headset to access virtual controls for their appliances or to look out the window and see a scene from the location of their next vacation, but when Apple has smart display glasses, this could be a big part of the experience.

A widget – and a virtual desktop – is only as good as its ability to stay where you put it. In VisionOS 26, Apple creates permanence to the spatial apps, widgets, and windows that you open and carefully place in ways that make sense to you. These configurations will remain in place even after the headset is restarted. While only a single configuration is possible in VisionOS 26, it opens up the possibility of having multiple saved configurations, and the ability to move configurations from one physical location (ex: your home office) to another (your living room, corporate office, coffee shop, etc.). Perhaps we’ll get this in VisionOS 27; Apple is certainly aware of the system’s limitations in this regard.

Apple is also introducing several quality-of-life improvements -- folders in Home view, unlocking iPhones automatically from the Apple Vision Pro, and better hand tracking – but the big changes are around content, gaming, and enterprise use. VisionOS 26 supports wide FOV (field of view) content, so anything shot by a GoPro or Insta360 can now be watched inside the headset, allowing users to share or relive experiences in 3D with first-person perspective. For gaming, Apple is admitting that hand tracking is not enough, and is supporting physical controls, including the PlayStation VR2 Sense controllers. While this has been a top request from early adopters, game developers are unlikely to flock to the platform just yet: Apple doesn’t include the controllers in the box, and the installed base for the platform is low.

Where the Apple Vision Pro ought to be successful at its current price point is in the enterprise, where it provides a high quality, untethered spatial computing experience that is actually less expensive than similarly capable systems from Varjo. The problem has been that the Apple Vision Pro is highly customized to a single user. That changes in VisionOS 26 with Team Device Sharing where user data can be saved to a phone. Apple is adding new Enterprise APIs for collaborative apps, spatially aware apps, and protected content so that you can build apps for healthcare, finance, or HR and users can only access the data they are authorized for. Spatial browsing is being built into Safari for VisionOS 26, and while the demo shown during the WWDC keynote was consumer-oriented, this capability could be a way to deliver 3D experiences within enterprise web apps for manufacturing or logistics. Those PlayStation VR2 Sense controllers are not just for gaming; with 6 DOF and multiple buttons and joysticks they are also useful for modeling and manipulating virtual objects in enterprise workflows. Logitech Muse support is included for those who want a pen-style controller. Finally, Apple has given Persona a makeover: meetings with someone’s avatar no longer requires a side trip to the uncanny valley.

In all, VisionOS is a huge leap forward. Now all Apple has to do is make the hardware more comfortable, more affordable, and more face-friendly ahead of Meta, XREAL, Snap, and Samsung.

One More Thing: Farewell Intel

As anyone who had a Mac in the 1990’s and 2010’s knows, Apple doesn’t support backwards compatibility on old platforms for long (that’s more of a Microsoft thing, and even they move on eventually). Apple’s transition away from Intel to its own silicon is now complete, so MacOS 26 “Tahoe” will be the last to support Intel Macs. Apple will provide security updates for another three years, but Apple is pushing developers and consumers alike to focus on the future. It’s hard to fault Apple here. With the move to Apple Silicon and excellent software translation layers, Apple has actually stuck with Intel longer than its historical average. There will be some pain amongst corporate Mac Pro users and budget-constrained consumers, but technology purchases are never timeless, and today’s M4 Macs are excellent values.

For Techsponential clients, a report is a springboard to personalized discussions and strategic advice. To discuss the implications of this report on your business, product, or investment strategies, contact Techsponential at avi@techsponential.com.