Making the most out of WWDC25.
- Om Chachad
- Jun 25
- 8 min read
Updated: Jun 25
This year’s WWDC brought some of the biggest changes to Apple Platforms in years—with a completely redesigned and unified interface across all platforms. The event, although shorter than last year’s WWDC, was packed with significant improvements and upgrades to the user experience as well as the developer experience for these platforms. It was so jam-packed in fact, that I couldn’t even get the time to record a YouTube video (a now broken 5-year long tradition), but I spent that time instead making the most out of my time watching sessions, attending group labs, exploring the new frameworks and trying out the new betas.

Here’s a round up of I spent my WWDC and I how I made the most out of this week—something I didn’t get a chance to do last year.
June 9th: Installing betas and testing.
I installed the betas on my iPhone 11 and on my M2 MacBook (on an external drive) to give the new Liquid Glass UI a shot without risking my primary set up. My plan was to not install it on my iPad because I haven’t had enough storage to upgrade past iPadOS 18.1 and also because iPadOS 18.1 was the last OS to support the tweak which allowed me to use Stage Manager on my otherwise unsupported iPad Air 4—to my surprise, however, the iPad Air 4 along with every other iPadOS 26 iPad now supports the new windowing system AND stage manager. Truly a who would’ve thought moment.

While I still haven’t gotten a chance to try out iPadOS 26 myself—I’ve heard nothing but great things about it. I wasn’t all that impressed with macOS being demoed on stage, but actually using it on my own laptop was a whole different story: the Mac truly does come to life with the new liquid glass UI. Although, I will in fact miss the launchpad and the lovely detailed macOS icons—I guess we lose some and we win some. The newly designed Xcode—and a few other design updates—are going to take a minute to get used to, but these are changes I welcome wholeheartedly and I’m super excited about them. I’ve also been loving the tinted clear glass effect way more than I had initially imagined I would—it looks fantastic.
iOS 26 too, looks great. The way liquid glass reacts to your touch is something no video presentation can convey. There are also HDR effects at play throughout the system which makes it that much more impressive to your senses. I like the refreshed icons and refreshed UI across the system—although I’m waiting for some of the key apps to actually be redesigned (and not just recompiled) for the new OS. Safari’s redesign has been such a pleasant change, though. I often remember the iOS 15 Safari UI that never ended up shipping, but I’m glad it’s making an even stronger comeback with the new release—while still maintaining the option for alternate appearances.
I also recompiled OneTap, Linkeeper, and snApp, and was surprised to see how well they automatically adapt to the new Liquid Glass UI—one of the biggest benefits of sticking to native SwiftUI components. Some places in the user interface do require a fundamental rethinking to fit the new design language, though, and that’s going to be the exciting part. Stay tuned for a fresh coat of paint on not just my apps, but also every app you use on your phone later this year.
Making sure I write “26” after each version number has been quite a change of habit, though. I remember coming back from my “power off week” break and everyone on Twitter was talking about the rumored “macOS 26”. I almost thought everyone had their fingers slip off from 16 to 26 (I’m not kidding when I say his was my train of thought 🤣). Anyways, probably a net positive change at the end of the day—but it sure does feel weird to have OS26 being introduced at WWDC25, and jumping from visionOS 3 to visionOS 26 is perhaps the worst of all.

Also, I had almost forgotten about this. OneTap--an app I've built completely myself for a freelance client--was featured on the app wall at WWDC. This was a huge surprise to our team. 🤯
June 10th: Binge-watching Sessions
One of the biggest changes to WWDC this year was the release of all sessions at once on the second day of dub dub. In previous years, we’ve had to wait a few days to get access to videos about topics we’re interested in, but having access to all videos at once has allowed developers to focus in on certain areas they’re truly excited about—without having to wait. I’ve been binge watching a lot of these sessions, and I still have a lot more of these to go, but watching the ones related to the philosophy of the new Liquid Glass design were quite interesting. I’ve already cleared out a huge number of them from my watch list, but there’s still so much more to explore—from Swift Concurrency to MLX.
June 11th: Foundations Models
Foundation Models were a big thing this year. They’re surprisingly capable and easy to use. The Foundation Models Framework (FMF for short) is really well built and allows for a multitude of use cases. I’ve been thoroughly impressed with how easy Apple has made it to implement tool calling and structured outputs while also giving us access to a relatively performant on-device LLM that can be universally accessed by all your apps. I was pleasantly surprised by the Transcript feature provided by the FMF and the free response streaming support that’s included—so much so that I couldn’t resist building a chat app.
This is a fun little project I created for now. I want to add more stuff here to make this a nicer experience but I might make this open sourced very soon. I’m calling it “Pineapple” ;)
I’ve been trying to come up with different use cases for the FMF, such as a personalized onboarding experience. The fact that it’s only restricted to Apple Intelligence-compatible devices is unfortunate—but you must check out Rudrank Riyam’s FoundryKit: a truly impressive FMF-like wrapper for MLX that can mitigate this issue.
June 12th: visionOS x HomeKit, Apple Merch.
After a Spatial Call on June 11th with Phil Traut—a fellow Vision Pro enthusiast—I couldn’t resist installing the visionOS 26 beta on my Vision Pro. I had been avoiding installing betas for a while because there is no official support available in India since the Vision Pro isn’t sold here—but the features were too enticing to ignore.
Personas got a HUGE boost on visionOS, and they're quite frankly even more impressive than they already were. The detail on the facial hair and the fact that it syncs in real time with my physical lip movement is a huge improvement. I tried them out on a Spatial Group FaceTime call, and it was nothing short of mind-blowingly impressive.
Widgets look really pretty on visionOS, and I wish these had been here from day 1. They also support wall and object occlusion, so they stay in the room you put them in, and you can’t see them once you’re no longer in the room. You can also now lock apps to your walls, and visionOS will relaunch them across Vision Pro reboots—which unlocks so many new possibilities. I was able to bring to life an idea I had for the longest time: a HomeKit Switch for my AC that can allow me to control it simply by looking at it. I created a super simple MVP and posted it on Twitter—and looks like people really liked it:
I want to shout out Sarang Borude—whose posts about visionOS and HomeKit inspired me to create this for my AC.
While I didn’t get the chance to be at Apple Park myself, I had someone bring these for me all the way from the Cupertino Apple Park Visitors Center right in time for WWDC week. It was super nice to finally own an Apple T-shirt. Also, I had no idea these cards were a thing.
June 13th: App Intents & Siri Shortcuts
After watching Ayaka Nonaka’s talk on Shortcuts and App Intents, I was rather impressed with the capabilities of the “Use Model” action in the Siri Shortcuts app and the ability to tap into Apple’s Private Cloud Compute model via Shortcuts. The action supports structured output (and INPUT!!) even inside Shortcuts, as well as the ability to passthrough in your own App Entities through these models and have them manipulate or filter your entities based on natural language input. Truly a gem of a hidden feature. And yes, structured output and all these benefits are passed on even when you use ChatGPT as the provider which makes it so much more powerful.
Krish Shah and I hopped on a fun WWDC discussion call, which lasted way longer than we expected after we decided we wanted to try pushing the Shortcuts App and the “Use Model” action to its limits. We created a fun little natural language-based search shortcut which passes in all of the user’s notes into the “Use Model” action and finds the ones relevant to what the user describes with natural language. While we’re trying to find ways to make this even more powerful, we’ve discovered some interesting things:
The Notes action works best for fetching upto 400 notes at once.
You can pass all the notes into the Use Model action and ask it to filter them out, and it’ll return a [filtered] set of notes that link to your existing notes.
The notes that are passed into the Use Model action don’t contain the whole note body—just the title and the summary. (The summary is the two-line preview you see on the Notes app sidebar)
For large queries like this, selecting the ChatGPT model works the best.
I hope Apple’s claims about OpenAI’s service being private are true—because otherwise I’ve hastily given unrestricted access to all of my notes to them. 🙈
June 14th: Group Labs, Binging & Tinkering.
I spent my nights throughout the WWDC week (not just the last day) attending Group Labs. These were new this year, and I found out about them thanks to Ashok Prabhu’s tweet. These were a great way to ask questions and get answers directly from Apple Engineers who built these tools and frameworks. I really enjoyed some of the sessions—especially the ML Frameworks one which gave us so much insight into the inner workings of the FMF, including it’s [then unknown] context window of 4096 tokens. Some of the sessions got a little too technical for me, but that’s okay—there was so much to learn!
In case you missed the sessions, @samhenrigold on Twitter posted some really nice AI-summarized recaps and transcipts for some of the sessions. I'd recommend checking them out if you have any questions or doubts for those topics, your question was probably answered:
I spent the last ‘dub dub day’ watching a few more sessions and poking around in Xcode 26 to find ways to bring my ideas to life, or enhance my existing apps with what’s new. I’m super excited about when I finally get to release the update later this year.
Your annual reminder to file feedback.
If you’re on the betas, make sure you’re filing feedback. I already have reported a few bugs here and there! There’s no better time than now to write feedback for those bugs you’ve seen—each day that passes is going to lower the chances of your feedback being noticed because of the sheer volume of feedbacks that accumulate.
Parting words.
I probably missed out on writing some of things I did during this year’s WWDC week—I’m writing this blog post while I’m on a vacation at 11:50PM, haha. I wish I could’ve attended the watch party in Mumbai—but at least I joined in on the live stream for the session that day. At the end of the day, I feel like I’ve made the most out of WWDC this year—as much as I can while I’m attending online anyways, making up for my laziness last year, haha.
There's definitely loads more to explore and learn, and I’m going to keep tinkering and exploring these announcements over the coming weeks and months.
I’m also super excited about attending the WWDC recap in Bangalore and meeting the local developer community on July 4th! So, see you at the Bangalore Apple Developer Centre in a week :D
Until then, maybe you could leave one of my apps 6 out of 5 stars. 🌟😉
Comments