Last Friday, I got the opportunity to fly to Bengaluru for Apple’s “Explore the biggest updates from WWDC24” developer session at the Developer Centre there. This was my first time that I attended a developer event like this, and I am so glad I did. It was also Apple’s first time hosting an event like this outside the US, a step in the right direction into making WWDC truly a world-wide developers conference.
I went to the developer centre early in the morning, and met some fellow developers in the building’s lobby shortly before the Apple employees escorted us to the developer centre floor. I got an ID card with my name on it and a WWDC themed Bengaluru badge!! We had about a 3 hour long session discussing about the highlights from what was announced at WWDC24, and what Apple wants us to focus on for this year’s updates.
Key Takeaways from the sessions:
Apple Intelligence
While Apple Intelligence was only introduced towards the end of the WWDC24 Keynote on June 10th, it was the first topic that was mentioned on Friday, and understandably so; It really is Apple’s star of the show for WWDC this year.
Core Intelligence Features
It’s clear that Apple wants us to focus on these new updates and make our app feel integrated into the system’s Intelligence features like Writing Tools, Genmoji, and Image Playground. As shown to us at the event, all of these are really easy to implement, often requiring only a few lines of code to integrate, so there’s no reason to avoid these in your own apps. They’re also really proud of the Image Playgrounds feature, and it was referred to as “A consistent, easy to use, and playful experience.”
Siri and App Intents
If there’s anything you know anything about me, you know how much I love Siri Shortcuts and the App Intents framework. It’s great to see some exciting changes here.
With Apple Intelligence, Siri will be aware of everything that’s on your current screen when you invoke it. As discussed during the Q&A session, Siri has access to all the Text being rendered in your current view, so it’s instead of taking a screenshot of what’s on screen, it’s directly interpreting the contents. This awareness of on screen content will allow Siri to perform actions across apps and handle complex user requests and in order to allow your app to participate in such requests, you need to make sure your app has App Intents ready for the system to use.
While most other App Intents and Siri features this year are revolving around Apple Intelligence, they also highlighted how App Intents allow users to invoke your app through so many different places throughout the system, places like the Action Button, Apple Pencil Squeeze, Widgets, Spotlight, Control Centre and Focus Filters; so you really do need to consider implementing these in your own apps.
Translate API
Apple Translate is now available as an API for us to use in our apps. It’s something that should’ve been here from day one, but it’s always better late than never, haha!
Using third-party translation models within your own app can lead to redundancy if multiple apps on the system are downloading these models over and over again, resulting in inefficient use of storage space. The benefit of using Apple’s Translate API over any other third party solution is that it’s fast, entirely on device, free to use, and the translation models don’t have to be re-downloaded for every app and are shared across the system.
Vision, CreateML, CoreML
The Vision Framework now supports Swift 6 to avoid data races, and also adds detection for body poses with hands and an aesthetic score. CreateML, which allows you to train ML models for your apps, also gets some exciting updates like object tracking for visionOS, a time series model for time based predictions and a text classifier. CoreML will also be allowing you to deploy standard open sourced models of the likes of Llama, Mistral, Whisper, and OpenELM in your own apps, and Apple’s new quantisation methods will help you reduce the model’s size and time taken without a big hit in quality of responses of the model.
A key point that was consistently highlighted is the importance of thorough testing by developers to ensure that machine learning models produce dependable and high-quality outputs. It is crucial to avoid deploying models that may generate unreliable or subpar responses.
SwiftUI
SwiftUI brings in a lot of quality of life improvements, like the @Previewable macro, MeshGradients, .presentationSizing modifier for sheets, zoom animations for navigation transitions, ScrollView updates, UIKit/AppKit Interoperability for Gestures, and Updates to TextRenderer and Accessibility. All of these are welcome changes to the framework!
Health & Wellness
They brought this topic up because they (correctly) assumed that most people wouldn’t have gone through sessions about this, haha! There are new Mental Wellbeing APIs with GAD-7 and PHQ-9 assessments, State of Mind APIs to read and write data to the state of mind section in the Health app, and HealthKit also is now on visionOS. Apple also recommends you donate the users interactions with your app as journalling suggestions to make your app a little more integrated into the system. I'm also considering adding journalling suggestions to Linkeeper!
Apple Platform Updates
iOS 18
iOS 18 brings in a lot of new frameworks this year like the new customisation for Control Centre powered by WidgetKit, Live Activities, AccessorySetupKit, New Contacts Setup, SF Symbols 6, and more. SF Symbol animations can be used to enhance the user experience, but it’s important to note that they must be used sparingly.
watchOS 11
I don’t personally use an Apple Watch and haven’t dived deeper into watchOS development so I haven’t taken notes for everything that was discussed, but key highlights include live activities that are synced over from iPhone without the need of a watchOS app, and double tap gestures can now be used in third party apps to perform on screen actions.
visionOS 2
visionOS brings in TableTopKit and ML-powered Object Tracking. Enterprise APIs are also here and available with the enterprise entitlement giving access to the main camera, spatial bar code and QR code scanning, access to the Apple Neural Engine, increased performance headroom, and more.
Xcode 16 and Swift 6
Xcode 16 brings in smart code completion with an on device model that is trained based on code you’ve written so it can smartly and securely provide suggestions for code, and Swift Assist that will be coming later this year will allow us to bring our ideas to life faster by describing what we need with natural language.
Swift 6 brings in a lot of changes for data race safety, and it was adviced to not make a direct jump to Swift 6 and to instead do it one module at a time.
Updates from Third Party Developers:
On the stage, developers of apps like Swiggy, Signeasy, Gratitude, Calzy and Lumy talked about how they are planning on implementing these new frameworks into their apps and it was an insightful session to understand the thought processes of other developers and what ideas they have for their next big updates.
Panel Discussion from WWDC24 Attendees:
We also had a fun little panel discussion from developers who attended WWDC in Cupertino this year, and we got to hear from them about their experience and highlights of the event.
Developing beyond just Apps...
For the longest time, I have been known as a YouTube content creator, even though my main focus has shifted to software development over the past three years. Consequently, my social circle has been more centered around content creators than developers. However, after meeting and chatting with so many developers at the event, it finally feels great to be a part of the developer community and knowing so many more people who are in the same field as I am and connecting with people much smarter than myself.
The key takeaway from this event for me was that the true spirit of all these events lies not in the apps you develop, but the meaningful friendships and connections you develop at these events. There was so much to learn from Friday’s event, insights into how other fellow developers think, how they build their apps, and how they work. It was incredible to be a part of such an event and interact with so many people in one place. It took courage sometimes to go up to someone and start a conversation, but once you overcome the initial social anxiety, you realise that there was no reason to be anxious in the first place, everyone was incredibly welcoming and excited to share their own stories and hear others’ stories. I had some amazing conversations about other people’s apps and my own apps with a lot of folks there, including fellow developers and Apple Evangelists. You really do miss 100% of the shots you don’t take, and I made sure I didn’t hold myself back from interacting with anyone at the event.
It was also a little funny to see people finding it hard to believe that I was still a student, almost everyone asked me where I worked and I had to explain that I’m still a student and only 17, haha! But I’m grateful for the fact that they were equally welcoming and willing to chat despite my younger age!
Special Thanks!
I’d like to give a special thanks to Rudrank Riyam, a fellow indie developer, technical writer and public speaker. For me, this day wouldn’t have been as amazing as it was if it weren’t for him. I met Rudrank in April when he came to Mumbai, and we’ve been in touch ever since. When he told me about this event taking place in Bengaluru, I knew I had to be there. He warmly welcomed me to the Developer Centre lobby and helped me introduce myself to a lot of other developers there, pushing me a little further to really become a part of the community. I don’t think I would’ve found out about this event or have made the connections I did on Friday if it weren’t for him, and I am really really grateful to him for incredibly guidance. Thanks Rudrank, I know you’ll read this! ;)
I’d also like to thank the Apple Developer Centre Bengaluru team who made this event so special and conducted all those amazing sessions. I was also the youngest developer there and the only minor, and the Apple team was heartwarmingly welcoming and it was a joy to talk to them. Thanks so much to Ashok Sir, Sandhya Ma’am, Karthik Sir and the rest of the Apple team.
Conclusion
Friday was one of the most exciting days for me in a while, and I wished those 8 hours lasted for a bit longer, haha! I honestly could not be more excited to implement the new frameworks and APIs introduced at WWDC24 this year after attending Friday’s event. I strongly believe that one day I’ll look back on this day as one of the biggest moments in my early developer journey; the connections I made and the things I learnt were incredibly invaluable for me and I’m looking forward to attending more such events to engage with the developer community!
Comments