How Apple Intelligence can take over the world (or just the Apple ecosystem)

Macworld

If you didn’t notice, Apple Intelligence is here. But it would be hard not to notice–Apple is marketing its new collection of AI features everywhere it possibly can, from television ads to its website to every single product announcement it’s made in the last two months. (Each video announcement Apple released last week–iMac, Mac mini, and MacBook Pro–featured an original, extended Apple Intelligence segment.)

But to say that Apple has gone all-in on Apple Intelligence wouldn’t be quite true. Yes, the Big Three are covered: iPhone, iPad, and Mac. But Apple makes many more devices than just those three! This year, understandably, the company is going to be focused on getting as many AI features up and running on the Big Three as it can. But sometime soon, probably next year, Apple is going to need to roll out a strategy regarding everything else in its product line-up.

How’s it going to manage that?

Apple TV

The current Apple TV is powered by the A15 bionic chip and 3 or 4GB of RAM. That won’t do at all, but it’s not unreasonable to imagine that a future Apple TV could be upgraded to use either the A17 Pro chip found in the iPad mini or the A18 chip found in the iPhone 16 and 16 Plus. And while getting the RAM up to 8GB will cost, Apple seems to have finally accepted that in the Apple Intelligence era, all our devices will need more memory.

So it seems fairly easy for Apple to build an Apple Intelligence-capable Apple TV box. The question is… what does that mean?

Clearly, an improved Siri will be a winner on Apple TV. Siri in iOS 18 is still a work in progress, with an improved understanding of verbal commands but not much better in terms of results. Still, Apple says that those improvements are coming between now and the middle of next year. Every Apple TV ships with a remote with a Siri button; it would be gratifying to be able to more easily navigate the thing with a faster, more responsive, and more intelligent Siri.

One of Siri’s forthcoming new features will be the ability to see what’s on-screen and act upon it, which might be a great way to let you ask questions like “What movies has that guy in the hat been in?” and actually get a good answer. And, of course, if Siri’s forthcoming ability to control individual apps were to be applied to the Apple TV, it might be easier to command apps to open and play the right episode of the show you’re watching.

Apple TV also comes with a version of the Photos app, which hasn’t really been updated in the past year. But if it were to gain some of the search and video-generation powers of the Photos app on the Big Three, you could generate video slideshows from your photo libraries right from your sofa by talking into the remote. Or, at the very least, see instant results to photo search queries.

Jared Newman/Foundry

I’m not a huge fan of notifications on the Apple TV, but I do use them occasionally. I think it would be cool to use Apple Intelligence to be aware of alerts happening on your other devices and use the new Reduce Interruptions focus mode to decide if they might be worth showing to you while you’re watching TV.

Finally, the Apple TV hardware is also a hub at the center of Apple’s smart home strategy. I could see a souped-up Apple TV using Apple Intelligence to be a better coordinator of the devices in your home. Maybe we could generate full home automation via voice command? It’s worth considering.

Apple Watch

It’s harder to imagine that the Apple Watch will get the horsepower to run the full version of Apple Intelligence anytime soon. It just requires too much processor power and memory, and the little Apple Watch and its tiny battery aren’t going to be capable of that for a while.

But that’s fine. We should want the Apple Watch and watchOS to become better clients for Apple Intelligence running on the iPhone connected to the watch. Last year, Apple upgraded the Apple Watch to process Siri commands on the device, which was a huge boost. The next step is to update watchOS to determine if a request could better be served by Apple Intelligence and then pass those requests off to a properly smart device, whether it’s the iPhone or even an Apple server.

What’s the use case for Apple Intelligence on Apple Watch? Really, I think that it’s all Siri-based, namely the ability to get better responses back on your wrist. But I’m also interested in the idea that as our iPhones get smarter, the Apple Watch could tap in on the power of individual iPhone apps without having to run those apps itself. Imagine asking your Apple Watch to grab some information from an iPhone app, or even orchestrate two apps together, to bring you a result. If you can leave your phone in your pocket–or even leave it behind at home–and have that still work, that would be pretty awesome.

HomePods and beyond

Like the Apple TV, I can imagine Apple building a new generation of HomePods that are properly equipped for Apple Intelligence, as well as additional HomePod-like products like a rumored HomePod with a screen. Again, these are products that are driven by Siri, so letting Siri be much more intelligent and giving it the power to summarize information found on the Web would make things a whole lot better. A device that knows it’s just a speaker and has no display, being able to reply with an answer that requires summarization and interpretation (including images!), would be a big upgrade over the fairly dumb HomePods of today.

Apple Intelligence also needs to be able to build good Apple Music playlists on the fly, right? Not just on the HomePod, but everywhere. It should be smart enough that I can use my voice to rearrange playlists, remove items from the queue, and more.

And as with these other devices, HomePods need to become better stewards of my entire device ecosystem. If I’ve got an important alert on an iPhone, my devices should be able to understand that the iPhone’s not with me and that I’m listening to something in the kitchen, and let me know that the important alert has come in. I know, I know, all the data being processed on my devices is part of Apple Intelligence’s appeal–but once the processing happens, I’d like my devices to talk to each other and do the right thing to get important information to me, wherever I am.

Dominik Tomaszewski / Foundry

AirPods and the future

Finally, AirPods! They’re tiny, I know, and unlikely to be anything but a recipient of Siri conversations from other devices. And yet… some reports suggest Apple might be building cameras into future AirPods models. One possibility is that AirPods could become the eyes for an iPhone that’s otherwise tucked away in your pocket, feeding images to Apple Intelligence for a version of the new Visual Intelligence feature premiering with iOS 18.2. Even better, if Apple made a pair of glasses with cameras and built-in AirPods in the style of Meta’s Ray-Ban specs, your iPhone could also use those as its eyes and ears when it’s in your pocket.

The future of Apple Intelligence is mostly unwritten. There’s a lot more for Apple to do just with the Big Three product lines. But it can’t leave its smaller devices and accessories behind. They will be important tools to feed data to Apple Intelligence and extend the intelligence of Apple’s platforms to the rest of our lives. I hope we’ll get a first sense of that future sometime next year after Apple’s built a stronger foundation with all of its forthcoming iOS 18 and macOS 15 releases.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top
0