In the “free idea for Apple” spirit of my SwiftScript post from awhile back, I want to share a vision for the kind of AI-powered product Apple could incorporate into macOS and iOS: Spotlight Bionic.
Spotlight was pretty revolutionary when it was first announced back in Mac OS X Tiger. With really good indexing technology Spotlight made it possible to do full-text search on basically your entire Mac, and it was fast.
Apple has augmented Spotlight with some extra functionality over the years, but fundamentally the core Spotlight feature is pretty rudimentary. You can search your entire Mac for things, but you have to be searching for pretty much the exact thing that’s in the files in question to be able to find it.
Throw a sophisticated LLM-based AI into the mix, though, and that changes the game substantially. You’re still ultimately querying the same dataset (i.e. the data on your Mac), but an LLM lets you query the data like an actual human being.
One of the early ways we could explain the magic of computers was stuff like full-text search. You could be editing a big text file, trying to remember where you used the word “blue,” and because computers are fast, you’d be able to find the word almost instantly. Originally this was very inflexible and you needed to search for the exact string, but it was still a big leap forward and search has gotten more flexible as computers got faster, allowing you to make “fuzzier” queries that may not be the exact matching text but are close enough that the computer can find the match anyway.
AI brings us to a point where we can do that same kind of searching, but without the “you need to search for the exact thing” restriction. And that’s revolutionary.
That means, for instance, you could ask Spotlight Bionic for all of the emails from your mom where she sent a recipe, even if your mom didn’t specifically ever use the word “recipe” in these messages.
We already have machine learning powered image search in the Photos app, but Spotlight Bionic could supercharge it, letting you ask “what’s that Keynote presentation I made that had the Simpsons meme in it?” (As if I would have just one…)
Traditionally we’ve worked to overcome computers’ rigidity on search by attaching metadata to files, or by putting files in folders with a strong organizational structure.
But that strategy has limits. Maybe as a high school student you keep all your American Lit book reports in a folder of that title, but if you failed to one time and you don’t remember what you named the file, you might not be able to find it unless you had something concrete to go off of. There’s nothing wrong with that, but AI can help you find stuff without a need for that. Maybe it could even help you see relationships between different files on your computer! Imagine how handy it might be if you’re writing an essay on banking regulations, and you know you’ve got some prior work where you wrote about similar stuff, and Spotlight can just show you those documents.
AI search can get more capabilities too, if it’s trained to understand the relationships things on your computer have with one another. Maybe you want to do a search for recipes from your mom, but you only want it to show you recipes where you replied to the email and said you wanted to make that recipe. In theory, a robust enough email app could give you a querying language that could let you search for specific emails, like ones from your mother, then from that search show you a list of your replies, but it would be a complicated query (and without AI you’d still need to query for words you know the emails have, like “recipe”, for instance). But if you give the AI the ability to understand and see those relationships and map those relationships to human language, then this becomes something a computer can easily automate.
Apple’s really well-positioned to offer this, too. Apple Silicon Macs have chips specialized in machine learning, so all of this can be indexed by your own Mac instead of in the cloud, which is a win for privacy and it’s a win for Apple because they aren’t paying for all the server infrastructure that would be needed for that.
Thinking of this kind of AI as a search capability is still just scratching the surface of the capabilities it unlocks. If you generalize this AI to “a general-purpose way to translate human language into something a computer can execute” (which is essentially what it would be doing when you make a search query; it’s just executing a search), you can actually start to talk to your computer like a human and have it perform complex tasks for you. Siri goes from being a tool that can set timers and give you sports scores to being able to take those recipes from your mom and build a Pages file of a cookbook that you could then edit and send to have printed as a gift to family members.
At Microsoft’s Build conference this week while showing off features like this, Satya Nadella said that with the help of AI, every user could become a power user, and that’s really the thing that excites and energizes me about adding AI into existing products. If we can make AI good enough (and this is still a big “if”; AI demos today are super rough), we will be living in a world where my specific skill of being a computer power user might become somewhat obsolete (or radically evolve). I’ve always been able to do cool and useful things with computers because I took the extra time to be able to work with my computer and communicate to it what I need, and the computer can reward me with productivity. But soon we might be able to cut that middle man out, and people will just be able to ask their computers and devices what they need and the computers will do the heavy lifting of figuring that out and translating it to computer-speak.