Earlier this week, Apple kicked off its annual WWDC with a speedrun keynote, half dedicated to artificial intelligence.
Boy, what a disaster.
In 20 years, I’ve never seen Apple seem so lost from a strategic perspective as of this week.
There are many ways to interpret what has been said during the event and after, each one readily supporting our own biases.
Below you’ll read a perspective on the strategy that I haven’t read anywhere else (yet).
Who could be interested in such a perspective?
Probably not Apple end users. To them, if Apple Intelligence sucks or not, it won’t make a difference. They’ll just stop using it.
But Apple shareholders, Apple competitors, and app developers might want to keep reading.
Before we dive in, though, I must disclose my bias on Apple, so you have the right context to frame the perspective. You can skip this part, but I think it’s a critical lens to judge the analysis that will follow.
What I think about Apple
I started my career in the Microsoft ecosystem over 25 years ago. I also used Windows during my teenage years, and that’s the platform I learned to hack best before I entered the professional cybersecurity field.
Over the years, I grew so frustrated with Windows that, as soon as I shifted my career from cybersecurity to virtualization, I switched to macOS for my desktop computer.
I’ve been on macOS for 20 years now, and I own a dozen Apple products, from desktop computers to tablets.
For a brief period of 3 years and a half, when I joined Gartner, I was forced to use Windows again. I hated every minute of it.
As soon as I left Gartner, I returned to macOS and I didn’t abandon it even during my 10 years in Red Hat.
In the last few months, because of all my research on Stable Diffusion, I’ve been once again pushed to embrace Windows. Since then, with great sadness, I noticed how little Windows has improved over the last 20 years, and how much worse it has gone.
Horribly worse.
As a result, I keep my macOS on my laptop and use Windows on my AI workstation only when I have no other choice.
Here are the three reasons why I deeply admire Apple and remain a loyal customer.
First: Apple is an undisputed leader in tasteful design.
Being cursed with some aesthetic sensibility, tasteful design matters immensely to me.
It’s not just about how things look. It’s also about what’s NOT in there.
Second: Apple is an undisputed leader in quality.
The company is not perfect, and its capability to deliver the highest quality is not consistent across all products and services. Nonetheless, I can find more high-quality materials and polish in Apple products than in any other brand I’ve ever used.
Third: Apple is an undisputed leader in removing friction from the user experience.
A core belief of mine, one that has remained constant for the past 25 years, is that companies win substantial market share when they remove friction from their products and services. And I dedicated a decade in Red Hat to advocate for this principle.
To me, the premium Apple asks for its products is the cost of developing tasteful design, relentlessly removing friction, and full dedication to the highest quality.
Given that these three things are top priorities in my life, the premium is fully justified.
If you have different priorities in life, of course, your perception of Apple will be very different.
I also admire Apple’s pillars of its innovation strategy:
- Be a true pioneer in R&D, but deliver only when a product can meet a high standard of quality (for an Apple 1.0 product).
- Enter the market after others, but deliver a much better product/service and UX.
- Relentlessly polish the product/service over the years, incrementally, until it’s perfect.
- Own the entire technology stack and process, so you can optimize and innovate at the integration level, too. Own your destiny.
It takes a huge dose of courage and extreme discipline to stay consistent with these pillars for decades. And I like people who are courageous and deeply focused.
I don’t think Apple is perfect, and I can cite multiple examples where I think the company is disappointing. But given that no company is perfect, all things equal, I’d prefer an Apple product over any other brand in almost every situation.
Now that my position on Apple is fully disclosed, let’s talk about the context in which Apple Intelligence arrived and then, finally, we’ll talk about Apple Intelligence and Non-Apple Intelligence.
Apple Intelligence’s Context
The crucial context we need to keep in mind when we look at what Apple announced at the WWDC can be articulated into three parts.
- Apple’s effort on Siri.
- Apple’s efforts on Shortcuts.
- Apple’s effort to position itself as an AI player.
Without this context, we are looking at Apple Intelligence in a vacuum and, blind to the failures of the past, we lack the tools to critically evaluate the potential success of Apple’s strategy.
Let’s start with Siri.
Apple acquired the company that built Siri in April 2010.
Siri was added to iOS in October 2011.
In over 14 years, Siri has not evolved at all, and most people across the globe use Siri as a vocal alarm clock. Barely anything else.
So, when Apple proudly tells us that they serve 1.5 billion Siri requests per day, you can’t help but think that those are, most likely, 1.5 billion requests per day to set a timer or an alarm clock.
I’m not the only one to feel this way:
14 years of R&D for a vocal alarm clock that never made any tangible progress.
This is a massive failure to deliver, even by Apple’s decade-long commitment to R&D standards.
Every company deserves a second chance, and every company can learn from its mistakes. But when you fail for 14 years in a row, you have to provide unequivocal evidence that you have learned from your mistakes and that this time is going to be different.
Apple shareholders and app developers should be very cautious in giving Apple Intelligence the benefit of the doubt after 14 years of Siri.
But there’s more to this.
Apple’s position is especially precarious because, in April 2018, hired Google’s head of search and artificial intelligence, John Giannandrea, to lead their AI strategy.
Not a random Google high-profile researcher. The head of both search and AI. Arguably, the biggest deal in the AI space at that time.
In the last two years, it has become finally obvious that Google might be exceptional at AI research, but it’s terrible when it comes to delivering AI products.
However, to figure this out, we didn’t have to wait for the unmitigated disasters of Google Bard and Gemini. Giannandrea joined Apple 6 years ago and, since then, his leadership has made absolutely no difference in terms of Siri’s capabilities and performance.
In fact, if anything, Siri got worse.
Every single day, including this very week, I find myself shouting at Siri up to three times, in an empty room, a foot away from my iPhone, before it finally activates and sets a timer for my meal.
Whatever Google’s AI top shot did at Apple, his organization managed to make Siri worse at the only use case it was useful for.
This particular failure became overwhelmingly evident a few weeks ago, when OpenAI introduced GPT-4o with the exceptionally expressive voice of Scarlett Johansson:
Meanwhile, Siri still answers your questions with three links to a web page and a shortcut to Google results.
The gap is embarrassingly evident.
Now.
Over the years, Apple did employ certain types of AI across various aspects of iOS, but always in a very tactical way, and almost always within the resource constraints of in-device computation.
- Photos can recognize faces and an increasing number of objects.
- Apple Keyboard can autocomplete words (but it’s still awfully behind Gboard after years since the latter hit the market).
- Notes can now autocomplete (very, very few short) sentences.
- AirPods can isolate background noise in a supernatural way.
and so on.
All these things are disjoined efforts to use machine learning and other forms AI in an opportunistic way where there’s a chance to improve the user experience.
But Apple has never developed a plan to deploy AI strategically.
Despite all the R&D and the top Google hiring, the company didn’t discover the building block for the ultimate AI assistant we saw in the movie “Her”. The one thing that could transform the user experience across all Apple products and services.
It took Google Research and, subsequently, OpenAI to show the world that AI assistants like Siri could be so much more than glorified alarm clocks.
And without them, Apple would still be sleeping on the job.
We’ll see why developing an overarching AI strategy is so important in a dedicated section further down.
For now, suffice to say this: Apple seems to have finally moved more because of external pressure than a realization of what might come next.
Another key piece of the context necessary to assess Apple Intelligence is about Apple’s efforts on Shortcuts.
Shortcuts is a (mostly) drag-and-drop tool that allows low-tech and no-tech users to automate tasks on their Apple devices without resorting to actual programming.
Any non-technical user could create a Shortcut that, for example, sends a message to a contact when he/she arrives at a specific location.
It’s a lovely technology with enormous potential that I (obviously) used for years.
Why are we talking about Shortcuts?
During the WWDC, Apple promised that Apple Intelligence would be capable of performing complex actions on the devices with a single command.
You can see it mentioned in this segment of the keynote at 1:09:59:
Now.
To perform the actions that Apple’s AI models decide to perform, those models need a way to execute. And, today, the way to automatically execute actions on Apple’s operating systems is Shortcuts.
Shortcuts is a technology that comes from a company called Workflow, which Apple acquired in March 2017.
In September 2018, Apple released Shortcuts as an app for iOS 12. A year later, Shortcuts became a stock app in iOS 13.
In the subsequent years, Apple has done very little to improve Shortcuts. A long list of stock apps and operating system features remained unautomatable after years.
App developers, initially enthusiastic about the technology, having seen so little commitment on Apple’s side, have made little to no effort to support Shortcuts in their apps.
The app evolved at such a glacial pace, and Apple demonstrated so little interest in it, that they didn’t release a version for macOS until September 2021.
Astonishingly, Apple did not integrate Siri with Shortcuts until September 2022, as part of iOS 16.
So, on one side, the automation engine that Siri should have relied on since day one has seen almost no progress in 7 years. On the other side, that critical integration only happened less than 2 years ago.
That doesn’t seem like a cogent AI strategy developed years in advance and delivered the Apple way.
More importantly, Apple’s disdain for Shortcuts over the years suggests that the company simply doesn’t have the mindset to support the foundation technologies that turn a large language model (LLM) into a large action model (LAM).
Even if Apple Intelligence will not use Shortcuts to execute actions, mindsets don’t change overnight.
The current track record should tell Apple shareholders and app developers to be extremely skeptical.
The third piece of context we need to discuss is Apple’s effort to position itself as an AI player.
By now, it should start to become clear that the AI strategy articulated by Apple during the WWDC doesn’t look at all like one developed years in advance.
Apple implied that its long-in-the-making strategy has guided the development of the Apple Silicon chips. You can hear the allusion in this segment of the keynote at 1:11:56:
There are plenty of reasons to be skeptical about this claim.
Muck more likely, Apple developed the M series chips because they wanted to enter the gaming market and, probably, develop a console that could rival the next cycle of PlayStation and Xbox.
The characteristics that make chips extremely good for gaming are incidentally many of the same that make them extremely good for generative AI.
But while Apple has given plenty of hints about its ambitions for the gaming industry over the years, the company offered zero evidence of any intention to dominate the AI space.
Quite the opposite.
There’s plenty of proof that Apple was not serious about AI until very, very recently.
In the last two years, since generative AI exploded with the advent of ChatGPT and Stable Diffusion, Apple has barely made a move to facilitate the use of generative AI software on its hardware.
Apple computers offer unparalleled performance since the advent of the M1 chip, and yet, they are the worst machine on the market to run generative AI workloads.
AI applications run from 3x to 10x slower (sometimes even more), on an Apple machine than on a Windows or Linux machine with an NVIDIA GPU.
It’s not for the lack of hardware capabilities. It’s because, for two years, Apple has done absolutely nothing to help software written for NVIDIA machines (where you use an AI development framework called CUDA) run faster on Apple machines.
And because Apple has done absolutely nothing to entice the AI community to write AI software for its own AI development framework (called Metal Performance Shaders or MPS).
And because, in the last few months, Apple has made things even more confusing by releasing a second AI development framework called MLX.
Absurdly, the researchers behind MLX expect that the AI community rewrites every single piece of software written for NVIDIA GPUs so that it can run faster on Apple hardware.
This includes recompiling every model that the AI community produces. It’s an unreasonable request, given that the AI community releases dozens of new AI models per month.
To add insult to injury, the MLX repository is not even under Apple’s official GitHub account, but on an account called ml-explore.
The AI community considers it a non-sanctioned experiment that is not part of Apple’s official AI strategy.
The bottom line is that Apple does not seem to care about supporting AI applications on its hardware, and this attitude has created a perception among practitioners (including app developers) that Apple hardware sucks at AI.
Which is true.
As things stand today, Apple machines are the last place in the world where you’d want to run an AI workload.
I know by first-hand experience, as I used Apple hardware to do generative AI for the last two years, and I had to move to a Windows machine with an NVIDIA GPU out of desperation.
This is not the posture of a company that has an articulate AI strategy developed years in advance.
Apple Intelligence
What you have read so far is the crucial context to keep in mind as we review the WWDC keynote.
A context that doesn’t fit tech press and AI influencers’ short memory and incentives. But a context that Apple shareholders and app developers should be keenly aware of.
Tech press and influencers only care about impressions and clicks. There’s no incentive to investigate and keep a skeptical attitude–core skills of a shareholder, a good analyst, or a market observer who simply pays attention.
This is why there’s a stark contrast between the overenthusiastic reactions you have read all week on social media, and this perspective.
But even without the context we spelled out so, there’s enough in the WWDC keynote to be extremely skeptical about Apple Intelligence.
The most important thing to pay attention to is that Apple Intelligence is split in two.
On one side, Apple Intelligence is about Apple continuing to tactically deploy AI across various aspects of its operating systems, as it has done in the past, but with a renewed focus on generative AI.
We’ll talk about this in a moment.
On the other side, Apple Intelligence is about Apple acting as a broker for third-party AI providers.
We’ll talk about that in the next section: Non-Apple Intelligence.
The first type of Apple Intelligence mentioned during the WWDC is the one that Apple will tactically sprinkle across its operating systems and apps, in line with what it has done in the last few years.
The simplest requests will be executed locally, on the devices that have the appropriate computational resources, and the most complex requests will be executed on specially-designed Apple servers.
Those simplest requests are taken care of by a 3B parameters model trained by Apple.
The most complex requests are taken care of by a model potentially based on Vicuna 13B, a model released by the AI community and, in turn, based on Meta LlaMA-2, and constrained by a commercial license covering less than 700M users.
This second model could be Ferret and its variant Ferret UI, but there’s no confirmation at this time.
The point here is that, despite years of R&D, top ex-Google hiring, and its almost-infinite resources, Apple is depending on research and tech developed by Google and Meta.
Which is the opposite of being in control of the technology stack and process. The opposite of the Apple way.
The most concerning part of the WWDC keynote is that the company doesn’t seem even remotely close to delivering the Siri enhancements its speakers are talking about.
In the segment below (at 1:18:50), pay attention to how many times, deliberately, the speaker uses the expression “Siri will” without ever showing anything.
The key expression in this clip is: “Over the course of the next year”.
To further validate this feeling, just a day after the keynote, we got confirmation that, by the launch of iOS 18 this fall, Apple Intelligence will only be available as a beta feature.
Somebody even suggested that Apple Intelligence might only be accessible via a waiting list.
Is this Apple or one of the dozens of scrappy AI startups that inundate social media every week?
It’s true that Apple has become more open to beta releases over the years but, in this case, it sounds like we might not even see a public release of Apple Intelligence before spring 2025 or even WWDC 2025.
This timeline seems at odds with a massive advertising campaign Apple launched after the WWDC to promote Apple Intelligence.
This ad appears on Instagram non-stop:
Nothing extraordinary about it per se, but why promoting Apple Intelligence so heavily when its official release to the general public seems almost a year away? What other reason if not to raise awareness that the company is finally doing something in the space?
Vision Pro was announced early, at a stage where the hardware is less than optimal, to give ample time to app developers to create content, which is critical for that platform.
Are we seeing the same approach here, or is Apple just acting under the pressure of showing something, anything, after 14 years of Siri failures?
During the keynote, at 1:39:21, Apple promised developers access to Apple Intelligence via SiriKit and the App Intents framework:
The key expression here is: “Starting with these categories.”
The risk here is that SiriKit and App Intents will go the way of Shortcuts, with Apple showing little commitment to expanding its reach over the years and developers stop caring about it.
How Apple invested in Shortcuts is deeply relevant here.
The second most concerning part of the WWDC keynote is that everything we have seen in terms of AI integration with the OS, at least on macOS, is already delivered today with third-party apps like MacGPT, BetterTouchTool, or even Grammarly.
You can see an example at 1:08:30:
Long-time readers of Synthetic Work certainly remember my endorsement of MacGPT and other AI-focused apps by Jordi Bruin. He was one of the first, if not the first, to introduce Apple Intelligence capabilities in macOS almost 2 years before Apple.
And he asks just one-off $25.73 USD (tax included!) for the privilege.
All of this is to say that whatever Apple is doing it’s not the game-changer that they are making it out to be.
Over time, Apple will might decide to integrate Apple Intelligence in iOS in ways that Jordi and other developers will never be able to. But, as of today, nothing of what we’ve seen is impressive.
The third most concerning part of the WWDC keynote is that Apple has not even tried to leverage its resources to push forward the state-of-the-art in generative AI already delivered by third parties.
For example, the use of text2image diffusion models to generate custom emoji (called Genmoji) and on-the-fly images in Messages conversations (AI Bitmojis) is subpar compared to what Midjourney, Stable Diffusion, or even OpenAI Dall-3 could do.
You can see a rather unimpressive example at 1:29:11:
The lack of effort in this area, knowing what’s possible today, is another suggestion that Apple has rushed a plan to embed AI in as many places as possible without a cogent strategy on how all these pieces will fit together.
The argument that some of these efforts are limited due to the decision to perform the AI inference on-device is a weak one.
Apple has never compromised in quality and now, all of a sudden, it’s late to the AI game with a service that is significantly worse than what’s already available on the market.
And all in the name of privacy, knowing full well that on-device inference has a non-negligible impact on the battery life of mobile devices.
If there’s a strategy behind this first portion of Apple Intelligence, it seems to be: “Convenience”.
As in: “Look, we give you the same AI things that have been around for two years, but with the convenience of them being available to you as part of the OS.”
Of course, only a microscopic fraction of billion of users with Apple devices bother to use Midjourney or install MacGPT. So, these features will be a welcome addition to most users, especially if they come free of charge.
But AI-generated Bitmojis are not enough to justify Apple’s jump in market capitalization we’ve seen this week.
At least for Apple shareholders, the short-term question ultimately boils down to: “Are more people going to buy 1,000 USD iPhones because they can generate custom Bitmojis and summarize paragraphs in Notes?”
The ultimate irony is that Apple thought it would be impressive to show Siri, now superpowered by generative AI, being used to…ask for the weather!
See by yourself at 1:17:04:
The company will have to do much more to counter the implications of GPT-5 and beyond. And this leads us to the second half of Apple Intelligence. Keep reading.
Non-Apple Intelligence
Apple recognized generative AI as a potentially transformative technology but, differently from every technology of the past, this is not one that can be fully controlled.
Apple can’t afford the reputational damage that Google doesn’t seem to care to avoid.
At the same time, Apple can’t afford to ignore generative AI.
The Apple way would be to spend years doing R&D and deliver generative AI when it’s ready. But the years of AI R&D have already gone wasted on Siri.
And so, now, uncharateristically, you have Tim Cook telling The Washington Post:
TWP: What’s your confidence that Apple Intelligence will not hallucinate?
Cook: It’s not 100%. But I think we have done everything that we know to do, including thinking very deeply about the readiness of the technology in the areas that we’re using it in.
So I am confident it will be very high quality.
But I’d say in all honesty that’s short of 100%. I would never claim that it’s 100%.
This reaffirms the fact that, when it comes to generative AI, Apple doesn’t seem in control, and it doesn’t seem confident it will ever be in control.
And if you are not in control, perhaps the best thing you can do is to be a broker for third-party AI providers who are in control, ready to cut loose the one that is potentially damaging your reputation.
That’s where OpenAI comes in.
People want to use ChatGPT on their Apple devices, and they want to see it capable of controlling the OS and the apps in a seamless way. So, the partnership in itself is good for the Apple user base.
The problem is how the partnership is presented and how Apple will move from there.
Apple proudly announced that OpenAI access will be free, and people across the globe amplified the message as if it were a big deal.
But OpenAI has already made GPT-4o available for free to everyone on May 13, 2024.
The only difference is that, when you use it via Apple, you don’t have to sign up for an OpenAI account.
And, just like OpenAI’s own offer, the usage of GPT-4o on Apple devices will be limited for free users.
If Apple users want to overcome the restrictions associated with their free account, they’ll need to pay the same subscription fee they currently pay to access ChatGPT Plus, and connect their OpenAI account to Apple.
Where’s the difference? Where’s the big deal?
Once again, much of Apple Intelligence’s substance is about something already available on the market.
But there’s much more to consider.
Apparently, neither OpenAI nor Apple paid each other for this partnership.
OpenAI got a phenomenal distribution on 2.2 billion devices and the most prestigious endorsement they could get in the IT industry at the moment.
Apple finally got itself the AI assistant that failed to develop for 14 years and, very likely, its usual %30 cut on every OpenAI paid subscription created through its operating systems.
It would be a fine win-win deal between the two best players in their respective fields.
But no.
Apple seems intentioned to play the AI broker role with Google, too, and probably others.
In an interview after the WWDC keynote, Craig Federighi, Senior Vice President of Software Engineering, said:
We think ultimately people are going to have a preference perhaps for certain models that they want to use, maybe one that’s great for creative writing or one that they prefer for coding. And so we want to enable users ultimately to bring a model of their choice.
And so we may look forward to doing integrations with different models like Google Gemini in the future. I mean, nothing to announce right now, but that’s our direction.
That’s truly at odds with Apple’s legendary and uncompromising commitment to quality.
If you read Synthetic Work long enough (or you have followed my facepalm updates on X or LinkedIn), you should be aware of a long, long list of embarrassments that plagued Google AI product releases in the last two years.
Why would Apple signal an upcoming partnership with Google when Google showed unforgivable incompetence in AI product development?
Perhaps, Apple doesn’t want to get into a dependency relationship with OpenAI like the one they developed with Google for search. So brokering AI from multiple providers is a way to keep some leverage.
If so, it’s concerning that Apple is seeking leverage by partnering with a company that performs so poorly in terms of AI product development instead of betting on its own R&D capabilities.
And so you see, the same theme comes up again and again, no matter what angle you look at Apple Intelligence from: Apple doesn’t seem in control this time.
In the short term, the most concerning part of Apple’s brokering role is not the leak of sensitive information to third-party AI providers, as some have vocally suggested.
It’s the awkward implementation.
As Apple showed during the WWDC keynote, every request to OpenAI (and future partners) will require explicit user permission.
You can see it in the segment starting at 1:36:21:
That’s the opposite of the Apple way: adding friction to the user experience to the point of making it unbearable.
How many times can users be asked to approve an interaction with the AI before they simply stop using it?
The implementation looks even more awkward when it’s compared to the increasingly seamless experience offered by OpenAI app on iOS.
ChatGPT for iOS recently gained the capability to listen for user requests in the background without keeping the app open. Which is fantastic and works flawlessly.
It’s day and night compared to the experience Apple showed during the WWDC keynote.
This is an extremely important point to consider.
As long as Apple gatekeeps the integration with the OS, its implementation of an AI assistant will remain superior to any third-party alternative out there.
For the same reasons, though, if Apple botches the implementation of its AI assistant, it will ruin the entire category of products for years to come: they won’t do good with its own AI assistant, and they won’t allow third-party AI assistants to have a fair chance to do better.
Here, one might speculate how long will it take for Sam Altman to start fantasizing about an Android phone (or a resurrection of Windows Phone) with unprecedented integration with OpenAI models.
Probably, he already started thinking about this long ago.
To follow up on the Apple shareholder question we asked earlier: “Are more people going to buy 1,000 USD OpenAI phones because they can have an AI assistant like in the movie Her?”
So, What’s Apple AI Strategy?
If you watch this interview of Tim Cook with Marques Brownlee, it’s hard to come away confident there’s one. Apple is always tight-lipped about its plans, but here Cook seems unprepared:
The message seems to be: “People were expecting us to do something with AI, so we did something.”
And that, if it’s correct, is a dramatic departure from Apple’s pillars of its innovation strategy I listed at the beginning of this editorial.
But let’s assume that it’s not correct and, as it often happens, we are underestimating Apple’s cleverness and foresight.
Let’s assume that Apple has a clear AI strategy developed years in advance.
Let’s assume that Apple has very deliberately chosen, as the best path forward, to be a broker for third-party AI providers, as we described in the previous section.
The problem is that such a strategy doesn’t seem adequate to dominate a market where a truly human-level intelligent AI assistant might permanently alter the user interaction with their smartphones.
Let me explain.
Just like brokering third-party search engines in Safari is not a search strategy, brokering third-party AI assistants is not an AI strategy.
The moat of a brokering approach doesn’t lie in the technology itself, but in the distribution reach that Apple commands.
Yet, at least other two companies have similar distribution reach on Apple platforms: Google and Facebook. And we saw both building the foundation for their AI assistants in the last couple of years.
Short of blocking user access to either ecosystem, Apple’s strategy as an AI broker has plenty of competition.
To understand that competition better, we have to imagine a future where AI assistants do actions on our behalf and our need for direct interaction with apps becomes less and less frequent and pressing.
If we can truly transition from an app-oriented interaction to a task-oriented interaction, then the apps don’t have to reside on the smartphone anymore. They can be anywhere.
And if they can be anywhere, they can live in Google or Meta datacenters, or somewhere else, and be accessed directly, and transparently by Google or Meta AI assistants.
In this scenario, Whatsapp (powered by LLaMA) or Google Assistant (powered by Gemini) become the gateway to access apps, and smartphones slowly turn into dumb pipes.
That’s the scenario I’d expect Apple shareholders to be concerned about and the one that Apple’s AI strategy should be designed to counter.
Of course, an optimistic Apple shareholder might expect this to be a stopgap measure until Apple can develop its own state-of-the-art AI assistant.
In that scenario, Apple could count on both the distribution reach and the technology integration to allow its AI assistant access to places that Google and Facebook AI assistants cannot reach.
But:
- Apple already failed to do so for 14 years with Siri. There’s no sustainable argument on why this time it would be different.
- Apple never developed its own search engine to replace the ones it’s brokering at a very high cost every year. There’s no sustainable argument on why it would act differently when it comes to AI.
In other words, historically, the more Apple deviates from its core competencies, the more it behaves in ways that seem counterintuitive.
No company is good at everything. Its DNA is defined by the people deliberately hired over decades, and it cannot be changed overnight, even with the support of the top leadership.
Sometimes, a certain type of mindset and attitude is so deeply ingrained in a company, because of the characteristics of its workforce, that it’s impossible to change it.
So, perhaps, Apple tried to develop a search engine and failed because it didn’t hire/attract/retain the right talents for the job. And, perhaps, it tried to develop a state-of-the-art AI assistant and failed for the same reason.
Perhaps, these things are not within Apple’s core competencies, and what we saw this week is the best plan they could come up with lacking those competencies.
In doubt, I’d look at the long-term future with much apprehension. An apprehension I never had about Apple before.
Will Apple Intelligence Help Us in the Workplace?
As a final question: will Apple Intelligence enable synthetic work?
We already know that we cannot count on Apple for office productivity.
It’s clear that the company has no interest in entering that market in a serious way and it’s happy to leave the opportunity to Microsoft and Google.
But generative AI is not just about productivity apps like word processors and spreadsheets.
If you are following the evolution of Adobe Creative Cloud, you know that generative AI is transforming Photoshop and Premiere, changing the way the creative industry works. And the creative industry is a key market for Apple.
So, a more precise question perhaps is: can we count on Apple Intelligence to boost our productivity in creative tasks?
What we’ve seen this week suggests that Apple isn’t interested in this either.
Sure. They might sprinkle some Apple Intelligence in Final Cut Pro and Logic Pro but, nothing we have heard so far suggests an overarching strategy like, for example, an omni-capable AI assistant living at the OS level that can help us in creative tasks whether you use Photoshop or Pixelmator Pro or Davinci Resolve or Premiere.
We only saw multi-million dollar AI models being used to ask Siri what’s tomorrow’s weather.
At the end of the day, after all of the things we have said, it seems clear that the one company that has a chance to win it all with AI is not Apple, and not Google, but Microsoft.
And I bet that, in Redmond, the remorse for killing the Windows Phone is eating the leadership alive.
If Microsoft could ever stop focusing on short-term gains and start focusing on building a long-term legacy, it could see a new golden age.