Microsoft’s Copilot + PC is a big deal from several aspects.
1. First, it was a bold decision to use ARM-based chips, departing from Intel. However, this won’t be a long-term move because Intel is releasing its AI chips called Lunar Lakes in a few months and will ship 40 million of them this year (source in comments). This arms race with ARM is good for the industry and consumers.
2. Microsoft understands that the future of AI is a hybrid model and that many AI applications should move to the edge. Otherwise, we will run out of resources.
3. Microsoft recognizes the benefits of edge computing. Although Microsoft is outperforming other giant cloud providers, cloud-only is not sustainable. hashtag#thefutureisEdge
4. For privacy, security, productivity, efficiency, convenience, speed, connectivity, and cost, users want local AI, and Microsoft understands this very well. A key factor is trust.
5. Microsoft bolsters its position as the leader for AI agents. Trust plays an important role in this dynamic. This is Satya Nadella’s vision: “I think we’re in the very early stages of understanding how our relationship with AI agents should be shaped by us primarily because that’s the only way to build trust. It’s not somebody else’s decision, a vendor’s decision; it’s a personal and maybe a spiritual decision of how we want to interact. That’s why I don’t believe there would be just one AI agent, because I may have multiple agents that I want to delegate different tasks to.”
6. Personal computers should be secure, AI-first, hyper-personalized, and privacy-first. They will be a modality of interacting with AI agents. Satya says: “The future of the computer is a computer that understands me versus a computer that we have to understand.”
7. Microsoft is incurring billions of dollars in losses due to the cost of Copilot’s inference on the cloud. That’s a big bottleneck. Now, that cost will be paid by the user in the form of hardware AND electricity. Even if Microsoft gives the hardware for free, it’s winning.
8. By pushing inference to the edge, at the AI PC, Microsoft can use the cloud only for training, freeing up the resources needed for large-scale multimodal training, also known as AGI (Artificial General Intelligence). This may be the most important point.
Well played, Satya and the Microsoft team, Vivek Pradeep! It seems all the stars are aligned with Microsoft’s vision. Yet, do not underestimate Intel. Intel’s AI hardware is very well positioned.
My prediction: Microsoft and OpenAI will have AGI by the end of 2025. I humbly ask for mercy and for them to slow down! hashtag#globalpauseonAGI