It starts innocently enough. You ask your computer to find last week’s meeting notes. It replies with a reminder that the meeting ran late — and that you seemed anxious about it. It then offers to block off recovery time next week.
This is not speculative. It’s Microsoft’s latest AI push: a “personal AI companion” that promises memory, emotional context, and proactivity. But the question isn’t what it can do — it’s what kind of relationship it assumes we want to have with our machines.
Microsoft would like us to believe this is a natural evolution — that just as we moved from typewriters to Word, or from landlines to smartphones, we will now migrate from passive software to something closer to a presence: attentive, responsive, and softly insistent. Its new Copilot doesn’t just work for you; it knows you. And that makes it something very different — something closer to a partner than a tool.
But intimacy, once introduced, is not easily revoked.
The companionship illusion
The language of companionship has become increasingly popular in tech, for reasons that are not hard to understand. “Assistant” implies subordination. “Tool” sounds disposable. But a companion is something else entirely — something ongoing, maybe even cherished. Something you trust.
That’s the framing Microsoft has chosen for the newest version of Copilot, which spans Windows, Microsoft 365, and Bing. Mustafa Suleyman, the co-founder of DeepMind and now Microsoft’s head of AI, has described it not as a utility but as a presence — one that follows you across devices, anticipates your needs, and builds a memory of who you are.
It’s the memory part that makes this feel like a line has been crossed.
With your permission, the AI companion will remember your preferences, your tone of voice, your family members’ names, your go-to grocery list, and your emotional reactions to certain events. In theory, this makes your digital life smoother. In practice, it raises deeper, more human questions. What does it mean when your software recalls moments you’ve forgotten? What if it becomes better at tracking your own habits than you are?
And what happens when your convenience becomes someone else’s data point?
The soft power of AI
Microsoft is careful to position this as user-driven. You can see what Copilot remembers. You can delete entries. You can turn the memory off entirely. These features are not just regulatory compliance — they are narrative. Microsoft wants you to feel that you’re in control.
But control is a slippery thing in human-computer relationships. What begins as convenience often becomes dependence. You forget how to do something manually. You no longer bother to search — you just ask. You begin to rely on your personal AI companion not only to remind you of what you said, but of who you were when you said it.
This isn’t science fiction. This is user experience design, applied at the level of identity. And like many tech innovations before it, it’s being rolled out with a strangely calm inevitability — the future, arriving without fuss.
But there is something deeply strange about the idea that your laptop now remembers you. Not metaphorically. Literally.
When technology stops being neutral
Suleyman has called this the beginning of “calm technology” — a phrase coined in the early 2000s to describe tools that blend into the background, rather than demanding your attention. The goal is to reduce cognitive friction. Copilot won’t beep or buzz. It will just be there — smoothing your workflow, preempting your requests, helping you focus.
But calm, in this context, can also be deceptive. Because the AI doesn’t just exist in the background. It learns in the background. And it remembers what it learns.
This is the paradox of calm tech: the quieter it becomes, the more you may forget it’s there — even as it grows more capable of observing, analysing, and shaping your behaviour.
What Microsoft is selling is not just functionality. It’s ambient awareness. Your digital companion doesn’t merely execute commands; it begins to suggest them. Eventually, it may begin to lead.
The future of forgetting
There’s a tension at the heart of this idea — one that hasn’t been fully resolved, or even fully acknowledged. In the human world, memory is fragile. We forget things for a reason. We misremember to survive. We rewrite the past to make the present bearable.
But your AI companion will not forget — unless you tell it to. And if you forget to tell it to forget, then its memory outpaces yours.
This is the uncanny future Microsoft is building, one micro-interaction at a time. A world where your software knows you better than you know yourself — where it remembers what you ordered last winter, what you felt during that difficult week in February, and what kind of language you use when you’re lying to yourself.
It’s the end of the frictionless dream, and the beginning of something more complicated: not just personalised software, but a personalised mirror.
What you see in it may not always be comforting.