We write like an AI even when we’re not using AI to write.

blog@dws.team
December 16, 2025
about 17 hours ago
We write like an AI even when we’re not using AI to write.

Already, there’s a marked change in writing style, influenced by AI and autocomplete.

I use Apple Notes to write the initial versions of my posts. Mainly because I can then write on the go, or as I am now, early in the morning, in my bed, when most of my more commonsensical fellow humans are still fast asleep.

Many hate autocomplete but I always have it on. In fact, I’m looking at the little screen suggesting the next word more than what I’m writing.

How does autocomplete work and what does it have to do with AI?

Autocomplete is a real-time collaboration with AI, blending on-device efficiency and cloud-scale intelligence to reshape how we write, even in apps like Apple Notes.

It’s powered by an LLM, well, remove the first “L”, of Large. It’s a lightweight versions of a language model. 

Because you need the response to be near instantaneous, the core model is on-device. Trained on generalised text corpora but fine-tuned to your personal vocabulary and style, this local model combines privacy and speed. Analyses context, also semantic context, to suggest the next word or phrase.

But the heavy lifting happens in the cloud: pre-trained models like those behind LLMs have ingested  petabytes of text, learning statistical patterns, semantic relationships, and stylistic nuances. These cloud models periodically update your local version, refining its predictions with broader linguistic trends.

As I write, my partial sentence gets encoded into a vector space, where the model calculates probability of the next token. 

That makes it more than autocomplete. It’s actively suggesting how my sentence should continue, and how it should end.

Critics call it homogenising. I call it augmenting. I call it another step on the path set in by one of my heroes, Douglas Engelbart, whose project was aptly named Augmenting Human Intellect.

But doesn’t autocomplete, and worse, having AI write your stories, make everything look alike, and humans more stupid? 

Who needs an internet where more than 50% is written by AI, some without any human intervention?

Generic phrases are accepted 20–40% of the time in emails says Google, and according to Stanford, heavy use can shrink vocabulary diversity by 12–18%. 35–45% of web content now involves AI (The Atlantic, 2025). 3–5% is fully machine-written.

Studies show a 9% drop in unaided writing skills for those who over-rely on it (Nature, 2025). That’s people who’ve been using AI so much they’re losing the capacity to write by themselves. With a pencil and a notebook.

Readers can spot AI sludge, too: engagement plummets 40% when they do (Pew, 2025).

These studies mirror what many consider pure common sense. To have faculty we need to train, if we use AI we grow lazy, and dumber.

But aren’t we overemphasising individuality and underemphasising the power of the collective?

I write for many reasons but among them is that I want to be understood. And so I try to write in concise sentences, in which I state what I want to say as clearly as I can. 

And I also try to create a compelling story line, for that I take the famous lectures given by Kurt Vonnegut to heart. If you don’t know about them, you’ll most certainly know the fairy tale of Cinderella. He calls that a “rise-fall-rise” story.

Vonnegut’s lecture on the "shapes of stories" shows that the most enduring and endearing tales follow universal templates.

The fact that these stories work, and work across the ages, is saying that we have more in common than that separates us.

Recently I read that humans mastered fire far earlier than was thought. Which for me conjured up the image of people around the fire, sharing stories. Four hundred thousand years later, we’re still doing it. Only now we have AI to help.