Cognitive Atrophy
From excess stimulation to the numbness of hyperconvenience.
Between notifications, infinite feeds, and ready-made answers, our minds have grown accustomed to never stopping. Thinking became scrolling. Remembering became searching. Creating became copying. Artificial intelligence, while promising relief from mental overload, has also sedated us with convenience. Gradually, we delegate to the machine not just tasks, but the very act of thinking. This is how cognitive atrophy is born: a silent numbness in which technological expansion coexists with the emptying of attention and critical thought.
What the Data Says
In the national survey by Talk Inc (2025), with over 1,200 people from all regions of Brazil, reveals the central paradox of our cognitive times. When asked about the effects of delegating tasks to AI, perceptions were divided between risks and opportunities:
- 62% associate AI use with mental atrophy and cognitive laziness.
- 53% are concerned about growing dependence on technology to think and decide.
On the other hand, many also perceive potential gains:
- 54% believe AI can increase intelligence.
- 56% see a boost to creativity.
- 62% highlight productivity gains.
This dilemma illustrates the symbolic dispute between two coexisting possibilities:
- The expanded mind, which uses technology to catalyze imagination, focus, and learning;
- And the atrophied mind, which outsources decisions and settles into algorithmic convenience.
(Fonte: Talk.Inc — “IA na Vida Real”, 2025)
Dimensions of Impact
The promise of artificial intelligence is one of expansion — of time, productivity, and the mind. But beneath the shine of this acceleration narrative, something silently contracts. Cognitive atrophy is not sudden loss, but gradual erosion of depth. The brain, shaped by millennia of stimulus scarcity, now finds itself flooded by a deluge of signals (notifications, alerts, feeds) that fragment attention, dissolve pause, and replace thought with reflex.
The dilemma between expansion and atrophy
If you’re panicking about the idea of atrophy, know that it’s not recent. Neurologists like Manfred Spitzer call it digital dementia — the weakening of basic mental functions caused by overexposure to screens and digital flows (Spitzer, Digital Dementia, 2012). What once required effort (remembering, calculating, imagining) has become outsourced tasks. It’s simple to remember: how many phone numbers do you still know by heart? How many routes can you retrace without GPS?
Nicholas Carr, in The Shallows (2010), describes the brain of the distraction era: unable to sustain concentration or prolonged contemplation, conditioned to jump between links and stimuli. Attention, our primary cognitive resource, has been converted into market raw material. The result is a state of continuous alertness, where mental rest becomes a luxury. Constant stimuli keep the brain in defensive mode, like a muscle unable to relax. This cognitive fatigue translates into anxiety, irritability, and loss of long-term memory. The mind, without intervals, loses the natural rhythm of thought, the gap between impulse and idea.
Digital amnesia and prompt-based thinking (Doomprompting)
If the first dimension is dispersion, the second is delegation. In 2011, Betsy Sparrow and colleagues identified the Google effect: we remember less of the information itself and more of where to find it (Science, 2011). Memory externalization became a habit, and with it, the outsourcing of reasoning. In AI-mediated environments, the act of thinking tends to convert into performance, a flow of prompts and responses that simulates cognition without effort. The sensation of productivity replaces the real work of thought.
The term “doomprompting” was coined to describe this phenomenon: users trapped in cycles of infinite refinement of prompts and AI outputs, confusing interface manipulation with intellectual creation. Recent reports from CIO (2025) describe doomprompting as an “interface addiction”: hours spent on adjustments and repetitions, without real cognitive progress.
The “attack” of cognitive atrophy here is not just lack of use, but misguided use. The authorial voice disappears, real reflection time shrinks, and what we see is thought performance, not real thought.
Permanent distraction and attentional fatigue
While we delegate memory, we also lose presence. Attention is now a battlefield: each touch on the phone is a micro-deviation from reality. Sherry Turkle, in Alone Together (2011), showed how hyperconnection creates affective isolation; we are always connected, but rarely present. Johann Hari, in Stolen Focus (2022), details the architecture of distraction: multiple open tabs, interrupted tasks, and the illusion of multitasking that reduces cognitive performance by up to 40%.
This permanent distraction has emotional and collective costs. The continuous fragmentation of attention reduces empathy, patience, and tolerance for ambiguity. The cognitively tired subject seeks immediate gratification, infinite scrolling as anesthetic. In this cycle, mental rest disappears and boredom, fertile space of imagination, is replaced by saturation.
Recent studies reinforce the severity of the situation: a review of nearly 100 studies (2000–2025) concludes that task-switching compromises long-term memory consolidation and increases residual cognitive load, reducing performance on subsequent tasks by up to 30%.
The Chinese study Exploring cognitive presence patterns in GenAI-integrated six-hat thinking technique scaffolded discussion: an epistemic network analysis shows that excessive use of generative AI can lead to cognitive atrophy when users become dependent on automatic responses and reduce independent reasoning. The same study shows that when used with structure and guidance, AI can expand critical thinking, stimulating comparison, questioning, and synthesis between ideas. The challenge, therefore, is to transform AI from a cognitive crutch into a reflective mirror, strengthening the autonomy of human thought.
Neural plasticity in dispute
The human brain adapts, but what happens when the environment is designed to capture attention and reward impulses? Catherine Malabou, in What Should We Do With Our Brain? (2008), calls this “destructive plasticity”: when neural malleability ceases to be emancipatory and begins to serve the system. We adapt to the rhythm of the machine until we confuse flexibility with submission.
This is perhaps the most subtle and dangerous dimension of cognitive atrophy: the point at which the brain reconfigures itself to survive in the digital environment, but loses the capacity to imagine outside of it. Plasticity, which should make us creators of new forms of thought, is captured by algorithms that anticipate desires and simplify choices. Byung-Chul Han in The Burnout Society (2010), warns: self-exploitation and excess positivity dissolve the space of the other and, with it, the possibility of transformation.
The risk is not only cognitive, but civilizational: by confusing efficiency with intelligence, we adapt to a form of thinking that no longer requires thinking. The challenge of the AI era is not to learn with it, but not to unlearn what makes us human: pause, friction, silence, and doubt.
Experts Weigh In
“With this, you realize, yes, that there is an attention deficit, that screen time, in general, is very high. I’ve had patients where I had to do screen weaning who spent 14 hours a day on screens. So, the risk is very high. And people don’t realize it; they are absorbed by that content and don’t notice.” — Christiane Valle, psychologist
“When you stop writing an email, the brain understands that it’s no longer relevant. So we dismantle that circuit. But it’s not just the email. You stopped remembering phone numbers. You stopped looking at routes to get somewhere. You stopped writing texts. You stopped doing research.” — Ana Carolina Souza, neuroscientist
“I’m very afraid of copy-paste… the last filter is always, always, always mine.” — Lúcia Leão, researcher and artist, cyberculture specialist
“Our brain… gets used very quickly to conveniences… before we would take a few minutes to think and create… today in seconds we already think ‘ChatGPT’.” — Yael, participant in the AI in Real Life research
“This convenience that we are led to love […] atrophies our brain plasticity and homogenizes our perception of the world […]. We see a super fascination — it’s fascinating to have a tool that will solve my life, nobody wants to think, everyone wants a button.” — Paula Martini, founder of Internet das Pessoas
“…from the moment you delegate [thinking] to a machine to do it for you […] we stop thinking, having critical thinking.” — Camilo Barros, futures designer
Critical Synthesis
Cognitive Atrophy is not the opposite of mental expansion, but its inevitable mirror. The same environment that offers tools to expand thought, creativity, and memory is also what weakens them when used without intention. AI accelerates both the power and the numbness of the mind: it can free time for deep thought or trap in cycles of convenience and dependence. As revealed by survey data from 1,200 Brazilians, 62% see risk of mental laziness, while 56% believe AI stimulates creativity, a portrait of our brain in dispute between reflex and reflection. The dilemma is not choosing between expanded or atrophied mind, but understanding that both evolve in parallel: expansion without criticism becomes automatism and criticism without expansion, paralysis.
Overcoming cognitive atrophy does not mean rejecting technology but rather relearning to think with it, not through it. We need to strengthen cognitive reserve, the invisible musculature of focus, curiosity, and imagination, through practices that restore body and rhythm to thought: starting tasks without AI, cultivating pauses and fertile boredom, training sustained attention, and creating spaces of silence away from screens. Mitigation is structural, not individual: schools, companies, and governments must create ecosystems that reward reflection time, creative error, and productive doubt. It is in this balance between human and machine, between remedy and poison, that cognitive sovereignty is rebuilt, transforming AI from crutch into critical mirror of the mind.
Learn More
Comentários da Comunidade
Seja o primeiro a comentar!
Adicione seu comentário