Authored by Charles Hugh Smith via OfTwoMinds blog,
Given that AI is fundamentally incapable of performing the tasks required for authentic innovation, we’re de-learning how to innovate.
That AI is turning those who use it into dummies is not only self-evident, it’s irrefutable. ChatGPT May Be Eroding Critical Thinking Skills, According to a New MIT Study
“Of the three groups, ChatGPT users had the lowest brain engagement and ‘consistently underperformed at neural, linguistic, and behavioral levels.’ Over the course of several months, ChatGPT users got lazier with each subsequent essay, often resorting to copy-and-paste by the end of the study.
“The task was executed, and you could say that it was efficient and convenient,” Kosmyna says. “But as we show in the paper, you basically didn’t integrate any of it into your memory networks.”
AI breaks the connection between learning and completing an academic task. With AI, students can check the box–task completed, paper written and submitted–without learning anything.
And by learning we don’t mean remember a factoid, we mean learning how to learn and learning how to think. As Substack writer maalvika explains in her viral essay compression culture is making you stupid and uninteresting, digital technologies have compressed our attention spans via what I would term “rewarding distraction” so we can no longer read anything longer than a few sentences without wanting a summary, highlights video or sound-bite.
In other words, very few people will actually read the MIT paper: TL/DR. Here’s the precis: Your Brain on ChatGPT (mit.edu).
Here’s the full paper.
To understand the context–and indeed, the ultimate point of the research–we must start by understanding the structure of learning and thinking which is a complex set of processes. Cognitive Load Theory (CLT) is a framework that parses out some of these processes.
Cognitive Load Theory (CLT), developed by John Sweller, provides a framework for understanding the mental effort required during learning and problem-solving. It identifies three categories of cognitive load: intrinsic cognitive load (ICL), which is tied to the complexity of the material being learned and the learner’s prior knowledge; extraneous cognitive load (ECL), which refers to the mental effort imposed by presentation of information; and germane cognitive load (GCL), which is the mental effort dedicated to constructing and automating schemas that support learning.
Checking the box “task completed” teaches us nothing. Actual learning and thinking require doing all the cognitive work that AI claims to do for us: reading the source materials, following the links between these sources, finding wormholes between various universes of knowledge, and thinking through claims and assumptions as an independent critical thinker.
When AI slaps together a bunch of claims and assumptions as authoritative, we don’t gain a superficial knowledge–we learn nothing. AI summarizes but without any ability to weed out questionable claims and assumptions because it has no tacit knowledge of contexts.
So AI spews out material without any actual cognitive value and the student slaps this into a paper without learning any actual cognitive skills. This cognitive debt can never be “paid back,” for the cognitive deficit lasts a lifetime.
Even AI’s vaunted ability to summarize robs us of the need to develop core cognitive abilities. As this researcher explains, “drudgery” is how we learn and learn to think deeply as opposed to a superficial grasp of material to pass an exam.
“Unfortunately, this innovation stifles innovation. When humans do the drudgery of literature search, citation validation, and due research diligence — the things OpenAI claims for Deep Research — they serendipitously see things they weren’t looking for. They build on the ideas of others that they hadn’t considered before and are inspired to form altogether new ideas. They also learn cognitive skills including the ability to filter information efficiently and recognize discrepancies in meaning.
I have seen in my field of systems analysis where decades of researchers have cited information that was incorrect — and expanded it into its own self-perpetuating world view. Critical thinking leads the researcher to not accept the work that others took as foundational and to spot the error. Tools such as Deep Research are incapable of spotting the core truth and so will perpetuate misdirection in research. That’s the opposite of good innovation.”
In summary: given that AI is fundamentally incapable of performing the tasks required for authentic innovation, we’re de-learning how to innovate. What we’re “learning” is to substitute a superficially clever simulation of innovation for authentic innovation, and in doing so, we’re losing the core cognitive skills needed to innovate.
In following the easy, convenient path of AI’s simulations of innovation, we are indeed “carefully falling into the cliff.” But since this is all TL/DR, and there’s no summary, highlights video or sound-bite, we don’t even see it.
So here’s the TL/DR “dummies” summary of AI: AI is turning us into dummies.
* * *
Check out my new book Ultra-Processed Life and my updated Books and Films.
Tyler Durden
Fri, 08/01/2025 – 11:00
Click this link for the original source of this article.
Author: Tyler Durden
This content is courtesy of, and owned and copyrighted by, https://zerohedge.com and its author. This content is made available by use of the public RSS feed offered by the host site and is used for educational purposes only. If you are the author or represent the host site and would like this content removed now and in the future, please contact USSANews.com using the email address in the Contact page found in the website menu.