Is AI eroding our critical thinking?

Ultimately, the choice rests with each individual: whether to take the convenient route of allowing AI to handle our critical thinking, or to preserve this essential cognitive process for ourselves.

Ultimately, the choice rests with each individual: whether to take the convenient route of allowing AI to handle our critical thinking, or to preserve this essential cognitive process for ourselves.

In a series of experiments described in Science Magazine in 2011, a trio of researchers found evidence to support a sneaking suspicion bubbling up in the minds of many Google aficionados: Frequent users of internet search engines didn’t retain information gleaned through online sleuthing as well as individuals who learned information offline. As the scientists posited, it was almost as if people and their computers were becoming “interconnected systems,” with basic memory functions delegated to search engines.Next Video

Over the ensuing years, this “Google effect” has taken on a new moniker: “digital amnesia.” It’s the tendency to forget information that can be found readily online via search engines. After all, why would the brain waste resources storing information available at the click of a button? Instead, people better recall how to access the information.

Google is now the most trafficked website in the world, and the internet itself is the definitive repository of human knowledge. If the brains of Googlers and their invaluable search engine were interconnected back then, they’ve essentially merged now. The myriad effects are up for speculation.

Now, artificial intelligence is here to supplement — or even supplant — human cognition, potentially redefining and broadening digital amnesia. From digital assistants to ChatGPT to Google Search’s AI Overview, AI is planning our days, doing our work, and answering our questions. In essence, it is thinking for us. Will this new relationship reshape how we think and behave? Research is underway, with much of the focus on critical thinking.

AI is increasingly thinking for us

Critical thinking is the ability to analyze, evaluate, and synthesize information to make reasoned decisions. People who rate highly in scientific measures of critical thinking get better grades in school, are more adept at their jobs, and are less susceptible to manipulation.

Critical thinking can seem nebulous and insignificant, but that impression couldn’t be farther from reality. For example, when you compare different loan terms for a house or scrutinize car insurance rates, you’re thinking critically. When you dig into the meaning of a poem, book, or work of art, you’re thinking critically. When you plan out your busy day to be optimally time efficient, you’re thinking critically.

Here’s the thing: AI can do all of those tasks for you and more, quite proficiently. And its use is increasingly mainstream. Since launching in 2022, ChatGPT has rocketed to become the ninth most trafficked website and the fourth most popular app on iPhone. The AI tool has 300 million weekly active users and 123.5 million daily active users as of early January.

People frequently utilize AI to craft emails, plan travel, get financial advice, summarize texts, and prepare for job interviews. According to a 2024 Pew Research poll, roughly half of Americans said they use AI at least several times a week, while a recent survey found that virtually all Americans use products that contain AI (even though two-thirds don’t realize it).

Companies offering AI assistants sell them as productivity boosters. They argue that tasking them with humdrum mental chores and queries frees up users’ time and cognitive resources, which they can then spend on other, more creative, and innovative pursuits. The idea makes intuitive sense and has backing in the scientific literature. According to cognitive load theory, the human cognitive system has limited capacity. So reducing cognitive load can enhance learning and performance.

Atrophied critical thinking

Habitually offloading cognitive tasks to AI could backfire, however. As AI has grown more commonplace in everyday life, psychologists theorize that it reduces users’ engagement in deep, reflective thinking, causing their critical thinking skills to atrophy over time.

Professor Dr. Michael Gerlich, Head of the Center for Strategic Corporate Foresight and Sustainability at the Swiss Business School, is one of the researchers studying this risk.

“If individuals use the cognitive resources freed up by AI for innovative tasks, the promise holds,” he told Big Think. “However, my research and related studies suggest that many users channel these resources into passive consumption, driven by AI-enhanced content curation. This trend aligns with findings on digital dependence, where the convenience of AI fosters a feedback loop that prioritizes entertainment over critical engagement.”

In other words, when AI frees up users’ cognitive resources, they typically don’t use their extra time and brain power to problem-solve or create. Rather, they tune out by watching Netflix or perusing social media — content served up by AI algorithms.

In Gerlich’s most recent research, published January 3rd in the journal Societies, he surveyed 666 participants in the UK on their AI tool use and also measured their critical thinking skills with oft-used, scientifically validated assessments of critical thinking.

Gerlich found a very strong negative correlation between subjects’ use of AI tools and their critical thinking skills. The higher their usage, the lower their skills. Younger participants tended to be more dependent on AI tools compared to older participants. Education was associated with greater critical thinking skills and attenuated AI’s negative effect.

Many participants suspected that AI was hampering their ability to think critically.

“I find myself using AI tools for almost everything — whether it’s finding a restaurant or making a quick decision at work,” one said. “It saves time, but I do wonder if I’m losing my ability to think things through as thoroughly as I used to.”

“I rely so much on AI that I don’t think I’d know how to solve certain problems without it,” another worried.

Gerlich’s findings suggest participants were correct to be concerned.

“As individuals increasingly offload cognitive tasks to AI tools, their ability to critically evaluate information, discern biases, and engage in reflective reasoning diminishes,” he wrote. “This relationship underscores the dual-edged nature of AI technology: while it enhances efficiency and convenience, it inadvertently fosters dependence, which can compromise critical thinking skills over time.”

The correct use of AI

Despite the glaring results, Gerlich made it very clear that this concerning correlation is not destiny. There are a couple of reasons. First, the finding requires further research to be fully validated.

“One potential direction is to investigate the longitudinal effects of AI tool usage on critical thinking skills over time,” he recommended. “This could involve tracking individuals’ cognitive development and AI tool usage patterns over several years to comprehensively understand the long-term impacts.

Second, he believes that how we use AI makes a huge difference. AI is fundamentally a tool, and tools can be used correctly and incorrectly.

“These results are linked with the wrong use of AI,” he told Big Think. “In my opinion, the correct use of AI can help to increase critical thinking skills.”

So how can we use AI correctly?

“AI tools like the popular large language models can be used for critical discussions and not only as an instrument that replaces one’s own work or thinking,” Gerlich said.

Generative AI is fantastic for brainstorming, showing prompters choices and ideas they might not have considered. AI can also foster critical thinking when users hone the questions they are asking to achieve desired outcomes. For example: trying to get an AI image generator to produce something that matches what you’re imagining. You have to be very clear and descriptive.

Teaching AI in schools

It’s vital for teachers to educate students about the proper use of AI assistants. Students should learn how to assess the veracity of AI responses, evaluate generative AI’s writing, and iteratively sculpt the outputs.

And, of course, students still need to learn the numerous skills that AI offers, like content analysis, writing, mathematics, and logic. Otherwise, we risk turning AI into a “black box,” Gerlich says.

“This ‘black box’ problem can reduce critical engagement and accountability, as individuals may blindly trust AI recommendations without questioning or evaluating them.”

“Educational systems should emphasize active learning, promoting exercises like argument analysis and problem-based learning,” he added. “In professional settings, creating environments where independent decision-making is valued can help maintain these skills. Engaging in reflective activities, such as journaling or debating, can also foster deeper cognitive engagement.”

A path to stagnation?

Public intellectual and author Yuval Noah Harari has opined that humans could become more and more cognitively idle with increasing AI automation, leading to programmed thinking and societal stagnation.

This dystopia isn’t implausible, Gerlich told Big Think, “particularly if AI adoption continues without parallel efforts to safeguard cognitive engagement.”

“However, this trajectory is not inevitable,” he rebutted. “Interventions, such as embedding critical thinking exercises in education and supporting ethical AI design that encourages human involvement, can counteract these trends.”

Gerlich said that integrating AI into daily life requires balance. Its benefits are too great to ignore, but we can’t unthinkingly embrace them.

“Ultimately, the choice rests with each individual: whether to take the convenient route of allowing AI to handle our critical thinking, or to preserve this essential cognitive process for ourselves.”

Still, he does harbor some pessimism.

“It is likely that a portion of society will opt for the path of least resistance.”