Gen Z reports early cognitive decline. Here’s what to know about the brain rot epidemic—and what to do about it

America post Staff
5 Min Read



“Challenges with memory and thinking have emerged as a leading health issue reported by U.S. adults,” associate professor of neurology Adam de Havenon of the Yale School of Medicine has reported

A 2025 Yale Study, authored by de Havenon, found an alarming increase in self-reported cognitive disability, particularly among adults ages 18 to 34. The younger cohort rate nearly doubled over a decade—from 5.1% in 2013 to 9.7% in 2023—driving most of the overall increase. 

By comparison, the rate among adults overall increased more modestly from 5.3% to 7.4% over the same period. The study tracked 4.5 million adults over 10 years.

Is there a youth dementia epidemic?

While the findings are a cause for concern, they do not necessarily suggest an emerging dementia epidemic. “This isn’t a diagnosis of dementia or even of cognitive impairment,” de Havenon explained. “It’s a subjective report of people saying they’re having serious difficulty concentrating, remembering, or making decisions. With dementia, there’s a structural brain disease and a specific pathology that’s injuring the brain and leading to cognitive impairment.” 

That said, the Yale study notes that these findings should be investigated further, “as growing cognitive problems among the population can pose future healthcare and workplace consequences.”

Because participants in the Yale study have not had their brains scanned, there’s no way of knowing yet if they display the structural brain changes associated with dementia. Further research would be needed to determine if there is a link between early self-reported cognitive decline and the structural brain changes associated with dementia. But if such a link is established, it would pose a significant economic cost; a study published in Frontiers in Neurology notes that dementia cost the global economy $1.3 trillion in 2019. That’s what makes research in treating dementia—from behavioral interventions to anti-inflammatory nasal spray—so important.

The Yale study also found a connection to socioeconomic factors among the participants, which demonstrates that the difficulties “may be becoming more widespread, especially among younger adults, and that social and structural factors likely play a key role.”

Is technology to blame?

While de Havenon’s report might have relied on subjective self-reporting, other studies support his findings. Earlier this year, neuroscientist Jared Cooney Horvath provided written testimony before the U.S. Senate Committee on Commerce, Science, and Transportation, noting that “over the past two decades, the cognitive development of children across much of the developed world has stalled and, in many domains, reversed,” Horvath wrote.

Rather, he blamed federal policy that “continues to incentivize large-scale digital adoption without demanding independent efficacy evidence, privacy protections, and developmental safeguards,” which “risks compounding long-term educational and workforce harm.”

For two decades, state governments have invested in providing students with laptops and tablets, digitizing classroom functions, and making Gen Z a beta test for a digital-first generation. The result? Despite having unprecedented access to information from an early age, Gen Z has become the first generation to score lower on standardized tests than previous generations.

Undoing decades’ worth of damage

Horvath says the fix is not about “rejecting technology,” but “a question of aligning educational tools with how human learning actually works. Evidence indicates that indiscriminate digital expansion has weakened learning environments rather than strengthened them.”

In a 2026 world, fully rejecting technology has become largely unrealistic, but scientists are increasingly exploring how to undo the psychological and cognitive damage.

Inc. has previously reported on a large study that followed more than 400 adults over a 14-day period as they used an app called Freedom, which essentially turns smartphones into dumb phones. Functionally, the app blocks internet access and removes browsing and social media apps, but still allows for calls and texts.

The results were striking. By cutting constant digital stimulation—reducing daily screen time to under three hours—participants “showed measurable improvements in sustained attention, mental health, and overall well-being. The gains in focus were particularly notable—equivalent, the researchers said, to reversing about a decade of age-related cognitive decline,” Inc. wrote.

—Victoria Salves, Editorial Fellow

This article originally appeared on Fast Company’s sister website, Inc.com. 

Inc. is the voice of the American entrepreneur. We inspire, inform, and document the most fascinating people in business: the risk-takers, the innovators, and the ultra-driven go-getters that represent the most dynamic force in the American economy.



Source link

Share This Article
Leave a Comment

Leave a Reply

Your email address will not be published. Required fields are marked *