Is the next generation getting smarter or dumber?
The digital revolution, from the neuroscience point of view
What is the effect of the digital era on young brains? After all, the current generation has spent their lives scrolling screens — and many of my neuroscience colleagues have grim predictions. They feel certain this technology will ruin the brains of the next generation.
But I suspect it won’t. If I were a betting man, I’d place money that the next generations will be smarter, on average, than we are.
Before I explain why, let me first clarify why I'm a little skeptical of the dismal predictions. First, for anyone who wants to insist that the digital world is dumbing kids down, there's a problem: it’s inordinately difficult to prove this scientifically. Why? Because you can’t get a proper control group. Let’s imagine that you found some difference in test scores between today’s students and students from a previous generation. You can’t necessarily attribute this to TikTok or video games, because there are countless confounding variables that could also serve as explanations: changes in sugar consumption, marijuana legalization, shifting political climate, changes in air quality, shifting cultural norms, differences in educational practices, reduced secondhand smoke, and so much more. So if there are any test score differences, you don’t know what explains the change.
Fine, you might say: just find a group of kids today who’ve grown up in the same generation but without digital exposure. But this is essentially impossible to find unless you draw from a rigorously isolated community (such as the Amish) or find children who live in dire poverty. Again, that introduces a large set of other variables that aren’t the same. There are dozens of other differences between the groups.
Thus, with any apparent difference we observe in today’s youth, it’s nearly impossible to say how much, if any, is due to digital technology and how much stems from hundreds of other changes.
That's the first issue that makes me a little skeptical of strong claims of decline. The second is that every generation in history (reaching back at least to the ancient Greeks) likes to complain about the degradation of the next generation. Consider this sentiment, published in 1790 by the Reverend Enos Hitchcock:
The free access which many young people have to romances, novels, and plays has poisoned the mind and corrupted the morals of many a promising youth; and prevented others from improving their minds in useful knowledge. Parents take care to feed their children with wholesome diet; and yet how unconcerned about the provision for the mind, whether they are furnished with salutary food, or with trash, chaff, or poison?
Or consider this description from the 7th Earl of Shaftesbury, in his speech to the House of Commons in 1843:
...a fearful multitude of untutored savages... [boys] with dogs at their heels and other evidence of dissolute habits...[girls who] drive coal-carts, ride astride upon horses, drink, swear, fight, smoke, whistle, and care for nobody...the morals of children are tenfold worse than formerly.
And beyond the generalized concerns of diminishing aptitude and morals, accusatory fingers always point to the new technologies. Here’s an article from the Brooklyn Eagle, lamenting the effect of the phone on social life. Of note: this article was written just over 114 years ago.
Or take this as an example: When the printing press was invented in the 1440s, many thinkers lamented that this technology would make the next generation slow-witted and vacuous. Why? Because now students didn't have to know the answers: they could just reach to the shelf and the answer was sitting right there. Books were the original Google.
But the dissemination of knowledge in book form made society better. So with the table now set, let me turn to why I think the next generation might just grow up to be smarter than we.
Broader exposure
The printing press was beneficial, of course, because any person’s intellectual diet could be larger. More than ever before, a child could learn about ideas from hundreds of years ago – something perhaps their parents had never before heard of. They could learn what was known about history, astronomy, chemistry, clouds, war, trees, diseases, animals, and other things that some curious soul before them had committed to paper (or papyrus).
In this way, a child could more easily springboard off the top of all the intellectual labor that had come before.
But then, almost 550 years after the printing press, things suddenly got even better. Now, instead of relying only on books, kids could explore videos, interactive tutorials, virtual labs, expert talks from around the planet, and forums that put them in contact with those anywhere on the planet who share their passions. Growing up with the internet, the entirety of humankind’s knowledge is now available on a portable rectangle in your pocket.
Why does this matter? Because we are now exposed to a much broader set of ideas.
Take music composition as an example. If you grew up 200 years ago, you would have been exposed only to your culture's narrow fencelines of music. There was a certain way things had to be done. But composers now hear musical endeavors from around the planet, from cultures they’ve never heard of. Having been exposed to all of this, they get to mash up the ideas to make something new. In the book The Runaway Species (written by me and my colleague Anthony Brandt) we make the argument that creativity emerges from basic cognitive operations applied to your storehouse of knowledge (those operations are bending, breaking, and blending).
Students today have a larger diet of ideas than ever before in history, and this gives them more raw materials to bend, break, and blend.
And it's even better than mere exposure: most of the knowledge is queryable and instant. Why is that important? Brain plasticity.
They can pose questions (to Google or chatGPT) and get answers instantly.
This immediacy satisfies curiosity. And from a brain plasticity point of view, this matters: When a child is curious about something, the right cocktail of neurotransmitters is present to make the information stick. With the internet, people now get lots of just-in-time information, while everyone in previous generations got mostly just-in-case information.
New opportunities
Technology rewrites our educational opportunities. Take something like virtual reality: a child can shrink down to the size of an atom to witness the world at that level, before scaling up to the size of the solar system to intuit planetary dynamics.
When my son decided to learn the piano, he demonstrated to me his approach with mixed reality: wearing VR goggles, he can see through to our physical piano in front of him… and he can also see colored bars emerging from the digital distance, telling him which keys to hit and when. In this way, within moments he was getting the hang of new pieces of music.

The second you witness a new method like this, you recognize the inefficiency of what you were reared on: localizing circles between horizontal lines and endlessly reciting to oneself Every Good Boy Does Fine. (In case you’re tempted to say “but these goggles just tell you which notes to hit,” remember the job of a musical staff is no different. Becoming a good player is all about the practice, the memorization, and the injection of feeling in the playing.) We were stuck with the paper method for centuries — but once we see new methods, we recognize how much better they are.
Or take AI. On a recent episode of my Inner Cosmos podcast, I touched on an exciting opportunity for education: teaching students that their opinions are not sacred truths, but starting points for deeper thinking. With a debating AI, a student chooses a hot-button issue (abortion, gun control, immigration), and debates it with a friendly but firm AI partner. The student is graded on how well she is able to defend her position using logic and evidence, and also on how skillfully she was able to articulate the opposing argument (a practice known as steel-manning). Then the roles are reversed and the student argues the other side and is graded again. No teacher has the time (or patience) to do this with each student in the class; AI is perfect for this. This is one of the most beautiful prospects for AI in education: not just the teaching of facts, but teaching students how to think — at scale.
From Inner Cosmos Episode 95: What's the future of education in an AI world?
Where does this lead?
I talk to young people all the time who say something unexpectedly insightful, and I’ll ask, “How did you know that?” Often they’ll say: “Oh, I saw it in a TED talk.” What an extraordinary opportunity that represents. A TED talk gives us a world-class expert delivering the most well-honed talk of their lives in 15 minutes. Compare that to my generation’s upbringing, where our exposure to knowledge depended entirely on the luck of the draw: whatever homeroom teacher we lucked into, who often had little or no expertise in the subject at hand.
This is not to say there aren’t downsides to being raised in a digital world. Among the most concerning are declining reading rates (especially when it comes to complex, sustained material), shifts in educational priorities (e.g., an increased focus on standardized testing), and ever-present screens that compete for attention, spontaneous conversation, and cognitive depth.
But it’s important to temper our critiques with a recognition of a common cognitive trap: retrospective romanticization. We often look back on the past with a selective eye, imagining children were more intellectually curious, neighbors were more articulate, and the world was more wholesome. We paint a picture of an idealized childhood filled with imaginative play and earnest self-improvement.
The truth is less flattering. My own generation, like all those before it, wasted enormous swaths of childhood because there was often little to do. And despite what we might now claim, those hours were not spent solving algebraic proofs or wrestling with moral philosophy. They were filled with reruns of sitcoms, repetitive board games, and aimless play.
Yes, there are real benefits to climbing trees and digging forts: physical activity, creative exploration, the joy of completing a task. But we can’t pretend we were miniature scholars polishing our King’s English or contemplating the mysteries of the cosmos. We were mostly spilling away time. Which is why it’s worth questioning whether our current digital era is worse, or simply different. This era trades one set of distractions for another – but it also offers new tools, broader literacies, and untold forms of creativity that our nostalgic minds are slow to credit.
If history is any guide, we are only continuing to consume a richer diet, and that gives us increasingly more materials with which to creatively explore the adjacent possible.