A New Renaissance?
Information Revolutions and Epistemic Chaos
Modernity feels spent. For decades we’ve been running on a cultural operating system built around buying things, staring at screens, and a vague sense that nothing quite connects. We built a world that prizes speed over depth, convenience over meaning. But something is shifting. I think we’re heading into a period that future historians might compare to the Renaissance, driven by three forces: AI automating the mundane, the collapse of scientific materialism, and a growing hunger for the sacred.
Echoes of a calamitous past
To see where this might be going, look at how the last Renaissance started. The cultural explosion of the 15th century didn’t emerge from peace. It was born from systemic collapse.
The Black Death killed roughly a third of Europe between 1347 and 1351. The feudal order, already creaking, couldn’t survive that kind of demographic shock. Labour became scarce, wages rose, and suddenly peasants had bargaining power they’d never had before. The old certainties - the Church’s authority, the fixed social hierarchy, the idea that your lot in life was God’s will - all of it started to crack. The printing press finished the job. When Gutenberg’s Bible appeared around 1455, information escaped the monasteries for good.
The parallels to now aren’t exact, but they’re hard to ignore. The pandemic didn’t kill a third of us, but it broke something psychological. The relentless modern grind just stopped. People had time to think, and a lot of them didn’t like what they found. Meanwhile, the internet and generative AI have done to legacy institutions what the printing press did to the medieval Church: stripped them of their monopoly on truth. We’re living through the resulting chaos. The old world is dying. The new one hasn’t arrived yet.
When institutions stop providing coherent narratives, people go looking for their own. That’s where we are.
The demise of the “dead machine”
For the last century or so, the dominant intellectual framework in the West has been scientific materialism: the universe is a sterile machine of “matter in motion,” consciousness is a biological accident, and meaning is something we invented to cope. If that’s true, then the superficiality of modern consumer culture makes a kind of grim sense. Why bother with depth if we’re just meat that learned to worry?
But this framework is breaking apart. Not because people are abandoning science. Because of science.
Modern physics keeps bumping into things that don’t fit a purely mechanistic picture. The fine-tuning problem is a good example: the fundamental constants of the universe are calibrated within absurdly narrow tolerances. Change the gravitational constant by one part in 10^60 and stars can’t form. The cosmological constant is balanced to about 120 decimal places. You can wave this away with multiverse theories, but that’s essentially saying “there must be infinite other universes we can’t observe” to avoid the simpler implication that the calibration is intentional. It’s a strange kind of parsimony.
In biology, the situation is similar. DNA doesn’t just carry information; it operates like a programming language with error correction, compression, and modular architecture. The chemical properties of the molecule don’t explain the code any more than ink explains a novel. Information, in our uniform experience, comes from minds. That’s not a proof of God. But it’s a data point that materialists have never satisfactorily explained away.
Science isn’t the enemy of the sacred. It might be the route back to it.
The role of artificial intelligence
AI sits in an odd position in all this. It can mass-produce exactly the kind of shallow, disposable content we’re supposedly trying to escape. Scroll through any social media feed and you’ll see what I mean. But it also has the potential to free people from work that deadens the spirit.
The original Renaissance was partly a story about tools. The printing press changed what was possible. So did improvements in lenses, pigments, navigation instruments. New tools don’t just make old tasks faster; they open up tasks that weren’t conceivable before. AI could do something similar. If it takes over the logistical, administrative, and repetitive labour that fills most people’s working lives, it forces a question: what do you actually want to do with your time?
There’s also something philosophically interesting about building artificial minds. We write code to create systems that process information, learn, and generate novel outputs. Then we look at biology and realise that DNA was doing this billions of years before we showed up. The similarity is hard to dismiss. It makes technology feel less like a departure from the natural world and more like a clumsy echo of it.
I don’t know if that’s comforting or unsettling. Probably both.
Toward a new enlightenment
If a New Renaissance is the spark, we’ll need something like a New Enlightenment to follow it: the hard work of actually rebuilding institutions and systems around different values.
The deconstructive philosophies of the 20th century were good at tearing things down. Nietzsche killed God, the postmodernists killed grand narratives, and the New Atheists tried to kill whatever was left. But deconstruction without reconstruction just leaves rubble. At some point you have to start building.
What that looks like, I’m genuinely not sure. Some kind of synthesis between scientific rigour and what you might call natural theology, maybe. A rethinking of how we structure work, since the 40-hour week was designed for factory production, not knowledge work or creative labour. And almost certainly a shift away from measuring everything in GDP, though I suspect that one will take the longest, because GDP is easy to count and meaning isn’t.
The realisation that we might exist in an intentional universe, that consciousness might be fundamental rather than accidental, is not something technology can experience for us. And the antidote to shallow modernity isn’t better gadgets. It’s the harder, slower work of paying attention, building real communities, and treating scientific discovery not as a weapon against meaning but as a way of uncovering it.
Whether we’ll actually manage any of this, I have no idea. History doesn’t repeat on command. But the ingredients are sitting there, and ignoring them feels like a worse bet than trying.