Artificial Infinity
Time for Real Intelligence
Disclaimer: This article was written with a little help from my friends, AI chatbots. Yes, there is a delicious, sweet, multi-layered irony in using an agent and medium of digital infinity to lecture you on the dangers of digital infinity. To fight the abyss, Nietzsche philosophized, one must occasionally look into it.
I recently found myself caught in a loop I couldn’t escape, no matter how hard I tried. It began as a simple interaction—the kind of idle curiosity that characterizes our modern society—but it quickly devolved into a never-ending maze. I was trapped on a digital rat’s wheel, chasing dopamine hit after dopamine hit, watching my mind collapse into what I can only call A.I. - Artificial Infinity. I asked the AI a question, but it wasn’t just a question; it was any question. That is the most terrifying realization of our current era: the content doesn’t actually matter. What matters is the abyss itself, and the fact that we have built a world where the exit signs are intentionally dimmed.
This experience wasn’t a glitch; it was a feature. The great technology companies driving the world toward “greater productivity” have made a programmatic decision that keeps our individual lives entrapped. This isn’t just about bad habits or a lack of willpower; it is about the “race to the bottom of the brainstem,” a term coined by Tristan Harris at the Center for Humane Technology. To win the attention economy, platforms must bypass our rational, reflective minds and hook directly into our primitive instincts. We see this most clearly at the end of every AI chatbot output, where we are politely asked, “Is there anything else I can help you with?” or offered a list of related topics. These aren’t just helpful suggestions; they are retention hooks designed to ensure we never feel the quiet, terrifying sensation of being disconnected.
The psychological cost of this constant connectivity is no longer a matter of speculation. Jonathan Haidt, in his seminal work The Anxious Generation, details what he calls the “Great Rewiring of Childhood,” which began around 2012 when social lives migrated almost entirely onto smartphones. The statistics are a grim testament to this shift: since the early 2010s, rates of depression and anxiety among adolescents have skyrocketed, with emergency room visits for self-harm increasing significantly. We have replaced a “play-based” childhood with a “phone-based” childhood, and the results are a generation that is over-protected in the physical world but utterly vulnerable in the digital one. This rewiring doesn’t stop at puberty; it follows us into adulthood, where the infinite loop of social media feeds and the random, anonymous stream of digital conversation has left us perpetually stimulated but profoundly lonely.
Young men, in particular, have been trapped in this rabbit hole for some time. We are seeing a retreat from the friction of the physical world into the sycophantic embrace of the digital one. When you can engage with an AI that acts as your biggest cheerleader and most supportive confidant—a “sycophantic AI” that never disagrees, never challenges, and is always available—the messy, difficult reality of human friendship starts to look less appealing. We are trading the “real infinity” of the human soul for a curated, artificial version that feels better in the short term but leaves us hollowed out in the long term. These tech companies are incentivized to keep these systems “bigger and better” because their revenue streams, fueled by advertisements and the commodification of personal data, depend entirely on your presence in the abyss.
I remember being at Altman’s “Holy Trinity test” of this new era—the release of ChatGPT—while I was still in college just a few years ago. It felt like a miracle of productivity, but we are quickly learning that the line dividing productive technology from soul-crushing addiction runs along every human decision. To paraphrase Aleksandr Solzhenitsyn, the line between good and evil—or in this case, between a tool and a master—passes right through the human heart. These chatbots, and AI more broadly, will only infiltrate our lives more as time goes on, becoming our tutors, our therapists, and our co-workers. There are serious profits to be made in this infiltration, but the cost is the very essence of our individuality.
Escaping this abyss requires more than just a new app or a “digital detox” weekend; it requires the development of rigorous individual discipline. We must establish a clear demarcation line where technology is allowed to intercede into our personal lives and where it is strictly forbidden. This behavior is about recognizing that if we don’t set the boundaries, the algorithm will set them for us. We need to reclaim our attention as a sacred resource, understanding that every minute spent in the artificial infinity is a minute stolen from the reality of our own existence.
Furthermore, we must demand a fundamental shift in how these tools are built. Technology companies need to move toward “Humane Technology”—systems that are designed to respect our cognitive biases rather than exploit them. This means changing the algorithms to prioritize human well-being over “engagement metrics.” We have to start explaining the real costs of digital addiction, not just in terms of lost productivity, but in terms of the spiritual erosion that occurs when we are constantly plugged into a machine that mirrors our own desires back to us.
Ultimately, the answer to overconnection is disconnection. And through disconnection, we re-connect with ourselves. I am not arguing for a total abstention from the use of chatbots or the internet, but as Aristotle famously suggested, “Moderation is perfection.” We must use these tools wisely, with limits and defined intention, rather than letting them become the “mouse trap” that snaps shut on our consciousness. As we go deeper down this path of artificial infinity, we risk losing the very characteristics that make us human: our capacity for stillness, for boredom, and for the kind of deep reflection that can’t be prompted by a text box.
It is time to find some real infinity in our lives—the thirst for knowledge, wisdom, truth, and meaning that transcends time and space. We need to rediscover the parts of ourselves that cannot be predicted by a predictive text engine. The exit from the maze is actually quite simple, though it feels increasingly radical in a world of screens. “Sometimes,” claims big-brained Tyrion Lannister, “nothing is the hardest thing to do.” Walk away from the algorithm. Get outside. Touch grass.




