Introduction
The headline made the rounds a few weeks ago: Alexandr Wang, the billionaire founder of Scale AI, says he’s waiting for Neuralink brain chips before having children. He imagines a future where his kids will enter the world already enhanced—their cognitive abilities optimized, their potential supercharged by artificial intelligence. For Wang, the choice is not just about parenting; it’s about staying relevant in a world where human limitations are no longer inevitable.
It’s easy to laugh off as Silicon Valley hubris. But beneath it sits a deeper fantasy—one we all, in little ways, have started to buy into. The fantasy of a frictionless life. A life where AI smooths the rough edges, eliminates the hard parts, makes everything from work to love to thinking itself just a little easier, and more optimized.
But in the famous French theorist Jean-Paul Sartre’s view, it’s precisely this incompleteness, this tension, that gives shape to human life. We are defined not by what we are, but by what we are not—by the restless gap between what is and what could be.
As Sartre argues in Being and Nothingness, human existence is defined by lack—by what it is not. To be conscious, to be a self, is to live in tension with incompleteness, always reaching beyond what is toward what could be. Without that struggle, without that negation, we cease to be subjects and risk collapsing into mere things. For Sartre, to be human is to carry the burden of choice—to wrestle with doubt, meaning, and the weight of our own freedom. We are not born with a fixed nature or preloaded software; we become who we are through what we do, and doing is always a struggle.
What happens, then, when the struggle disappears? When AI offers us a world without resistance, without failure, without the restless effort that makes us feel alive? Sartre writes that to be human is to be defined by lack—by the gap between what we are and what we might become. Without that lack, without that striving, what’s left of us?
This is not a Luddite panic or a call to reject technology. It’s a deeper, more unsettling question: Are we building tools to ease human life—or to erase the very conditions that make us human?
The Promise of Frictionless Living
The small promises of an easier life already surround us. AI drafts our emails, curates our social media feeds, and completes our sentences. Recommendation algorithms spare us the burden of indecision; navigation apps save us from the ache of getting lost. Even our emotions are quietly optimized—with AI companions to comfort us, meditation apps to regulate us, and algorithmic therapies to soothe our anxieties.
None of these tools is inherently bad. On the contrary, they often work: they help, they streamline, they patch over the awkward, the inefficient, the difficult. But they also carry a quieter shift, one we rarely notice.
Wang’s fantasy—of skipping the slow, uncertain process of growing into ourselves—isn’t just about parenting. It reflects a deeper shift in how we see difficulty itself. Struggle becomes a bug to fix. Slowness becomes inefficiency. Resistance becomes something to smooth away, not live through.
Sartre calls this bad faith—the self-deception that lets us believe we are not free, that we can offload responsibility for our lives onto something or someone else. Meaning, for him, is not handed down from above; it is made, wrestled with, chosen. But how do we choose meaning when every choice has been pre-sorted, pre-smoothed, pre-recommended?
The existential danger of AI is not that it becomes our master. It’s that it becomes our cushion. A layer of softness between us and the world, until we no longer remember how to wrestle with it.
What We Lose When We Stop Struggling
It’s tempting to see this as progress. After all, isn’t the point of civilization to reduce suffering, to make life more bearable? Why shouldn’t we use the best tools available to lessen pain, uncertainty, and toil?
But not all struggle is suffering. Some kinds of struggle are the texture of being alive.
A writer wrestling with a blank page is not just producing words—they are confronting their own limits, their own thoughts, their own voice. A couple working through conflict is not just fixing a problem—they are learning who they are, what they want, how to love. A person sitting with grief is not simply in pain—they are shaping their capacity for memory, attachment, and care.
When AI steps in to “help,” it does more than lighten the load—it risks removing the very work that makes us human. Sartre understood human existence not as a finished essence, but as a restless, self-making project—what he called the for-itself, defined not by what it is, but by what it lacks.
We are, in Sartre’s words, a “useless passion,” haunted by an impossible striving: the wish to become a self-caused, complete being, a kind of god, which we can never attain without ceasing to be human. This gap—this lack of being—is not an accidental hardship; it is what constitutes us as human.
And suffering shows up in different forms. Anguish is the experience of our radical freedom: the terrifying realization that nothing in our past or nature determines what we will choose next.
Nausea is the taste of our sheer contingency, the sense that we exist without necessity, without justification, simply there.
And crucially, Sartre points to what he calls the coefficient of adversity—the resistance offered by the world itself. We encounter freedom only in a world that pushes back. Without friction, without resistance, without difficulty, freedom becomes empty, weightless, meaningless.
In short, we need to struggle—not because suffering is noble, but because without it, we risk collapsing into “things”: optimized, efficient, frictionless, but no longer fully alive. This is the real danger of AI—not that it will take over, but that it will remove the conditions of becoming, of doing, of experiencing, of meaning-making.
Just like Wang, too impatient to let his kids grow into themselves, we risk losing faith in the value of the slow, difficult work of becoming human.
Without friction, the ship ceases to sail—and becomes just driftwood.
The Ethical Stakes: More Than Just Individual Choice
It’s easy to think this is just a matter of personal preference. Let people use the tools they want; let some opt into ease and others into effort. But the spread of AI raises ethical questions that go far beyond individual lifestyle.
First, there’s the issue of equity. If only some people—the wealthy, the connected, the early adopters—have access to AI-enhanced lives, what happens to those left outside? We risk creating a world where human capacity is split: between the optimized and the unoptimized, the enhanced and the merely human. A new class divide, not just of wealth or power, but of experience itself.
Second, there’s the question of agency. The more we delegate choices to machines—what to watch, who to date, what career to pursue, how to parent—the more we lose the practice of decision-making itself. And freedom, Sartre insists, is not just the right to choose, but the work of choosing. Without that work, we drift into passivity, into the illusion that life is something happening to us, rather than something we are making.
Sartre argues that freedom is not an abstract possession but something that only becomes real when it engages a world that pushes back—what he calls the “coefficient of adversity.” A world without resistance is one of hollowed-out freedom: action without weight, choice without consequence.
And finally, there’s the danger of cultural forgetting. Across history, struggle has been the ground of solidarity, art, resistance, and love. It’s where people meet each other as equals, as co-strugglers, as meaning-makers. What happens to a culture that increasingly designs hardship out of its systems? What happens when friction is no longer a shared experience, but an avoidable glitch?
Let It Be Hard
To preserve struggle is not to romanticize suffering, but to preserve the conditions of becoming. Sartre called human life a useless passion—a restless, unfinished striving with no final fulfillment. And it is in this unfinishable project, this uneasy movement, that we find not perfection, but humanity.
It’s easy to imagine the future as a place of pure optimization—where the sharp edges are sanded down, the difficult choices outsourced, the struggles we once saw as formative now framed as unnecessary pain.
And maybe, on the surface, that sounds like progress. But what if the cost of that smooth future is not just the loss of hardship, but the loss of everything that makes us human?
Sartre teaches that we are not born with a ready-made self. We become through action, through resistance, through the uneasy, often painful process of shaping meaning out of a meaningless world. Freedom, in his view, is not comfort; it’s a burden. It’s the constant demand to choose, to err, to take responsibility.
AI does not have to be the enemy. But its deepest risk is not domination—it’s sedation. Not that it will enslave us, but that it will soften us out of existence—dissolve the conditions where freedom, growth, and authenticity can even arise.
If we want to face the age of AI with any ethical clarity, we need to stop asking how much friction we can erase—and start asking how much friction we must preserve. We need to remember that struggle is not something to be designed away, but something to be lived through. It is, perhaps, the last thing we cannot afford to automate.
Childhood is hard, for instance—and Wang’s fantasy skips past that. His decision doesn’t just enhance his kids; it strips them of the chance to struggle, to become.
Let it be hard.