Your value is not about utility
Self-improvement of the transhumanist sort requires that we adopt an entirely functional understanding of who and what we are: All of our abilities can be improved upon and all of our parts are replaceable. Upgradable.
The quirks that make us human are interpreted, instead, as faults that impede our productivity and progress. Embracing those flaws, as humanists tend to do, is judged by the transhumanists as a form of nostalgia, and a dangerously romantic misinterpretation of our savage past as a purer state of being. Nature and biology are not mysteries to embrace but limits to transcend. This transhumanist mindset is, in fact, taking hold. We can see it in the way we bring digital technologies closer and closer. The screen is encroaching on the eye, from TVs to computer monitors to phone screens to smartwatches to VR goggles to tiny LEDs that project images onto the retina to neural implants that communicate directly with the optic nerve.
With each leap in human–machine intimacy, resolution increases, and our utility value is improved along some measurable metric. This is the mindset encouraged by wristbands that count our heartbeats and footsteps under the pretense of improved health or life extension. Health, happiness, and humanity itself are all reducible to data points and subject to optimization.
We are all just numbers: the quantified self.
Like a music recording that can be reduced to code and stored in a file, the quantified human can also be reduced to bits, replicated infinitely, uploaded to the cloud, or installed in a robot. But only the metrics we choose to follow are recorded and translated. Those we don’t value, or don’t even know about, are discarded in the new model.
Improved longevity, supercognition, or military prowess all sound promising until we consider what is being left behind, whose values are being expressed (and whose aren’t), and how these choices change the larger systems of which we are a part. Just as life-saving antibiotics also strengthen bacteria and weaken the collective immune system, or as steroids improve short-term performance at the expense of long-term health, there are trade-offs. We—or the companies selling the improvements—are actively picking which features of humanity to enhance and which to suppress or ignore. In amplifying an individual’s brainpower, we may inadvertently disable some of their ability to connect with others or achieve organismic resonance. What of racial diversity, gender fluidity, sexual orientation, or body type? The human traits that are not favored by the market will surely be abandoned.
Could that happen to a civilization as supposedly enlightened as our own? Our track record suggests it will.
The internet’s tremendous social and intellectual potential was surrendered to short-term market priorities, turning a human-centered medium into a platform for manipulation, surveillance, and extraction. The more we see the human being as a technology to be enhanced, the greater the danger of applying this same market ethos to people, and extending our utility value at the expense of others. Life extension becomes the last-ditch attempt of the market to increase our available timeline as consumers—and consumers willing to spend anything for that extra few years of longevity.
Sure, many of us would accept a digital implant if it really worked as advertised. Who wouldn’t want some painless enhancements, free of side effects? Or the choice of when or whether to die? Besides, participation in the ever-changing economy requires some acquiescence to technology, from streetcars and eyeglasses to elevators and computers.
But our technological investments come with strings. The deals are not honest; they’ve never been. Companies change their user agreements. Or they sell the printers at a loss and then overcharge us for the ink cartridges. The technologies we are currently bringing into our lives turn us into always-on customers—more like subscribers than purchasers, never really owning or fully controlling what we’ve bought. Operating system upgrades render our hardware obsolete, forcing us to buy new equipment. How long until that chip in my brain, and the neurons that grew around it, are obsolete?
It’s not that wanting to improve ourselves, even with seemingly invasive technology, is so wrong. It’s that we humans should be making active choices about what it is we want to do to ourselves, rather than letting the machines, or the markets propelling them, decide for us.