The previous post took us to the dystopian neo-noir and cult film Blade Runner:
It basically asks: What makes us human? – and, over the years, the Futures Forum blog has looked at how AI (‘Replicants’?) interacts with us:
Douglas Rushkoff is an American media theorist who has written extensively on these issues:
In a recent interview he made the point that we need to celebrate our humanity – and yet both the most powerful voices and the loudest voices seem to be very much against humanity:
“Stop seeing humanity as the problem and start seeing humanity as the solution!”
On the one hand, the billionaires complain that we’re too irrational and that “there aren’t enough algorithms to make us as predictable as we should be” when it comes to being able to sell us stuff.
On the other hand, environmentalists claim humans have “screwed up the whole planet”: that they’re a cancer destroying everything…
This blog has looked at the fears of eco-catastrophe – and many of these fears have been voiced by environmentalists who do not necessarily have a positive view of human beings:
What Douglas Rushkoff does is to point out that those at the opposite end of the spectrum – the mega-rich – also have their deep-seated fears and a corresponding negative view of humanity, as he discusses in this podcast:
Douglas Rushkoff: “Survival of the Richest”
Five wealthy investors asked Douglas how to survive environmental collapse. But what they really wanted to know was how to transcend the human world they look down upon.
This week’s Playback gets into the psyche of some big-money overlords — the ones who can’t make it to Mars with Elon, anyway.
In his wildly popular story “Survival of the Richest,” researcher Douglas Rushkoff starts off writing about an invitation he received last year to give a keynote speech at a deluxe private resort. Despite his misgivings about offering investment advice to incurious rich people, he went: The speaker’s fee was roughly half his annual professor’s salary.
But instead of the usual audience of wealthy retirees, he was greeted by “five super-wealthy guys — yes, all men — from the upper echelon of the hedge fund world.” On the face of it, they wanted Douglas’s advice on how to escape environmental collapse. But soon they began asking questions like “Is Google really building Ray Kurzweil a home for his brain” and “How do I maintain authority over my security force after the event?’” (“The Event,” meaning “environmental collapse, social unrest, nuclear explosion, unstoppable virus, or Mr. Robot hack that takes everything down.”)
Douglas realized that these one-percenters just shy of the .01 percent really sought an escape — and reliable protection from — human beings. To these billionaires, regular humans are the enemy: inferior, particularly in their unpredictability and insubordination, to robots and machines. So naturally, Douglas’s advice to focus on a humanist approach to apocalypse fell on deaf ears. These investors don’t want to invest in community and environment; they want to invest in themselves — in their own power and domination. This begs the question: Will the apocalypse happen to them, or have they already started it for all of us?
Listen to Douglas read his argument (3:00) — plus recordings from a technologist who learned that “fair” products are almost impossible to make — and then chat with host Manoush Zomorodi (16:20). The two get into Douglas’s self-described positioning as “the technology world’s old country doctor,” why preparation is the same as prevention as far as apocalypse goes, how many tech-evangelist billionaires don’t actually know history and digital technology’s relationship with individualism.
Here’s the original article in full:
And here’s some comment:
Rushkoff has a new book out:
Though created by humans, our technologies, markets, and institutions often contain an antihuman agenda. Douglas Rushkoff, digital theorist and host of the NPR-One podcast Team Human, reveals the dynamics of this antihuman machinery and invites us to remake these aspects of society in ways that foster our humanity.