Skip to content

“Stop seeing humanity as the problem and start seeing human beings as the solution!”

  • by JW

The previous post took us to the dystopian neo-noir and cult film Blade Runner:

2019: Blade Runner: “The future has arrived: it’s just not evenly distributed yet.”


It basically asks: What makes us human? – and, over the years, the Futures Forum blog has looked at how AI (‘Replicants’?) interacts with us:

Futures Forum: Artificial Intelligence: how can technology help humanity?

Futures Forum: Power-relations and control > “Who will own the future?” > on Artificial Intelligence, Universal Basic Income and the potential threats from automation


Douglas Rushkoff is an American media theorist who has written extensively on these issues:

Team Human vs. Team AI,” Strategy+Business, February 5, 2019.


In a recent interview he made the point that we need to celebrate our humanity – and yet both the most powerful voices and the loudest voices seem to be very much against humanity:

“Stop seeing humanity as the problem and start seeing humanity as the solution!”

On the one hand, the billionaires complain that we’re too irrational and that “there aren’t enough algorithms to make us as predictable as we should be” when it comes to being able to sell us stuff.

On the other hand, environmentalists claim humans have “screwed up the whole planet”: that they’re a cancer destroying everything…

Douglas Rushkoff: “Survival of the Richest” |


This blog has looked at the fears of eco-catastrophe – and many of these fears have been voiced by environmentalists who do not necessarily have a positive view of human beings:

Eco-anxiety: between doom and denial

Media hype and mental health


What Douglas Rushkoff does is to point out that those at the opposite end of the spectrum – the mega-rich – also have their deep-seated fears and a corresponding negative view of humanity, as he discusses in this podcast:


Douglas Rushkoff: “Survival of the Richest”

Five wealthy investors asked Douglas how to survive environmental collapse. But what they really wanted to know was how to transcend the human world they look down upon.

This week’s Playback gets into the psyche of some big-money overlords — the ones who can’t make it to Mars with Elon, anyway.

In his wildly popular story “Survival of the Richest,” researcher Douglas Rushkoff starts off writing about an invitation he received last year to give a keynote speech at a deluxe private resort. Despite his misgivings about offering investment advice to incurious rich people, he went: The speaker’s fee was roughly half his annual professor’s salary.

But instead of the usual audience of wealthy retirees, he was greeted by “five super-wealthy guys — yes, all men — from the upper echelon of the hedge fund world.” On the face of it, they wanted Douglas’s advice on how to escape environmental collapse. But soon they began asking questions like “Is Google really building Ray Kurzweil a home for his brain” and “How do I maintain authority over my security force after the event?’” (“The Event,” meaning “environmental collapse, social unrest, nuclear explosion, unstoppable virus, or Mr. Robot hack that takes everything down.”)

Douglas realized that these one-percenters just shy of the .01 percent really sought an escape — and reliable protection from — human beings. To these billionaires, regular humans are the enemy: inferior, particularly in their unpredictability and insubordination, to robots and machines. So naturally, Douglas’s advice to focus on a humanist approach to apocalypse fell on deaf ears. These investors don’t want to invest in community and environment; they want to invest in themselves — in their own power and domination. This begs the question: Will the apocalypse happen to them, or have they already started it for all of us?

Listen to Douglas read his argument (3:00) — plus recordings from a technologist who learned that “fair” products are almost impossible to make — and then chat with host Manoush Zomorodi (16:20). The two get into Douglas’s self-described positioning as “the technology world’s old country doctor,” why preparation is the same as prevention as far as apocalypse goes, how many tech-evangelist billionaires don’t actually know history and digital technology’s relationship with individualism.

Douglas Rushkoff: “Survival of the Richest” |


Here’s the original article in full:

Survival of the Richest | The wealthy are plotting to leave us behind

How tech’s richest plan to save themselves after the apocalypse | Guardian


And here’s some comment:

Survival of the Richest:

We asked psychologists why so many rich people think the apocalypse is coming |


Rushkoff has a new book out:

Though created by humans, our technologies, markets, and institutions often contain an antihuman agenda. Douglas Rushkoff, digital theorist and host of the NPR-One podcast Team Human, reveals the dynamics of this antihuman machinery and invites us to remake these aspects of society in ways that foster our humanity.

How to be “Team Human” in the digital future | Douglas Rushkoff | youtube