The ethics of technology is not a competition. But if aliens happened to descend upon our planet right this moment, Arrival-style, demanding to speak with our top tech ethicist, Douglas Rushkoff would be a reasonable option.
Rushkoff — a prolific writer, broadcaster, and filmmaker once named by MIT as “one of the world’s ten leading intellectuals,” recently published a new book, Team Human, that certainly would be a strong contender for tech ethics ‘book of the year’ thus far. Team Human is both an intellectual history of the technologies (including social technologies) of the past millennium or two and an effective rallying cry for humanity at a time when many of us have rightly become far too cynical to stomach most rallying cries on most topics.
If nothing else, you’ll see below that Rushkoff wins, hands down, the competition for most Biblical references in one of my TechCrunch interviews thus far. He ends our conversation, however, echoing Felix Adler, the late 19th-century founder of the Ethical Culture movement — Adler, like me, was essentially secular clergy — who famously said, “the place where people meet to seek the highest is holy ground.”
I don’t know if readers of this piece will have a transcendent experience reading it, secular or otherwise, but if you want to spend meaningful time with one of the world’s greatest living thinkers on technology and ethics, please proceed below.
Table of Contents
Reading time for this article is 24 minutes (6,050 words)
Greg Epstein: I loved Team Human and I’m excited for TechCrunch readers to learn about it. First, how would you summarize the argument?
Douglas Rushkoff: I see [the book] less as an argument than as an experience. I’m from this old fashioned author community that thinks of books less as about whatever data or information might be in them and more about what happens to you. A book is almost more like a poem or a piece of art, or a movie that takes you through an experience. The experience I’m trying to convey is celebration of being human. To reacquaint people with their essential human dignity.
But really, the book is arguing we too easily reverse the figure and ground between us and our tools, or us and our institutions. Then we end up trying to conform to them rather than have them serve us. This time out, it might be particularly dangerous since we’re empowering technologies with the ability to search out and leverage human exploits. These are powerful tools. It’s not just some advertising agency trying something and then retooling every quarter. It’s algorithms trying things and retooling in real-time to activate our brainstem and thwart our higher processes.