Technocracy and the prophet of a dystopic future

The futurist philosopher Yuval Noah Harari worries about a lot.

He worries that Silicon Valley is undermining democracy and ushering in a dystopian hellscape in which voting is obsolete. https://www.nytimes.com/2018/11/09/business/yuval-noah-harari-silicon-valley.html

He worries that by creating powerful influence machines to control billions of minds, the big tech companies are destroying the idea of a sovereign individual with free will.

He worries that because the technological revolution’s work requires so few laborers, Silicon Valley is creating a tiny ruling class and a teeming, furious “useless class.”

But lately, Mr. Harari is anxious about something much more personal. If this is his harrowing warning, then why do Silicon Valley C.E.O.s love him so?

“One possibility is that my message is not threatening to them, and so they embrace it?” a puzzled Mr. Harari said one afternoon in October. “For me, that’s more worrying. Maybe I’m missing something?”

When Mr. Harari toured the Bay Area this fall to promote his latest book, the reception was incongruously joyful. Reed Hastings, the chief executive of Netflix, threw him a dinner party. The leaders of X, Alphabet’s secretive research division, invited Mr. Harari over. Bill Gates reviewed the book (“Fascinating” and “such a stimulating writer”) in The New York Times.

“I’m interested in how Silicon Valley can be so infatuated with Yuval, which they are — it’s insane he’s so popular, they’re all inviting him to campus — yet what Yuval is saying undermines the premise of the advertising- and engagement-based model of their products,” said Tristan Harris, Google’s former in-house design ethicist and the co-founder of the Center for Humane Technology.

Part of the reason might be that Silicon Valley, at a certain level, is not optimistic on the future of democracy. The more of a mess Washington becomes, the more interested the tech world is in creating something else, and it might not look like elected representation. Rank-and-file coders have long been wary of regulation and curious about alternative forms of government. A separatist streak runs through the place: Venture capitalists periodically call for California to secede or shatter, or for the creation of corporate nation-states. And this summer, Mark Zuckerberg, who has recommended Mr. Harari to his book club, acknowledged a fixation with the autocrat Caesar Augustus. “Basically,” Mr. Zuckerberg told The New Yorker, “through a really harsh approach, he established 200 years of world peace.”

Mr. Harari, thinking about all this, puts it this way: “Utopia and dystopia depends on your values.”

Mr. Harari, who has a Ph.D. from Oxford, is a 42-year-old Israeli philosopher and a history professor at Hebrew University of Jerusalem. The story of his current fame begins in 2011, when he published a book of notable ambition: to survey the whole of human existence. “Sapiens: A Brief History of Humankind,” first released in Hebrew, did not break new ground in terms of historical research. Nor did its premise — that humans are animals and our dominance is an accident — seem a likely commercial hit. But the casual tone and smooth way Mr. Harari tied together existing knowledge across fields made it a deeply pleasing read, even as the tome ended on the notion that the process of human evolution might be over. Translated into English in 2014, the book went on to sell more than eight million copies and made Mr. Harari a celebrity intellectual.

He followed up with “Homo Deus: A Brief History of Tomorrow,” which outlined his vision of what comes after human evolution. In it, he describes Dataism, a new faith based around the power of algorithms. Mr. Harari’s future is one in which big data is worshiped, artificial intelligence surpasses human intelligence, and some humans develop Godlike abilities.

Now, he has written a book about the present and how it could lead to that future: “21 Lessons for the 21st Century.” It is meant to be read as a series of warnings. His recent TED Talk was called “Why fascism is so tempting — and how your data could power it.

His prophecies might have made him a Cassandra in Silicon Valley, or at the very least an unwelcome presence. Instead, he has had to reconcile himself to the locals’ strange delight. “If you make people start thinking far more deeply and seriously about these issues,” he told me, sounding weary, “some of the things they will think about might not be what you want them to think about.”

Mr. Harari agreed to let me tag along for a few days on his travels through the Valley, and one afternoon in September, I waited for him outside X’s offices, in Mountain View, while he spoke to the Alphabet employees inside. After a while, he emerged: a shy, thin, bespectacled man with a dusting of dark hair. Mr. Harari has a sort of owlish demeanor, in that he looks wise and also does not move his body very much, even while glancing to the side. His face is not particularly expressive, with the exception of one rogue eyebrow. When you catch his eye, there is a wary look — like he wants to know if you, too, understand exactly how bad the world is about to get.

At the Alphabet talk, Mr. Harari had been accompanied by his publisher. They said that the younger employees had expressed concern about whether their work was contributing to a less free society, while the executives generally thought their impact was positive.

Some workers had tried to predict how well humans would adapt to large technological change based on how they have responded to small shifts, like a new version of Gmail. Mr. Harari told them to think more starkly: If there isn’t a major policy intervention, most humans probably will not adapt at all.

It made him sad, he told me, to see people build things that destroy their own societies, but he works every day to maintain an academic distance and remind himself that humans are just animals. “Part of it is really coming from seeing humans as apes, that this is how they behave,” he said, adding, “They’re chimpanzees. They’re sapiens. This is what they do.”

He was slouching a little. Socializing exhausts him.

As we boarded the black gull-wing Tesla Mr. Harari had rented for his visit, he brought up Aldous Huxley. Generations have been horrified by his novel “Brave New World,” which depicts a regime of emotion control and painless consumption. Readers who encounter the book today, Mr. Harari said, often think it sounds great. “Everything is so nice, and in that way it is an intellectually disturbing book because you’re really hard-pressed to explain what’s wrong with it,” he said. “And you do get today a vision coming out of some people in Silicon Valley which goes in that direction.”

An Alphabet media relations manager later reached out to Mr. Harari’s team to tell him to tell me that the visit to X was not allowed to be part of this story. The request confused and then amused Mr. Harari. It is interesting, he said, that unlike politicians, tech companies do not need a free press, since they already control the means of message distribution.

He said he had resigned himself to tech executives’ global reign, pointing out how much worse the politicians are. “I’ve met a number of these high-tech giants, and generally they’re good people,” he said. “They’re not Attila the Hun. In the lottery of human leaders, you could get far worse.”

Some of his tech fans, he thinks, come to him out of anxiety. “Some may be very frightened of the impact of what they are doing,” Mr. Harari said.

Still, their enthusiastic embrace of his work makes him uncomfortable. “It’s just a rule of thumb in history that if you are so much coddled by the elites it must mean that you don’t want to frighten them,” Mr. Harari said. “They can absorb you. You can become the intellectual entertainment.”

C.E.O. testimonials to Mr. Harari’s acumen are indeed not hard to come by. “I’m drawn to Yuval for his clarity of thought,” Jack Dorsey, the head of Twitter and Square, wrote in an email, going on to praise a particular chapter on meditation.

And Mr. Hastings wrote: “Yuval’s the anti-Silicon Valley persona — he doesn’t carry a phone and he spends a lot of time contemplating while off the grid. We see in him who we wish we were.” He added, “His thinking on A.I. and biotech in his new book pushes our understanding of the dramas to unfold.”

At the dinner Mr. Hastings co-hosted, academics and industry leaders debated the dangers of data collection, and to what degree longevity therapies will extend the human life span. (Mr. Harari has written that the ruling class will vastly outlive the useless.) “That evening was small, but could be magnified to symbolize his impact in the heart of Silicon Valley,” said Dr. Fei-Fei Li, an artificial intelligence expert who pushed internally at Google to keep secret the company’s efforts to process military drone footage for the Pentagon. “His book has that ability to bring these people together at a table, and that is his contribution.”

A few nights earlier, Mr. Harari spoke to a sold-out theater of 3,500 in San Francisco. One ticket-holder walking in, an older man, told me it was brave and honest for Mr. Harari to use the term “useless class.”

The author was paired for discussion with the prolific intellectual Sam Harris, who strode onstage in a gray suit and well-starched white button-down. Mr. Harari was less at ease, in a loose suit that crumpled around him, his hands clasped in his lap as he sat deep in his chair. But as he spoke about meditation — Mr. Harari spends two hours each day and two months each year in silence — he became commanding. In a region where self-optimization is paramount and meditation is a competitive sport, Mr. Harari’s devotion confers hero status.

He told the audience that free will is an illusion, and that human rights are just a story we tell ourselves. Political parties, he said, might not make sense anymore. He went on to argue that the liberal world order has relied on fictions like “the customer is always right” and “follow your heart,” and that these ideas no longer work in the age of artificial intelligence, when hearts can be manipulated at scale.

Everyone in Silicon Valley is focused on building the future, Mr. Harari continued, while most of the world’s people are not even needed enough to be exploited. “Now you increasingly feel that there are all these elites that just don’t need me,” he said. “And it’s much worse to be irrelevant than to be exploited.”

The useless class he describes is uniquely vulnerable. “If a century ago you mounted a revolution against exploitation, you knew that when bad comes to worse, they can’t shoot all of us because they need us,” he said, citing army service and factory work.

Now it is becoming less clear why the ruling elite would not just kill the new useless class. “You’re totally expendable,” he told the audience.

This, Mr. Harari told me later, is why Silicon Valley is so excited about the concept of universal basic income, or stipends paid to people regardless of whether they work. The message is: “We don’t need you. But we are nice, so we’ll take care of you.”

On Sept. 14, he published an essay in The Guardian assailing another old trope — that “the voter knows best.”

“If humans are hackable animals, and if our choices and opinions don’t reflect our free will, what should the point of politics be?” he wrote. “How do you live when you realize … that your heart might be a government agent, that your amygdala might be working for Putin, and that the next thought that emerges in your mind might well be the result of some algorithm that knows you better than you know yourself? These are the most interesting questions humanity now faces.”

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: