Homo Deus: A Brief History of Tomorrow
February 21, 2017
Homo Deus is an almost overwhelmingly erudite journey in humanity's future, and one worth taking.
In his first book, Sapiens: A Brief History of Humankind, Yuval Noah Harari documented the rise of homo sapiens, how we outlasted and overcame other species of humans, and have come to rule the Earth. He also began an exploration of what comes next for humanity. In his new book, Homo Deus: A Brief History of Tomorrow, he examines this future in more depth and detail.
Human life on Earth has hitherto been determined by three primary concerns: famine, plague, and war. And while there is still too much of it in the world, these concerns are no longer a preoccupying concern for most of us. If you only pay attention to the nightly news, the world might seem like its filled with death, destruction, and disaster, that there is carnage on every street corner. But we have, in fact, mostly gotten these forces under control. And when they do occur, we think of them as a political failure rather than natural phenomenon:
[W]hen famine, plague or war break out of our control, we feel that somebody must have screwed up, we set up a commission of inquiry, and promise ourselves that next time we'll do better. And it actually works. Such calamities indeed happen less and less often. For the first time in history, more people die today from eating too much than from eating too little; more people die from old age than from infectious diseases; and more people commit suicide than are killed by soldiers, terrorists and criminals combined. In the early twenty-first century, the average human is far more likely to die from bingeing at McDonald's than from drought, Ebola or an Al-Qaeda attack.
To put it more succinctly:
In 2012 about 56 million people died due to human violence (war killed 120,000 people, and crime killed another 500,000). In contrast, 800,000 committed suicide, and 1.5 million died of diabetes. Sugar is now more dangerous than gunpowder.
There is a "New Peace" spreading over the world, and it is a peace that powerful governments and corporations rely upon. That is in part because, in a knowledge economy, wars just aren't as profitable for governments and big business as is peace.
Hence, as knowledge became the most important economic resource, the profitability of war declines and wars became increasingly restricted to those parts of the world—such as the Middle East and Central Africa—where the economies are still old-fashioned material based economies.
Harari himself is a professor in Jerusalem, so he knows this New Peace does not extend to all corners of the Earth. But, whereas it once seemed inevitable that territorial wars would break out between neighbors every generation or so, it now seems almost inconceivable in most parts of the world. But where do we turn our problem-solving attention now that humanity's three largest have been largely brought to heel. Well, some of our largest companies and brightest minds have turned to the project of solving death rather than building devices for visiting upon people—as they did so well in the last century. Now that we have solved basic upkeep, many are looking for an upgrade.
Now, I think of natural death is a feature of humanity, not a bug. But not everyone sees it that way, especially in Silicon Valley, including Google, which launched a company called Calico in 2013 whose stated aim is to "solve death." It all hinges on the idea that "the life sciences have come to see organisms," including humans, "as biochemical algorithms." Harari's journey through that mental exercise and (albeit slim) chance that humans can cure death is fascinating in its own right, but the idea of organisms as algorithms is widely accepted and has broad implications for all of our life systems.
You may not agree with the idea that organisms are algorithms, and that giraffes, tomatoes and human beings are just different methods for processing data. But you should know that this is current scientific dogma, and it is changing the world beyond recognition.
Not only individual organisms are seen as data-processing systems, but also entire societies such as beehives, bacteria colonies, forests and human cities. Economists increasingly interpret the economy as a data processing system. Laypeople believe that the economy consists of peasants growing wheat, workers manufacturing clothes, and customers buying bread and underpants. Yet experts see the economy as a mechanism for gathering data about desires and abilities, and turning this data into decisions.
In this view, capitalism and communism, democracy or dictatorship, are just competing data-processing systems. And it was the unique conditions of our recent history, in which individuals held immense economic and military value, that caused systems that valued the individual to be ascendent. It was not God's will, nor superior morality or values that caused democracy and capitalism to flourish, but a product of particular set of circumstances on the ground. But what happens when individuals lose that power to automation in both the workplace and warfare? What happens when a growing number of wealthy individuals and corporations decide that "equality is out—immortality is in." What happens when intelligence decouples from consciousness as artificial intelligence advances? What happens when average humans become militarily and economically useless as algorithms begin to overtake our utility? What happens when algorithms and apps not only do our jobs better than us, but know us better than we do? Think of facial recognition software or the Fitbit on your wrist. It's likely just the beginning.
Some people use these apps without thinking too deeply about it, but for others this is already an ideology, if not a religion. The Quantified Self movement argues that the self is nothing but mathematical patterns. These patterns are so complex that the human mind has no chance of understanding them. So if you wish to obey the old adage and know thyself, you should not waste your time on philosophy, meditation or psychoanalysis, but rather you should systematically collect biometric data and allow algorithms to analyze them for you and tell you who you are and what you should do. The movement's motto is 'Self-knowledge through numbers'.
What happens, then, when wearable devices increasingly become implanted devices, when the "Internet of things" includes our very selves? What happens when inequality advances to a degree that there are not only economic classes, but biological castes, when there exists a "small and privileged elite of upgraded humans?" What happens when we lose an interest in our own authenticity, and give into the easy life of automation and biochemical alteration to make our desires align with whatever is most productive and painless? What happens when people become "just another designer product?" What happens when we worship data over all else? What happens when free will is simply no longer practical?
The new technologies of the twenty-first century may thus reverse the humanist revolution, stripping humans of their authority, and empowering non-human algorithms instead.
These are not rhetorical questions. If we're not careful, we may be living in a golden age of individual freedom, between being beholden to concerns of basic subsistence and survival, to being so preoccupied with that psychological blueprint that we relinquish control of our lives to networks of algorithmic caretakers that make all our decisions for us. Because being flooded with too much information is just as debilitating as having it withheld entirely. Yet, if we cede our inner struggles for instant gratification, our doubt for better data and a new big brother to help us manage it all, we may end up worse off for it:
When we mix the practical ability to engineer our minds with our ignorance of the mental spectrum and with the narrow interests of governments, armies and corporations, we get a recipe for trouble. We may successfully upgrade our bodies and our brains, while losing our minds in the process. Indeed, techno-humanism may end up downgrading humans.
The Matrix looks more and more real all the time. We are doing more than contributing to the flow of data, we are being merged into it.
What I've touched on here doesn't do justice to the amazing breadth and scholarship of Harari's new book. Homo Deus is chilling at times. It will hopefully shake you. But it also prompts a profound questioning of our existence on Earth that can enrich your life and alter what you find important and appreciate within it. Do you have a world view? Prepare for it to be challenged. He doesn't question the future so much as he lays it out our current trajectory and asks us to question it—prompts our politicians, economists, and especially scientists to question their emerging data-centric view of the world. He leaves us with three fundamental questions at the end, one of which is: "What's more valuable—intelligence or consciousness?" He raises both.
Homo Deus is one of the most challenging and rewarding books I've ever read. It an almost overwhelmingly erudite journey, and one well worth taking.
We have 20 copies available.