Harari: Owning your data (and your mind) will beat the techno-totalitarians. Us: How local, and how mindful, must this be?
Yuval Noah Harari has rightly come to global fame for his clear sightedness about human development, past, present and future. His core point is that our human capacity for creating powerful collective stories, and our curiosity about the world, has driven us to the point where the fate of the planet is entirely in our hands.
Harari is also admirable for the way he drives his readers to consider their own ethical responsibility for the exercise of this immense power. We need better stories, guiding better sciences, or we're in planetary trouble - not just from our devices, but our own troubled hearts and minds, and our unaddressed vices (as Pat Kane's review of Homo Deus notes).
This Atlantic magazine excerpt from his new blockbuster, 21 Lessons for the 21st Century, develops some of Harari's earlier ideas from Homo Deus and Sapiens. Particularly, he focusses here on the possibility that by combining big data with machine intelligences, we may have invented machines that know us better than we know ourselves.
Not only does this threaten our liberal idea of individualism, but it could also be the ultimate tool for totalitarians. They will increasingly be able to measure our responses and bio-indicators as we go through our streets, and deal with dissidence or revolt (or even mild mental non-comformity) before it can remotely arise. (China's Social Credit System is the usual bogeyman here - but Harari's identifies many Western examples also).
However, what we're interested in here are Harari's suggestions as to how we might stave off such a future. One idea is reminiscent of a D.A. post earlier this week, Tomas Bjorkman's hope that a new wave of "contemplative spaces" can evolve human consciousness to a higher level, making us capable of current complexities. Harari writes:
We need to place a much higher priority on understanding how the human mind works—particularly how our own wisdom and compassion can be cultivated. If we invest too much in AI and too little in developing the human mind, the very sophisticated artificial intelligence of computers might serve only to empower the natural stupidity of humans, and to nurture our worst (but also, perhaps, most powerful) impulses, among them greed and hatred. To avoid such an outcome, for every dollar and every minute we invest in improving AI, we would be wise to invest a dollar and a minute in exploring and developing human consciousness.
We can only concur - adult psychological development, as an input into the strengthening of citizenship, has been a big theme of A/UK since its beginning.
Harari's second point is also really interesting:
The race to accumulate data is already on, and is currently headed by giants such as Google and Facebook and, in China, Baidu and Tencent. So far, many of these companies have acted as “attention merchants”—they capture our attention by providing us with free information, services, and entertainment, and then they resell our attention to advertisers. Yet their true business isn’t merely selling ads. Rather, by capturing our attention they manage to accumulate immense amounts of data about us, which are worth more than any advertising revenue. We aren’t their customers—we are their product.
Ordinary people will find it very difficult to resist this process. At present, many of us are happy to give away our most valuable asset—our personal data—in exchange for free email services and funny cat videos. But if, later on, ordinary people decide to try to block the flow of data, they are likely to have trouble doing so, especially as they may have come to rely on the network to help them make decisions, and even for their health and physical survival.
Nationalization of data by governments could offer one solution; it would certainly curb the power of big corporations. But history suggests that we are not necessarily better off in the hands of overmighty governments. So we had better call upon our scientists, our philosophers, our lawyers, and even our poets to turn their attention to this big question: How do you regulate the ownership of data?
...If you dislike the idea of living in a digital dictatorship or some similarly degraded form of society—then the most important contribution you can make is to find ways to prevent too much data from being concentrated in too few hands, and also find ways to keep distributed data processing more efficient than centralized data processing. These will not be easy tasks. But achieving them may be the best safeguard of democracy.
What are the new levels of governance, democracy and control, living in the space between ourselves/our networks/our localities, and the macro-structures of state and corporation? This question is also one we have been exploring and testing in A/UK from the very start. We've often related it to land, or energy, or culture. But we're just beginning to relate it to data, and what structures will help us "take back control" of that.
For a clue as to our investigations, check out Indra Adnan (A/UK co-initiator's) discussions around the Open Coop event earlier this year. See particularly this blog which connects our interest in a more feminised politics with the operations of blockchain technology, particularly the Holochain platform.
Holochain has an explicit agenda to re-decentralise the internet, returning the processing and holding of information to individuals and communities. This is precisely about turning away from vast, centrally processing corporations: the ones Harari is so plainly warning up about in his recent work. But we are as interested as Harari in how mindfulness and "wokeness" can guide structural reform and innovation.
We'd love Yuval to engage with the work of the Alternative UK - we'll be reaching out in the next few weeks.