When technology is designed with empathy in mind: Exo-Gloves and Deep AI
In our recent event on The Politics of Immersive Technology (VR and AR), we found that many technologists wanted to respond to the charge that their devices reduced our powers of empathy, by isolating us in a private, narcissistic world. Our speakers wanted to stress that hi-tech could be a generator of empathy - eg, with VR and AR, putting the player in the stance of marginalised or oppressed members of society.
The "empathy" agenda seems to be spreading into other tech areas than just the virtual. Atlas of the Future recently posted about the Exo-Glove Poly (see video below, from Tokyo). The device was developed (says the Atlas) "through a unique and inspiring cooperation of students with disabled persons. The team hope that more people with disability will be able to live a better independent life, but most of all, the goal of the SNU Biorobotics Lab is to foster and educate innovative and empathetic researchers to become agents of change for the future. Cho plans to commercialise the product by the end of 2017. You can get more information here."
Artificial intelligence is often rendered as the enemy of empathy between sentient beings - replacing human cognitive skills with automated ones. Or (worse) simulating emotion - as in the responses of "helper" AIs like Apple's Siri or Amazon's Alexa - and thereby mining our interaction, for the benefit of advertisers and states.
But like any tool at the service of human imagination, AI doesn't have to be used that way. MIT Media Lab's Deep Empathy project deploys its learning machines to give people in the comfortable West a sense of what a conflict like Syria would be like - if it was inflicted on their familiar cities and surroundings. See the transformation of the image below, and try it out further here:
From the Deep Empathy blurb:
Around the world, 50 million children have migrated across borders or been forcibly displaced within their own countries. In Syria alone, the brutal six-year old war has affected more than 13.5 million people and 80% of the country's children—8.4 million young lives shattered by violence and fear. Hundreds of thousands of people have been displaced and their homes destroyed.
Can you conceptualize these numbers? People generate a response that statistics can't. And technologists—through tools like AI—have opportunities to help people see things differently.
Deep Empathy utilizes deep learning to learn the characteristics of Syrian neighborhoods affected by conflict, and then simulates how cities around the world would look in the midst of a similar conflict. Can this approach -- familiar in a range of artistic applications -- help us to see recognizable elements of our lives through the lens of those experiencing vastly different circumstances, theoretically a world away? And by helping an AI learn empathy, can this AI teach us to care?