The Designers' Oath: James Williams wants an ethics that respects human attention, in the digital age


One of the axioms that rattles around A/UK is "whatever you pay attention to, grows". It's a statement from the wisdom traditions, particularly those which say that observing how our minds operate, and frame reality, is the most important work a human can embark on ("mindfulness" is the general term used).

But we've come upon a number of studies and books recently, which remind us that the power we have to direct our attention is something that mighty forces are battling to subvert, or at least influence.

For a new politics, this fact is incredibly important. The angst about post-truth and fake news could be the beginning of a realisation that we should be aware and active about how our shared reality is made... Or it could be the source of a corrosive cynicism (even nihilism) about the world, leading people to give up on their agency and citizenship. And we know what these kinds of vacuums usher in. 

So to be self-conscious about what we attend to, and who seeks our attention, is a vital factor in our coming political life. This is the lesson from the new book by James Williams, Stand Out Of Our Light (amazingly, his publisher provides a free download copy here - we also have a link here). An excerpt from a review in the Financial Times:

The internet has helped turn the amateur art of persuasion into the professional science of manipulation that now threatens our “inward domain of consciousness”, as John Stuart Mill described the frontline of liberty. “The liberation of human attention may be the defining moral and political struggle of our time,” Williams writes. “We therefore have an obligation to rewire this system of intelligent, adversarial persuasion before it rewires us.”

Williams recalls his days working as a strategist for Google with some affection, admiring the company’s ambition to organise the world’s information and make it universally accessible and useful. But one day he had an epiphany after being distracted by the technologies that were supposed to be making him more productive.

“It felt like something disintegrating, decohering: as though the floor was crumbling under my feet,” he writes. His realisation was that a frightening misalignment had emerged between the goals we set ourselves and the goals our technologies direct us towards. The metrics that drove the tech industry — number of views, time on site, number of clicks, total conversions — seemed petty and perverse.

Yet these are the mechanisms of grabbing our attention that we casually submit ourselves to in our digital lives. Williams used to work for Google, but as he describes in this LSE blog interview, he began to realise the implications of his trade: 

Rather than view lengthy scrolls down Facebook or Twitter feeds as benign detours from our central engagements with technology, he argues that these processes of "attentional capture and exploitation" have come to define them, jeopardising our ability to achieve our intentions and goals: "There was more technology in my life than ever before, but it felt harder than ever to do the things I wanted to do".

Williams - who is now studying philosophy at Oxford - is now strongly suggesting that technologists, engineers and designers come up with their own version of a Hippocratic oath ("do no harm", the implicit axiom for all doctors and healers). He calls it "the Designers Oath":

As someone who shapes the lives of others, I promise to:

  • Care genuinely about their success;
  • Understand their intentions, goals, and values as completely as possible;
  • Align my projects and actions with their intentions, goals, and values;
  • Respect their dignity, attention, and freedom, and never use their own weaknesses against them;
  • Measure the full effect of my projects on their lives, and not just those effects that are important to me;
  • Communicate clearly, honestly, and frequently my intentions and methods; and
  • Promote their ability to direct their own lives by encouraging reflection on their own values, goals, and intentions.

In terms of what we are beginning to understand about the attentional strategies deployed on us in the age of Trump, Brexit, Putin and Cambridge Analytica, these are lofty goals. But we are all for the creative classes' growing awareness of their own moral agenda - becoming "soulitarians", so to speak - and will watch the take-up of Williams' ethics with interest. 

Williams is part of a rising wave of digital ethicists and even moralists. We have blogged here on Francois Chollet and his urging of AI designers to respect human agency. We have also profiled Perspectiva's Attention project (current version here), and would point your attention to Tristan Harris's Centre for Humane Technology

And here's an extract from Williams' book in the Guardian's Long Read section.