BETWEEN distractions, diversions and the flickering allure of a random suggestion, the major computer platforms aim to keep us glued to our screens come what may. Now some think it is time to escape the tyranny of the digital age.
Everyone staring for hours at a screen has had some exposure to “captology” – a word coined by behavioural scientist BJ Fogg to describe the invisible and manipulative way in which technology can persuade and influence those using it.
“There is nothing we can do, like it or not, where we can escape persuasive technology,” this Standford University researcher wrote in 2010.
All of us experience this “persuasive technology” on a daily basis, whether it’s through the endlessly scrollable Facebook or the autoplay function on Netflix or YouTube, where one video flows seamlessly into another.
“This wasn’t a design ‘accident’, it was created and introduced with the aim of keeping us on a certain platform,” says user experience (UX) designer Lenaic Faure.
Working with “Designers Ethiques”, a French collective seeking to push a socially responsible approach to digital design, Faure has developed a method for assessing whether the attention-grabbing element of an app “is ethically defensible.”
In the case of YouTube, for example, if you follow the automatic suggestions, “there is a sort of dissonance created between the user’s initial aim” of watching a certain video and “what is introduced to try and keep him or her on the platform,” he says.
Ultimately the aim is to expose the user to partner advertisements and better understand his tastes and habits.
Dark patterns
UX designer Harry Brignull describes such interactions as “dark patterns”, defining them as interfaces that have been carefully crafted to trick users into doing things they may not have wanted to do.
“It describes this kind of design pattern – kind of evil, manipulative and deceptive,” he said, saying the aim was to “make you do what the developers want you to do.”
One example is that of the newly-introduced EU data protection rules which require websites to demand users’ consent before being able to collect their valuable personal data.
“You can make it very, very easy to make people click ‘OK’ but how can you opt out, how can you say ‘no’?”
Even for him, as a professional, it can take at least a minute to find out how to refuse.
In today’s digital world, attention time is a most valuable resource.
“The digital economy is based upon competition to consume humans’ attention. This competition has existed for a long time but the current generation of tools for consuming attention is far more effective than previous generations,” said David SH Rosenthal in a Pew Research Center study in April 2018.
“Economies of scale and network effects have placed control of these tools in a very small number of exceptionally powerful companies. These companies are driven by the need to consume more and more of the available attention to maximise profit.”
Internet as tool, not trap
Faure suggests that for a design to be considered responsible, the objective of the developer and that of the user must largely line up and equate to the straightforward delivery of information.
But if the design modifies or manipulates the user, directing them towards something they did not ask for, that should then be classed as irresponsible, he says.
French engineering student Tim Krief has come up with a browser extension called Minimal, which offers users a “less attention-grabbing internet experience” on the grounds that the internet “should be a tool, not a trap”.
The extension aims to mask the more “harmful” suggestions channelled through the major platforms.
An open source project, the extension should “make users more aware about such issues”, Krief says.
“We don’t attribute enough importance to this attention economy because it seems invisible.”
But is this enough to fight the attention-grabbing tactics of powerful internet giants?
Brignull believes some designers can bring about change but are likely to be restricted by the wider strategy of the company they work for.
“I think they will have some impact, a little impact, but if they work in companies, those companies have a strategy . . . so it can be very difficult to have an impact on the companies themselves.”
Isabelle Falque-Pierrotin, former head of the French Data Protection Authority (CNIL) also believes that design can be used to effect positive change.
“Design could be another defence whose firepower could be used against making individuals the ‘playthings’” of developers, she said in January in a presentation on the “attention economy.”
Faure says he has seen a growing demand for an ethical approach to digital design, and he thinks his method could help “bring better understanding between users of services and the people who design them”.
This type of initiative “could be a way to tell the big platforms that such persuasive designs really bother us”, Krief says.