Between distractions, diversions and the flickering allure of a random suggestion, the major computer platforms aim to keep us glued to our screens come what may. Now some think it is time to escape the tyranny of the digital age. Everyone staring for hours at a screen has had some exposure to "captology" - a word coined by behavioural scientist BJ Fogg to describe the invisible and manipulative way in which technology can persuade and influence those using it.
"There is nothing we can do, like it or not, where we can escape persuasive technology," this Standford University researcher wrote in 2010. All of us experience this "persuasive technology" on a daily basis, whether it's through the endlessly-scrollable Facebook or the autoplay function on Netflix or YouTube, where one video flows seamlessly into another.
"This wasn't a design 'accident', it was created and introduced with the aim of keeping us on a certain platform," says user experience (UX) designer Lenaic Faure. Working with "Designers Ethiques", a French collective seeking to push a socially responsible approach to digital design, Faure has developed a method for assessing whether the attention-grabbing element of an app "is ethically defensible." In the case of YouTube, for example, if you follow the automatic suggestions, "there is a sort of dissonance created between the user's initial aim" of watching a certain video and "what is introduced to try and keep him or her on the platform," he says.
Ultimately the aim is to expose the user to partner advertisements and better understand his tastes and habits. UX designer Harry Brignull describes such interactions as "dark patterns", defining them as interfaces that have been carefully crafted to trick users into doing things they may not have wanted to do.