How should ethics be considered when practitioners design technologies for human interaction?
Complex socio-technical systems, including online commerce, self-driving cars, cyber-physical systems such as smart grids, and algorithmic decision-making systems used for credit applications, insurance eligibility and employment screening, have wide-ranging impacts on society. Practitioners in a diverse range of disciplines, including user experience (UX) designers, software developers, product managers and data scientists, play key roles in determining how ethical considerations shape the design of these pervasive systems.
“Dark patterns” of user experience design
A team of graduate and undergraduate researchers led by Colin Gray, assistant professor of computer graphics technology and director of Purdue’s UX Pedagogy and Practice (UXP2) Lab, are studying practitioners’ experiences and attitudes around ethics. Their aim is to develop methods and approaches that increase practitioners’ everyday ethical awareness. They will also develop educational materials to help future designers and technologists be more effective “everyday ethicists.”
Their research began in 2017 as a study of “dark patterns,” the name given to UX design strategies in which designers use their knowledge of human behavior and the desires of end users to intentionally manipulate, steer or “nudge” them into actions not in their best interest.
“It’s clear in the engineering and technology literature that there is an ethical responsibility inherent in design,” Gray said. “When building a bridge, for example, engineers know they must make it strong enough not to fall down and hurt people. But when you’re not designing a physical thing, it’s a much murkier space. How far does ethical responsibility go when you’re designing a user experience which has the capability of harm?”
People often face dark patterns in today’s online world, Gray said. Websites use deceptive headlines or images as “clickbait” to keep you at their sites longer and entice you to buy products, download software, or sign up for services, memberships, or subscriptions you may not want. Social media encourages you to “like,” “follow,” or invite more people so you’ll spend more time using each app, exposing you to even more embedded advertisements.
The research team identified five strategies that designers use to embed dark patterns in designed technologies:
- Nagging: Redirection of expected functionality that persists beyond a single interaction
- Obstruction: Making a process more difficult than it needs to be, with the intent of dissuading certain action(s)
- Sneaking: Attempting to hide, disguise or delay the divulging of information that is relevant to the user
- Interface Interference: Manipulation of the user interface to prioritize certain actions over others
- Forced Action: Requiring the user to perform a certain action to access (or continue to access) certain functionality
“As soon as you mention these strategies to people,” said Gray, “their eyes light up and they say, ‘I’ve experienced that!’ We experience these forms of manipulation constantly as we engage with digital systems.”
Designers observing designers
To characterize ethically centered design methods, tools and approaches, the research team wanted to identify the phenomenon of dark patterns from several different perspectives. They compiled examples found in the real world. They observed expert practitioners to understand the tensions in their daily work to balance ethical concerns with design outcomes. They studied student designers in controlled lab settings. And they also surveyed users’ everyday perspectives on ethical concerns in digital technologies.
Shruthi Chivukula, a doctoral candidate and graduate research assistant in the UXP2 lab, led studies of student designers and practitioners. She started by observing designers at Purdue to build sensitivity and alignment with other undergraduate researchers before expanding to off-campus locations. Chivukula conducted each observation with two or three other researchers, cross-checking their observations, recording multiple viewpoints, and developing and refining ways to monitor ethical decision-making.
“We tried to focus our observations on their ethical decisions, looking at several factors,” explained Chivukula. “What kind of decisions do they make on a daily basis? Do they have arguments? What do the key stakeholders think?
“It was important that all of us observing were also designers ourselves, via coursework or experience. It was easy for us to take notes, debrief later and gain an overall perspective,” she added.
The research team also monitored Reddit, a popular website with more than one million user-created discussion communities, focusing on a specific “subreddit” in which users post examples of exploitative designs that they believe to be “exactly what not to do as a designer.” The researchers conducted a content analysis of posts and comments, examining the examples attached and noting why post authors felt each example represented exploitative or overtly manipulative design practices.
“Because they are members of the general public and not necessarily designers,” Chivukula said, “it was helpful to analyze their perspectives and to see how they talk about ethical concerns.”
Putting practitioners in the driver’s seat
Gray’s research team is now in the second year of a three-year National Science Foundation grant to identify and develop methods that increase practitioners’ ethical awareness and action. Through two current studies, they’re seeking to identify and describe ethical considerations that a range of technology and design practitioners face. The team is identifying the identity, disciplinary, ecological and societal factors that shape, constrain or enable ethical design from the practitioner perspective. The study includes interviews, focus groups and surveys of practitioners at a wide range of levels, Gray said, including company founders and executives, senior scientists, software engineers and product designers. They are also analyzing online interactions among practitioners that relate to ethics and values.
“One goal is to characterize how ethically aware people are,” said Gray. “There’s always an ethical balance in the design of anything. Are they aware of those tensions, or do they just feel that they simply ‘do their jobs?’”
Through a third set of studies that include co-design workshops with practitioners and researchers, the researchers will encourage the creation, evaluation, refinement and dissemination of methods that increase the ethical capacity of socio-technical practitioners. Recent events in the world, including Black Lives Matter protests, the pandemic, and growing awareness of technology ethics all provide fresh opportunities to spread awareness of ethics in technology practice and everyday life.
“Technologies are always political, serving some people’s ends more than others,” Gray said. “We want to help everyone involved in the design of technology become aware of the need to find a balance. We’re trying to put designers and technologists in the driver’s seat, with tools to support their own work.”
Additional information
- Purdue UX Pedagogy and Practice Lab
- Improving Everyday Ethics (Purdue UX Pedagogy and Practice Lab)
- Improving Everyday Ethics in Socio-technical Practice (National Science Foundation)
- NSF funds research into “dark patterns” of user experience design
- The dark side of UX Design (Purdue UX Pedagogy and Practice Lab)
- Dark Patterns, Pragmatist Ethics, and User Experience (National Science Foundation)
- Colin Gray
- Shruthi Chivukula