top of page

Announcing Selected Abstracts: The 'Unpacking Deceptive Design' Research Series

The first round of call for abstracts saw insightful entries from researchers across regions. We are happy to enlist here the selected authors, as well as the theme they will explore for this series.



Countering Deceptive Design: A Sociological Perspective

Nilanjan Raghunath, Singapore University of Technology and Design


Abstract:

Deceptive design is vastly prevalent in the digital age, particularly in online products and services. It persuasively nudges consumers to make choices and face unknown consequences they are partly or fully unaware of because of their lack of knowledge or routine dependency on the product or service. Examples of this include signing subscriptions for what consumers believe are a free or necessary product or service, consenting to user agreements that violate user privacy, surreptitious personal data collection practices and the inflicting of digital, emotional and financial harm through misguided advertising. Deceptive design is often disguised as part of routine marketing practice by a legitimate institution with wording that is legally accurate but not ethically appropriate and hard to detect. This blog will explore sociological theories by Goffman and Simmel to analyze the social control of lying and deception prompted by the social legitimacy of powerful organizations. It will provide innovative ideas on why people are willing to accept the terms and conditions of deceptive design and what can be done to detect and reduce the negative effects on consumers. A sociological understanding will add a deep and rich layer to the current understanding of deceptive design problems and solutions by making sense of human and institutional interactions and behavior, which are paramount to understanding how organizations can be more ethical in their marketing practices and how consumers can be more conscious of their choices.


2. Identifying Anticompetitive Harms in Deceptive Design Discourse Isha Suri, Senior Researcher at the Centre for Internet and Society (CIS)


Abstract:

Digital markets have compelled us to reimagine the way humans navigate their lives. While

ostensibly they enable efficient economic organisation by reducing information asymmetries, they also pose complex challenges for consumers, and enforcement authorities alike. Making use of ‘deceptive design’ or ‘dark patterns’ to manipulate users into making choices they would not otherwise have made and that may cause harm, is one such problem. Even though, businesses exploiting human weakness is neither novel nor unique to digital markets, however the scale of digital markets enabling platforms to reach millions of consumers through targeted advertising compounds the risks significantly. Also, evidence suggests platforms adopt dark patterns that enable self-preferencing, forced data sharing, and create barriers to switching. In addition to foreclosing rivals, deceptive design practices are also employed to compel users to act against their self-interests through tactics such as drip pricing, complex language, information overload, and scarcity claims. This further distorts the competitive process by shifting the incentive to compete on attributes such as price and quality, to exploiting consumer shortcomings. This piece explores and maps deceptive design practices in different types of digital platforms and the anticompetitive harms arising from these identified practices. Platforms will be classified based on the taxonomy provided by Cennamo (2019) and would be broadly classified into (i) multisided transaction platforms (e.g. Online retail) (ii) complementary innovation platforms (e.g. Operating systems), and (iii) information platforms (e.g. Search). In addition, the blog would explore policy interventions that could help address anticompetitive harms posed by deceptive design practices through cross jurisdictional analyses.


3. Dark patterns in Indian fintech apps – what are they and how can we solve them?

Monami Dasgupta, Rajashree Gopalakrishnan, Vinith Kurian, D91 Labs


Abstract:

The Indian Fintech Industry is on a high growth trajectory. From basic bank account opening and payments to loan repayments, fintech has innovated for various use cases over the years and positively impacted the finances and businesses of various sections of society. However, one of the perils of fintech products is that sometimes they use dark patterns in the form of obfuscated consent framing, hidden costs, expensive and convoluted surrender clauses, misleading games, and bundled products that may be unsuitable and harmful for the user. Fintech companies deploy such methods to compete for market share, meet their aggressive sales targets, and create a loyal base of customers by making it difficult for them to leave the app.


Dark patterns or deceptive designs are “user interfaces that make it difficult for users to express their actual preferences or that manipulate users into taking actions that do not comport with their preferences or expectations.”. These are strategies that are designed and crafted carefully to trick users into taking decisions that they may not otherwise do. For instance, pre-selecting cookies, anchoring an expensive EMI option or bundling insurance with an airline ticket are examples of dark patterns. These elements are not designed by mistake but with nefarious intentions and do not have the user’s best interest in mind. In our study we aim to identify and analyse the dark patterns used by some fintech apps in India and the consequent harm it can lead to. We also aim to propose design recommendations and best practices of ethical designs for the fintech apps in India.


4. Deceptive design in voice interfaces based in India

Saumyaa Naidu and Shweta Mohandas, Centre for Internet and Society (CIS)


Abstract:

Voice interfaces (VI) have become increasingly prevalent in India today, being integrated in

smartphones and through voice assistants. Voice technologies have immense potential to

enable access to people who are limited by digital text-only platforms. While several voice

based products are being developed in India, the application of the technology is still limited by barriers such as accessibility and multi-lingual support. In continuation of our research on the landscape of VIs (voice assistants, voice bots, and other voice based chat applications) in India, we plan to examine the use of deceptive design in internet based voice products in the country. The article will examine the notice and consent architecture, and conversation design of selected VIs to highlight instances of deceptive design, and evaluate its impact on privacy, accessibility, and language support. Drawing upon available literature on deceptive design, we understand this to be practices built into user interfaces that manipulate people using online platforms into making choices or taking actions they did not intend to take. We will also focus on the impact of these manipulative design practices on people with disabilities, and with limited digital literacy and experience with technology, in the context of VIs. We plan to assess the design process followed in the conceptualisation and development of VIs in order to understand the challenges specific to designing voice based products. Based on this, we will present design guidelines and regulatory recommendations to tackle deceptive design practices in VIs.


5. Tacking dark patterns as a regulator in the UK (tentative theme),

Caroline Sinders, Design Researcher and Artist, Information Commissioner's Office (ICO), UK.



Our second call for abstracts is now open! Read more on how to submit here.





bottom of page