Faculty of law blogs / UNIVERSITY OF OXFORD

Looking at Dark Patterns in Light of the Competition Law in India

Author(s)

Renu Gupta
Advocate practicing commercial litigation and arbitration in Delhi and is the Co-Founding Partner of law firm Olive Law, India
Akshat Bhushan
Student at Hidayatullah National Law University, Raipur, India

Posted

Time to read

4 Minutes

The advent of big tech has re-defined the fundamental rules of the market. Most of these tech companies offer free services to customers, and these freebies raise several questions. Tech companies often indulge in the practice of dark patterns, which can broadly be characterised as a user interface that subtly tricks users to take decisions on the platform that adversely affect their own interests. Generally, these patterns trick the user into either paying more money or parting with more data than they ordinarily would. This post focusses on the latter, ie, those dark patterns that use non-price elements to abuse their dominance in the market. A number of jurisdictions have dealt with this through data protection laws. However, India does not have a comprehensive data protection framework. We try to situate the hazard of dark patterns within the already existing antitrust framework.

Relevant legal provisions for regulating dark patterns

The Competition Commission of India is well within its power under the Competition Act, 2002 to prohibit or penalise tech enterprises from adopting means to gain advantage over other competitors and prevent new entrants into the market. Section 4 of the Competition Act, 2002 prohibits an enterprise or a group  from abusing dominant position in the market by. The Competition Commission of India is empowered to conduct an inquiry into such matters according to the procedures set out in section 26 and to pass orders after inquiry under section 27 of the Act.

Dark patterns that do not have direct monetary impact on the user

The traditional understanding has been that antitrust regulators intervene when a dominant enterprise restricts the output or increases their prices or indulges in predatory pricing to drive out competitors. Thus, pricing was seen as an essential parameter of competition. In 2016, the Competition Commission of India refused to deal with the WhatsApp privacy update issue because it took the view that matters relating to data and privacy did not fall within its jurisdiction. However, the Competition Commission of India came down heavily on Google Inc. in the 2018 case of Bharat Matrimony v Google Inc. Google Inc. was found to be a dominant enterprise in the Indian market for online web search. It was found guilty of abusing its dominance to display its own commercial flight services over third party entities in their search results. Google argued that the user availed the online search services of Google for free and, therefore, there is no cause of action under section 19 of the Indian Competition Act 2002. However, the Competition Commission of India rejected this contention and held that the user was receiving these services in return for data, which Google sold to advertisers. This case marks a shift from the Commission’s approach of non-interference in 2016 in non-price matters related to data.

Facebook and Google have been using dark patterns of pre-selected checkboxes and confirmshaming. They often preselect the option or, by default, disable privacy protecting options, which undermine the privacy of its users. Usually, users do not pay attention to these options due to cognitive inertia and, therefore, fail to opt in to the more private option or opt out of the privacy threatening option. These boxes should either be left empty, so that the users read all the options and make an informed choice, or those options which guarantee the most privacy must be preselected.

The practice of confirmshaming induces the user to not select the privacy protecting option by phrasing it in such a way that it scares or shames the user, thereby dissuading him or her from selecting that option. For example, when Facebook-users disable the facial recognition technology, they are warned that Facebook would not be able notify them if somebody impersonates them using their photos. However, it does not give the whole picture. For instance, it only warns users about the actions they won't be able to carry out if they disable the option, but it does not provide information about the dangers of enabling facial technology, like using photos to match with others or using it for targeted ads. Consequently, the user refrains from disabling the facial recognition technology.

Such methods are used by social media entities to obtain the consent of the user and choose for him or her the most privacy intrusive option. This enables such companies to obtain access to more big data, which can be sold to advertisers, thereby increasing their influence in digital marketplaces. These practices are not only unfair and exploitative, but they also produce an exclusionary effect. They create a huge barrier for a new enterprise to enter the market, as it cannot be expected to compete with already established enterprises that have access to huge volumes of big data. The Competition Commission of India has, in a number of cases, observed that data is an important non-price competition parameter in the digital marketplace. Accordingly, it must consider investigating these tech enterprises for abuse of dominance in the market by undermining the data privacy of their users. 

Coercion as a component of abuse of dominance

The Nobel Prize winner in economics Daniel Kahneman, has pointed out that our brain has two parts—system 1 and system 2. System 1 is that part of the brain, which is responsible for more instant and spontaneous reaction. Dark patterns such as streaks in Snapchat messaging stimulate the release of dopamine, which causes the user to become addicted to using the service. This causes the user to spend more time on the platform, which allows the tech enterprise to extract more data from the user. The user would not have ordinarily behaved in that way if not for the dark patterns. Thereby, an illusion of consent is created, but in reality the tech platforms are taking advantage of the cognitive bias and are making the user do certain things which are altogether prejudicial to the user’s interests. Therefore, this is not free will; it is a form of coercion.

Coercion has been considered as an exploitative practice and is an important element of the abuse of market dominance under competition law. In 2020, the Competition Commission of India ordered an investigation into the then proposed privacy update of WhatsApp, which forced users to either share even personal data that is not required for availing messaging and call services or to opt-out of WhatsApp altogether. The Commission found a prima facie case of abuse of dominance against WhatsApp as WhatsApp was found to be a dominant player in the relevant market and the user would be forced to accept these terms. In another instance in 2020,  the Commission found Google to have abused its dominance when it only allowed GPay payments on its Play Store which resulted in the denial of market access to other payment platforms. Therefore, these cases make it clear that the Commission has consistently treated coercion as an important component to identify abuse of dominance.

Conclusion

For dark patterns that do not have direct monetary impact, the tricks deployed to induce the user into sharing data should be considered as a form of coercive anti-competitive activity which unduly enhances the market influence of these platforms.

Renu Gupta is an Advocate practicing commercial litigation and arbitration in Delhi and is the Co-Founding Partner of law firm Olive Law, India.

Akshat Bhushan is a law student at Hidayatullah National Law University, Raipur, India.

Share

With the support of