The Competition and Markets Authority (CMA) has been focusing lately on the adverse impacts of Online Choice Architecture (OCA) and how it can hurt competition and consumers. The situations in which people make decisions and how alternatives are presented to them are described by choice architecture. In online settings, choice architecture is the environment in which users act, including the display and positioning of options as well as the design of interfaces. OCA issues tend to cross over with other areas including online safety and data protection; they are not only limited to consumer protection and the ICO has long argued for the implementation of a design-centred approach to data privacy, which makes sure that businesses take privacy and data protection concerns into account at the very beginning of developing a system, service, product, or process and continuing throughout its lifecycle.

The ICO and CMA issued a joint paper on August 9, 2023, aimed at tackling the issue of harmful internet design. The paper underscores how OCA practices, including those that entail the gathering and use of personal information, can in the eyes of the regulators have a harmful influence on both customers and the competitive environment. We distil the key points from the paper below.

What does the joint paper say?

The ICO and CMA collaborated to provide an overview of how online design choices can lead to data protection, consumer, and competition harms, as well as the relevant laws that could be violated by the non-exhaustive but highlighted harmful OCA practises such as “harmful nudging,” “confirmshaming,” “biased framing,” “bundled consent,” and “default settings.”. The key concerns pointed out are:

Harmful design
practises
ICO’s concernsCMA’s Concerns
Harmful or dark nudges: A design approach that pushes users to make inadvertent or ill-considered decisions. An example was given of websites that provide an ‘accept all’ button but does not provide a ‘reject all’ button just as easily.














Users must be able to refuse non-essential cookies with the same ease as they can accept them, without having to take any additional steps.
 
– Such practice is likely to infringe the fairness, lawfulness (due to uninformed consent) and transparency principles of the GDPR.

– The PECR is also likely to be infringed where a cookie banner that incorporates dark nudges is being used to obtain cookies.





Because it can encourage users to provide more personal information than they would otherwise choose to as part of receiving services, access to this personal information may confer a competitive advantage to certain large platforms and inhibit entry and expansion by smaller businesses.

Where these techniques make certain options easier to choose over others and discourage more conscious deliberations of choices, this can result in ill-considered or inadvertent decisions that may decrease users’ welfare or may not align with their preferences.
Confirmshaming: This is the practice of pressuring or shaming someone into doing something by making them feel guilty or embarrassed for not doing it.







Using certain language ploys to put pressure on users to make a biased choice is likely to infringe the “fairness” principle of the UK GDPR.

Consent obtained in this manner is also uninformed or invalid, leading to the “lawfulness” principle being infringed.

Similar to concerns with harmful nudges, confirmshaming could nudge users towards choices to share more personal data than they otherwise would when receiving services. In certain markets, access to such data may confer a competitive advantage to existing incumbents and inhibit entry by smaller challenger businesses.
Biased framing: This approach uses positive framing to emphasise the potential positive impact of making a choice whilst minimising or ignoring the potential risks of negative impacts.














Not giving equal weight to the risks and benefits of a decision about personal data processing means it is harder for users to properly assess the information and make an informed choice. This can lead to both the “fairness” and “transparency” principles of the UK GDPR being infringed.
 
Consent obtained using biased framing that is not fair or transparent (because it deceives or misleads and is not open and honest) is also likely to be invalid, on the basis that it is not fully informed, thereby leading to infringement of the “lawfulness” requirement of the UK GDPR.
The misuse of biased (positive or negative) framing generally can undermine users’ ability to process and assess information independently, and therefore adversely affect their decision making. If it is misleading, it may breach consumer protection law.











Bundled consent: asking the user to consent to the use of their personal information for multiple separate purposes or processing activities via a single consent option.









Consent for separate processing activities needs to be “specific” under UK GDPR. The GDPR is also clear that consent should not be bundled up as a condition of service unless it is necessary for that service. Bundled consent is therefore highly likely to be invalid thereby increasing the risk of infringing the “lawfulness” requirements.



Competition concerns may arise where certain businesses use such practices to bundle consent for data sharing across all their first-party services, thus leading to greater extraction of user data. Where businesses with substantial market power that provide multiple services can bundle their services, this can also result in them leveraging their existing market position to enter related markets and increase barriers for rivals in those markets.
Default settings: a predefined choice that the user must take active steps to change.


















The UK GDPR requires a “data protection by design and default” approach to the processing of personal data.  It is also unlikely that consent obtained via default settings (i.e., assuming consent on the basis that the individual has not changed their settings from the default without confirming this choice) will be valid, because individuals must take a positive action to indicate their consent. This could lead to infringement of the lawfulness principle if relying on consent as a basis for processing. It would also infringe the PECR if relying on unchanged default settings as consent to set non-essential cookies.The use of defaults can lead users to make choices about their personal data that may not be in their best interests, for example, sharing more data than they would like to when receiving services or inadvertently enrolling into auto-renewing subscription plans.












Looking ahead

The paper’s publication follows significant CMA policy and enforcement actions regarding OCA practices. For example, in March 2023, the CMA launched its  “Rip Off Tip Off” campaign, which encourages consumers to report deceitful online sales tactics. In addition, in April 2022, the CMA published a research paper on OCA accompanied by an evidence review. The ICO has also consistently advocated for the implementation of a privacy by design strategy to avoid technological issues of data protection legislation violations. This joint paper urges online operators to re-evaluate existing OCA practises in order to ensure that users are empowered to make informed decisions about the processing of their personal data and consumer utility. It provides a road map for businesses to responsibly use OCA by putting users at the centre of their design, testing design decisions before release, and (of course) complying with relevant data protection, consumer, and competition regulations.

Author

Vin leads our London Data Privacy practice and is also a member of our Global Privacy & Security Leadership team bringing his vast experience in this specialist area for over 22 years, advising clients from various data-rich sectors including retail, financial services/fin-tech, life sciences, healthcare, proptech and technology platforms.

Author

Paul is head of cybersecurity in the UK and a key member of our wider data protection team. For 15 years, Paul has guided clients through all types of major data security incidents as well as complex technology and data disputes. Paul pioneered an award-winning data breach and dark web scanning tool which was the first product of its kind in the legal market.

Author

Chiemeka works as a privacy specialist in Baker McKenzie's Intellectual Property & Technology Practice Group and is based in the firm's London office. He is a Nigerian-qualified lawyer who focuses in data protection, privacy, and technology transactions.

Author

Amaka Uzoukwu is a Trainee Solicitor in Baker McKenzie's London office.