Commentary
11.04.2025

A real ban on dark patterns - why co-regulation is not enough

This blog post was originally designed as an op-ed – a graded paper as part of Professor Daniela Stockmann's course, Tech Companies & the Public Interest, in Spring 2024

Imagine your morning: You grab your coffee and deliberately ignore the pack of cigarettes on the table because the horrifying image of someone's punctured lung on it has made you lose all desire to smoke. Next, you pick up your phone and open Instagram. You are planning to take just five minutes. Some 20 minutes later, you look up and realise you're late for work. The endlessly entertaining cat videos, the memes, and even the world news however tragic just sucked you in and wouldn't let you go. Again! This does not need to happen. Just as you left the ciggies to one side because of the gruesome picture that serves as a warning, you could limit your app time. Instead of giving you a fleeting reminder that your 5-minute time limit is up, the phone could show you an image of the potential change in your brain caused by social media addiction. Why isn’t this happening? 

Well, the EU is big on co-regulation. According to Daniela Stockmann, Professor of Digital Governance at the Hertie School, co-regulation is a governmental strategy that establishes a non-state regulatory regime involving multiple stakeholders. Among these are industry, advertisers, consumer organisations, and public interest groups. In other words, Big Tech is sitting at the table when regulating it is being discussed/negotiated. Talk about a conflict of interest! Yet, the recent Digital Services Act (DSA) enshrines a co-regulatory approach. Tech companies are caught between protecting their business model - making us spend ever more time on platforms designed to sell advertisement slots to marketers that match our interests and habits - AND complying with the DSA and the values promoted by it. 

As an in-house technology consultant for German ministries, the question of how to deal with big tech is always before me and from several different angles. In a recent research paper, I looked more closely at how the time well-spent paradigm works on Instagram. Among other things Instagram now offers you the option to limit your daily viewing time and to set regular breaks. Since I wrote the paper Instagram has adjusted the time reminders you can choose and has partially removed dark patterns from them. Dark patterns are, put briefly and plainly, manipulative design tricks to get users to make choices against their will, making them spend more time on the app by design. If the list of suggested break time reminders goes, say, 30 minutes, 45 minutes, 10 minutes, 5 minutes, you are more likely than not to choose thirty. And not putting them in ascending order is a deliberate platform design choice designed to make you spend more time - a dark pattern. Such use of dark patterns needs to be completely banned, and the ban strictly enforced. 

In fact, dark patterns are already banned in law. Article 67 of the DSA is wholly concerned with this issue. Yet, we can still find dark patterns on the apps we use every day. Big Tech is trying to bend the law by reducing dark patterns but without abolishing them. This is exemplified by a recent switch to ascending time limits on Instagram. However, this change does not go far enough. You can dismiss the reminder in a split second and stay on the app indefinitely without further reminders. And even this is only if you have taken the trouble to set these limits for yourself in the first place. Companies will never stop using this manipulative technique off their own bat because of their business model. Implementation and interpretation of the ban on dark patterns need to be stricter. 

Make no mistake, banning dark patterns matters even more than individual consumers' right to make their own decisions. It lies at the heart of our liberal democracy. Researchers have shown that mental health is crucial for political participation. Negative mental health outcomes are closely related to the overuse of apps. Participation is a vital part of liberal democracy if it is to represent all interests, reinforce trust and create community as researchers like Bullock and others at Robert Bosch Stiftung  have underlined. 

And wouldn’t it be nice to enjoy your morning coffee forming your own arguments about how to participate in our liberal democracy? And not having to blame yourself for getting lost in an app deliberately designed to addict you? Sure, I could still opt out of using Instagram altogether, or any other online platform for that matter. It’s not just Instagram; here it stands for other very large online platforms (VLOPs). They form an important part of our public sphere, how we relate to each other and exchange ideas and goods and services. With this great power should come great responsibility. They should not be allowed to exploit their co-regulatory powers by aggressively trying to make us addicts in the search for ever greater profit. The platform's design should put the consumer at its heart. Regulation should be applied in our favour. Not theirs. 


Photo by Claudio Schwarz on Unsplash