#hertielove
11.05.2020

How should the EU regulate platforms as part of the Digital Services Act?

Daniela Stockmann discusses student contributions to the current EU policy dialogue in her spring 2020 project course. 

Students in Daniela Stockmann’s spring 2020 course, Social media lab: How should the EU regulate platforms as part of the Digital Services Act?, acted as researchers during the public consultation phase on internet regulation for the European Commission’s Digital Services Act. The students wrote op-eds and developed a website, www.digitalservicesact.eu, to actively participate in the public debate. The goal was to provide expertise on how platforms deal with hate speech and disinformation, and to identify key approaches for tackling those issues and discussing their advantages and disadvantages. Such a systematic discussion of approaches towards platform regulation is currently absent in the public and policy debate, says Stockmann. In this interview, she talks about the approach, goals and learning of the course.

What policy issue were you working on in this class?

Social media have recently been criticised for spreading false information and hate speech, polarizing people, promoting populist parties, and threatening peace and stability. In an effort to manage the disruptive elements of social media a consensus is emerging among EU policymakers that social media need to be regulated. The President of the European Commission Ursula von der Leyen has committed to a Digital Services Act in her Political Guidelines for the newly elected Commission that took up its work in fall 2019. According to the Directorate-General for Communications Networks, Content and Technology (DG Connect) in charge of creating a digital single market in Europe, the Digital Services Act (DSA) is going to redraft the e-commerce directive with implications for content moderation of social media platforms. Before and during the drafting of the DAS, a public consultation phase will start during the spring of 2020.

This provides an excellent opportunity to gain first-hand experience in participating in public and policy debates regarding platform regulation and to develop strategies to improve the quality of policies aimed at producing positive effects for society. One of the greatest difficulties in developing policies in this area is a knowledge gap between platforms and society, as well as policymakers, regarding what platforms currently do to address hate speech and disinformation circulating on social media platforms. Acting as researchers during the public consultation phase, this class aims to open space for policy discussion by 

1) providing expertise regarding what we currently know and do not know about platforms’ efforts regarding hate speech and disinformation; 

2) identifying key approaches aimed at tackling those issues and discussing their advantages and disadvantages. Systematic discussion of approaches towards platform regulation is currently absent in the public and policy debate.

How did the students contribute to developing ideas for this?

We have developed a website called www.digitalservicesact.eu on which we publicise information about the public consultation phase of the Digital Services Act related to platform regulation. Students contributed to the website in multiple ways throughout the course, starting with input on the web design and developing its content. We had a professional illustrator come in and present different ideas for visualisation of the problems the digital services act aims to address. The whole group provided feedback and took a vote regarding which illustration was most suitable to the website. Course assignments were designed to build up students’ expertise on the policy discussion regarding how to tackle online disinformation and hate speech; students also wrote op-eds in which they developed their own opinions regarding the various policy approaches to address these problems. We teamed up with the Hertie School’s Governance Post to provide feedback to students and to provide the opportunity for four op-eds to be published first in The Governance Post and then on our website. As a final assignment, students will work on three policy papers addressing content moderation, transparency, and business models as three potential avenues in which EU policymakers could go about tackling online disinformation and hate speech. These policy papers will be published on our website and distributed to stakeholders we have identified as we followed developments in Brussels and with whom we have established contact. DGConnect, the Directorate-General of the European Commission responsible for developing a single digital market, has been encouraging us to send our policy papers, and they have been thrilled to get some expert opinions not funded by big tech in this area. 

What were some takeaways from the course?

The European Commission under the Presidency of von der Leyen is currently looking for fresh ideas. The Commission is going to take major decisions regarding Europe’s digital development which will fundamentally shape digital transformation. It is a good time to think creatively about how to shape regulation of social media platforms.

The predominant approach towards addressing harmful content is content moderation, which was invented when companies like Facebook, Google, and Twitter self-regulated their content. As European policymakers are moving towards co-regulation, opportunities have opened up to improve the transparency and accountability of social media companies. A reform of business models may be a promising solution that tackles the reasons why hate speech and disinformation tend to go viral more easily than high quality information. Predominant approaches seem to cure the symptoms without addressing the true causes of the problem.

How would you like the results of this class to move forward?

I would be thrilled if the ideas we developed in op-eds and policy papers would be taken up by policymakers and journalists in order to advance the policy discussion about how to deal with what is called harmful online content. The coronavirus pandemic has highlighted the importance of tackling these problems, but we currently lack solid public discourse. As the Digital Services Act will be drafted and go into parliament by the end of 2020, I hope to continue to work with students in the fall, developing further ideas and participating in the public consultation by the European Commission.

 

More about Daniela Stockmann

  • Daniela Stockmann, Professor of Digital Governance | Director, Centre for Digital Governance