Social media has recently been criticised for spreading false information and hate speech, polarising people, promoting populist parties, and threatening peace and stability. In an effort to manage the disruptive elements of social media, a consensus is emerging among EU policy-makers that social media needs to be regulated. German President-elect of the European Commission Ursula von der Leyen has committed to a Digital Services Act in her Political Guidelines for the 2019 European Commission. According to the Directorate-General for Communications Networks, Content and Technology (DG Connect) in charge of creating a digital single market in Europe, the Digital Services Act (DSA) is going to redraft the e-commerce directive with implications for content moderation of social media platforms. Before and during the drafting of the DSA, a public consultation phase will start during the spring of 2020. This provides an excellent opportunity to gain first-hand experience regarding public and policy debates on platform regulation, and to develop strategies to improve the quality of policies aimed at producing positive effects for society. One of the greatest difficulties in developing policies in this area constitutes a knowledge gap between platforms and society, as well as policy-makers regarding what platforms currently do to address hate speech and disinformation circulating on social media platforms.
Acting as researchers during the public consultation phase we aim to open space for policy discussion by
1) providing expertise on what we currently know and don't know about how platforms deal with hate speech and disinformation;
2) identifying key approaches aimed at tackling those issues and discussing their advantages and disadvantages. This systematic discussion of approaches towards platform regulation is currently absent in the public and policy debate.