A recent French lawsuit alleges that Instagram discriminates against feminist content while giving sexism a free pass. *trigger warning*
Trigger warning: This article features sexual themes and deals with sexual assault as well as discrimination.
Here’s why this represents an exciting challenge to the commercial logic of privatised social media governance.
‘What can we do to make men stop raping?’ This question went viral over French Instagram in January 2021 – until Instagram brought the conversation to an abrupt end by deleting all the posts involved. The only explanation given: ‘algorithmic error’. For 14 French feminist and activist influencers, this was the last straw. They filed a lawsuit on 9 March against Facebook (which owns Instagram), insisting that its content moderation practices are unfair and demanding to see what happens behind the scenes.
The activists claim that their posts are regularly deleted or ‘shadowbanned’ (hidden from followers’ feeds and search results), even when they don’t violate any of Instagram’s stated policies. The January incident was just the latest example: One claimant, Jüne Plã, author and owner of the feminist and LGBTQ+-inclusive sex advice account @jouissance.club, has had her account deleted three times. The activists argue that this represents a glaring double standard: feminist content is deleted without reasoning, but Instagram takes little action when users report sexism or harassment.
Though widespread anecdotal evidence makes this claim highly plausible, Facebook and Instagram’s moderation processes are effectively a black box. Professional content creators play constant guessing games about what ‘the algorithm’ wants to see and what might get them blocked. This secrecy means the French activists can’t prove they’re being discriminated against. That’s why their case asks that Facebook disclose its moderation procedures and outcomes to an independent expert who will analyse the extent to which these alleged practices are taking place and how many users are being affected.
By exposing and challenging Instagram’s double standards, this lawsuit initiates a much-needed debate about how private companies set the terms of online communication. Jennifer Cobbe, an expert on social media law, has suggested that content moderation inserts commercial priorities into all user interactions. Corporate objectives – like protecting ‘brand safety’ by ensuring adverts don’t appear alongside anything controversial – determine the structural conditions for public debate and private conversations. As highlighted in a Médiapart article endorsed by many of the French activists in February, this consistently makes online spaces more exclusionary, discriminatory and oppressive.
Online speech around the world is subject to arbitrary rules that reflect Facebook’s corporate interests. For example, it is notoriously conservative about nudity and sexual content, a stance disproportionately influenced by US social norms. Yet these policies also seem to be applied in a highly inconsistent way which favours conventional gender roles and beauty standards. A 2020 AlgorithmWatch study found that Instagram influencers were more widely promoted when they wore revealing outfits. This, however, isn’t the case for everyone: similarly suggestive, yet non-explicit pictures by users who are queer, people of colour, fat or otherwise challenging normative beauty standards are regularly deleted without explanation.
These moralistic policies have real-world consequences. Content creators who earn a living from social media face the constant economic risk of being cut off. In a notorious recent example, which the French activists also criticised in their Médiapart article, Instagram abruptly decided in December 2020 to clamp down on all content even remotely associated with sex work. As well as silencing voices which are already marginalised in public debate, sex workers argue that this exacerbates their economic precarity during the pandemic.
So far, state regulation of content moderation has typically focused on ensuring illegal content gets deleted as quickly as possible. This approach does nothing to challenge how private corporations dictate the terms of online conversations, and ignores the possibility that stricter, more ‘effective’ moderation will exacerbate inequalities of representation.
In this context, the French case is exciting because it represents a more fundamental challenge to the hyper-commercial logic of social media governance. By suing Instagram for discrimination, they are rejecting the premise that online content should be organised based on what is most profitable – a paradigm which digital justice scholar Safiya Noble argues inherently silences and devalues marginalised groups by catering to the interests of those in power (the majority). Queer feminist content will probably never make as much money for Facebook as the content which reflects mainstream values and beauty norms. That doesn’t mean Facebook should get away with treating it differently.
The case joins a series of recent initiatives in which users are challenging arbitrary and unaccountable platform governance. For example, several court cases in Germany have developed the principle that large platforms must uphold their users’ constitutional rights to equality and free speech. Most recently, a court applied this principle to rule that Facebook cannot moderate content in an arbitrary or unreasonable way. The French case takes this one step forward by focusing on collective interests rather than just individual rights. The activists are not only seeking to have particular posts restored, but to reveal and challenge systematic discrimination.
Platforms make money by exploiting and dictating the terms of their users’ labour, and this case puts in the spotlight the immense economic power social media platforms exercise over their content creators. Their claim echoes a recent Dutch case in which union-backed Uber drivers were partially successful in demanding access to Uber’s driver tracking data, which they planned to pool in a data trust to facilitate collective bargaining. Social media creators in many countries are also unionising, whether independently or by joining traditional showbusiness unions.
These steps towards collective organisation give us a glimpse into alternative ways of running social media with governance structures that represent users, not just advertisers and shareholders. The French activists are making it known that the systemic inequality of representation is not ‘algorithmic error’, but rather the intent of commercially-driven social media. If their discrimination case is successful, this could be a springboard for more sustained collective action to bring progressive and inclusive values into the online public sphere.