Shadow banning on social networks: What the DSA is changing for users and platforms

Shadow banning is an opaque practice by which a platform reduces the visibility of a user without their knowledge, having the effect of reducing the reach and engagement of this user. The aim of shadow banning is to stop the distribution of content that may violate the rules of a platform without taking direct action against the user: it is silent censorship. This hitherto unregulated practice is addressed by a new regulation that will change a number of things for platforms as well as for users.

Shadow Banning: A Hitherto Unregulated Practice

Shadow banning is used by digital platforms as a way to moderate content without using express censorship measures, such as deleting content or suspending accounts. While the platforms have publicly denied the practice of shadow banning, they have recognized that their algorithms have an impact on the choice of content to be treated first. In 2018, Mark Zuckerberg said that “techniques for reducing visibility would help platforms manage misinformation without becoming “arbiters of truth”: preferring silent censorship to public censorship (1). This practice, while controversial, has remained largely unregulated, leaving platforms with the discretion of its application and little recourse for affected users.

Example of shadow banning on LinkedIn: a user who sends too many messages to people who are not subscribed to his account, who tags too many people in his posts, who relays a lot of false information or who the algorithm suspects of automating his actions will undoubtedly notice a decrease in the reach and engagement of his posts.

The DSA: A New Framework for Shadow Banning

The European “Digital Services Act” (DSA) regulation, which came into force in August 2023 for the largest platforms and will come into force on February 17, 2024 for other platforms, marks a significant turning point in the regulation of digital services. This regulation introduces transparency and accountability obligations for platforms when it comes to content moderation. When it comes to shadow banning, the key DSA articles to consider are:

  • Section 14 : this article requires platforms to set out in their general conditions all their moderation rules (including algorithmic ones) in clear and unequivocal language. This means that the practice of shadow banning must be made public and explained to users.
  • Section 17 : this article requires platforms to notify users of moderation actions taken against them, by providing them with an explanation in particular as to the facts and circumstances on the basis of which the moderation decision was taken, as well as to indicate to the user how he can challenge this decision.

What the DSA is changing for platforms and users

The DSA imposes an obligation on digital platforms to be transparent and to justify their moderation practices, including shadow banning. This means that:

  • For platforms : Content moderation policies and terms and conditions should be reviewed to ensure that they comply with DSA requirements. This includes setting up clear mechanisms to notify users of moderation actions as well as to deal with potential disputes.
  • For users : there is now a legal framework for understanding and challenging moderation decisions that are made against users. The elements relating to the moderation decisions made by the platforms, as well as the means of contesting them, will be found in the terms and conditions of the platforms.

How to react to shadow banning as a user or platform

  • Users : if you suspect that you are a victim of shadow banning, consult the terms and conditions of the platform and ask for an explanation under article 17 of the DSA. If the response is unsatisfactory, you may consider challenging any decisions that have been made against you.
  • Platforms : update your terms and conditions to ensure that they comply with the DSA (including clear and unequivocal elements about your content moderation methods). Establish transparent and accessible processes for notifying and challenging moderation decisions and remember that every moderation decision should be justifiable.

Conclusion

The Digital Services Act represents a significant change in the regulation of digital platforms, offering a legal framework to challenge the practice of shadow banning. For both users and platforms, it is crucial to understand these new legal obligations and to adapt to a more transparent and responsible digital environment.


(1) Mark Zuckerberg, 2018 about Shadow Banning, The Recode Interview.