The way social media platforms tackle disinformation has been under intense scrutiny during the COVID-19 pandemic (see here and here). Last week, the European Commission released its assessment of how the self-regulatory system to tackle disinformation - a Code of Practice agreed to by Facebook, Google, Twitter, Mozilla, Microsoft, TikTok and various advertising bodies - performed in its first 12 months of operation.
The report shows that the biggest platforms have taken many positive and unprecedented steps to reduce the volume and impact of disinformation circulating online, from removing hundreds of thousands of misleading adverts to prioritising information from trustworthy sources.
However, the overriding message of the report is that self-regulation can only take you so far. The report highlights the shortcomings of any voluntary regime, including the “regulatory asymmetry” between those who choose to comply and those who don't, the absence of any independent oversight to monitor compliance and the lack of any effective remedies where there are breaches.
With the EU’s Digital Services Act on the horizon, as well as new regimes in other Member States (see here), this looks to be another step along the road from self-regulation to binding legislation on online harms.
The Code of Practice has shown that online platforms and the advertising sector can do a lot to counter disinformation when they are put under public scrutiny. But platforms need to be more accountable and responsible; they need to become more transparent. The time has come to go beyond self-regulatory measures. Europe is best placed to lead the way and propose instruments for more resilient and fair democracy in an increasingly digital world.”