Members of the European Parliament are calling for regulations to be placed on large social media platforms in order to defend free speech and protect democracy, according to an announcement published by the Parliament.
According to the announcement, all Members of Parliament supported implementing regulations against “the vast power of social media platforms and their worrying impact on politics and freedom of speech.”
There was a large majority in Parliament to implement regulations also in part because there are none yet. As a result, social media platforms have censored content and accounts and have not been transparent about their operations or motives.
This has caused Parliament to call for the Digital Services Act and the Digital Markets Act to be a part of the Democracy Action Plan.
“Citing various decisions taken by the platforms to censor content or accounts, a large majority in the chamber highlighted the lack of clear rules governing such decisions and the lack of transparency of big tech practices. They urged the Commission to address the issue in the Digital Services Act and the Digital Markets Act, and as part of the Democracy Action Plan,” the announcement reads.
There was a large focus on providing legal certainty when removing content in an effort to ensure that decisions to censor content and accounts lies with democratically elected authorities, not private companies, in order to protect freedom of speech and fight against undue censorship.
Other topics of focus included:
- The need to fight disinformation to protect democracy and EU values.
- Algorithm transparency.
- The need to limit or ban microtargeting and profiling practices to fundamentally alter the business models of tech giants.
- The systemic risks and societal and economic harm that major social media platforms can cause or make worse.
- The problems caused by the emergence of tech monopolies and their impact on media pluralism, as well as pluralism in public discourse.
- The need for rules to govern both the online and offline spheres and the false dichotomy between the two.
The issue of regulating big tech companies and major social media platforms was first brought forward by the Parliament in October 2020. At that time, the Parliament stressed that enforcement of these regulations should lie with public law enforcement and decisions regarding enforcement should ultimately lie with an independent judiciary and not with a private commercial entity.
The 2019 EU elections were protected from disinformation through the EU Action Plan and the European Commission’s Code of Practice Platforms. The Commission has confirmed, however, that when implementing the Democracy Action Plan, self-regulatory measures need to be replaced with a “mix of mandatory and co-regulation measures” to appropriately protect users’ fundamental rights and regulate content moderation.
It is still unclear how social media platforms and tech giants will react to the regulations that are likely to be placed on them, or exactly what the regulations will entail.