New Regulations for Major Online Platforms in Europe
Written on
Introduction
In a groundbreaking initiative, the European Union (EU) is imposing stricter regulations on significant internet platforms such as Google, Facebook, TikTok, and Instagram. Each of these platforms has over 45 million monthly users, and the EU's goal is to bolster user safety, particularly for younger demographics. This includes implementing clearer advertising labels, banning ads directed at children, and limiting the use of sensitive data for advertising purposes.
The EU Commission's Directive
Recently, the EU Commission unveiled a list of 19 major internet providers, which includes not only American giants like Google, Facebook, Twitter, and Amazon but also the Chinese video sharing platform TikTok. These platforms, due to their extensive user bases, are now recognized as having significant societal responsibilities, leading to the introduction of stricter regulations.
New Regulations Targeting Critical Issues
Addressing Hate Speech and Misinformation
The digital landscape has long been challenged by hate speech, misinformation, and disinformation. Critics, such as data activist Max Schrems, have consistently highlighted the shortcomings in data protection and the transparency of platform operations. The new regulations require these platforms to proactively tackle these issues, compelling them to evaluate and mitigate risks associated with illegal content, gender-based violence, the protection of minors, mental health concerns, and the potential effects on freedom of expression and democracy.
Annual Risk Assessments and Transparency
Platforms are now obligated to prepare an annual risk assessment report, which will be scrutinized by the European Center for Algorithmic Transparency (ECAT). The outcomes of these assessments will be made public, enhancing transparency and accountability in the digital space.
Advertising Clarity
A significant change involves the transparency of advertising. Social media platforms must now clearly identify advertisements, revealing the sources of funding. Users will better understand why they encounter specific content, as platforms will disclose how their algorithms function and the criteria used for content selection.
The Digital Service Act (DSA)
These new regulations form part of the EU's Digital Service Act (DSA), serving as a fundamental legal framework for online services, social media platforms, and the broader digital environment. Enacted in 2020 and effective since November 16, 2022, the DSA aims to ensure user protection, promote transparency in digital services, and enforce greater accountability among internet giants. Prohibitions that apply offline are now extended to the online realm, addressing issues such as insults, incitement, and the distribution of prohibited content.
Conclusion and Compliance Deadline
Major platforms like Facebook, Google, and Amazon must comply with these regulations by August 25, marking a significant advancement in the oversight of online environments within the EU.
Final Thoughts: These regulations represent an essential step toward fostering a safer and more transparent digital ecosystem. By holding major platforms accountable, the EU is tackling pressing concerns like misinformation and advocating for a responsible digital landscape.
Implementing the global regulation for media platforms in the EU. This video discusses the implications and requirements set forth by the new regulations.
Tech giants face stricter rules in the EU as user figures exceed thresholds. This video explores how these changes will affect major tech companies.