The European Union is advancing new regulations to protect children from the addictive designs of social media platforms, including TikTok and Meta. This initiative, announced by European Commission President Ursula Von der Leyen, aims to address the increasing risks associated with social media use among young users.
Von der Leyen highlighted the growing concerns over issues such as sleep deprivation, anxiety, and cyberbullying, emphasizing that the focus should be on whether social media should have access to young people rather than the other way around.
Key Features of the Digital Fairness Act
The forthcoming Digital Fairness Act (DFA) is expected to include several important measures:
- Banning Manipulative Practices: The Act will prohibit features that manipulate users, such as endless scrolling and autoplay.
- Regulating Influencer Marketing: It will also address misleading practices in influencer marketing targeted at children.
- Minimum Age Requirements: The Commission may propose a minimum age for platform access to further safeguard young users.
Strengthening Existing Regulations
This new regulation will enhance the EU's existing Digital Services Act (DSA), which mandates that large platforms take more responsibility for tackling illegal and harmful content. The Commission is currently investigating several platforms, including TikTok and Meta, regarding their compliance with these regulations.
Broader Implications for Social Media
The EU's stance reflects a growing global trend to regulate social media more strictly, with various countries discussing or implementing legislation aimed at protecting minors online. This movement is part of a broader effort to ensure that digital environments are safer for younger audiences.