Renew Europe pushes EU to tackle “addictive design” and protect young people’s mental health
The paper proposes introducing EU-wide age verification mechanisms, built on the European Digital Identity Wallet
The Renew Europe group in the European Parliament is calling on the Commission to deploy its full legislative arsenal – including the Digital Services Act (DSA), the GDPR, and the AI Act – to reduce the harmful impact of social media platforms on young people’s mental health, according to a position paper seen by Euractiv.
The initiative comes amid mounting scientific evidence that intensive social media use affects the developing brains of adolescents, particularly areas linked to emotion regulation and impulse control. In 2022, 96% of adolescents in the European Union used social media daily, with 37% spending over three hours online, according to Commission data.
The group’s push for regulatory measures also coincides with a growing consensus in Brussels on the need to address concerns about excessive social media use among young people.
“Our children are growing up under the influence of algorithms that monetise attention and manipulate emotion,” MEP Veronika Ostrihoňová told Euractiv. “This is a kitchen table issue for millions of families that cannot be ignored at an EU level.”
Just last week, EU digital ministers signed the Jutland Declaration, a statement emphasising the “exceptional” need to safeguard children in the digital space. The statement was vague on details. However, there are signs that EU officials are starting to take concerns about social media use seriously.
European Commission President Ursula von der Leyen called the issue a top priority in her recent State of the Union address, eyeing the possibility of EU-wide rules to curb children’s use of social media services, which she characterised as profit-seeking and harmfully addictive.
Now, Renew is pushing in the same direction. The group wants EU rules to address addictive algorithms by clarifying the interaction between existing provisions on dark patterns, deceptive design techniques used to influence user behaviour.
Renew also called for mandatory child-safe defaults, including the automatic deactivation of continuous video playback, bans on screenshots of minors’ content, notification limits during nighttime hours, and the removal of filters linked to risks to body image or self-esteem.
“We need to recognise social media addiction as a public health issue, and demand accountability from the platforms that profit from our children’s vulnerability,” Ostrihoňová said.
Commission President von der Leyen raised a similar issue during her latest State of the Union address, arguing that it is “parents, not algorithms, who should be raising our children.”
Age verification and digital identity
Renew’s new policy paper proposes several measures, introducing an EU-wide age verification mechanism, built on the European Digital Identity Wallet and the principle of privacy by design. It also suggests the Commission explore a standardized minimum age or tiered age limits for social media use.
Last September, several political groups in the European Parliament had already pushed for stronger age controls during a debate in the Internal Market Committee.
S&D MEP Christel Schaldemose — rapporteur on the protection of minors online — advocated for the introduction of age verification mechanisms and a possible EU-wide minimum age for access to social media.
Her EPP colleague Pablo Arias Echeverría also called for reliable and non-intrusive age checks, and supported restricting access to social media and video-sharing platforms for children under-16s unless they obtain parental consent.
Renew, meanwhile, insists the Commission must regularly update guidelines under Article 28 of the DSA on the protection of minors. At the same time, Member States must equip national Digital Services Coordinators with adequate resources for enforcement.
Further measures could come under the Digital Fairness Act, which is under consultation until 24 October. The act aims to strengthen consumer protection and address addictive digital design and misleading influencer marketing, which primarily target and affect young people.
(bms, cs, cm)