Skip to content

Strict Age Verification App Trials: EU Nations Assess Tougher Age Control Measures

Gathers Personal Information: Date of Birth Records Kept By Retail Shops

Age Verification Testing: EU Nations Evaluate Application for Tightened Age Limits
Age Verification Testing: EU Nations Evaluate Application for Tightened Age Limits

Strict Age Verification App Trials: EU Nations Assess Tougher Age Control Measures

In a bid to protect minors online, the European Union (EU) is pushing for stricter age restrictions on social media platforms and pornographic websites. This initiative is primarily driven through updated regulations under the Digital Services Act (DSA) and the development of a dedicated age verification app.

## Regulatory Framework

The DSA now requires age verification as a risk-mitigation measure, allowing member states to set their own minimum age requirements for accessing certain online services or content. New guidelines under the DSA mandate platforms to assess potential harms to minors and implement age assurance measures, categorized by risk tiers—low, medium, and high.

Three main methods of age assurance are recognized: self-declaration, age estimation, and age verification. The guidelines emphasize that age-assurance techniques must be proportionate, data-minimizing, and auditable to protect user privacy.

## Age Verification App

The EU has developed a prototype of a white-label age verification app, scheduled for beta release in summer 2025. This app allows users to verify their age securely and privately, typically by scanning a government-issued ID or using biometric authentication. The app then issues an age-verified token or certificate that platforms can accept without needing to store or process sensitive personal data.

Social media platforms and pornographic websites can integrate this app into their user onboarding or content access workflows, ensuring that only users above the required minimum age can access age-restricted content or services.

## Challenges and Considerations

The DSA's approach to minimum age requirements may lead to a fragmented regulatory landscape across the EU. There is strong advocacy to ensure that new age verification measures do not compromise privacy, free expression, or exclude users without official documentation. Recommendations include strong privacy settings, transparency, and user control as complementary safeguards.

The EU Commission is currently investigating several platforms, including TikTok, Meta (formerly Facebook), and porn providers YouPorn, Stripchat, XVideos, and XNXX, for suspected shortcomings in child and youth protection. If the allegations against these platforms are confirmed, they could face high fines from the EU Commission.

The app developed by the EU Commission for blocking porn websites and certain online networks for children is currently in a test phase in five countries: France, Spain, Italy, Greece, and Denmark. Each country can develop its own app based on the software presented by the EU Commission.

In summary, the EU is advancing new legal and technical frameworks to enforce stricter age restrictions on social media platforms and pornographic websites, primarily through updated regulations under the DSA and the development of a dedicated age verification app. The Commission also recommends deactivating potentially addictive features, such as "streaks," for children and young people.

  1. The Community policy of the European Union (EU) includes the updated regulations under the Digital Services Act (DSA), which mandates stricter age restrictions on social media platforms and pornographic websites, along with the development of a general-news-worthy age verification app.
  2. Politics and technology intertwine in the EU's regulatory framework, as demonstrated by the implementation of stricter age assurance measures for accessing online content or services, facilitated by the forthcoming age verification app aimed at protecting minors.

Read also:

    Latest