Table of Contents
- Are EU GDPR changes making kids less safe online? A practical guide to Safer Internet Day 2026 and child data protection
- Safer Internet Day: Safety is also a political job
- Why GDPR details matter for real-world safety
- The credibility gap: Teaching safety while lowering safeguards
- What to do now: Practical guidance aligned with policy reality
Are EU GDPR changes making kids less safe online? A practical guide to Safer Internet Day 2026 and child data protection
You want children and young people to use the internet safely. Awareness campaigns help, but they cannot carry the whole load. A safer internet also depends on lawmakers setting clear, enforceable limits on how companies collect and use personal data.
Safer Internet Day: Safety is also a political job
Safer Internet Day 2026 takes place on February 10 under the theme “Together for a better internet.” The day highlights a simple goal: help children and young people use digital media responsibly. That work matters in schools and families, but it also belongs in parliaments and ministries.
Bettina Gayk, the State Commissioner for Data Protection and Freedom of Information in North Rhine-Westphalia, argues that it is not enough to tell young people to “be careful.” Education reduces risk, yet it cannot fix a system that rewards data extraction. If platforms can track users widely, children face pressure they cannot realistically manage alone. A stable legal framework must define where data use ends—especially when minors are involved.
Why GDPR details matter for real-world safety
Gayk points to a resolution from Germany’s data protection supervisory authorities. The aim is to anchor stronger protection for children and young people more firmly in the GDPR. The policy direction is straightforward: make youth protection a built-in rule, not a voluntary feature.
Her main concern targets recent EU-level proposals to amend the GDPR in ways that reduce protections for internet users. One example is the idea of exempting pseudonymized data from the GDPR under certain conditions.
Pseudonymized data is data that does not directly name a person, but can still be tied back to them with additional information. A basic example: a company replaces “Aisha Ahmad” with “User 19482,” but still holds a separate key that can reconnect the number to the name. Under today’s approach, that data generally still counts as personal data, because it can still relate to an identifiable person.
If the rules shift so that pseudonymized data is not consistently treated as personal data, companies may gain room to process more user data with fewer legal constraints. In practice, that can mean more tracking and profiling, including for advertising, without the same level of justification or safeguards. Gayk’s warning is structural: weaken the definition of personal data, and you weaken one of the core pillars that makes individual data rights enforceable.
The credibility gap: Teaching safety while lowering safeguards
Safer Internet Day reaches millions of people across roughly 190 countries and territories. It speaks to EU and national decision-makers, businesses, civil society, educators, parents, and young people. That broad audience only makes sense if the message is consistent: reduce harm, reduce exposure, reduce manipulation.
Gayk calls it problematic when policymakers promote youth awareness on one hand, yet advance legislative proposals on the other that could expand data processing in ways that increase online risks. Her logic is plain: you cannot tell children to avoid danger while you redesign the street to allow faster traffic.
What to do now: Practical guidance aligned with policy reality
Even with strong laws, media literacy stays essential. When laws are weak or slow to adapt, literacy becomes even more important because children must recognize risks in real time.
For children and young people:
- Treat “free” apps as paid with data, ask “What does this app want from me?”
- Limit tracking where possible, use privacy settings, restrict app permissions, and say no to unnecessary cookies
- Watch for persuasion design (endless scroll, streaks, “people you may know”), and take breaks on purpose
For parents and educators:
- Build routines: device-free time blocks, shared rules, regular privacy checkups
- Use age-appropriate conversations about ads, tracking, and why recommendations appear
- Prefer services that minimize data and offer clear controls
For policymakers and regulators:
- Keep pseudonymized data within robust protections when it can still link to people
- Require child-focused defaults (data minimization, limited profiling, high-privacy settings by default)
- Enforce transparency that a teen can understand, not just a lawyer can parse