Skip to Content

Why Are Dangerous TikTok Challenges Killing Teens? A Parent’s Worst Nightmare Explained

How Social Media Algorithms Push Kids Toward Fatal Stunts? The Shocking Truth About Digital Manipulation

When a mom loses her child to a dangerous internet trend, something is very wrong. Norma Nazario learned this the hard way. Her 15-year-old son Zackery died while “subway surfing” on a moving train in New York City.

What’s subway surfing? Kids climb on top of subway cars while they’re moving. They do it for likes and views on social media. It’s deadly. Since 2002, 16 people have died trying this stunt in New York alone.

But here’s what makes this story different. Norma is fighting back. She’s suing the big tech companies – Meta (which owns Instagram) and ByteDance (which owns TikTok).

The Real Problem: How Apps Target Young Minds

These platforms don’t just show random videos. They use smart computer programs called algorithms. These programs study what you watch. Then they show you more of the same thing.

A judge named Paul Goetz made an important decision. He said Norma can try to prove that these companies “goaded” her son into subway surfing. This means they pushed him toward danger on purpose.

Here’s how it works:

  1. The app sees you’re a teenager
  2. It notices you watch risky videos
  3. It shows you more dangerous content
  4. You get hooked on the excitement
  5. You try the stunts yourself

Research shows this actually happens. One study found that TikTok’s algorithm increased harmful content by four times in just five days. That’s scary fast.

What Happened to Zackery

On February 20, 2023, Zackery and his girlfriend climbed on top of a subway train crossing the Williamsburg Bridge. A low beam hit Zackery. He fell between the cars. The train ran him over.

His mom found subway surfing videos all over his social media accounts. The platforms had been feeding him this content. They knew he was just 15 years old.

The lawsuit says something chilling. These apps even encouraged Zackery to buy gear for subway surfing – like ski masks and gloves. They turned a deadly activity into a shopping list.

Why This Court Case Matters

For years, social media companies have hidden behind a law called Section 230. This law says they can’t be blamed for what users post. It’s like saying a phone company isn’t responsible for bad phone calls.

But Judge Goetz saw through this defense. He said these companies do more than just host content. Their algorithms actively target kids based on age. They’re not neutral platforms anymore. They’re pushing specific content to specific people.

The judge wrote: “It is plausible that the social media defendants’ role exceeded that of neutral assistance in promoting content”. In simple terms – they’re not just innocent bystanders.

The Science Behind Digital Addiction

Young brains are still growing. They’re extra sensitive to rewards and excitement. Social media companies know this. They design their apps to be addictive.

Here’s what happens in a teen’s brain:

  1. Notifications trigger dopamine (feel-good chemicals)
  2. Endless scrolling creates partial attention
  3. Algorithms learn what makes you click
  4. Content gets more extreme to keep you hooked

Research shows a direct link between social media time and depression. For every extra hour on these platforms, depression risk goes up by 13%.

Other Dangerous Trends That Worry Parents

Subway surfing isn’t the only deadly challenge. Social media has spawned many dangerous trends:

  • Car surfing (standing on moving cars)
  • Door kicking challenges
  • Urban exploration in dangerous places

A report called “Dared by the Algorithm” showed how easily kids find these videos. Researchers created a fake account for a 14-year-old boy. Within days, the platforms were showing dangerous challenge videos.

What Police Are Doing

New York police now use drones to catch subway surfers. Most kids they find are around 15 years old. That’s the same age Zackery was when he died.

The subway authority (MTA) says subway surfing incidents more than doubled around the time of Zackery’s death. They blame social media for the increase.

The Money Behind the Madness

These platforms make billions from advertising. In 2022, US children generated $11 billion in ad revenue for major social media companies. Kids are profitable. The longer they stay on the app, the more money companies make.

One expert called artificial intelligence in social media “fentanyl” added to “digital heroin”. These platforms are designed to be addictive. They target kids because young users are more valuable long-term.

What Parents Can Do Right Now

Don’t wait for laws to change. Here’s what you can do today:

Check your kid’s phone

  • Look at their social media feeds
  • See what videos they’re watching
  • Notice if content is getting more extreme

Talk openly about algorithms

  • Explain how apps manipulate them
  • Teach them to recognize targeted content
  • Help them understand they’re being sold to

Set real limits

  • Use parental controls
  • Create phone-free zones
  • Model healthy tech habits yourself

The Fight for Change

Norma’s lawsuit is moving forward. This could change everything. If she wins, it might force social media companies to protect kids better.

Other laws are being considered too. The Kids Online Safety Act would require platforms to act in children’s best interests. It would let kids opt out of recommendation algorithms.

Some countries are taking stronger action. Australia banned social media for teens under 16. The US is considering warning labels on social platforms.

Why This Matters for Every Family

You might think this won’t happen to your family. But consider this: 96% of teens use social media. These platforms are getting smarter at manipulation. They know exactly which buttons to push.

Zackery wasn’t a “bad kid.” He was a normal teenager who got caught in a digital trap. His mom found videos on his accounts that he never searched for. The algorithm brought them to him.

The companies knew he was 15. They knew the content was dangerous. They showed it to him anyway. And they profited when he watched.

This isn’t about censorship or taking away fun. It’s about protecting kids from predatory business practices. When algorithms target children with deadly content, that’s not free speech. That’s corporate negligence.

Norma Nazario lost her son. Now she’s fighting so other parents won’t have to. Her lawsuit could be the first step toward making the internet safer for everyone’s children.

The question isn’t whether social media affects kids. The question is whether we’ll let tech companies continue putting profits over young lives.