Molly Russell’s death may spark social media reforms for US kids

In a global first, a London coroner has blamed social media for the suicide of a teenage girl — potentially opening the legal floodgates against industry heavyweights like Instagram, Facebook, Snapchat and TikTok.

But the tragic case of Molly Russell could also point the way toward life-saving legislative reforms. That is, if Big Tech doesn’t pull the plug.

Molly was 14 years old when she took her own life in 2017, after secretly delving into “the bleakest of worlds” on Instagram and Pinterest, her father Ian Russell told North London Coroner’s Court on Sept. 21.

Without the family’s knowledge, Molly — once “full of love and bubbling with excitement for what should have lay ahead in her life” — was “pushed into a rabbit hole of depressive content,” her father said, as the two sites’ artificial-intelligence algorithms directed a continual stream of dark and hopeless content to her feed and in-box.

Proprietary algorithms keep social media users hooked, and lock their attention to their screens, by feeding them more of what the programs predict they want.

Once Molly engaged with posts related to depression and self-harm, she was bombarded with relentlessly grim content that took a brutal toll on her psyche.


Molly's father, Ian Russell, Russell said his daughter was once "full of love and bubbling with excitement." The coroner has concluded that the 14-year-old died after suffering from "negative effects of online content."
Molly’s father, Ian Russell, Russell said his daughter was once “full of love and bubbling with excitement.” The coroner has concluded that the 14-year-old died after suffering from “negative effects of online content.”
PA Images via Getty Images

“Everyone is better off without me … ” Molly tweeted in July 2017, four months before she died, on a Twitter account she hid from her family. “I don’t fit in this world.”

Testified her father: “Ending her life seemed to her like a solution — while to us her life seemed very normal.”

Even after Molly killed herself, one social media platform reportedly sent her a personalized email pointing her to suicide-themed messages, like an image of a girl’s cut thigh captioned “I can’t tell you how many times I wish I was dead.”

Her father, monitoring his daughter’s email account on the family computer after her death, was “shocked” to see such subject lines as “10 depression pins you might like” piling up in her in-box.


Ian Russell said that, even after his daughter's death, one social media platform's algorithms sent her a personalized email pointing her to suicide-themed messages
Ian Russell said that, even after his daughter’s death, one social media platform’s algorithms sent her a personalized email pointing her to suicide-themed messages
PA Images via Getty Images

Child psychiatrist Dr. Navin Venugopal, who reviewed Molly’s accounts for the court, called the material “very disturbing, distressing.”

“I was not able to sleep well for a few weeks” after evaluating the content, Venugopal said. “It would certainly affect her and made her feel more hopeless.”

Officials from Pinterest and Meta, the company that owns Instagram and Facebook, insisted on the witness stand that the material Molly accessed was benign.

But coroner Andrew Walker found that the teen “died from an act of self-harm whilst suffering from depression and the negative effects of online content.”

“The platforms operated in such a way, using algorithms, as to result in binge periods of images provided without Molly requesting them,” Walker wrote on Sept 30. “It is likely that the material viewed by Molly … contributed to her death in a more than minimal way.”


Ian and Janet (far left, with one of their daughterIan and Janet (far left, with one of their daughters, right) Russell's landmark court case in the UK court, which holds social media to blame for their daughter Molly's death, may have a major impact in the US.s, right) Russell, recently won a landmark case in a UK court, which holds social media to blame for their daughter Molly’s death.
Ian and Janet (far left, with one of their daughters, right) Russell’s landmark case in the UK, which holds social media to blame for their daughter Molly’s death, may have a major impact in the US.
PA Images via Getty Images

Activists in the US — where suicide in the 12-to-16 age group increased by 146% between 2009 and 2019 — saw the ruling as a breakthrough.

“It is a huge development,” attorney Matthew P. Bergman of the Seattle-based Social Media Victims Law Center told The Post. Bergman has filed suit against the social-media giants on behalf of seven American families who lost their kids to internet-related suicide, with dozens more cases in the works.

“It’s the first time that a social media company has been adjudicated to be a cause of a child’s suicide,” Bergman said.

Tammy Rodriguez, a Connecticut mom who has sued Meta and Snap over the suicide death of her 11-year-old daughter Selena, called the British decision “wonderful news.”


According to attorney Matthew Bergman, “technologies that would remove 80% of the risk of [algorithms] already exist.” But, he said, social-media brands fear curbing user engagement.
According to attorney Matthew Bergman, “technologies that would remove 80% of the risk of [algorithms] already exist.” But, he said, social-media brands fear curbing user engagement.
NY Post photo composite

According to the lawsuit, Selena died in 2021 after her extreme addiction to Snapchat and Instagram led to severe sleep deprivation as she sought to keep up with round-the-clock alerts. She spiraled into depression, eating disorders and sexual exploitation before taking her own life.

“Not that anything could bring the beautiful Molly back,” Rodriguez told The Post, “but holding social media companies accountable will save children in the future.”

“We do this for Molly and Selena and every other beautiful girl who deserved better in this world,” Selena’s sister Destiny, 22, said of the family’s legal battle.

Frances Haugen, the Facebook whistleblower who leaked thousands of internal documents in 2021 and exposed the company’s addictive algorithms, predicted that the coroner’s finding will be “the first of many.”


Bergman has filed suit against the social-media giants on behalf of seven American families who lost children to suicide.
Bergman has filed suit against the social-media giants on behalf of seven American families who lost children to suicide.

“A court has recognized that algorithms are dangerous and that engagement-based ranking and its bias pushing users towards ever more extreme content can cost children their lives,” Haugen told The Post.

Teens are uniquely susceptible to the addictive lure of social media — a fact that Meta’s own research, detailed in Haugen’s trove of documents, revealed.

“It’s just a simple question of neurology,” Bergman claimed. “The dopamine response that an adolescent gets upon receiving a ‘like’ from Instagram or Facebook is four times greater than the dopamine response an adult gets.”

Shocking or psychologically discordant content — like the dark materials that Pinterest’s and Instagram’s algorithms allegedly pushed into Molly’s feeds — amps up the dopamine hit even more, heightening the urge to keep scrolling, Bergman said, citing Haugen’s testimony.


Tammy Rodriguez (left), a Connecticut mom who has sued Meta and Snap over the suicide of her 11-year-old daughter Selena (right), called the British decision "wonderful news."
Tammy Rodriguez (left), a Connecticut mom who has sued Meta and Snap over the suicide of her 11-year-old daughter Selena (right), called the British decision “wonderful news.”

“These algorithms are very sophisticated artificial intelligence products, designed by social psychologists and computer scientists to addict our kids,” Bergman claimed.

What’s worse, teens and pre-teens are prone to poor decision-making due to their brains’ still-developing executive function skills.

“I mean, that’s what teenagers do — they make bad decisions,” Bergman said. “We all did at that age. But in the past, bad teen decisions didn’t stay online in perpetuity.”

Today, social media immortalizes and amplifies kids’ inevitable mistakes, opening the door to bullying and blackmail, as well as anxiety and depression.


Rodriguez (center, with daughters Destiny, left, and Selena) told The Post: “Holding social media companies accountable will save children."
Rodriguez (center, with daughters Destiny, left, and Selena) told The Post: “Holding social media companies accountable will save children.”
Courtesy of the Rodriguez Family

“What happened to Molly Russell was neither a coincidence nor an accident,” Bergman claimed. “It was a direct and foreseeable consequence of an algorithmic recommendation system designed to place user engagement over and above user safety.

“It’s profits over people,” he alleged.

And the social media behemoths have the power to stop much of the damage.

“What is most distressing is that technologies that would remove 80% of the risk of these products already exist, and could be implemented in matter of weeks,” Bergman claimed. “And these companies have decided, ‘Well, if we implement that we’ll lose user engagement, so we won’t do it.’”


According to mom Tammy Rodriguez's lawsuit, Selena committed suicide after her addiction to Instagram led to severe sleep deprivation.
According to mom Tammy Rodriguez’s lawsuit, Selena committed suicide after her addiction to Instagram led to severe sleep deprivation.

While eliminating the algorithms for kids could quickly cut down on addictive behaviors, age and identity verification may also immediately reduce online sexual abuse.

“There is nothing in any of the platforms to ensure that people are the appropriate age and to ensure that they are who they say they are,” Bergman noted. “But this technology is off-the-shelf — dating apps like Tinder use it all the time.

“If technology is good enough for folks who want to hook up, good Lord, we should be providing it to our kids,” he said.

In the face of corporate inaction, state legislatures are teeing up a patchwork of laws aimed at making the internet safer for kids and teens.


Russell was reportedly "shocked" to see such subject lines as “10 depression pins you might like" piling up in his daughter's in-box.
Russell was reportedly “shocked” to see such subject lines as “10 depression pins you might like” piling up in his daughter’s in-box.
PA Images via Getty Images

California’s Age-Appropriate Design Code Act, signed into law by Gov. Gavin Newsom last month, will impose tight data privacy settings on the accounts of social media users under age 18 and require age verification for access. The measure, regarded as the strictest of its kind in the US, won’t take effect until 2024.

“The bill has a lot of promise,” Bergman said.

Other states are contemplating similar laws. A New York bill, introduced last month by state Sen. Andrew Gounardes (D-Brooklyn), would require tech companies to establish a fast-access helpline for use when a child’s data is compromised — essentially, a 911 for digital crimes.

“We’re not trying to shut down social media,” Gounardes said. “We’re just trying to put in place smart, thoughtful and important guardrails.”

None of the state laws in the pipeline crack down on the potentially damaging, yet immensely profitable, algorithms that the UK coroner found could do the greatest harm.


GOP Sen. Marsha Blackburn of Tennessee says Big Tech's inability to change has caused her to take action.
GOP Sen. Marsha Blackburn of Tennessee says Big Tech’s inability to change has caused her to take action.
Getty Images

“If Instagram had done something as simple as let Molly reset her own algorithm without losing her friends or old posts, she might be alive today,” Haugen said. “She shouldn’t have had to choose between her past and her future.”

But a bipartisan bill that’s stalled in the US Senate could do just that.

“Big Tech’s unwillingness to change has prompted us to take action,” said GOP Sen. Marsha Blackburn of Tennessee, who wrote the Kids Online Safety Act with Connecticut Democrat Sen. Richard Blumenthal.

Introduced by the ideological rivals in February in the wake of Haugen’s searing congressional testimony, the bill would let minors and their parents disable social media algorithms altogether. It would also require parental warnings when children access harmful material.


Frances Haugen, the Facebook whistleblower who leaked thousands of internal documents in 2021 and exposed the company’s addictive algorithms, told The Post that the coroner's finding in Molly's death will be "the first of many."
Frances Haugen, the Facebook whistleblower who leaked thousands of internal documents in 2021 and exposed the company’s addictive algorithms, told The Post that the coroner’s finding in Molly’s death will be “the first of many.”
Getty Images for Unfinished Live

“Sen. Blumenthal and I have heard countless stories of the physical and emotional damage caused by social media, and Molly Russell’s story is utterly heartbreaking,” Blackburn said. “The Kids Online Safety Act will require social media platforms to make safety — not profit — the default.”

Despite support from both sides of the aisle and unanimous approval in the Senate Commerce Committee, Majority Leader Chuck Schumer has not brought the bill to the floor for debate.

“It is awaiting a full vote before the Senate, whenever Leader Schumer decides protecting children online is a priority,” snarked a Senate aide.

“I think it’s good that people on the right and on the left are recognizing that there are product design changes … that are necessary to keep kids safe,” Haugen said.

“We need to stop accepting that kids die from social media and act.”

If you are struggling with suicidal thoughts or are experiencing a mental health crisis and live in New York City, you can call 1-888-NYC-WELL for free and confidential crisis counseling. If you live outside the five boroughs, you can dial the 24/7 National Suicide Prevention hotline at 988 or go to SuicidePreventionLifeline.org.