Social Media Giants Blamed for British Teenage Suicides
“She had so much to offer.”
Ian Russell is speaking of his 14-year-old daughter Molly, the youngest of three sisters, who committed suicide in 2017, leaving a note that read, “I am sorry. I did this because of me.”
After Molly’s suicide, her parents examined the teenager’s social media use and discovered she was interacting with other teenage users caught in the grip of depression and who were suicidal and self-harming. The users were almost grooming themselves and goading each other to take drastic action.
“I have no doubt that Instagram helped kill my daughter,” Molly’s father told the BBC in an explosive interview that drew a public apology from U.S. social media giant Facebook, owner of the photo sharing site Instagram, as well as a promise to do more to tackle suicide and self-harming posts.
“We’re going to look at this from top to bottom and change everything we’re doing, if necessary, to get this right,” Nick Clegg, a former British deputy prime minister and now Facebook’s head of global affairs, said in a statement.
The promise, though, has done little to tamp down criticism.
In the past eight years, the suicide rate among British teenagers has nearly doubled. Last year around 200 schoolchildren killed themselves. Tech giants do not bear all of the responsibility for the deaths, their critics say, but they are abetting them by not doing enough to help stop them.
Amid growing public uproar, the British government has said next month, it will unveil groundbreaking legislation designed to enforce a legal duty of care on such firms.
“Social media companies clearly need to do more to ensure they are not promoting harmful content to vulnerable people,” said a government official.
The British plan to order social media providers to protect users against material that promotes suicide methods and self-harm will be watched closely by policymakers in other European countries, who are also grappling with how to cope with malign consequences of social media use.
Germany is cracking down on what Facebook does with users’ personal data. In France, the government is “embedding” regulators inside social media companies to investigate how they combat online hate speech.
Since January, Facebook, in particular, has been targeted for criticism in the United States. The company operates a unique suite of apps, but U.S. critics say the social media giant is too casual about social responsibilities.
U.S. lawmakers accuse Facebook of doing too little to stop Russian meddling in the 2016 presidential race, and along with YouTube and Twitter, it has been attacked for being slow in taking down jihadist posts and videos.
Laying the groundwork for the British measure, the country’s chief medical officer will announce this week that Facebook, Instagram, Snapchat and WhatsApp figure as important links in a dangerous chain leading from self-harm to suicide.
Sally Davies will urge parents to be more alert and to limit, as well as monitor, their children’s screen time.
The legislation is likely to be based on recommendations from a British parliamentary committee which wrapped up an inquiry last week and concluded social media use is disrupting young users’ sleep patterns, distorting their body image and leaving them exposed to bullying, grooming and sexting.
The panel said that self-regulation will no longer suffice.
“We must see an independent, statutory regulator established as soon as possible, one which has the full support of the government to take strong and effective actions against companies who do not comply,” the committee said.
Clegg said some of the criticism is over-wrought. In a television interview last week, he said the company had “saved the lives” of thousands of potentially suicidal users by flagging them to authorities.
Recent academic studies, including one by psychologists at Oxford University, suggest that social media use has no major adverse impact on mental health. The Oxford University study concluded that “wearing glasses has more negative effect on adolescent mental health.”
But the academic studies are not assuaging critics, and some lawmakers cast doubt on their overall accuracy, saying they do not look closely enough on teenage girls, who seem the most vulnerable.
“Worryingly, social media companies — who have a clear responsibility toward particularly young users — seem to be in no rush to share vital data with academics that could help tackle the very real harms our young people face in the virtual world,” said lawmaker Norman Lamb.
More than 30 British families have complained that social media giants have blocked or hindered their access to social media data after their children’s suicides. A requirement on firms to share data which can help identify and protect teenagers at risk will likely be among the new legal requirements the government unveils, officials said.