A ‘Prevention of Future Deaths’ report following a U.K. coroner’s inquest into the suicide of British schoolgirl, Molly Russell, who killed herself almost five years ago after viewing content on social media websites that promoted self harm, has recommended the government looks at requiring age verification on sign-up to social platforms to ensure the separation of age-appropriate content for adults and children.
The inquest into Russell’s death heard she binge-consumed content about suicide and depression on sites including Instagram and Pinterest — some of which was algorithmically curated for her, based on the platforms tracking her viewing habits — before taking her own life, aged 14.
Coroner, Andrew Walker, concluded last month that “negative effects of online content” were a factor in her death, adding that such content “shouldn’t have been available for a child to see”.
His ‘Prevention of Future Deaths’ report — which was made public today after being sent to a number of social media firms and to the government — also recommends that lawmakers consider setting up of an independent regulatory body to monitor online platform content, paying special attention to children’s access to harmful content and to content-shaping elements like algorithmic curation and advertising.
Additionally, the coroner’s report recommends that the government reviews provisions for parental controls on social media platforms accessed by kids and considers powers that would provide caregivers with access to content viewed by children.
“I recommend that consideration is given to enacting such legislation as may be necessary to ensure the protection of children from the effects of harmful online content and the effective regulation of harmful online content,” he adds, before urging platforms not to wait for a change in the law.
“Although regulation would be a matter for Government I can see no reason why the platforms themselves would not wish to give consideration to self-regulation taking into account the matters raised above.”
Tech companies including Meta (Instagram’s owner), Pinterest, Snap and Twitter have been given 56 days to respond to the coroner’s report — with a deadline of December 8 for them to provide details of any actions taken or proposed (setting out a timetable for proposed actions), or else they must provide the coroner with an explanation why no action is being proposed by them.
We reached out to the companies for a response to the coroner’s report.
At the time of writing Meta had not responded.
A Pinterest spokeswoman told us it has received the report and plans to respond by the due date. In a statement, the social sharing site added:
Our thoughts are with the Russell family. We’ve listened very carefully to everything that the Coroner and the family have said during the inquest. Pinterest is committed to making ongoing improvements to help ensure that the platform is safe for everyone and the Coroner’s report will be considered with care. Over the past few years, we’ve continued to strengthen our policies around self-harm content, we’ve provided routes to compassionate support for those in need and we’ve invested heavily in building new technologies that automatically identify and take action on self-harm content. Molly’s story has reinforced our commitment to creating a safe and positive space for our Pinners.
A Snap spokeswoman also confirmed it has received a copy of the Coroner’s report and said it’s reviewing it and will respond within the requested timeframe.
A spokeswoman for Twitter also confirmed it has received the report too but said the company has nothing further to add.
The U.K. government has already proposed legislation aimed at making the U.K. the safest place to go online in the world, as it touts its plan for the Online Safety Bill — a piece of legislation that’s been years in the making and has a stated focus on children’s safety. The bill also empowers a content-focused internet regulator, Ofcom, to enforce the rules.
However the Online Safety Bill’s progress through parliament was put on pause by the recent Conservative Party leadership contest.
Since then, the new prime minister, Liz Truss, and the new secretary of state she appointed to head up the department, Michelle Donelan, have extended that pause by freezing the bill to make changes — specifically to provisions tackling the area of ‘legal but harmful’ content in response to concerns about the impact on freedom of expression.
There is no fresh timetable for restarting the bill. But with limited parliamentary time left before a general election must be called, and — more pressingly — widespread chaos across Truss’ government, it is looking increasingly likely the bill will fail to pass — leaving platforms to continue self regulating the bulk of their content moderation. (An age appropriate children’s design code is being enforced in the UK, though.)
We contacted the Department for Digital, Culture, Media and Sport (DCMS) for a response to the coroner’s report. A spokesman at DCMS told us it would send a statement “shortly” — but six hours (and minus one chancellor) later we’re still waiting to receive it. Calls to the DCMS press office line were being routed to voicemail. (But SoS Donelan was spotted busily tweeting the latest Truss ‘hold-the-fractious-government-together’ line — which includes the unfortunate appeal that we “must come together and focus on delivering”)
In a statement to the press following the coroner’s report, Molly Russell’s father Ian called for social media firms to get their house in order without waiting to be ordered to do so by unruly lawmakers.
“We urge social media companies to heed the coroner’s words and not drag their feet waiting for legislation and regulation, but instead to take a proactive approach to self-regulation to make their platforms safer for their young users,” he said, adding: “They should think long and hard about whether their platforms are suitable for young people at all.”
Source @TechCrunch