Cryptheory – Just Crypto

Cryptocurrencies are our life! Get an Overview of Market News

Social media firms won’t prioritise child safety until Ofcom is given the teeth to make them, warns NSPCC

3 min read

Child welfare campaigners have accused social media companies of doing the bare minimum to protect children using their platforms, urging the Government to give regulator Ofcom “the teeth it needs” to force tech companies to prioritise child safety.

Andy Burrows, head of child safety online for the NSPCC, that that social media’s failure to get to grips with child safety had not resulted in major repercussions for the key players, with the exception of the huge degree of concern raised in the wake of Molly Russell’s death.

Molly Russell, 14, took her own life in November 2017 after viewing self-harm and suicide content on Instagram. Her father, Ian Russell, gave interviews in 2019 claiming the photo-sharing app had “helped kill my daughter”, after which the company met with Health Secretary Matt Hancock and banned self-harming images and videos.

“The succession of scandals that tech firms have borne, from Molly Russell and the failure to protect children to Cambridge Analytica and racism against Premier League footballers shows that users have not been put off from using their services,” Mr Burrows told i.

“There is no commercial drive or legal imperative to incentivise companies to make their platforms safer other than a sense of doing the right thing, and time and time again, that simply hasn’t been enough. Regulation is needed because the tech companies won’t do this themselves… and the regulatory powers have to give Ofcom the teeth that it needs to be able to take on this issue.”

The NSPCC is urging Culture Secretary Oliver Dowden to ensure the forthcoming Online Harms Bill legislation, which will give Ofcom the power to fine internet companies up to 10 per cent of their annual turnover for failing to comply with new rules, has adequate powers to protect children online.

The charity also called on the Prime Minister to bring the legislation forward after publishing research last year detailing how predators were increasingly using Instagram to contact children in online grooming offences, with offences increasingly annually in the three years before the coronavirus lockdown in March 2020.

“I don’t get the sense there is palpable concern among tech companies to make their platforms safer for children,” Mr Burrows said.

LONDON, ENGLAND - MARCH 10: Secretary of State for Culture, Media and Sport Oliver Dowden on Downing Street on March 10, 2021 in London, England. The Government has published a national action plan designed to protect journalists from abuse and harassment. (Photo by Rob Pinney/Getty Images)
Oliver Dowden has been urged to make the Online Harms Bill robust (Photo: Getty)

“There’s a recognition that regulation is coming, but I think their objective will be to deliver a piece of legislation that captures broadly what is already being done and that does not require a fundamental redesign of their systems and approach, and the Government needs to go further than that.”

Mr Dowden recently said that while the Government reserves its right to impose criminal sanctions against tech companies’ senior management for failure to comply with the legislation, it would only be perused in “the most egregious cases”.

Having the power to hold tech firms legally reliable is hugely important in terms of deterrence value because the industry has resisted the introduction of criminal sanctions “very, very strongly”, Mr Burrows said.

While he acknowledged there were “bright spots” in terms of companies making positive policy or product changes to protect children, there is “no indication this is going to happen on their own terms”, he added.

“Child safety is not something that’s front of mind for these companies – this regulation is so necessary because the child abuse threat has reached the hugely worrying scale it has precisely because these issues are not front and centre of their corporate decision-making.

“Yes, criminal sanctions are very much a last resort, but we’re not proposing anything that goes above and beyond expectations of directors across other parts of the economy… this isn’t some particularly draconian new approach that’s been targeted at tech bosses, this is about consistency with UK law.”

A new report from the NSPCC found that three in four UK adults (78%) support prosecuting senior managers of social media companies if they consistently fail to protect children from abuse, with 80 per cent saying they wanted to see the bosses of social media companies fined for consistent failures.

The findings demonstrate “clear support” for the responsibility to be borne not only by tech firms as corporate entities, but by the senior managers who are responsible for making the decisions whether a service will be safe for children, Mr Burrows said.

…→

All content in this article is for informational purposes only and in no way serves as investment advice. Investing in cryptocurrencies, commodities and stocks is very risky and can lead to capital losses.

Leave a Reply

Your email address will not be published. Required fields are marked *