Social media “recommends graphic content to 13-year-olds”, online safety report says
Social media accounts linked to children were “directly targeted” with graphical content within 24 hours of their creation, according to a new report on online safety.
He says accounts created for the study based on real children as young as 13 received content on eating disorders, self-harm and sexualized images.
The study by the 5Rights Foundation’s child safety group and England’s Children’s Commissioner Dame Rachel de Souza said the research was “alarming and upsetting” and called for mandatory rules on the way online services are designed to be introduced.
An age-appropriate design code will come into effect in September, with the Information Commissioner’s Office (ICO) able to impose fines and other penalties on services that do not, by design, incorporate new security standards for the protection of the data of users under the age of 18.
But 5Rights said more needs to be done to build broader child safety into online platforms right from the design process.
He says that despite knowing the ages of younger users, social media platforms allowed them to be contacted – unsolicited – by adults and recommended potentially harmful content.
Facebook, Instagram and TikTok were the platforms named in the report, which was produced with research firm Revealing Reality.
All three companies have been contacted for comment.
“The results of this research are alarming and heartbreaking. But just as risks are designed for the system, they can be designed, ”said 5Rights President Baroness Kidron.
“It’s time to adopt mandatory design standards for all services that impact or interact with children, to ensure their safety and well-being in the digital world.
“In all other contexts, we provide mutually agreed protections for children. A nightclub cannot serve a pint to a child, a retailer cannot sell them a knife, a theater cannot allow them to watch an R18 movie, a parent cannot deny them an education, and a pharmaceutical company cannot. not give them a dose of adult medication.
“These protections do not apply only when harm is proven, but in anticipation of the risks associated with their age and changing abilities.
“These protections are rooted in our legal system, our treaty obligations and our culture. Everywhere except in the digital world.
She added that the study revealed a “deep recklessness and disregard for children” who were “embedded” in the features, products and services of the digital world.
Dame Rachel said: “This research highlights the huge range of risks children currently face online.
“We do not allow children to access services and content inappropriate for them, such as pornography, in the offline world.
“They shouldn’t be able to access it in the online world either. I look forward to working with government, parents, online platforms and organizations like 5Rights to create a kid friendly online world.
Online safety activist Ian Russell, who started a foundation on behalf of his daughter Molly after she committed suicide after viewing self-harm and suicide content online, said research showed “how algorithmic amplification actively connects children to harmful digital content, unfortunately as I know only too well, sometimes with tragic consequences”.
“In our digital wilderness, young people need organized routes to explore, allowing them to move around while staying safe. Routes to trusted areas of support, especially with regards to mental health, should be better marked so that help can be provided whenever needed, ”he said.
“All of us – governments, businesses and individuals alike – need to act quickly to fix everything digital.
“We need to find ways to eliminate online harm and cultivate goodness, if our digital world is to prosper as it should.
“Above all, we need to put safety first, especially for children when they are online. We must work to prevent digital wolves, seek out the vulnerable and destroy young lives.
For more stories of where you live, visit InYourZone.