Ptechhub
  • News
  • Industries
    • Enterprise IT
    • AI & ML
    • Cybersecurity
    • Finance
    • Telco
  • Brand Hub
    • Lifesight
  • Blogs
No Result
View All Result
  • News
  • Industries
    • Enterprise IT
    • AI & ML
    • Cybersecurity
    • Finance
    • Telco
  • Brand Hub
    • Lifesight
  • Blogs
No Result
View All Result
PtechHub
No Result
View All Result

Social media algorithms exposing children to violent pornographic content, report shows | Computer Weekly

By Computer Weekly by By Computer Weekly
August 26, 2025
Home Uncategorized
Share on FacebookShare on Twitter


Social media algorithms are pushing unsolicited pornographic content into children’s feeds, according to a report by the Children’s Commissioner.

The data was collected prior to the implementation of the Online Safety Act, but provides a snapshot on the types of harmful content being shown and accessed by children online and how that content affects them.

According to the report, 70% of respondents, whose ages were between 16 to 21 years old, had seen pornography, with the average child reporting to have seen this type of content at the age of 13, and more than a quarter having seen it by the age of 11.

The respondents exposed to pornographic content online said that eight out of the top 10 sources of this content were social media or social networking websites. 

According to the report, X (formerly known as Twitter) was the largest platform where children encountered pornography, with 45% of respondents, making it more likely for children to find pornography there than on dedicated pornographic websites.

Other social media companies popular among children also show up in the survey with what the report states to be “concerning frequency”. These include: Snapchat (29%), Instagram (23%), TikTok (22%) and YouTube (15%). 

Strikingly, 59% reported seeing pornography online by accident, which is up from 38% in 2023. Mark Jones, a legal Partner at Payne Hick Beach, said that “children are viewing harmful content due to algorithms used by platforms, rather than actively searching it out themselves”.

Harmful content

Jones, who is part of the Dispute Resolution Department, and represents both individuals and corporations, added: “Under the Online Safety Act and the child safety duties, platforms are required to stop their algorithms from recommending harmful content. This, coupled with age assurance measures, aims to protect children in the online world. The algorithms should filter out harmful content from reaching children in the first place.”

The report actively supports the introduction of Ofcom’s new age verification measures and the implementation of the Children’s Code, which requires social media websites to make changes to prevent children seeing this type of harmful content. 

“The Children’s Code came into force from 25 July 2025,” said Jones. “It will be interesting to see what changes, if any, are seen in this area. In particular, whether platforms are effectively moderating content and no longer using toxic algorithms to filter out harmful content being accessed by children.”

Additionally, the report emphasises that the majority of pornographic content seen by respondents depicted acts that are illegal under existing pornography laws. For example, 58% of respondents had seen porn depicting strangulation when they were under the age of 18. Furthermore, 44% reported seeing a depiction of rape.

The report emphasises that this has a detrimental effect on children’s interactions with one another, affecting their expectations around sex and body image. 

A spokesperson for the Children’s Commissioner told Computer Weekly the link between exposure to pornography and harm caused to children’s behaviour was very significant based on direct self-reporting from these children.

“Children have told the Children’s Commissioner they expect to be experiencing violence in a relationship, or they expect their first interactions of a sexual nature to be like what they’re seeing in pornography, because that’s what they’re exposed to,” they said.

Depiction of women

Particularly concerning is the depiction of women, who are more commonly shown being on the receiving end of sexually aggressive acts than men were, which the report finds leads to violent perceptions of sex that targets women.

The spokesperson says that the commission found through surveys and research particularly for girls, who had seen violent pornography, their own depictions of consent became clouded.

“Girls who have seen pornography were far more likely to agree with the statement that girls who say ‘no’ can be persuaded to have sex,” they said. “So, they might say ‘no’ to start with, but then are now expecting to be persuaded otherwise. The idea of consent that has been enshrined in our education system through sex education, and relationships education, over the last 10 to 15 years seems to be on rocky ground.”

While social media presents itself as the first point of contact for many of these children, the Children’s Commissioner reiterated that algorithms are not necessarily evil, but rather tech companies are not optimising their search engines to remove this content from children.

“Tech companies know who their young users are,” said the spokesperson. “They do have the ability to recognise and monitor user activity. There must be a greater focus and less ambiguity about who you direct that algorithmic content to if it’s a young user. It should simply either be stopped before it even gets to their feed, or there has to be a much more stringent way of keeping them off the site, and we are yet to see that with sites like X.”

The report recommends that online pornography be made to meet the same content requirements as offline pornography, so that the depiction of non-fatal strangulation is outlawed.

It also calls for the government to explore options that prevent children from using virtual private networks (VPNs) to bypass the Online Safety Act’s regulations, and further funding for schools to implement the new Relationships, Health and Sex Education (RHSE) curriculum, including a recruitment drive for specialist RHSE teachers. “This has to be a benchmark against the success of the Online Safety Act. We will repeat this survey again next year to see if there is any significant change in what children are able to access,” the report added.

The communications regulator, Ofcom, has not directly responded to the report’s findings, but has previously stated that: “Tech firms must introduce age checks to prevent children from accessing porn, self-harm, suicide and eating disorder content” and “expect to launch any investigations into individual services” that do not meet the compliance.

There have been several calls to implement stricter regulation on social media algorithms, which have reportedly fuelled misinformation, and other harmful content.  

The Commons Science, Innovation and Technology Committee had previously attributed the spread of misinformation to algorithms that prioritise advertising and engagement-based business models to generate revenue, without implementing tools in their systems to deprioritise harmful content.

Algorithms function based on machine learning artificial intelligence models, and can develop biases, prioritising shocking content that generates clicks. This technology can itself be repurposed to reinforce positive social outcomes and remove harmful content from being shown to users.



Source link

By Computer Weekly

By Computer Weekly

Next Post
CISA Adds Three Exploited Vulnerabilities to KEV Catalog Affecting Citrix and Git

CISA Adds Three Exploited Vulnerabilities to KEV Catalog Affecting Citrix and Git

Recommended.

Palo Alto Networks CEO Nikesh Arora: SIEM’s Days Are Numbered

Palo Alto Networks CEO Nikesh Arora: SIEM’s Days Are Numbered

May 20, 2025
Eightpoint’s New Holy Bible App Makes It Easy to Stay Connected to God’s Word–Even Offline

Eightpoint’s New Holy Bible App Makes It Easy to Stay Connected to God’s Word–Even Offline

July 22, 2025

Trending.

Google Sues 25 Chinese Entities Over BADBOX 2.0 Botnet Affecting 10M Android Devices

Google Sues 25 Chinese Entities Over BADBOX 2.0 Botnet Affecting 10M Android Devices

July 18, 2025
Stocks making the biggest moves premarket: Salesforce, American Eagle, Hewlett Packard Enterprise and more

Stocks making the biggest moves premarket: Salesforce, American Eagle, Hewlett Packard Enterprise and more

September 4, 2025
Wesco Declares Quarterly Dividend on Common Stock

Wesco Declares Quarterly Dividend on Common Stock

December 1, 2025
⚡ THN Weekly Recap: New Attacks, Old Tricks, Bigger Impact

⚡ THN Weekly Recap: New Attacks, Old Tricks, Bigger Impact

March 10, 2025
Bloody Wolf Targets Uzbekistan, Russia Using NetSupport RAT in Spear-Phishing Campaign

Bloody Wolf Targets Uzbekistan, Russia Using NetSupport RAT in Spear-Phishing Campaign

February 9, 2026

PTechHub

A tech news platform delivering fresh perspectives, critical insights, and in-depth reporting — beyond the buzz. We cover innovation, policy, and digital culture with clarity, independence, and a sharp editorial edge.

Follow Us

Industries

  • AI & ML
  • Cybersecurity
  • Enterprise IT
  • Finance
  • Telco

Navigation

  • About
  • Advertise
  • Privacy & Policy
  • Contact

Subscribe to Our Newsletter

  • About
  • Advertise
  • Privacy & Policy
  • Contact

Copyright © 2025 | Powered By Porpholio

No Result
View All Result
  • News
  • Industries
    • Enterprise IT
    • AI & ML
    • Cybersecurity
    • Finance
    • Telco
  • Brand Hub
    • Lifesight
  • Blogs

Copyright © 2025 | Powered By Porpholio