April 27, 2024

Instagram Reels Offers ‘Risque Footage of Children’ Next to Ads for Major Companies

Instagram Reels, which were launched to compete with China’s TikTok, reportedly offers “risqué footage of children as well as overtly sexual adult videos” to adult users who follow children, with some of the content even being placed next to advertisements for major companies.

In one example, Mark Zuckerberg’s Meta-owned platform showed an ad for the dating app Bumble between a video of someone caressing a “life-size latex doll” and another video featuring an underage girl clad in an out fit that exposed her midriff, according to a report by the Wall Street Journal.

Another example involved a Pizza Hut commercial being seen next to a video of a man laying in bed with an alleged 10-year-old girl, and a Walmart ad next to a video of a video of a woman exposing her groin.

Instagram head Adam Mosseri

Instagram head Adam Mosseri (Steve Jennings/Getty)

Mark Zuckerberg discusses Instagram

Mark Zuckerberg discusses Instagram (AFP/Getty)

To test how major advertisers appear on Reels, the Journal set up test Instagram accounts and followed “only young gymnasts, cheerleaders and other teen and preteen influencers active on the platform.”

“Thousands of followers of such young people’s accounts often include large numbers of adult men, and that many of the accounts who followed those children also had demonstrated interest in sex content related to both children and adults,” the outlet reported.

Additionally, the Instagram reels being displayed to the test accounts became more concerning after WSJ reporters began following adult users who were also following the content featuring children.

The algorithm reportedly appeared to being showcasing “a mix of adult pornography and child-sexualizing material, such as a video of a clothed girl caressing her torso and another of a child pantomiming a sex act.”

“We don’t want this kind of content on our platforms and brands don’t want their ads to appear next to it,” a Meta spokesperson told WSJ. “We continue to invest aggressively to stop it — and report every quarter on the prevalence of such content, which remains very low.”

“Our systems are effective at reducing harmful content and we’ve invested billions in safety, security and brand suitability solutions,” the spokesperson added. “We tested Reels for nearly a year before releasing it widely — with a robust set of safety controls and measures.”

The spokesperson went on to argue that the Wall Street Journal reporters had “a manufactured experience” via their test accounts, and that it does not reflect the experience of most users. The company and even insisted that the propagation of such content is small.

But current and former Meta employees told WSJ that staffers at the company have “known internally” for years that their algorithms have a tendency to display child sex content to users — even before the company launched Reels in 2020.

Most companies reportedly sign contracts specifying that their advertisements cannot be placed next sexually-charged or explicit content.

“We have no desire to pay Meta to market our brands to predators or place our ads anywhere near this content,” a spokesperson for Match Group, the parent company of Tinder, told the outlet.

Match Group has reportedly pulled all of its ads from platforms owned by Meta in response to such revelations involving children.

Meanwhile, a Bumble spokesperson told WSJ that the dating app “would never intentionally advertise adjacent to inappropriate content,” and has also ceased advertising on Meta platforms.

As Breitbart News reported, Zuckerberg’s Meta, the parent company of Instagram and Facebook, is facing a lawsuit from 33 states alleging that the company knowingly allowed and pursued users under the age of 13 on its platforms.

You can follow Alana Mastrangelo on Facebook and X/Twitter at @ARmastrangelo, and on Instagram.

This post originally appeared on and written by:

Share
Source: