Key Highlights
- Sexual content was shown alongside advertisements for well-known US firms.
- The salacious content was delivered to the followers of teen influencers.
Instagram is quite popular for its reels and is always in the news. But this time the reason for coming into the discussion is because of something very wrong. According to a report, Instagram’s Reels algorithm is delivering too much sexual content to all the accounts that only follow kids. Alongside, the social media platform is also showing ads for reputed brands and this is a matter of great concern.
The Wall Street Journal conducted a test by setting up accounts to check Instagram’s algorithm. The testing account followed children’s accounts including young gymnasts, cheerleaders, and other teen and preteen influencers. These accounts are said to have featured kids and had no explicit content.
As mentioned earlier this sexual content was shown alongside advertisements for well-known US firms including Disney, Walmart, Pizza Hut, Bumble, Match Group, and Hims.
The Canadian Centre for Child Protection is said to have obtained similar results in its tests.
WSJ also revealed that TikTok afterwards suggested explicit pornographic and “risqué footage of children” videos to its test accounts. As per the report, Walmart and Pizza Hut declined to comment whereas Bumble, Match Group, Hims, and Disney have discontinued their ads from Meta to object to their ads being displayed alongside such content.
WSJ also reported that Instagram’s tendency to collect Child sexualization content was a known internal issue even before Reels was launched. Tech Firms usually argue that tests like these don’t always accurately reflect real user experience. The same group of people proposed that updating the algorithms responsible for pushing related content to users is necessary. However, according to internal documents seen by the WSJ Meta may have made it impossible for its safety team to apply such radical changes.