Meta is apologizing for the mistake of displaying graphics and violent images in the reel feed of some Instagram users.
“We have fixed an error that allows users to view content in their Instagram reel feeds that should not be recommended,” a Meta spokesperson said in a statement Thursday without providing details about the nature of the issue.
The apology comes a day after multiple Instagram users reported that they received recommendations for content labelled “not work safe” in the form of short videos showing gore and violence.
Meta attempts to protect users from offensive content and specifically removes graphic or violent material in accordance with the company’s policies.
The error follows a change in Meta’s policy in January Finish third party fact checking Programmed and replaced with a community-driven system like that used by Elon Musk’s X social media platform.