In a recent legal filing, Meta, the parent company of Facebook and Instagram, is facing allegations of profiting from corporate ads placed next to content promoting child sexual exploitation. The lawsuit, initiated by the New Mexico attorney general, claims that Meta is failing to prevent adults from exploiting minors on its platforms.
It follows an investigation by the Guardian in April, exposing the tech giant's struggle to stop the use of its platforms for the trafficking of children for sexual exploitation.
Corporate advertisers, including Walmart and Match Group, have reportedly objected to their ads being displayed alongside disturbing content. The filing also unveils evidence suggesting child predators use Instagram to find victims, emphasizing that while Meta is capable of identifying such content, it doesn't effectively address or prevent its reappearance.
Why does it matter?
The New Mexico lawsuit adds to the mounting pressure on Meta to address reported harm to young users on Facebook and Instagram. The company is already facing legal actions from school districts and state attorneys general related to youth mental health, child safety, and privacy. In a recent Senate subcommittee hearing, a Meta whistle-blower disclosed that CEO Mark Zuckerberg and top executives ignored long-standing warnings about potential teen harm, particularly on Instagram.