Facebook plans to roll out new policies and is testing two new tools to prevent children from sharing content that may victimise them. This is based on the study conducted by the company on child sexual abuse material online that reveals that better understanding of intent could help in preventing revictimisation of children. The first tool is a pop up that is shown to people who search for terms related to child exploitation in Facebook and Instagram apps and the second is a safety alert sent to people who share a viral meme of children or any CSAM material that their action is against the company and can lead to legal consequences.
Facebook further announced that they are working to improve the CSAM detection tools and reporting mechanism.