Facebook will launch a series of tools to stop revenge pornographic material from spreading on its platforms.
The term refers to non-consensual pornography that's distributed online to shame, exploit or extort its victims.
On Wednesday, the company said it would apply photo-matching to ensure intimate, nonconsensual images that are reported once aren't able to be uploaded again through Facebook's properties, including Messenger and Instagram.
Facebook said once an image is reported, it is reviewed by the company's community operations team and then photo-matching will be applied to remove the photographs or video.