Meta’s Instagram platform will begin testing new tools to combat so-called sexting, a form of blackmail with intimate photos online.
According to Ukrinform, this situation was reported by the BBC.
The tools include a feature that will blur nude images sent by users in private messages (DMs). It will be enabled by default for those under 18, but users who are of legal age will also be able to enable this feature. Instagram also emphasizes that the feature, which is available to adults, will not automatically notify the platform about nude nature images in DMs.
Pop-ups will also be tested to direct potential victims of sexual blackmail to a support service.
The nudity protection feature uses artificial intelligence to detect nudity in DMs and give users the option to view them or not. In a statement, Instagram said the feature was designed “not only to protect people from unwanted nudity in their DMs, but also to protect people from scammers who may send nude images to trick people into sending images of themselves in return.”
When the system detects that a user has sent a nude image, the user will be given security tips, including a reminder that recipients can take a screenshot or forward the image without the sender’s knowledge.
The platform will also detect signs that a user may be a potential hijacker and implement measures to make it difficult for the user to interact with other users.
Instagram is testing pop-up messages for people who may have interacted with hijacked accounts, directing them to expert advice and support.
The announcement comes on the same day that another Meta platform, WhatsApp, lowered the minimum registration age from 16 to 13 in the UK and Europe.
As reported by Ukrinform, Instagram and Threads will no longer recommend political content to social network users.