The Australian Federal Police and Monash University have asked users to share photos taken as children to train artificial intelligence to detect abuse in photos.
Researchers collect images of people under the age of 17 in safe situations. They say photos should not contain nudity, even in relatively harmless scenarios like a baby taking a bath.
The images will be combined into a dataset to train an AI model to distinguish between a small child in a normal setting and an abused, unsafe situation. The researchers believe the software will help law enforcement agencies quickly identify child sexual abuse material from among the thousands of photos reviewed and avoid manual verification.
According to Australian Federal Police Chief Janice Dahlins, artificial intelligence has the potential to identify victims and discover illegal material previously unknown to officers.
“In 2021, the Australian Child Abuse Center received more than 33,000 reports of child abuse online, and each report may contain a large amount of images and videos,” he said.
Dahlins added that viewing such material is a time-consuming process. Also, manual analysis can cause psychological stress on researchers.
The researchers say the crowdsourced photo collection will allow for an unbiased dataset.
“Getting pictures from the Internet is problematic because there’s no way to know if the children in these photos actually consented to their pictures being uploaded or used for research,” said Campbell Wilson, co-director of AiLECS and associate professor at Monash University. .
The My Pictures Matter crowdsourcing campaign is open to adults who consent to their photos being used. Users are also required to provide an email address.
Project leader and lab researcher Nina Lewis stressed that no other user-identifying information can be collected. “Email addresses are stored in a separate database,” she added.
“The images used by the researchers cannot reveal any personal information about the persons depicted,” he said.
Contributors will be provided with updates at every stage of the project and can request the removal of their images from the dataset if they wish.
Recall that in November 2021, the Australian authorities banned Clearview AI from collecting citizens’ data.
In August, Apple announced plans to roll out a tool to scan user photos for child abuse on iOS, iPadOS, and macOS.
The company later delayed the launch of the feature indefinitely.