Google APL helping organisations fight child abuse online

New Delhi, Feb 25 : In its effort to fight child sexual abuse online, Google helped its partners classify more than 2 billion images in 2020, thus identifying the small fraction of violative content faster and with more precision and tackle illegal content referred to as child sexual abuse material (CSAM).

In 2018, Google developed and launched the Content Safety API.

“Using AI classifiers we built for our own products, the API helps organisations classify and prioritise the most likely CSAM content for review,” Google said in a statement on Wednesday.

Today, the API is being used by NGOs like SaferNet Brazil and companies including Facebook and Yubo.

CSAI Match is a technology developed by YouTube engineers to identify re-uploads of previously identified child sexual abuse in videos.

“We use both hash-matching software like CSAI Match and machine learning classifiers that can identify never-before-seen CSAM imagery,” Google said.

The tech giant recently launched a new transparency report on Google’s ‘Efforts to Combat Online Child Sexual Abuse Material’.

“We encourage organisations who are interested to apply to use CSAI Match or Content Safety API,” the company said.

Google said it is working together across industry and with leading child safety organisations like the WeProtect Global Alliance, Thorn, the Global Partnership to End Violence Against Children.

“And we continue to work to empower and support organisations that are creating real and lasting change for children,” it added.

Disclaimer: This story is auto-generated from IANS service.