Meta served deadline by EU to come up with answers regarding child abuse material on Instagram


The European Union (EU) regulators on Friday (Dec 1) pulled Meta, asking the company to provide more details on measures it had taken to tackle self-generated child sexual abuse material (SC-CSAM) available on its highly popular photo and video sharing app Instagram.  

The EU has given a December 22 deadline to the company to submit its answer. Failure to do so will result in a formal investigation under the new EU online content rules. 

Notably, under the EU’s Digital Services Act (DSA), big tech companies are required to do more to police illegal and harmful content on their platform. 

“The Commission is requesting Meta to provide additional information on the measures it has taken to comply with its obligations to assess risks and take effective mitigation measures linked to the protection of minors, including regarding the circulation of SG-CSAM on Instagram,” the European Commission said in a statement on its latest query.

“Information is also requested about Instagram’s recommender system and amplification of potentially harmful content.”

It was in October that the bloc first sent a request for information regarding the measures taken to counter the spread of terrorist and violent content. A month later, a second request was directed, seeking information regarding measures to protect minors.  

EU pressing Meta for information comes after a Wall Street Journal report, earlier this year claimed that Instagram was struggling to clean child sexual abuse material.

The report added that Instagram’s algorithms were connecting a series of accounts which were being used for making, buying and trading underage sex content.

Another follow-up report by the publication a few days ago stated that Instagram had failed to rectify the problem as paedophiles roamed the digital alleys of the popular social media platform. 

“Five months later, tests conducted by the Journal as well as by the Canadian Centre for Child Protection show that Meta’s recommendation systems still promote such content.”

A year ago, Meta was fined nearly half a billion dollars after the platform was found to have violated the bloc’s data protection rules for minors. It was the largest General Data Protection Regulation (GDPR) penalty that the EU had imposed on a social media giant to date. 

(With inputs from agencies)



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *