Tech giants join forces to fight AI election interference


A group of 20 tech companies on Friday (Feb 16) said that they had agreed for a joint effort to prevent deceptive AI content creating interference in elections across the globe in the current year.

Generative AI tools have proliferated since the craze was started by ChatGPT in 2022. Now, written as well as pictorial content can be easily generated using these tools and likeness of the end-product with one created by human is uncanny. This has raised fears of misuse of these tools to fool voters in a year more than half of world’s population will take part in elections.

Companies like Microsoft, Adobe and OpenAI are among the signatories of the tech accord. The names of the signatories were announced at the Munich Security Conference. Other signatories are social media giants like Meta, TikTok and X (formerly Twitter).

Watch | Gravitas | OpenAI’s text-to-video AI generator: All about Sora

The companies have agreed to commitments in the agreement to come together to develop tools for detection of AI-generated images, video and audio which can be misleading. The companies have also agreed to create public awareness campaigns in order to educate voters to make them better positioned to recognise deceptive content

Technology to identify AI-generated content or certify its origin could include watermarking or embedding metadata, the companies said.

The accord did not specify a timeline for meeting the commitments or how each company would implement them.

“I think the utility of this (accord) is the breadth of the companies signing up to it,” said Nick Clegg, president of global affairs at Meta Platforms.

“It’s all good and well if individual platforms develop new policies of detection, provenance, labeling, watermarking and so on, but unless there is a wider commitment to do so in a shared interoperable way, we’re going to be stuck with a hodgepodge of different commitments,” Clegg said.

(With inputs from agencies)



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *