Meta Platforms, the owner of Facebook and Instagram, has announced specific measures to curb misleading content and deepfakes ahead of the federal elections in Australia, according to an official blog post.
Meta is partnering with notable news agencies such as Agence France-Presse (AFP) and the Australian Associated Press (APP) for fact checking of content across its platforms. The agencies will review content for Meta and help identify fallacies, according to the company.
“When content is debunked by fact-checkers, we attach warning labels to the content and reduce its distribution in Feed and Explore so it is less likely to be seen,” Cheryl Seeto, Meta’s Head of Policy in Australia, says.
Although Meta ended its fact checking initiatives in the United States (US) in January, it appears that the company intends to continue the program in Australia, at least until the elections have been held.
Meta has stated it will remove any deepfake content that violates the company’s policies or rate it “altered”, subsequently ranking the labelled content lower down in its feed to reduce its distribution. Individuals sharing AI-generated content will be asked to disclose it.
“For content that doesn’t violate our policies, we still believe it’s important for people to know when photorealistic content they’re seeing has been created using AI,” says Seeto.
The Australian federal elections are scheduled for May this year.