September, Thursday 19, 2024

Meta mandates that political advertisers clearly indicate the presence of deepfakes


592UxDTn9IgLFGD.png

Meta, the company formerly known as Facebook, has announced new rules for political advertisers on its platforms Facebook and Instagram. Starting from January, any adverts related to politics, elections, or social issues will be required to disclose any use of artificial intelligence (AI) or digital manipulation. This includes digitally altered images or videos, changing what someone says in a video, altering footage of real events, and creating realistic-looking people who do not exist. The policy will be enforced globally and will be monitored by a combination of human and AI fact-checkers. Users will be notified when an ad has been flagged as digitally altered, although Meta did not provide specific details about how this information would be presented. Minor changes like cropping or color correction do not need to be declared unless they significantly impact the claims or assertions made in the ad. Meta already has policies in place for all users, not just advertisers, regarding deepfakes in videos. Any deepfake videos that could mislead viewers into thinking the subject said something they did not will be removed. Threads, Meta's other social media platform, will follow the same policies as Instagram. Advertisers who fail to disclose digital alterations may face penalties and their ads may be rejected. This move follows a similar policy announcement by Google, while TikTok already prohibits political advertising. Deepfakes, which involve using AI to manipulate videos, are a growing concern in politics. There have been instances of deepfakes circulating on social media, including a fake image of former US President Donald Trump being arrested and a deepfake video of Ukrainian President Volodymyr Zelensky discussing surrendering to Russia. However, there have also been cases where claims of deepfakes were proven false, such as a video of US President Joe Biden being authenticated. With upcoming elections in major democracies around the world, addressing deepfakes and digital manipulation in political advertising has become crucial.