Meta has announced that it was introducing a new policy to help people understand when a social issue, election, or political advertisement on Facebook or Instagram has been digitally created or altered.
This included through the use of Artificial Intelligence (AI).
It would go into effect in the new year, across the world, according to Nick Clegg, Meta’s president for global affairs in a series of tweets that announced the new policy on Wednesday.
“In the new year, advertisers who run ads about social issues, elections & politics with Meta will have to disclose if image or sound has been created or altered digitally, including with AI, to show real people doing or saying things they have not done or said,” Clegg stated.
The new policy would require advertisers to make clear if their political ads had an image, video or audio that looked real but was digital created or altered so that it looked like someone was saying something they did not showed a person or event that was not actually real, or possessed as a depiction of a real event but was actually fake.
“This includes if the content is digitally created or altered by in ways that “are inconsequential or immaterial to the claim, assertion, or issue raised in the ad,” he said.
It gave examples such as using technology to adjust the size or sharpen their image, but noted that those could still be problematic if they change the claim in the ad.
Meta, however, stated that those fake videos, images and audio would still be allowed to be posted on the site.
Instead, Meta would add information on the ad when an advertiser disclosed in the advertising flow that the content was digitally created or altered.
It indicated that it would give further information about that process later.
It did not say how advertisers would flag such ads, what would be shown to users when they were flagged, and how those who did not flag them would be punished.
All what Meta stated was that it would remove any ads that violated its policies, when they were created by artificial intelligence or real people.
If its fact checkers decided that a piece of content had been “altered”, then it would stop it from being run as an ad.
WARNING! All rights reserved. This material, and other digital content on this website, may not be reproduced, published, broadcast, rewritten or redistributed in whole or in part without prior express permission from ZAMBIA MONITOR.