In a recent report by the BBC, an anonymous campaign employing artificial intelligence to impersonate Sudan's former leader, Omar al-Bashir, has gained widespread attention on TikTok, amplifying online confusion in a country already grappling with civil war. The campaign, operating under the name 'The Voice of Sudan,' has been posting alleged 'leaked recordings' of Bashir since late August, even though the voice's authenticity has been questioned.
As Bashir remains out of public sight due to illness and facing war crimes accusations, experts believe that voice conversion software has been used to mimic Bashir's voice. Some of the recordings were found to be taken from a Sudanese political commentator's broadcasts in the United States.
The motive behind this campaign is unclear, but it may aim to deceive people into thinking Bashir is involved in the ongoing war or to promote a specific political viewpoint. TikTok has taken down the account after being contacted by the BBC, citing violating their guidelines on posting false content and using synthetic media.
Why does it matter?
This campaign highlights how easily fake content can be disseminated through social media, raising concerns about the potential for disinformation and unrest, as indicated by AI experts. Detecting audio-based disinformation is challenging, and the technology to spot synthetic audio is still in its early stages. Real-life instances are spreading globally, such as the case in Slovakia, where deepfake videos with AI-generated voices of politicians circulated before the parliamentary elections. As countries grapple with this issue, senators from both political parties in the United States have joined forces to back legislation aimed at preventing the use of AI for producing misleading political ads during federal elections.