The Internet A whole bunch of reactions and memes Since Joe Biden announced on Sunday that he would not run for reelection and endorsed Vice President Kamala Harris as the Democratic nominee.
Harris supporters on social media have taken note of some of the vice president’s funniest moments and quirky speeches over the past few years.Do you think it fell from a coconut tree? for example.
But some supporters of Republican presidential nominee Donald Trump have chosen a different path: sharing manipulated media on social media promoting fake speeches by Kamala Harris that she never gave.
The video, which has gone viral on TikTok and Twitter, shows Kamala Harris speaking to a crowd, but it’s a deepfake: the video has been edited and the audio replaced with what appears to be an AI-generated clone of her voice.
Media is important to America The report was released The company reported on Monday about a deepfake video going viral on TikTok, where it had garnered millions of views. Shortly after the report, TikTok removed the post and the fake audio from its platform.
“TikTok has strict policies against harmful AI-generated content and misleading, edited media, and we proactively remove this content while partnering with fact-checkers to assess the accuracy of content on TikTok in real time,” a TikTok spokesperson said in a statement provided to Mashable.
Mashable Lightspeed
Kamala Harris deepfake resurfaces after presidential candidate election
This isn’t the first time a deepfake of Kamala Harris has been circulated online. Exposed The deepfake video of Harris was first posted last year.
The deepfake video uses real footage of Harris speaking to an audience at Howard University in 2023, but the footage has been digitally altered.
“Today is today, yesterday was yesterday,” Kamala Harris is heard slurring her words in the video. “Tomorrow is tomorrow’s today, so if you live for today, future today will become past today and tomorrow’s today.”
But Harris Quote.
The full video of the live event does not include any scenes seen in the popular deepfake video. Experts say It pointed out There is digital noise around her mouth in the video, which is an attempt to edit the clip to match the fake audio, and the fake audio does not include any background noise or crowd sounds.
Either way, more than a year after the Harris deepfake was exposed as fake, the video went viral last week after a right-wing user uploaded it to Elon Musk’s X platform. The post is still up on X, where it has been viewed more than 3.4 million times. Per their policies, X will not remove this type of content. However, X users have managed to add a user-generated community note to the post, informing others that the video is fake.
Unlike X, AI-generated misinformation violates TikTok’s platform rules. TikTok says it proactively removes 98% of content that violates its policies. But one of the Harris deepfake viral uploads was viewed more than 4.1 million times before it was removed, according to a Media Matters report. TikTok says it is working to detect and remove other Harris deepfake uploads.
Deepfake videos It has long been a concern For political campaigns. With AI-generated audio and video tools now freely available to the public, deepfakes are likely to become a bigger problem in 2024 than ever before.