Fake video of Taylor Swift talking about LA wildfires
The fake clip of Taylor Swift is doing the rounds on social media. Image by Facebook/AAP

Video of Taylor Swift calling California wildfires ‘divine retribution’ is fake

Kate Atkinson January 15, 2025
WHAT WAS CLAIMED

Taylor Swift said the LA wildfires are “divine retribution” for US support of Israel.

OUR VERDICT

False. The video is a deepfake.

AAP FACTCHECK – A fake video clip supposedly showing pop singer Taylor Swift calling the Los Angeles wildfires punishment for US support of Israel is circulating on social media.

The deepfake appears to show Swift commenting on the LA disaster during a televised interview, calling the fires “God’s punishment” and “divine retribution” for US support of Israel.

A news channel logo is visible and captions in Arabic overlay the video.

Facebook post with fake video showing Taylor Swift discuss LA wildfire
 Social media users have shared the fake clip, which was created from real Tonight Show footage. 

“For over a year and a half,” she appears to say, “Gaza has endured relentless bombing by missiles financed through the taxes paid by American citizens.

“This brutal assault has caused immense suffering for Gaza’s residents and left its infrastructure in ruins while the world remained largely silent.

“However, in just two days divine retribution struck the United States, as a natural disaster burned an area larger than Gaza itself …”

The clip is being shared widely on social media.

“Listen carefully to what the international artist Taylor Swift said: ‘America’s fires are the result of its actions in Gaza,’ one user captioned an X post.

A neighborhood destroyed by the Palisades wildfire in LA, California
 The recent wildfires have caused catastrophic damage across the LA hills. 

The footage is taken from a November 2021 interview on The Tonight Show Starring Jimmy Fallon (four minutes 26 seconds) and she’s talking about a recording of her song All Too Well.

Swift’s appearance, dress, jewellery and the set’s background match the deepfake video, but her speech is different.

Red flags indicating the clip is a deepfake include her inconsistent accent and her voice appearing out of sync with her mouth, which moves unnaturally.

The earliest version of the video AAP FactCheck could find was posted to TikTok account @roya24.ai on January 11, 2025.

The account has shared many AI videos in the past.

According to Google Translate, its bio reads: “Vision 24 channel. A Moroccan page with an Algerian flavour.”

The clip has had 4.9 million views at the time of writing.

Location information shows it was posted from Riyadh, Saudi Arabia.

The Verdict

False – The claim is inaccurate.

AAP FactCheck is an accredited member of the International Fact-Checking Network. To keep up with our latest fact checks, follow us on Facebook, Twitter and Instagram.

All information, text and images included on the AAP Websites is for personal use only and may not be re-written, copied, re-sold or re-distributed, framed, linked, shared onto social media or otherwise used whether for compensation of any kind or not, unless you have the prior written permission of AAP. For more information, please refer to our standard terms and conditions.