Wounded Bondi survivor again being targeted with AI-generated images

Matthew Elmas December 18, 2025
b1628d88 a559 4e23 9826 ed0c4e13c2e9
The image has clearly been edited from a photo shared by the Embassy of Israel in Australia. Image by AAP/Facebook/X

WHAT WAS CLAIMED

A photo shows a Bondi Beach survivor's injury is fake.

OUR VERDICT

False. The photo has been edited using artificial intelligence.

AAP FACTCHECK - A Bondi Beach survivor who's recovering from a bullet wound to his head is again being targeted with false claims based on AI-generated images.

An image purporting to show Aren Ostrovsky without his head bandaged is being shared on social media, implying his injury is fake.

However, the image shows clear signs of digital manipulation and has been identified as being AI-generated by Google's AI detection software.

Sajid Akram, 50, and his 24-year-old son Naveed opened fire on a Jewish event at a park near Bondi Beach, killing at least 15 people and injuring dozens, on December 14, 2025.

Sajid was shot dead by police, while Naveed remains under police guard in hospital, where he has been charged with 59 offences, including 15 counts of murder.

A wave of false claims about the shooters, victims and the political reactions to the terrorist attack has spread across social media.

An X post features a purported image of Mr Ostrovsky in hospital without any visible wound or bandage on his head alongside Israel's Ambassador to Australia, Amir Maimon.

A screenshot of an X post.
An X post featuring the manipulated photo has been viewed nearly a million times. (AAP/X)

The image was posted alongside a similar photo of Mr Ostrovsky with a bandage on his head and Mr Maimon.

"When he didn't know he was being photographed, the bandage was suddenly gone — no wound, no marks, nothing," the X post reads.

The first image has also been shared on Facebook in posts casting doubt on Mr Ostrovsky's injuries.

A screenshot of a Facebook post.
A range of false information has been shared about Arsen Ostrovsky, who was wounded in the shooting. (AAP/Facebook)

"Wasn't he bleeding from his head? How come there is no bandage anymore?" the post reads. 

"Maybe has some super amazing stitches not visible to the human eye?"

An Instagram post sharing the image claims it's evidence that he is a crisis actor.

"CRISIS ACTOR ALL HEALED IN A DAY AND TAKES HIS 50 PIECES OF SILVER IN DONUTS INSTEAD," the post reads.

A screenshot of an Instagram post.
Several posts have claimed the manipulated photo is evidence that Mr Ostrovsky is a crisis actor. (AAP/Instagram)

However, the image of Mr Ostrovsky without the head bandage is an AI-edited version of a real photo.

The original photo, from the Israeli embassy X post on December 16, shows Mr Ostrovsky sitting in a hospital bed with Mr Maimon standing beside him.

The bandage is clearly visible and can also be seen in another image in the same X post.

A screenshot of an X post.
Arsen Ostrovsky underwent surgery after a bullet grazed his head in the Bondi Beach shooting. (AAP/X)

The edited image shows both men in identical positions, indicating it's a cropped version of the original.

However, visible errors indicate it has been manipulated.

First, a coil in the cord visible behind Mr Ostrovsky's head in the genuine photo has disappeared from the edited version.

In the edited image, Mr Ostrovsky's face has a far smoother complexion, with less facial hair and visual blemishes - a hallmark sign of AI generation. 

When AAP FactCheck uploaded the original photo to the Google Gemini chatbot and asked it to remove the bandage, it produced an image strikingly similar to the edited version shared on social media.

A screenshot of an AI manipulated image next to the original.
The altered image has several hallmarks of AI, including smoothed facial features. (AAP/X/Facebook)

When uploaded to Google Lens, the 'About this image' tool also identified the image as being "Made with Google AI".

This is because it contains a "SynthID watermark", which Google embeds into images generated using its AI tools.

The digital watermarks cannot be detected by the human eye but can be identified in the content's pixels, according to Google.

Mr Ostrovsky has been repeatedly targeted with misinformation involving AI-generated images since the shooting.

AAP FactCheck has debunked similar false claims about Mr Ostrovsky's injuries.

It's unclear who edited the image of Mr Ostrovsky in hospital, or who generated the earlier purported images of him.

AAP FactCheck is an accredited member of the International Fact-Checking Network. To keep up with our latest fact checks, follow us on Facebook, Instagram, Threads, X, BlueSky, TikTok and YouTube.

Sources

Fact-checking is a team effort

Every AAP FactCheck article is the result of a meticulous process involving numerous experienced journalists and producers. Our articles are thoroughly researched, carefully crafted and rigorously scrutinised to ensure the highest standard of accuracy and objectivity in every piece.

AAP FactCheck is an accredited member of the International Fact-Checking Network