Home / Technology / Reporter Recalled Over Fake AI Video
Reporter Recalled Over Fake AI Video
20 Feb
Summary
- AI-generated images were used in a news report without proper labeling.
- A correspondent was recalled after using misleading video clips.
- The broadcaster stated the damage to its reputation was considerable.

A German public broadcaster has recalled a New York correspondent following the broadcast of AI-generated and outdated video clips during a news segment. The report, focusing on US immigration raids, included misleading footage that was not identified as AI-generated or accurately dated.
One of the clips featured a watermark from OpenAI's Sora platform. The broadcaster's editor-in-chief stated that disregarding journalistic rules caused considerable damage to the outlet's credibility. The correspondent's original report was accurate, but an updated version contained the two misleading clips.
This incident follows previous cases where media outlets faced scrutiny over the use of AI-generated content. Journalists have previously been caught using synthetic material, underscoring the growing challenges in verifying digital content and maintaining journalistic integrity.




