Fact checkFalse

False: AI video made with Google model confused internet users as real TV news

The eight-second clip displays the Veo logo, Google’s video generation model, in the bottom right corner. It also contains SynthID, an invisible watermark embedded in images and videos created with Google’s AI tools.

點擊查看本文中文版

[Note: AI technologies and policies are evolving rapidly. The detection methods and other related information discussed in this article may become outdated in the near future.]

In mid-June, an eight-second video allegedly showing a motorcyclist falling into a rainwater-filled pothole in a road pavement went viral. The clip features a TV reporter in a yellow raincoat, supposedly on location to report on the heavy rain in Mexico City. After witnessing the fall, she says to the camera, “Oh. Another one is gone” in English.

The video has been shared on Threads, Facebook, and TikTok, receiving thousands of likes and shares. Many users’ comments questioned whether the footage is real.

Annie Lab can confirm that the video was created using Veo, Google’s video generation model.

Here are some indicators that helped us identify the Veo clip. First, a Veo watermark is visible in the bottom right corner of the video.

A screenshot of the video and Veo’s watermark

This watermark does not categorically confirm that it is a Veo-generated video (such a logo can be added to authentic videos in post-production).

But a reverse image search result revealed in the “About this image” tab (archived here) that the video was indeed “Made with Google AI.”

The search result also indicated when Google first indexed the image, suggesting it “recently appeared online.”

Screenshots of “About this image” on Google that identified the video to be generated with Google’s AI product.

This method of relying on the “about this image” is not perfect, however. Some of our search attempts fail to recognize the AI-generated content.

“About this image” in some cases did not identify the AIGC.

Meanwhile, through a TikTok handle spotted in one version of the video, we were able to identify a user who is likely the original creator. The user (@javvidoblev) describes themselves in Spanish as an “AI Content Producer & Creator.”

The account posted the pothole video on June 13, the earliest instance Annie Lab has found, with a caption mentioning the use of AI and Veo 3.

“Pothole videos” could be a new internet trend. A similar “news report” clip, also created with Veo, claiming to show Guatemala City, has recently surfaced on Instagram.

AI detection

Google says its search provides information about AI-generated content (AIGC) through two methods (archived here) — C2PA and SynthID.

C2PA, or Coalition for Content Provenance and Authenticity, is a joint project that builds a technical standard for metadata, often referred to as Content Credentials, to show the content’s origin, creation process, and subsequent modifications.

SynthID, developed by Google DeepMind, “embeds digital watermarks directly into AI-generated images, audio, text or video” across “Google’s generative AI consumer products,” according to its project page (archived here).

These watermarks are invisible to human eyes but detectable by Google’s detector tools.

Annie Lab is among the early testers for the SynthID Detector tool, which identified invisible, pixel-level watermarks in a screen capture of the video in question.

A screenshot of the SynthId Detector result.

Despite the significant improvement in the quality of AI-generated images and videos, generative models occasionally leave traces of their artificial intelligence generation.

In the case of this weather report video, while the movements of the characters and the TV reporter’s speech are perhaps convincing to most viewers, the logos on the microphone and raincoat, supposedly identifying the news channel and media group, do not appear to be authentic.

A screenshot of logos seen in the video.