Go to contents

YouTuber arrested for AI-generated fake police videos

Posted February. 03, 2026 09:09,   

Updated February. 03, 2026 09:09

YouTuber arrested for AI-generated fake police videos

A YouTuber in his 30s has been arrested for creating and spreading fabricated videos that falsely depict police using excessive force against civilians through artificial intelligence. The arrest reflects a tougher police response as AI-generated content increasingly triggers false reports and mistaken dispatches of police and fire services in South Korea and overseas.

On Feb. 2, the Gyeonggi Northern Provincial Police Agency said it arrested the YouTuber and referred him to prosecutors on charges including the distribution of false communications under the Framework Act on Telecommunications. Police said the suspect produced AI-generated videos designed to resemble police body camera footage and distributed them online. Investigators allege that he created and uploaded 54 fabricated videos starting in October last year.

According to police, the YouTuber used ChatGPT and other tools to generate prompts and produced the videos using Sora, an AI video generation program. He primarily uploaded short-form videos to social media platforms, where they amassed a combined total of about 34 million views.

The videos included scenes showing police chasing a Chinese man armed with a weapon and subduing him with a Taser, as well as officers pushing a female streamer to the ground and arresting her after a street confrontation. Another video depicted participants at a rally opposing U.S. President Donald Trump shouting, “Corrupt police, step down,” at officers. The footage was crafted to closely mimic real police operations, complete with AI-generated voices and ambient background noise.

Authorities say a key concern is how easily such AI-generated videos impersonating state authority can be found across major platforms. As of Feb. 2, manipulated videos featuring police officers and firefighters were still accessible on leading social media sites without being taken down.

One video uploaded in October last year showed police scolding a young man holding the South Korean flag at a rally, telling him that waving the flag required permission and that he should instead carry the Chinese flag because a Chinese delegation was visiting that day. Many viewers mistook the footage for real, posting comments claiming the country was becoming communist.

Other provocative videos portrayed police firing water cannons at streamers or fire trucks ramming vehicles blocking roadways and forcing them aside.

Police warn that AI-manipulated videos pose risks beyond eroding public trust in law enforcement, including the misuse of limited policing and emergency response resources. Last year in South Korea, a family reported an emergency after mistaking an AI-generated video showing a homeless person breaking into a home for a real incident.

Similar cases have surfaced overseas. In Japan, an AI-generated fake police officer appeared in a video call and demanded money under the pretense of an investigation. In Australia, controversy erupted after an AI-generated video circulated false information about firearm permit requirements.

“Maliciously fabricated videos warrant punishment under charges such as obstruction of official duties or defamation, which carry heavier statutory penalties than violations of telecommunications law,” said Seo Jun-bae, a professor of public administration at the Korean National Police University.


조승연 기자 cho@donga.com