Go to contents

Fake AI experts mislead older YouTube viewers

Posted February. 10, 2026 08:33,   

Updated February. 10, 2026 08:34


A 78-year-old man, identified only as A, is an avid viewer of YouTube channels focused on senior health. He finds them helpful because a gray-haired, veteran doctor appears to zero in on his concerns with uncanny precision. What resonates most is the gentle tone. “You worry your grandchildren might avoid you because of an odor, don’t you?” the doctor asks. “You feel depressed, but you hesitate to show it because you do not want to burden your children, right?” Perhaps because the speaker seems to be a peer, he appears to instinctively grasp the emotions that surface as mind and body weaken with age.

A woman in her mid-70s, identified as B, keeps YouTube channels featuring senior-centered stories playing throughout the day. Titles such as “My daughter-in-law slapped me in front of my son” and “The woman crying at my husband’s funeral turned out to be my friend” draw her in. Videos lasting one to two hours are more gripping than many television dramas. Their appeal is heightened by claims that they are based on real events, along with the convenience of rewinding and replaying.

According to the 2025 Statistics on Older Persons, people aged 65 and older have sharply increased their video viewing time in recent years. As a result, the number of channels targeting senior audiences has grown rapidly. The problem is that many of them are fake, crafted to appear authentic. Virtual figures created using generative artificial intelligence introduce themselves as a “doctor of Korean medicine with 46 years of experience” or an “endocrinologist with 35 years of clinical practice.” They emphasize that their videos are based on academic papers and statistical data, yet rarely provide clear or verifiable sources.

The gender and age of these AI experts are typically tailored to a channel’s theme. The most common persona is a retired physician portrayed as a former university professor. Channels focused on senior skincare sometimes feature a female doctor in her 50s. Older viewers, often anxious due to emotional isolation and health concerns, are easily drawn in. Even foreign AI doctors who speak fluent Korean attract long strings of comments reading, “Thank you, doctor.”

Stories featured on senior-focused channels are also highly likely to be generated by AI. A producer specializing in long-form content for older audiences said scripts from popular channels are frequently copied, with only the names of characters and locations changed before new stories are produced using AI. Even so, many channels fail to disclose that the stories are fictional. Some add phrases such as “Send us your story by email,” further reinforcing the impression that the content is authentic.

Older adults are particularly vulnerable to AI-generated content. Sophisticated virtual figures reduce emotional distance by using empathetic language to address illnesses commonly experienced in old age. White lab coats and references to clinical experience lend these personas a sense of authority. Once algorithms begin repeatedly recommending similar channels, viewers can quickly become immersed in a closed world of AI experts.

YouTube requires AI-generated content to be labeled as “altered or synthetic content,” and disclosure obligations for AI-produced material are being introduced under the AI Basic Act. However, few older viewers pay close attention to such notices. Even when comments warn that the content is not genuine, most viewers appear largely unfazed.

The government has said it will regulate advertisements that feature virtual experts. YouTube channels that generate revenue through views and advertising should likewise refrain from deceiving audiences with fabricated content. As courses on creating AI-generated videos proliferate, older adults more urgently need education on how to identify trustworthy information and understand how algorithms deliver personalized content than instruction in production techniques.