海角社区

Subscribe to the OSS Weekly Newsletter!

Deceitful AI Videos Mislead Seniors on Important Health Issues

The script, visuals, and voiceover narration are generated by AI, and so are the fake studies brought up as evidence.

I had never received an email in Vietnamese before. My request for an interview had been written in English, and the channel I had reached out to made English-language videos, but the reply I received was a simple question written in Vietnamese: 鈥淲hat can we help each other develop?鈥

The man in the videos was white, but of course, he was never real in the first place.

There is a rapidly spreading plague of videos on YouTube aimed at older adults who need medical advice. As their body ages, they want to know the secrets to eating right, exercising effectively, and maybe regaining the energy they had between the sheets decades earlier, and a new industry has set up shop on Google鈥檚 video platform to supply solutions, even if they are made up. Unlike with the quacks of old, nothing is being sold here. Your attention is what puts money into their pockets.

Hallucinated medicine

It was after delivering a lecture to the 海角社区 Community for Lifelong Learning, made up of senior citizens with a thirst for knowledge, that I was asked about听.听

Every few days, the YouTube channel with 321,000 subscribers posts a new video with 鈥渟cience-backed health tips, surprising remedies, and powerful longevity secrets that most people over 60 have never been told.鈥 The channel鈥檚 icon is a cartoonish grandpa with a finger on his smiling lips, swearing you to secrecy, and the thumbnails advertising each video are emblazoned with red and yellow banners, like crime scene tape. 鈥淕oodbye old age!鈥 screams the latest. 鈥淣ever take this after 65!鈥; 鈥淛ust 1 cup before bed repairs your eyes overnight鈥; 鈥99% of seniors don鈥檛 know: huge mistake.鈥 One of the channel鈥檚 trademarks is opening a video title with 鈥淥ver 60?鈥.

I began watching听, where a top heart surgeon allegedly says to skip walking and do these five exercises instead鈥攁 video posted two months ago which has accumulated a whopping 3.3 million views. Tech-savvy consumers will recognize the anonymity of these types of videos, but for older adults who may be less familiar with how the sausage is made, it鈥檚 worth pointing out a few red flags. There is an overreliance on stock footage, meaning video clips professionally shot for the purpose of being used by just about anyone. These shots lack character: they display a blandness, a sort of vanilla flavour that is immediately noticeable to the trained eye. The stock footage here is supplemented by simplistic cartoon drawings that jiggle left and right, and the voiceover narration sounds like a youngish man from North America. Except that I don鈥檛 believe this is a recording.

The voiceover is likely to be generated by artificial intelligence (AI). It鈥檚 very good, but it鈥檚 a little bit monotonous and, once in a while, the emphasis is slightly off. In fact, this听entire听video appears to be constructed from parts generated via AI. In and of itself, it doesn鈥檛 mean that the information is bad. If the text was written by a human but the video was realized using AI, the advice could theoretically be good. So, I decided to look up the 鈥済roundbreaking鈥 2024 study out of Copenhagen that this entire video hangs on, and I found that it did not exist. There听is听a third issue in the 34th听volume of the听Scandinavian Journal of Medicine & Science in Sports, as listed in the video description, but the article itself does not appear in it. I checked on the journal website and I did a web search for the title of the article. It鈥檚 fake.

Generative AI has been caught on multiple occasions hallucinating documents that do not exist. The听MAHA Reportreleased by the White House in May was riddled with hallucinated citations; meanwhile, librarians are now dealing with听, and judges are finding hallucinated cases in听听as more people uncritically trust AI to answer their questions and do work on their behalf. Now, fake papers are being cited in YouTube videos aimed at seniors.

Senior Secrets is not the only channel delivering AI hallucinations to a hungry audience: I found dozens of similar channels, with names like听,听,听,听,听, and听. With its 17+ million total views, Senior Secrets is merely the tip of the AI iceberg, but unlike our world鈥檚 actual icebergs, this one is quickly growing. It鈥檚 actually more of a听鈥攖hose masses of wet wipes and greases that clog up our sewers鈥攖han something that benefits the world.

I picked four such channels and checked every scientific reference their most popular videos listed to see if they existed. Out of 65 references, five were real. I was unable to find the 60 others. As with the Copenhagen non-study, the journals, volumes, and issues were usually dead-on: the AI is simply inserting fake papers into real pages. Occasionally, a journal was made up. Often, the only authors listed were departments or institutes (like 鈥淢ayo Clinic Center for Aging鈥 or 鈥淏ritish Columbia University Exercise Science Department鈥), which is highly unusual and should serve as a red flag. People write papers, not departments.

A minority of these channels lists an email address. I reached out to five of them, first with an interview request, then with a list of fake citations they had published in their videos. Only one wrote back. The channel听, which has released an astounding 304 videos since it began posting content on August 8 of this year, has a GMail address listed as a contact, and a 鈥淣guy锚n Anh Phan鈥 replied with the above-mentioned, business-friendly Vietnamese invitation to collaborate. I replied but they stopped responding.

While the videos鈥 aging audience may believe they are watching content made by English-speaking, North-American healthcare professionals, what they are likely seeing is being manufactured halfway around the world by content farms.

Made in Asia

Detective work often hinges on a stupid mistake someone made once, because even if you diligently cover your tracks, you are likely to slip up at some point. I ran a dozen of these YouTube channels through听, an online tool that extracts from a channel鈥檚 videos all kinds of information, including their geolocation. Most of the videos had no geotags鈥攖he person who uploaded the video did not indicate where the video had been made. But here and there, I saw slip-ups. Most of the videos on the Senior Wellness channel had听听as their location, which is Vietnamese for 鈥淯nited States,鈥 and one video from July was marked as 鈥淎n Do,鈥 Vietnamese for 鈥淚ndia.鈥 Multiple French-language channels named after fake doctors were geotagged听听Vietnamese for 鈥淔rance.鈥

We can use this metadata to see where on a map this video was posted, but unfortunately, the resulting GPS coordinates can be misleading. A large number of AI videos masquerading as being American have GPS coordinates that took me near the now-extinct mining camp of听, which appears to be roughly in the middle of that state. The reason is that these coordinates are derived, via certain imperfect shortcuts, from the Internet protocol (or IP) address the video uploader is using. On top of this, the uploader is likely to be using a VPN鈥攁 virtual private network, which allows them to听appear听to be using the Internet from a different country, like the United States or France鈥攁nd the GPS coordinates inferred from this IP address can be wrong. Indeed, there is听听that has been the subject of angry visitors because bad actors on the Internet were erroneously traced to that specific location by people following their Internet footprint.

What we are witnessing with channels like Senior Secrets is the work of content farms, likely based in Vietnam. Sitting in front of dozens of computers are people with no formal training in science or medicine who write prompts for generative AI platforms like ChatGPT and Gemini. The AI creates scripts, animations, thumbnails, voiceover narration, fake scientific papers; and these made-up elements are mashed together in a video that gets uploaded to YouTube.

Parallel to this, we see AI-generated videos mimicking real-life influencers鈥攍ike Quebec鈥檚 own听, who writes for the magazine听尝鈥檃肠迟耻补濒颈迟茅鈥攁nd selling a supplement, like CBD. If you鈥檙e not paying close attention, it looks like him and sounds like him, and he is telling you exactly what to buy to improve your health.

Speaking of Dr. Vadeboncoeur, it only took me a few minutes to find French-language channels just like Senior Secrets, hosted by fictitious physicians鈥攕ome old and wise, others young and fit鈥攏amed听,听, and听, the latter having accumulated a stunning 5 million views. On this channel,听鈥檚 geotag took me to a dense urban area in Lahore, Pakistan, possibly another slip-up. A听听on the channel of 鈥淒r. Marc Belland鈥 has the AI-generated host speaking English with French subtitles. One commenter remarked in French that this urologist is supposed to be in France, so why is his voice dubbed into English? The answer, of course, is because he is a computer creation.

YouTube has invited AI into its ecosystem. Last summer, the company made听, communicating that it would go after 鈥渕ass-produced and repetitious content.鈥 But AI content, as long as it doesn鈥檛 meet this standard of 鈥渕ass production鈥 and 鈥渞epetitiveness,鈥 is allowed. The more people watch it, the more money is made in ad revenue for whoever owns the channel, because ads play before, during, and after many videos. Welcome to monetized AI slop.

Targeting older adults is particularly insidious. Not only are they less likely than teenagers to understand how good AI has become at mimicking us, but the visual and hearing impairments common at that age will make detecting the subtle signs of AI even harder. Imaging watching a video like this on an old smartphone while dealing with hearing loss and macular degeneration, trying to spot if the doctor you see onscreen has hair that is slightly too smooth or a voice that is a bit out of sync with his mouth. It鈥檚 practically impossible.

And that is without considering newer AI models like Sora and Nano Banana Pro that generate videos often indistinguishable from reality. Even young, tech-savvy people are struggling to know if an apparently leaked photo from the set of听听or听听is genuine or AI generated. They are reduced to scrutinizing the slightest pixel that looks wrong and imagining they have found the telltale sign of AI. The photos are likely to be fake, but the online sleuths themselves might also be hallucinating in their quest to catch a glitch in the matrix. In the case of movie set leaks, the damage is minimal; but when it comes to health advice, the harm can be significant.

One听听from Senior Book claims that adding flaxseed to your daily oatmeal has the same benefit on your blood pressure as medication without nasty side effects, according to a University of Toronto study that does not appear to exist. (A听听of multiple trials on this question, including some from Canada, does not list a study that matches the criteria mentioned in the video, and the actual analysis鈥 conclusion is that consuming whole flaxseeds听may听reduce blood pressure, but none of the trials pitted the seed against actual medication.) Another听听from the same channel scares the elderly away from eating healthy vegetables because the AI says it increases the risk of a stroke, citing a Dr. Mei Tanaka from a paper that I could not find. And in a Senior Secrets听听on juices that are claimed to heal your vision while you sleep, the calm AI voice casually mentions that 鈥渕ost doctors won鈥檛 tell you this because, let鈥檚 be honest, there鈥檚 no money in natural remedies.鈥 Not only does it ignore the financial mastodon that is the wellness industry, but it teaches older adults to distrust their doctor.听

We are facing a crisis right now where reality and hallucinated fantasy have become indistinguishable. The kinds of videos made for Senior Secrets will only improve, as their content farm manufacturers switch to more and more realistic video generators. Do not trust random videos for health information. Make sure the host is human and credentialed. Look up their medical license on the website of their medical college to see if they exist. Seek their appearances on legitimate shows that prove they are real. Put more trust in in-person interactions than in what you see online. Ask health questions to your doctor, if you have one. Rely on professional orders and associations to find specialists who know the academic literature in their field and can give you evidence-based advice. Develop the healthy reflex, when watching a video from a source unknown to you, to ask yourself, 鈥淐ould this be AI? Is this voice real?鈥

This technology will need to be better regulated before our collective grasp on reality slips. For now, we must all remain vigilant.

Take-home message:
- More and more videos offering medical advice to older adults on YouTube are entirely made using generative AI, from their script to the voiceover narration, and they cite scientific papers that do not exist but that superficially look real
- Many of these videos appear to be made by content farms in Vietnam
- Because they are not made using cutting-edge AI tools yet, these videos can still be recognized as fake because they present incorrect anatomy, unnatural-looking people, somewhat monotonous voices, and gibberish writing on screen
- Do not trust random videos online for health information


Back to top