How people are being tricked by deepfake doctor videos

What a pain!

Some of the UK’s most famous TV doctors are increasingly seeing their names and likenesses co-opted to sell scam products to unsuspecting social media users, new research warns.

The phenomenon is called deepfaking — using artificial intelligence to create sophisticated digital fabrications of real people. In these faux videos, a person’s head may be superimposed onto another person’s body, or their voice may be replicated in a convincing way.


Dr. Rangan Chatterjee has also been the subject of deepfake videos.
Dr. Rangan Chatterjee has also been the subject of deepfake videos. Ken McKay/ITV/Shutterstock

The research — published as a feature article Wednesday in the BMJ — finds that general practitioners Hilary Jones and Rangan Chatterjee and the late health guru Michael Mosley, who died last month, are being used to promote products without their consent.

In Jones’ case, that means unwittingly shilling blood pressure and diabetes cure-alls and hemp gummies.

Jones, 71, who is known for his work on “Good Morning Britain,” among other TV shows, said he employs a social media specialist to forage the web for deepfake videos that misrepresent his views and tries to get them taken down.

“There’s been a big increase in this kind of activity,” Jones shared. “Even if they’re taken down, they just pop up the next day under a different name.”

It can be tricky to discern which videos are forged. Recent research finds that 27% to 50% of people cannot distinguish authentic videos about scientific subjects from deepfakes.

It may be even more difficult if the video features a trusted medical professional who has long appeared in the media.


Before his death last month, Dr. Michael Mosley had his likeness co-opted for deepfakes.
Before his death last month, Dr. Michael Mosley had his likeness co-opted for deepfakes. AP

John Cormack, a retired UK doctor, worked with the BMJ to try to get a sense of how widespread the deepfake doctor phenomenon is across social media.

“The bottom line is, it’s much cheaper to spend your cash on making videos than it is on doing research and coming up with new products and getting them to market in the conventional way,” Cormack said in the article. “They seem to have found a way of printing money.”

Cormack said the platforms that host the content — such as Facebook, Instagram, X, YouTube and TikTok — should be held accountable for the computer-generated videos.

A spokesperson for Meta, which owns and operates Facebook and Instagram, told the BMJ that it will investigate the examples highlighted in the research.

“We don’t permit content that intentionally deceives or seeks to defraud others, and we’re constantly working to improve detection and enforcement,” the spokesperson said. “We encourage anyone who sees content that might violate our policies to report it so we can investigate and act.”

What to do if you detect a deepfake video

  • Look carefully at the content or listen to the audio to make sure your suspicions are justified
  • Contact the person shown endorsing the product to see if the video, image or audio is legitimate
  • Question its veracity with a comment on the post
  • Use the platform’s built-in reporting tools to share your concerns
  • Report the user or account that shared the post