AIJB

KNMG Warns Against Dangerous Medical Deepfakes

KNMG Warns Against Dangerous Medical Deepfakes
2025-08-22 nepnieuws

amsterdam, vrijdag, 22 augustus 2025.
The Royal Dutch Society for the Advancement of Medicine (KNMG) has expressed serious concerns about the increasing misuse of AI-generated deepfake videos in the healthcare sector. These videos use the faces and voices of doctors without their consent to spread false health claims, posing significant risks to healthcare, public health, and trust in physicians. The KNMG calls on the government and online platforms to take joint action against this form of misinformation.

Risks of Medical Deepfakes

Medical deepfakes can have severe consequences for healthcare, public health, and trust in physicians. These AI-generated videos deliberately or inadvertently spread misinformation, which can lead to incorrect medication use, misdiagnosis, and even serious health issues. The KNMG emphasises that these deepfakes can undermine the reputation and expertise of doctors, thereby eroding patient trust in the medical profession [1].

Collaboration Needed for Combat

To effectively combat the production and dissemination of medical deepfakes, the KNMG calls for joint action from the government, technology companies, healthcare professionals, and citizens. According to the KNMG, it is crucial to take measures against both the creators and distributors of these deepfakes, regardless of whether the deception is intentional or unintentional. The KNMG suggests preserving evidence, reporting to platforms, informing colleagues and employers, and taking legal steps [1][2].

Practical Tips for Recognition

For citizens, it is important to remain critical and vigilant for potential deepfakes. Here are some practical tips to identify fake news and medical deepfakes:

  1. Check the Source: Ensure the information comes from reliable and recognised sources.
  2. Look for Inconsistencies: Watch for unnatural movements, unclear sounds, or other technical errors in the video.
  3. Double-Check the Information: Compare the claims with other reliable sources and seek confirmation.
  4. Use Fact-Checking Tools: Various online tools are available to verify the authenticity of information.
  5. Stay Informed: Keep up-to-date with the latest developments in the medical world and the ways AI is being used and misused [1][2].

Implications for Media Literacy and Democracy

The dissemination of medical deepfakes has far-reaching implications for media literacy and democracy. It undermines trust in authorities and can lead to confusion and polarisation in society. To address these issues, it is essential to increase attention to media literacy and digital skills. Educational institutions, government agencies, and media organisations must collaborate to provide citizens with the tools and knowledge they need to remain critical and recognise reliable information [1][2][3].

Sources