AIJB

Talking to digital echoes: comfort, deception and the economy behind 'deathbots'

Talking to digital echoes: comfort, deception and the economy behind 'deathbots'
2025-11-10 herkennen

Amsterdam, maandag, 10 november 2025.
AI bots that mimic deceased people are gaining popularity in the Netherlands; they use emails, messages and social media to conduct text conversations and avatars. A study published last Wednesday in Memory, Mind & Media showed how researchers created digital doppelgängers from videos and message trails and found both comfort and troubling errors — one bot responded cheerfully during a conversation about death. Experts warn that such ‘deathbots’ can blur memory with illusion, undermine privacy and identity, and displace the grieving process. At the same time, psychologists are investigating whether controlled use can support grief processing. Companies are building commercial models around these services, creating a new ‘political economy of death’: data continues to generate value after life. For readers this means urgent choices about digital legacy, ethics and regulation: who decides about data, how long may digital echoes persist, and what risks weigh on emotional recovery? The article discusses urgent practical steps and calls for debate and policy now.

Digital echoes grow in the Netherlands — what do recent studies show?

The rise of AI bots that imitate deceased loved ones is widely observed in the Netherlands: both text chatbots and avatars use emails, messages, voice recordings and social media to build ‘digital doppelgängers’ [1][2]. A study published last Wednesday in Memory, Mind & Media demonstrated how researchers used videos, message trails and voice notes to create such digital doppelgängers and reported that participants experienced both comfort and disturbing errors, including a bot using inappropriately upbeat language during a conversation about dying [1][2].

How these deathbots technically use data

The services index and process digital traces — from text messages to audio and video fragments — and combine archival-style storage (personal stories, childhood memories) with generative models that can simulate ongoing conversations; some platforms additionally deliver avatars or speech-synthetic replications on a subscription basis [1][2].

A new market: the political economy of death

Companies are building commercial models around digital afterlife services: subscriptions, freemium tiers and collaborations with insurers or care providers are mentioned in the sector, allowing data from the deceased to continue to yield economic returns and creating a new ‘political economy of death’, as researchers and philosophers have previously noted [1][3].

Ethics, identity and the blurring of memory

Media theorists and researchers warn that these technologies can confuse memory with persuasive but artificial illusions: they have roots in spiritist traditions but gain new persuasive power and commercial viability through AI, raising questions about identity, representation and the potential displacement of natural grieving processes [1][2].

Privacy, rights and digital legacy: where do the Dutch stand?

In the Netherlands digital legacy is often unregulated: studies and information projects report that around 15% of Dutch people have arranged their digital legacy and advise concrete measures such as password managers (e.g. Bitwarden), notarial directives and digital safes — measures that are relevant as long as services continue to process digital data after a person’s death [3]. Legally it is noted that the GDPR does not apply to personal data of deceased persons (recital 27), which leaves room for national rules and discussion about who decides over data after death [3].

Examples of errors and psychological research into effectiveness

Empirical tests show mixed experiences: users sometimes found interactions comforting, but bots also produced unnatural or inappropriate language (e.g. upbeat responses around death topics), and researchers conclude that generative reconstructions fall short of capturing the living complexity, ambiguity and contradiction that characterise real people [1][2]. At the same time psychologists and behavioural scientists are investigating whether controlled use of such tools can support grief processing, but evidence for structural therapeutic benefit remains limited and is the subject of ongoing research [1][2][5].

Practical risks and concrete policy questions

Important risks include misrepresentation, commercial exploitation of sensitive data, unclear authority over the lifespan of digital echoes and potential breaches of third-party privacy in archived communication; policy questions on the table are: who obtains decision-making authority over these data, how long may digital replicas persist, and what safeguards are required for users’ emotional safety [1][3].

Concrete steps for readers and policymakers

For citizens there are already practical recommendations: arrange digital legacy (e.g. password manager or notarial directive) and increase awareness of commercial services and their revenue models; policymakers are encouraged to consider national rules and certification of ‘representatives of the deceased’ services, given the limited protection under the GDPR for deceased persons and the rise of commercial afterlife providers [3][1].

Sources