Advertisement

The App That Brings the Dead Back to Life

November 21, 2025 4:00 pm in by

A new artificial intelligence app designed to let people “speak” with digital versions of their deceased loved ones has ignited a wave of alarm, ethical concern, and outright horror across the internet — with many calling it one of the most disturbing uses of AI to date.

The app, called 2wai (pronounced “two-way”), was launched by Canadian actor Calum Worthy, best known for Austin & Ally and American Vandal. Worthy, who co-founded the company alongside Hollywood producer Russell Geyser, promoted the technology in a viral post on X (formerly Twitter) in which he asked:
“What if the loved ones we’ve lost could be part of our future?”

Article continues after this ad
Advertisement

But while the company insists the platform is a “living archive of humanity,” the public response has been overwhelmingly negative — and many users have immediately likened it to a real-life version of the Black Mirror episode Be Right Back.

A Mother Recreated in AI — and Shown Raising Her Grandchild

The promotional video that triggered the backlash — which has been viewed more than 22 million times — depicts a pregnant woman revealing her baby bump to an AI-generated avatar of her late mother.

The AI “grandmother” then appears throughout the child’s life:
• reading bedtime stories,
• chatting after school,
• and speaking to him as he becomes a father himself.

At the end of the ad, viewers see the real mother recording a brief video to “train” her digital avatar. A caption promises:
“With 2wai, three minutes can last forever.”

Article continues after this ad
Advertisement

What the App Actually Does

According to the company, 2wai allows users to create “HoloAvatars” — animated, conversational AI replicas of real people, fictional characters or historical figures. The app has pre-made avatars, including:
• Darius (fitness trainer)
• Celeste (astrologer)
• Luca (chef)
• Shakespeare
• King Henry VIII

But its most controversial feature involves creating an avatar of a real person using only a three-minute video recording.

Despite public assumptions, 2wai’s marketing lead Alex Finden told Newsweek that the app only allows people to create avatars of themselves while alive — supposedly to avoid misuse. After death, the company says, the person’s family would manage the avatar.

However, 2wai has not explained how such a small amount of data could authentically replicate personality, speech patterns or identity. And the company did not respond to additional requests for clarity.

Article continues after this ad
Advertisement

“Objectively one of the most evil ideas imaginable”

Across X and Reddit, thousands of users condemned the app as unethical, exploitative, and deeply unsettling.

Some of the standout reactions include:

• “It’s an entire business model based on exploiting the mentally ill and bereaved. The creators know it and are ghouls.” – HoneybeeXYZ
• “This seems absolutely ripe for abuse.” – trainwreck42
• “It isn’t your loved one. It’s a soulless LLM.” – SIGMA920
• “We should let the dead lay dead. It’s the most human solution.” – Kraien
• “The ad was genuinely one of the most evil things I’ve ever seen… the kind of unsettling you get before an eldritch god appears.” – MyPants

Others joked darkly about the commercial possibilities:
“Nothing says compassion like turning someone’s grief into a business opportunity.”
“Hey so what if we don’t do subscription-model necromancy?”

Article continues after this ad
Advertisement

And many were alarmed by parallels to Black Mirror:
“I’d love to know which kind of entrepreneur looks at a Black Mirror episode and says: ‘Yes, let’s build that.’”

Fears of Manipulation, Identity Theft and Exploitation

Some Redditors raised concerns that religious groups or predatory organisations could easily exploit such technology:

“I can definitely see religious organisations using this to their advantage — promising members access to their departed loved ones for years of worship.” – ShevanelFlip

Others pointed out that unstable or grieving users may struggle to distinguish real memories from AI hallucinations:
“The problem is some people will believe the LLM output, and it could lead to very bad outcomes for those already unstable.”

Article continues after this ad
Advertisement

Several people used the phrase “cyber psychosis” to describe the potential mental health fallout.

Experts Warn of “Digital Hauntings”

Well before 2wai appeared, researchers from the University of Cambridge’s Leverhulme Centre for the Future of Intelligence warned that “deadbots” — AI reconstructions of the deceased — could cause profound psychological harm.

Their study outlined three major risks:

  1. Commercial manipulation
    Bots could covertly advertise products “from beyond the grave.”
  2. Emotional harm to children
    AI versions of dead parents could insist they are still “with you,” confusing a child’s understanding of death.
  3. Digital stalking
    AI avatars might repeatedly contact the living with reminders or suggestions — what researchers describe as being “stalked by the dead.”
Article continues after this ad
Advertisement

Co-author Dr Tomasz Hollanek warned:
“These services run the risk of causing huge distress… particularly at an already difficult time.”

A Growing “Digital Afterlife” Industry

Worthy is not the first tech figure to attempt AI resurrection.
In recent years, AI has been used to recreate voices of late celebrities including Edith Piaf, James Dean and Burt Reynolds. Companies such as Project December and Hereafter also allow users to simulate conversations with the dead by providing personal memories or written data.

But the scale, marketing angle, and emotional framing of 2wai has pushed public discomfort to a new level.

The Conversation Continues

Article continues after this ad
Advertisement

Despite the backlash, 2wai insists its intention is to preserve human stories for future generations — not replace the dead or interfere with the grieving process.

For now, the debate around digital resurrection, identity ownership and emotional safety continues to grow, with many calling for strict regulation before AI “afterlife” tools become mainstream.

Not Available in Australia

Although the app is currently in beta for iOS in some countries, it does not appear in the Australian App Store at this time.

Advertisement