Showing posts with label virtual avatars. Show all posts
Showing posts with label virtual avatars. Show all posts

Thursday, March 12, 2026

AI Avatars

I recently discovered a long-imagined science fiction trope coming true -- interactive computer avatars of real people. While StoryFile may not be the only company offering this service, it happens to be the one I noticed in the news:

StoryFile

Their slogan: "Bringing History to Life Through Interactive Conversations." As the tagline on the main page explains it, the technology "transforms interviews into AI-powered, life-size conversations for museums and institutions."

The interview consists of a "cinematic, professionally filmed session capturing hundreds of thoughtful responses." In the second step, "StoryFile intelligently links each answer to natural conversational pathways." Users of the finished product "ask questions and receive instant, authentic video responses" in "real-time interaction." Thus the experiences of survivors from World War II, the Holocaust, etc. are preserved and made publicly available.

This process strikes me as legitimate and useful for museums and other repositories of historical resources. The computerized "conversations" are educational and don't claim to be anything they aren't. But what about StoryFile's "Legacies" program? You can arrange to have yourself interviewed at length to create a computerized, interactive avatar, thus enabling your heirs to access your memories after your death. How would this program essentially differ from, for instance, a tape recording or a final video message? In my opinion, the interactive component makes a qualitative difference. The grieving survivors might feel as if they're talking to the actual person.

Might there be a danger of some users being unable to accept the deaths of loved ones because in a sense they can still interact with them? Could mourners become paralyzed by this illusion, trapped in one stage of grief and unwilling or unable to move on?

The next phase of development for this technology, a common SF trope but currently impossible in the real world, would be to upload the actual consciousness of the deceased into a mainframe or, nowadays I suppose, into the Cloud. Would this process constitute survival after death or merely an electronic replica -- or a sort of ghost?

Margaret L. Carter

Please explore love among the monsters at Carter's Crypt.

Thursday, February 13, 2025

AI-Generated Persons

I've read more than one near-future story about people dealing with bereavement through interaction with computer simulations of the deceased. I had no idea they already existed. This company creates simulacra of clients by means of information, photos, and videos provided by the individual in life for that purpose:

You, Only Virtual

If you upload this kind of data to generate a Versona -- a virtual person -- your surviving loved ones can carry on conversations with "you." The Versona claims to personalize and "replicate" the unique conversational style the survivor had with a particular deceased loved one. It promises "an authentic connection that brings true comfort and familiarity" even after death. Supposedly, these AI "persons" will "continue to grow alongside you. . . . learn, evolve, and remember." Still more audaciously, the Versona purports to embody "the essence of your relationship, brought to life."

Are "griefbots" or "deadbots" a productive way of dealing with grief, though? This article explores potential problems:

Will We Live on in the Form of Virtual Avatars?

The more realistic the program becomes, the more likely that users will tend to forget they're talking to a machine, not the actual person. Even if the user doesn't get snared by this illusion, interacting with the software might delay or derail the normal progression of grief. One critic quoted in the article warns, “This could lengthen the mourning process and perpetuate the lack and suffering, because the object is there. It blurs the relationship with the machine. And you can’t turn them off, because they represent someone you love." Another potential hazard arises from the AI tendency to "hallucinate." Until that can be reliably prevented, a Versona might say things jarringly unlike the real person, causing the live user painful cognitive and emotional dissonance.

An article on the ethical and legal problems associated with digital avatars, such as issues around actors continuing to "appear" in films after their deaths:

Digital Avatars and Our Refusal to Die

Suppose it eventually becomes possible to upload a person's literal mind and self into a computer before death, as imagined in many SF works? Would that process avoid the problems and hazards inherent in an AI-created reincarnation? Or would the flesh-and-blood user still run the risk of being tempted to live in a fantasy world of virtual rather than real "relationship"? Would the uploaded personality be the "real" person in a meaningful sense?

There's also a commercially available program to let you talk with your future self:

Meet Future You

The website describes it as "an interactive experience for cultivating self-reflection and long-term thinking." They claim to use AI to create a "realistic conversation partner based on information you provide," which "lets you chat with a personalized version of your future self." This program seems less problematic than conversing with the posthumous avatar of a dead parent, spouse, etc. Users wouldn't be likely to get sucked into unrealistic expectations of Future You as long as they remember the software can't make actual predictions. It's "designed to aid exploration of aspirations" by producing "a vivid and realistic picture of what your future life could be like." It sounds like a fun way to explore options if the user doesn't mistake it for an infallible oracle. And it might be entertaining just to see and hear what the AI imagines you'll look and sound like years or decades down the road.

Margaret L. Carter

Please explore love among the monsters at Carter's Crypt.