Generative AI has found a host of uses since it exploded into the public consciousness with the release of ChatGPT last year. From drafting essays for students to helping customer service agents field calls to writing code for software engineers, it seems there’s no end to the ways these tools can make life a little easier. But some entrepreneurs are trying to put AI to use in a very different way: making the afterlife a little easier. Not for us—that’s a whole other level of metaphysics and technology we have yet to reach—but for those we leave behind.
Losing a loved one sets off a process of grieving that’s different for everyone. But one component of grief is universal: we miss the person who’s gone and wish we could talk to them again, spend time with them, hear their voice and their laughter. A company called MindBank Ai is building a product that would let us do just that.
“When you lose someone you have moments where you want to look through photos or videos of them in a bittersweet way,” Emil Jimenez, the company’s founder, told Singularity Hub in an interview. “Now you can have those moments, but have a conversation with the person too.”
MindBank plans to create digital twins of its users, replicating their personalities, ways of thinking and speaking, and other characteristics as closely as possible. These twins will interact with our loved ones after we’re gone—and teach us about ourselves while we’re still here.
“What we’ve created is a personalized layer of AI that sits on top of a generalized model,” Jimenez said. “I like to call it AI-enhanced humanity.” That’s actually the title of his book, which comes out in October. It starts two million years ago with homo habilus, the first hominid that was making tools, and one of its themes is how tools change our culture. “AI is just another tool that’s going to change our culture,” he said.
The Voice of a Database
One day when Jimenez’s daughter was four years old, she was playing games on an ipad when Siri, Apple’s AI assistant, popped up. The child started asking the AI questions, and within minutes had formed a bond with it. “I started thinking,” Jimenez said. “Siri is just an interface to a database. How can I become that database, so that my little girl can always ask me a question and get a response?”
He wondered what the world will look like 40 years from now, when his daughter is his age. What might the future hold for her, and for all the other kids out there who are interacting with technology, screens, and AI in a way no previous generation ever has?
While we don’t know quite what that future will look like, we can bet personal data will play a significant role. That data—and who can access it, and how—will likely be regulated and controlled differently than it is today. We’ll be able to opt in or out of sharing our data in various ways, and will get products and services (and ads, of course) that are correspondingly personalized.
“The digital twin is a data lake for you to connect to other services,” Jimenez said. “That same database of your stories and your life can also be art, or music, or medical records. It’s digitizing humanity, and being able to use all the tools available to offer you better services.”
The concept of a personal digital twin essentially boils down to representing you using your data. That would include data you give only to the MindBank app. To get a sense of your personality and how you think, the app asks you a series of questions; the more questions you answer, the better it gets to know you.
“You answer questions and it records your voice,” Jimenez said. “All it takes is one minute a day. The more data the better, but it’s more important that you do a consistent frequency at low doses.”
A Personalized Turing Test
So what are the technologies that will make personal digital twins possible? A key one is natural language processing, which underlies ChatGPT and other large language models.
Natural language processing (NLP) is, in short, the AI field of understanding and mimicking human speech. NLP algorithms are trained on massive datasets of text—in ChatGPT’s case, billions of pages from the internet—and they use that data to figure out the relationships between words. Text or speech generators with a transformer architecture (that’s what the “T” in GPT stands for) model the relationships between all the words in a sentence at once, weighing how likely it is that a given word will be preceded or followed by another word, and how much that likelihood changes based on the other words in the sentence.
Through finding the relationships and patterns between words in its training dataset, NLP algorithms learn from their own inferences in what’s called unsupervised machine learning.
To create convincing digital twins, MindBank will have to take things a step further. Rather than a fixed set of rules its algorithms would use to respond to questions and interact with others, the model would ideally evolve over time. This would mean having some level of understanding of users’ experiences and how they’ve impacted them, and being able to integrate that into the way they interact. This is obviously more complicated than reflecting back the patterns in how someone talks or writes.
How easy is it to “learn” a person, anyway? While we all have our own unique speech patterns, go-to phrases, and ingrained habits and ways of thinking, we’re also dynamic beings who aren’t always predictable.
Think about the last time you asked friends or relatives for advice. You probably had an idea what kind of response you’d get from different people: friend X tells it like it is, so you only go to her if you’re up for some tough love. Friend Y tends to be gentle and say what he knows you want to hear. Every once in a while, though, those friends surprise you. Maybe the tough-love friend is going through a life challenge of her own that’s making her see things differently. Or the gentle friend is having a bad day and feeling less sympathetic than usual.
Humans are complex creatures who are constantly being influenced by our environment and interactions. To the extent that our thoughts and behavior fall into patterns, Mindbank is aiming to capture it all. “We want the longitudinal data of emotions and personality,” Jimenez said. “Today you might have a good day, tomorrow a bad day. We want the ups and downs.”
Know Thyself
Understanding those ups and downs won’t just help your descendants know you better in the future—it can help you know yourself better in the present. Alongside being a trove of data on your personality and stories for others, Jimenez wants MindBank to be a tool for deeper self-awareness. “We can help people move from healthcare to self-care by empowering them with their health data, specifically their mental health data,” he said.
He likened MindBank’s digital twins to a highly advanced form of journaling. If you write down your thoughts and feelings while navigating a difficult experience, going back to read them could give you insights into why you made a certain decision, give you more awareness of how you process things, or remind you to behave differently in the future. “The product we have today is literally a dashboard of the mind,” he said. “Imagine Google Analytics of your mind, where you can see data of how you are emotionally and personality-wise, how you evolve, and you can track your progression over time.”
Personal digital twins will be like living versions of a journal you can talk to and learn from. Through techniques like identifying and labeling words—ie as positive or negative, intrinsic or extrinsic—MindBank will produce a model of your personality, traits, and emotions, and mirror it back to you.
“There’s a feedback loop to it,” Jimenez said. “We have the ability to scale that, make it data-driven, and give people interesting insights.”
Next-Gen Grieving?
Whether you’re using them to understand yourself better or to keep your memory alive after you’re gone, MindBank Ai and other platforms like it pose some intriguing possibilities. But there’s a definite creepiness factor, and some big questions worth considering.
Would it be healthy to continue talking to a loved one after they’re deceased? Would the grieving process become easier—or harder? Could there be nefarious uses for digital twins that bad actors would take advantage of? And, social media has in many ways made people more self-absorbed and less authentic; would digital twins make this effect even worse?
In terms of grieving a loved one, Jimenez thinks MindBank will help people get through hard times. “Reliving is part of the therapeutic cycle,” he said. “It’s part of coming to grips with what happened and moving on. This will be another tool that can help us get past that stage.”
We already use texts, emails, letters, or voicemails to ‘engage’ with and remember the deceased during grieving. How different would it be to talk them in the form of a digital twin? For some people, the opportunity could provide relief, preventing them from getting bogged down in sorrow or being unable to let go. It could even make it easier to prepare for a loved one’s passing knowing that you’ll be able to “talk” to them after they’re gone.
The opposite risk may exist, too. What if the algorithm gets your loved one’s personality wrong or says something totally out of place at a particularly difficult moment? It would be a visceral reminder that the real person is gone and nothing can replace them, and would likely make the user feel even more grief and stress.
There’s also the question of how far we may let digital twins go. If they become commonplace, how much power would we want to give them? Say you need to make an important decision about your own children or your career—would you ask your deceased parent’s digital twin for advice? Could a digital twin remain on a board of directors after the real person has passed away?
Not So Sci-Fi
Creating and interacting with digital twins is already part of some people’s lives today, and multiple platforms are using AI to piece together likenesses of those who’ve departed.
You, Only Virtual recreates the “unique dynamics of a relationship” and generates what it calls an authentic essence of a loved one using real-time and archived communication. The company’s (somewhat eerie) tagline? “Never have to say goodbye.”
HereAfter AI calls itself a virtual biographer: the app interviews users by giving them story prompts, then uses their replies to design a “legacy avatar” of them. Their loved ones can then ask the app questions and get responses in the user’s voice. HereAfter’s founder started the company after using AI to create a chatbot version of his father, which he called the Dadbot.
A key difference between HereAfter and Mindbank is that while the former sticks to a set of questions and stories, MindBank is aiming for something closer to open-ended conversation—a far more complex feat for AI to master.
Outside of these dedicated “grief tech” services, people are using ChatGPT to recreate their loved ones’ voices in written form, feeding the algorithm texts and emails to train it then having it respond to them as if it were the relative or friend themselves. People have reported finding comfort in these interactions.
Creepiness aside, perhaps our assessment of services like Mindbank AI should come down to one question: does it help people? But as with many technologies, interacting with digital twins in a way that’s enriching rather than destructive will ultimately be up to each user. While the platforms should try to build in guardrails to keep users from, say, getting addicted or using avatars for nefarious purposes, it seems they’ll primarily be self-policed; only you know if having a two-hour “conversation” with your deceased parent makes your grief more bearable, or less.
Jimenez, for his part, is nothing but optimistic about AI’s potential to help us remember our loved ones and stay connected to them after they’re gone. “I think it’s a beautiful thing,” he said.
Image Credit: Ahmet Sali on Unsplash
* This article was originally published at Singularity Hub
0 Comments