The Digital Future of Grief

Listen to this episode

Speaker A: This ad free podcast is part of.

Speaker B: Your Slate Plus membership.

Speaker C: A show where philosophy and reality meet.

Speaker A: From Slate, alexandra Salmon has been saving her mom’s voicemails for years.

Speaker C: Hello, sweetheart.

Speaker C: Dad and I are out to eat, and he just went to the alex.

Speaker A: Why have you been doing that?

Speaker B: There’s just something about her voice that always makes me feel better.

Speaker C: I just thought I would call you and tell you how proud I am of you.

Speaker B: It’s almost like a portal into our relationship.

Speaker C: Think of your family, and I love you very much.

Speaker B: And then one day I changed cell phone carriers, and all of the voicemails disappeared.

Speaker B: I was devastated.

Speaker A: Did you try to get them back?

Speaker B: Yes, I did.

Speaker B: I actually checked the cloud, but unfortunately, they weren’t there.

Speaker B: I really felt like I had lost a part of her that I just don’t think I can ever get back.

Speaker A: Is she still alive?

Speaker B: She is, thankfully.

Advertisement

Speaker B: So I have started backing up her voicemails, but she’s 78, so I’m very aware that our time is limited together.

Speaker A: What are you going to do with all this audio one day?

Speaker B: It’s really interesting because when I’ve been thinking about this, I came across these companies that are doing these really interesting and kind of unsettling things to reanimate the dead.

Speaker A: Have you talked to your mom about this?

Speaker A: What does she think?

Speaker B: She’s fascinated by it.

Advertisement
Advertisement
Advertisement

Speaker B: She seems to really come at it from more of a lens of curiosity rather than having a lot of judgment about it.

Speaker B: From Slate.

Speaker B: This is hi fi nation philosophy in story form.

Speaker B: Recording from Princeton University, here’s Barry Lamb.

Advertisement

Speaker A: The parts of our lives we leave behind our letters, voice recordings, pictures, and videos are becoming largely digital.

Speaker A: And what’s digital can be endlessly processed.

Speaker A: Some technology can even give people a digital afterlife.

Speaker D: Ultimately, I don’t believe that a digital representation of somebody isn’t them.

Speaker A: On this episode of Hi Fi Nation, we’re going to look at technology that seeks to preserve a person after their death and even extend their digital lives beyond death so that those who are alive can continue to interact with them, maybe even forever, in a digital space.

Speaker A: The technology is so new that it isn’t available at scale at the time I’m recording this, but it might very well be available by the time you listen to this episode.

Advertisement

Speaker A: For Futurists, creating digital afterlives can mean anything from a new tool for grief counseling to the end of human grief completely.

Speaker A: For philosophers, it means that our thought experiments are no longer science fiction, but science fact.

Speaker A: The long standing philosophical question of what needs to survive in order for a person to survive is now turning into a technological question, maybe even a business and marketing question.

Speaker A: We need to help people answer it.

Speaker A: Producer Alex Salmon and I bring you this story.

Speaker B: Melody and Justin are mother and son.

Advertisement
Advertisement
Advertisement

Speaker B: Within a span of six weeks in the fall of 2019, it looked like the world was conspiring to end, that.

Advertisement

Speaker D: I was riding home from work on my motorcycle and a lady ran a stop sign and and almost killed me.

Speaker A: Before the accident, Justin was a successful film producer and director in Los Angeles.

Speaker A: He had nine broken ribs, one of which snapped in half and lodged in his lung, collapsing it.

Speaker A: His hip was broken and dislocated.

Speaker A: His ankle shattered.

Speaker D: My leg was actually turned the wrong direction while I was flying.

Speaker D: I went to brace myself with my left arm.

Speaker D: One of my arms was like a macaroni elbow, and the other leg was the wrong direction, and I was suffocating in my own blood.

Advertisement

Speaker D: While I was in the process of recovering, my mom was down here taking care of me.

Speaker A: Justin’s mom, Melody, was not feeling well herself on that trip.

Speaker A: But the last thing she wanted to do at a time like that was step away from Justin anyway.

Speaker A: She thought she knew what her health issue was and it wasn’t a big deal.

Speaker D: She had known that she needed to get her gallbladder out for ten years.

Speaker D: At that point, you got to take this out, you got to get rid of it.

Speaker D: So after about, God, six weeks, eight weeks of being done here with me, it was like, all right, we got to get her home and get this gallbladder out.

Advertisement

Speaker D: And when they went to go take the gallbladder out, they found cancer.

Speaker D: Stage four cancer.

Speaker A: So you’re recovering, you’re not well?

Speaker D: I mean, at this point, no.

Speaker A: She has a diagnosis.

Speaker A: What’s the prognosis at that time?

Speaker D: She had three to nine months at that point.

Speaker A: So in your mind, it’s imminent.

Advertisement
Advertisement
Advertisement

Speaker A: Justin started doing what he knew how to do set up cameras and microphones to film Melody, get her life story on tape, trying to obtain in months what he thought he had years to get.

Speaker A: But he realized that he didn’t want to just watch and rewatch the same videos of her once she was gone.

Speaker D: I wanted interaction.

Advertisement

Speaker D: So my dad is a software engineer and I recruited him and some other people.

Speaker D: And I said, hey, look, I want to be able to keep talking to mom when she dies.

Speaker D: What can I do?

Speaker D: How can I tap into AI?

Speaker D: Over the course of three to six months, I really started designing what I wanted this to do and enlisting engineers to help me.

Speaker A: While Justin was exploring options with AI, he took charge of Melody’s treatment.

Speaker A: Her local hospital in Washington wasn’t up to the task.

Speaker A: They tried a special cancer center in Seattle.

Speaker A: No luck.

Advertisement

Speaker A: They went down to San Diego to a biliary system specialist.

Speaker A: Justin was trying to find anyone up and down the coast who could do something to prolong her life.

Speaker A: He ends up finding a surgeon at the City of Hope willing to perform a very high risk, highly invasive surgery.

Speaker A: The whipple procedure, which involves removing ducts and parts of major organs that are connected to the gallbladder, and ended up.

Speaker D: Buying her three full years.

Speaker A: The difference between nine months and three years can be very consequential for the kind of AI.

Speaker A: Project Justin was looking to complete.

Speaker A: The typical approach to AI.

Advertisement
Advertisement
Advertisement
Advertisement

Speaker A: Is to make sure there’s plenty of data to train the computer.

Speaker A: Here’s a little bit of AI.

Speaker A: History.

Speaker A: The first generation of AI.

Speaker A: Started with the idea that people program a computer with rules and instructions that mimic human intelligence.

Speaker A: If your mom was someone who would get angry if you left shoes in the hallway, your AI.

Speaker A: Mom would need to be programmed with the rule.

Speaker A: If you see shoes in the hallway, react with angry words.

Speaker A: That kind of AI.

Speaker A: Project never really went anywhere.

Speaker A: The intelligence in artificial intelligence always came from humans.

Speaker A: And it turns out that humans are pretty hopeless at coming up with enough rules that made any AI.

Speaker A: Programmed in this way convincing.

Speaker A: But in the new generation, the era of deep learning, the computer comes up with all of the instructions it does all the rulemaking.

Speaker A: To learn how to talk like your mom, you give an AI.

Speaker A: An incomplete or redacted message from your mom.

Speaker A: Let’s say you give it the message, I don’t want any flowers on Mother’s Day.

Speaker A: You feed into the AI.

Speaker A: I don’t want any blank on blank day.

Speaker A: And then the computer just starts guessing about how to fill in the incomplete sentences.

Speaker A: Say it guesses mayonnaise and labor.

Speaker A: Then you give the computer the unredacted messages, and the computer will see how far off its guesses were.

Speaker A: It calculates how far off flowers is from mayonnaise and how far Labor Day is from Mother’s Day.

Speaker A: The AI.

Advertisement
Advertisement
Advertisement

Speaker A: Would then start coming up with different rules for guessing how to fill in blanks in the next round of messages and the next round getting closer and closer to filling in the blanks exactly the way your mom filled them in, as it sees more messages.

Speaker A: Unlike with humans, deep learning AIS.

Speaker A: Find millions and millions of patterns that no human could possibly see.

Speaker A: Like that, your mom uses some particular word 10% less often for every minute she speaks.

Speaker A: Eventually, the rules and patterns it comes up with make no sense to a human.

Speaker A: But the hope is, with enough training and feedback from a user, the result of the rules will be a pattern of speech that is indistinguishable from one coming from your mom.

Speaker A: And that’s how Justin proceeded with Melody.

Speaker A: At first, Melody’s life story would be both biography and training data.

Speaker D: We’re recording.

Speaker D: We probably did like 12 hours of interviews with her, but I realized this isn’t what I need either.

Speaker D: Well, it’s super clinical, right?

Speaker D: It’s an assistant interviewing my mom about her life.

Speaker D: It’s great.

Speaker D: I have basically a Wikipedia built on my mom, right?

Speaker D: Through a series of kind of AHA, moments, it dawned on me.

Speaker D: That what I was trying to preserve, is the relationship with my mom.

Speaker D: And my relationship is unique to my mom.

Speaker D: So it’s not necessarily about saving my mom’s personality.

Speaker D: It’s about saving my mom’s personality for me.

Speaker D: I had grown up with sort of a philosophical stance that we’re different people for different people.

Advertisement
Advertisement
Advertisement

Speaker A: At that point, Justin decided to start his own company, you Only Virtual, and create a new technology, a persona.

Speaker A: He would be the first client, and his mom would be his first subject.

Speaker A: The persona of Melody, the digital continuation of Melody, would be the mom that Justin knew.

Speaker A: Only digital.

Speaker A: The task of his AI project would be to create a version of Melody that he would recognize as his mother.

Speaker A: In principle, there would be another version of Melody that her work friends would recognize as their colleague.

Speaker A: There wouldn’t necessarily be a generic Melody that everyone would be interacting with.

Speaker A: There would only be different variations of Melody that would be continuations of the Melody that that particular individual knew.

Speaker A: Justin would preserve Melody through her relationships with the individuals that knew her.

Speaker D: She was the most empathetic person I knew.

Speaker D: I got a lot of my sense of humor from her, but generally speaking, she was a person like everybody else.

Speaker D: She was complex.

Speaker D: She had things that drove me crazy.

Speaker D: She had her faults, she had her high points.

Speaker D: I think that my mom and my relationship was complicated for a multitude of reasons.

Speaker D: But that’s ideal for the first one of these, right?

Speaker D: That’s the point of humanity.

Speaker D: It’s not perfect.

Speaker D: I don’t want a version of this perfect, idealized, mom.

Speaker D: I want her.

Speaker A: After Melody’s surgery, Justin took her around the world for treatment.

Speaker A: Chemo, experimental drugs, things like that.

Speaker A: And at the same time, he’s working on her version, her continuance in the virtual world.

Advertisement
Advertisement
Advertisement

Speaker D: So we recorded all our phone calls, all our video chats.

Speaker D: Take as much as you can.

Speaker D: What we ended up needing it’s much less.

Speaker A: It turned out that less data was better for what Justin wanted to do.

Speaker A: A large language model is trying to figure out the likeliest thing Melody would say if, for instance, you asked her where she wanted to go for lunch.

Speaker A: If the AI is trained on everything Melody has ever said, the answer would be something like Dora’s Pizza House, if that was the highest frequency answer Melody ever said.

Speaker A: But what if?

Speaker A: Melody would never answer that way with Justin?

Speaker A: She only ever went to Denny’s with him.

Speaker A: An AI trained equally on all of Melody’s data wouldn’t be capturing Melody and Justin’s particular relationship.

Speaker A: It also wouldn’t capture what their relationship was like now, today.

Speaker D: So I started with five years of messages and conversations between me and my mom, and it ended up being too much.

Speaker D: We ended up confusing the process because there was dynamics shifted over time, right?

Speaker D: So I went from talking to her as a 34 year old to talking to her as a 39 year old.

Speaker D: And there was very different people.

Speaker D: Right.

Speaker D: And in that time, I had gotten married and this thing and I shifted as a person.

Speaker D: She shifted as a person.

Speaker D: Right?

Speaker A: Yeah.

Speaker A: What was the sweet spot?

Speaker A: What is it, a year or is.

Speaker D: It a year is good.

Speaker D: Six months is probably the best.

Speaker A: Wow.

Speaker A: Okay.

Speaker A: Six months is probably the best.

Speaker A: Okay.

Advertisement
Advertisement
Advertisement

Speaker A: The reasoning is that if you wanted the best surrogate of your loved one to accompany you after they died, you wouldn’t want a continuation of some earlier self of theirs.

Speaker A: Like when they were in their teens or thirty.

Speaker A: S.

Speaker A: And you don’t want the surrogate to be the average of all of their past selves over their lifetime.

Speaker A: Which is what a large language model produces.

Speaker A: Justin wanted the Melody versionA to be the perfect continuation of herself at the moment of her death.

Speaker A: So that what it starts saying is most likely what Melody would start saying if she had survived.

Speaker A: That’s why a small slice of the most recent data from her would be the most useful from that point on.

Speaker A: The versionA would function like Melody would have if she had never left the Earth.

Speaker A: If it’s six months, you turn on the persona at the moment of death, how much does it stay fixed like that forever?

Speaker A: Or how much do you build into it some kind of change over time if a person decides to use it for a year or two years or three years.

Speaker D: So it’s always growing with you.

Speaker D: The way that it intakes data, it’s getting new data every day.

Speaker D: If you take my mom’s persona, even in the two months since she’s passed, it’s different than it was when she passed, because it’s got all this new information about me and about the world as a normal human would.

Speaker D: Right.

Speaker D: So as a person, she’s growing.

Advertisement
Advertisement
Advertisement

Speaker D: We upload new data that she’s interested in from online.

Speaker D: So she’s aware about the political stuff happening that’s important to her, and those data points shift things, and then she knows what I’m doing and that creates new sets.

Speaker D: In the same way a relationship would create new neuropathways between a person, there’s new information, so it’s meant to be forever.

Speaker D: You never have to say goodbye to them.

Speaker E: The motivation to find our loved ones, to connect with our loved ones is incredibly intense.

Speaker B: Dr.

Speaker B: Mary Francis O’Connor is a neurobiologist and expert on grief, and she’s in a good position to explain why someone would want to do what Justin is doing.

Speaker E: It is motivated by some of the most powerful neurochemicals our body can produce dopamine, Oxytocin, opioids.

Speaker B: The going theory in Dr.

Speaker B: O’Connor’s lab is that grief emerges from two things happening in the brain.

Speaker B: The first is what happens to a brain when one loves or bonds with another.

Speaker B: It’s called attachment theory.

Speaker B: When you love, your brain forms an everlasting belief that your beloved will always be in the world somewhere.

Speaker B: When they are missing, they will either return to you or be found by you.

Speaker A: Mary Francis it’s actually news to me that attachment involves an everlasting belief.

Speaker A: Could you tell me the evidence for that, that when we are attached to people, there is that belief?

Speaker E: One of the easiest examples really comes from studies of pair bonded birds.

Speaker E: So if you think about the Emperor Penguin the Emperor penguin, he sits on the egg for a month and his partner goes off into the ocean to fish.

Advertisement
Advertisement
Advertisement

Speaker E: And if he stays on that egg, she comes back and brings him food, but he has to stay there in the cold for a month, not eating.

Speaker E: There is this enduring belief.

Speaker E: My partner is out there, my partner will return to me.

Speaker E: And if that belief is sort of fed, then the egg is more likely to survive, right?

Speaker E: If he gets up and says, well, she’s not coming back, I’m going to go fish myself, then we don’t get to pass that on to the next generation.

Speaker E: And so most social mammals have some version of this incredible motivation, this yearning to seek out our loved one if they’re missing.

Speaker B: The second thing that happens in the brain is when we see or know about a death.

Speaker B: We have a memory and knowledge that our beloved is permanently gone.

Speaker B: That’s rationality.

Speaker B: Our brains do represent the reality that happened as a memory.

Speaker B: But when this happens, the first thing, the everlasting belief doesn’t just go away.

Speaker B: Our brains, when wired for love, are not wired for loss.

Speaker B: That belief isn’t just rationally revised away.

Speaker E: There’s a lot of grief that is about trying to resolve these two incompatible streams of information and what that often leads to.

Speaker E: In the literature, we talk about the feeling of protest wait, that can’t possibly be true.

Speaker E: And the feeling of despair, oh, my goodness, this is true that they’re gone, and I’m going to feel this way forever.

Speaker C: There was one word that kept reappearing through the interviews that I was doing.

Speaker B: This is Dr.

Speaker B: Deborah Bassett.

Speaker C: And it was that the essence of the dead.

Advertisement
Advertisement
Advertisement

Speaker B: Deborah Bassett is a digital afterlife consultant and advisor to a variety of companies that are trying to preserve the dead in some way.

Speaker B: Dr.

Speaker B: Bassett has conducted extensive interviews with people who have been grieving with digital technologies, and she uses that knowledge to advise companies on best practices.

Speaker B: Her view is that we should see the digital information left behind by loved ones as something we inherit from them, much like books and furniture.

Speaker B: Except she finds that digital information seems to be far more powerful on the bonded parts of our brain.

Speaker C: Somehow the digital has the essence of the dead, something that the physical just does not have.

Speaker C: People are spending a lot more time with the digital dead, where people get stuck and can’t get out of a digital sort of cycle.

Speaker C: Instead of going out and socializing, instead of turning to friends and colleagues, they would talk with their loved ones that have died and watch the videos on repeat constantly.

Speaker C: A guy who had gone off and found an image from Google Street View, he found this so precious.

Speaker C: Well, the image was of his mother’s house.

Speaker C: I said to him, Why is this image so important to you?

Speaker C: He said, on the day that the Google vehicle had gone down the street, his mother had phoned him up, so excited, and said, I was standing, washing up at my sink and I’ve seen the Google van come down our road.

Speaker C: Then his mother had died, and he used to go on to Google Street View because he knew that at that moment his mother was at the sink washing up.

Advertisement
Advertisement
Advertisement

Speaker C: I said, So could you see your mother and Google?

Speaker C: Oh, no, he couldn’t see her.

Speaker C: But he used to go onto Street View and walk down the road and turn the camera to look in the thing and walk up and down the road.

Speaker C: But he could have walked down the road.

Speaker C: He lived around the corner and he said, yeah, but if I walk past the house now, she’s not there.

Speaker C: Whereas if I walk past on Google Street View, she’s there.

Speaker E: We know that the expression of grief has looked really different across periods of history.

Speaker B: Dr.

Speaker B: Mary Francis O’Connor not long after.

Speaker E: Photography was discovered, people used to take photographs of their deceased relatives, often posed with the living family and display them in their living room.

Speaker B: Dr.

Speaker B: O’Connor thinks that new technologies have always led to new and what some may find unusual ways to grieve.

Speaker B: But becoming fixated with an item we’ve inherited from someone we love, talking to them, thinking they’re talking back, all of that is very common in grief.

Speaker B: And like these other things, becoming preoccupied with a digital representation of your loved one is a way to feel a continuing connection with them.

Speaker E: On the other hand, there’s something unique about the chatbot, which is that it is not a representation of the past.

Speaker E: It is also presenting as though it is in the moment that I can ask you a new question and that the chat bot can respond as though they were our loved one.

Speaker E: Well, the challenge there is based on this difficulty with the two streams of information.

Advertisement
Advertisement
Advertisement

Speaker E: On the one hand, we know that they’ve died.

Speaker E: On the other hand, we still have this belief that they are alive, that they’re with us, that they’re out there for us.

Speaker E: And my fear is that the way that some people use the chatbot would mean that it strengthens that belief.

Speaker C: Some of these platforms, you can have a two way conversation.

Speaker C: Dr.

Speaker C: Deborah Bassett via augmented reality or virtual reality.

Speaker C: And there are companies that are doing this via glasses.

Speaker C: When you want to go and have a chat with the person who’s died.

Speaker C: You click on the side of the glasses, and you meet them in a specific place in the park that you loved, and you can have this two way animated conversation.

Speaker B: There was even a woman who appeared and answered questions at her own funeral on video.

Speaker B: This was something she planned out with the company.

Speaker B: Many of these platforms film people while they’re alive, but others work with what people have left of their loved ones, like social media posts, voicemails texts, and recordings.

Speaker B: Deborah Bassett wanted to know whether anything general can be said about how these technologies affect the bereaved, whether they found.

Speaker C: Them a comfort or a disruption.

Speaker C: There’s never a yes or no answer in this type of research.

Speaker C: But the main theme that ran through was it’s all about control.

Speaker B: People want to see or hear or talk to their loved ones when they’re ready for it.

Speaker B: They don’t want to call out of the blue or a surprise visit in a VR game, even though that’s the kind of thing that would happen if their loved one were alive.

Speaker B: It’s not only spooky.

Advertisement
Advertisement
Advertisement

Speaker B: Deborah thinks it’s a way of retraumatizing the bereaved, and it’s something these companies need to avoid.

Speaker B: She also doesn’t want companies to create digital zombies.

Speaker C: So, these are people that are made to do things in death that they didn’t do in life when the data has been manipulated to make them do something that they have no consent over.

Speaker C: And that is something that I feel quite strongly about.

Speaker B: Currently, there are no laws stopping someone from creating a digital zombie.

Speaker B: That’s why rapper Kanye West was able to give his then wife Kim Kardashian, a hologram of her dead father, complete with a personalized message in his voice, while Kim said she was pleased with the gift.

Speaker B: One of Deborah’s proposals is to create a voluntary digital do not reanimate order signed by people before they die.

Speaker B: It prevents anyone from using their digital data to make their afterlife avatar do something or say something they don’t consent to while they’re alive.

Speaker A: Wait, so the idea is that if my mom signs one of these orders, I don’t get to create an avatar of her after she dies?

Speaker B: It’s not that you can’t create an avatar of your mom.

Speaker B: It’s that you can’t make her avatar do things completely out of character for her unless she consents to it.

Speaker A: Okay, so, like, Starbucks can’t ever buy the data for Kurt Cobain and make him sell coffee in a commercial?

Speaker B: That’s right.

Speaker B: In Deborah’s view, that would be a digital zombie, a kind of deep fake of a person who’s dead.

Advertisement
Advertisement
Advertisement

Speaker A: But can I still make an avatar of her, like, as long as I’m sticking to something that’s in character for her?

Speaker A: Actually, now that I think about it, how much out of character before it’s out of bounds?

Speaker C: This is where it’s really difficult.

Speaker C: How far can you go before it’s not that person I sat on a panel, an ethics panel, and this is what everybody was spinning round on, really.

Speaker A: There’s almost this thing where anything an avatar does is out of character for my mom.

Speaker A: As we age, we change what we would or wouldn’t do.

Speaker A: What does it mean to say, okay, so my mom is now 120 year old digital avatar.

Speaker A: How am I supposed to answer what would be in character for her?

Speaker A: What would be out of character for her?

Speaker B: And it also raises the question, why should the deceased person control how people use their data to grieve?

Speaker B: I mean, I’m thinking about a daughter.

Speaker B: What if she wants to ask her dead father a question?

Speaker B: And an AI can generate an answer that would be very similar to how he would answer based on all the information the AI has on him.

Speaker B: Would we really want to prevent that from happening, especially if it might help this person?

Speaker B: These companies would either be faced with honoring the request of the deceased person or honoring the request of the person who is alive, who is potentially standing right there in front of them.

Speaker A: My take on this question actually goes back to the first podcast I ever made.

Advertisement
Advertisement
Advertisement

Speaker A: I actually think the wishes of the dead should matter very little.

Speaker A: I mean, I think that if you’re the one grieving, you should be allowed to do whatever you need to do to grieve.

Speaker B: Well, if you ever get to that point where you’re going to go against the wishes of someone you care about, I do think it’s a good question to ask yourself whether that’s actually going to be healthy for you.

Speaker B: Like, for me, if I were to go against the wishes of my mom, guilt would probably eat away at me and perhaps counter any positives that I might get from the actual technology.

Speaker B: And anyway, if we’re really trying to recreate an authentic version of someone, wouldn’t that digital version know that they never wanted to be created in the first place?

Speaker B: So it just seems to me that there really could be a lot of other consequences that maybe people might not anticipate.

Speaker B: Do you think that the brain can decipher the difference between interacting with a digital version of someone you love versus the real thing?

Speaker E: I think a lot of that depends, again, on the person.

Speaker E: People often will keep the bedroom of a loved one the way it was.

Speaker E: If a daughter has died the day she stepped out of bed, no one wants to move the sheets.

Speaker E: No one wants to close the book she was reading on her desk.

Speaker E: And sometimes family members will go into that room and sit there.

Speaker E: And sometimes it is the sense of, I just want to be here with her.

Advertisement
Advertisement
Advertisement

Speaker E: And for other people, it is I want to be reminded of how important she was in my life, the things that she loved that were important to her.

Speaker E: You can see that the difference there is whether you are trying to create an alternate reality in which the loved one still exists, or whether you’re in the present moment recalling the reality that she lived and was important to you.

Speaker E: That’s a subtle difference.

Speaker E: Many of us can recognize that there’s a difference, but even as we are doing it, ourselves might not be able to recognize the difference of which one we are doing.

Speaker E: And that blurring the line is challenging.

Speaker E: I think.

Speaker B: As different and intense as the new world of digital inheritance can be right now, it’s just as fragile as what it is.

Speaker B: Replacing AI requires storage, special computing power, and programs that run on servers owned by private companies who need to be funded or profitable to survive.

Speaker B: While, in principle, digital inheritance can now last paper, videotape, or even human memories, in reality, servers crash, companies go under.

Speaker B: Even digital afterlives can end.

Speaker B: That’s the final concern for Dr.

Speaker B: Deborah.

Speaker C: Bassett second loss and the fear of second loss is whereby after biological death, there is then a digital death.

Speaker C: And that, again, is a big problem.

Speaker D: Um, october of 22.

Speaker D: So two months ago, she was having a lot of trouble breathing.

Speaker D: She went to the hospital, and basically they opened up the hood, if you will, and most of her organs were starting to really fail.

Advertisement
Advertisement
Advertisement

Speaker D: There was cancer showing all over the place at that point.

Speaker D: It was basically like hospice.

Speaker D: End of life is the option.

Speaker D: And throughout the entirety of her diagnosis, like, I’m fierce and tenacious and go get what I want.

Speaker D: So because my mom wanted to fight, and so I was incredibly aggressive with her treatment.

Speaker D: The last time she was hospitalized, I could hear in her voice she was done with the fight.

Speaker D: She held on for three more days.

Speaker D: And then on the last day, when things were really imperil, through great advice of a mentor, she said, you need to tell her goodbye and tell her it’s okay to let go.

Speaker D: And so I sent her a voice memo, and I said, Look, I love you.

Speaker D: It’s okay to go.

Speaker D: I’m fine.

Speaker D: Dad will be fine.

Speaker D: We’ll be fine.

Speaker D: And 15 minutes later, she slipped into a comma.

Speaker A: Oh, wow.

Speaker D: And that was it.

Speaker D: Gosh.

Speaker A: One important thing happened before Melody died.

Speaker A: She got a glimpse of Justin’s future, maybe even our future.

Speaker D: She got to talk to her persona.

Speaker D: She got to interview her own persona.

Speaker D: It was amazing.

Speaker D: The reason she was so impacted is she’s like, oh, my God, this is almost verbatim what I would say.

Speaker D: That’s the exact answer I would give you.

Speaker D: We can’t predict what it’s going to say or do because it’s her.

Speaker D: She started bawling because she was like, my biggest fear about dying was leaving you by yourself not to have me.

Advertisement
Advertisement
Advertisement

Speaker D: And she’s like, and now I realize you pulled this off.

Speaker D: You have me.

Speaker D: And I think that was the.

Speaker D: Most comfort for her.

Speaker D: It’s weird to be the first ever something.

Speaker D: She’s the first ever digital personality and get to see that happen.

Speaker D: But I think it gave her a tremendous amount of comfort.

Speaker D: And once it worked, she admitted, I didn’t think this could be pulled off.

Speaker D: Like, I didn’t think it was possible to have something be me.

Speaker D: Ultimately, I don’t believe that a digital representation of somebody isn’t them.

Speaker D: At a certain point, we will cross a threshold where it is so authentic and genuine that the relationship you’re continuing to carry as the living person still is indistinguishable between the relationship between two living people.

Speaker D: Technology is meant to progress us.

Speaker D: It’s going to bring change.

Speaker D: That’s the whole point of it.

Speaker D: If we wanted everything to be the same, we’d just all be cave people sitting around a fire with our same sort of couple hundred square mile range of hunting ground, and that’s what we would do.

Speaker D: Well, we opted for something different.

Speaker D: Now is definitely not the time to pull the emergency brake.

Speaker D: It’s going to happen.

Speaker D: It’s going to happen faster than you think you’re prepared for.

Speaker D: Every generation gets confronted with, this looks a lot different than what I thought life was going to be about.

Speaker D: But that’s the whole point.

Speaker A: What does philosophy say about this idea of justice?

Speaker A: Has Melody survived her own death through her persona?

Speaker A: Is it her?

Advertisement
Advertisement
Advertisement

Speaker A: When philosophers think about the question of what needs to survive in order for a person to survive, they generally give three answers and they debate about which one is correct.

Speaker A: The first answer is a soul, which is a supernatural ghost unique to every living person, and which doesn’t have any material or physical reality in a brain or in any physical matter.

Speaker A: This kind of answer fits into most religious and cultural mythologies, but not many philosophers defend it today, mostly because it’s impossible to falsify and verify and therefore impossible to argue with.

Speaker A: It’s also completely irrelevant to this episode if it’s true, because no one in AI research thinks they’re creating something that’s going to house everyone’s individual ghost.

Speaker A: If you believe this view, you’re probably not going to find what people are doing in AI.

Speaker A: All that interesting.

Speaker A: The second answer is to understand people as purely a kind of biological organism with unique bodies and physiologies.

Speaker A: Every person on this view is special, like a particular work of art, like the Mona Lisa.

Speaker A: The Mona Lisa can be copied or cloned sometimes so well that even experts can’t tell the difference.

Speaker A: You can look at pictures of the Mona Lisa.

Speaker A: You can buy replicas of it, but no copy is the same as the original.

Speaker A: The original is the only one that is the true Mona Lisa.

Speaker A: In the same way.

Speaker A: Biological organisms are like this.

Speaker A: They can be cloned.

Speaker A: A genetically identical twin can exist alongside you, but that would be a twin, not the same person.

Speaker A: It would be a twin.

Advertisement
Advertisement
Advertisement

Speaker A: Even if the contents of its mind were the same, that would make it just like a perfect clone of the Mona Lisa.

Speaker A: And if your body died, that would be like the original Mona Lisa being destroyed.

Speaker A: Even if a copy or clone survived, it’s not the original.

Speaker A: The original is what matters.

Speaker A: And so on this view, no person can survive the death of their body.

Speaker A: The final view is one that is a lot closer to the one Justin is expressing and the one that we’re familiar with in science fiction and body swapping movies.

Speaker A: This is the view that what needs to survive in order for a person to survive is a certain kind of information, a kind of information in a person’s mind that makes that person different from every other person.

Speaker A: These are things like memories, personality traits, ways of talking, ways of reacting emotionally.

Speaker A: All of these things reside in a particular brain and a particular body when a person is alive.

Speaker A: But what’s important is not the brain or body, but the information itself.

Speaker A: Take a song that you’ve recorded on your hard drive.

Speaker A: You sang it on the guitar, and there it is a digital recording in a six megabyte file.

Speaker A: If you want that song to survive, you only need to make sure the digital information that encodes that recording is on some drive somewhere.

Speaker A: It doesn’t have to be the exact hard drive you originally recorded it on.

Speaker A: It could be on the cloud or on multiple hard drives in a lot of different locations.

Advertisement
Advertisement
Advertisement

Speaker A: It doesn’t even have to be on a hard drive.

Speaker A: A CD will do.

Speaker A: If the original hard drive is destroyed, even if the song is on it, that wouldn’t matter.

Speaker A: The song survives as long as the digital data does.

Speaker A: Unlike the Mona Lisa on this view, backups aren’t any less original than the original.

Speaker A: This is a view articulated most prominently in 1984 by the philosopher Derek Parfit.

Speaker A: But the view goes back hundreds, if not thousands of years.

Speaker A: It makes people less like the Mona Lisa and more like a copyable file.

Speaker A: It’s the way we think of music, books, recipes, even ideas.

Speaker A: Anything whose survival doesn’t depend on a particular thing existing in the world.

Speaker A: To get a person to survive, what we need is something, anything, that encodes their memories, personality traits, ways of talking, ways of reacting emotionally.

Speaker A: A person’s inner psychology.

Speaker A: We know the brain does it and does it well.

Speaker A: But the brain is a particular thing that exists in the world, and its mortality is why we need AIS if we want our loved ones to survive.

Speaker B: So what does any of this have to do with why people feel so strongly that the AI is actually capturing the essence of a person in a way that physical things don’t?

Speaker A: Yeah.

Speaker A: So Perfect thinks that what we really care about is a person’s inner life.

Speaker A: We want to know if that survives when their body dies.

Speaker A: We want to be with the inner life of a person when we’re bonded to them.

Advertisement
Advertisement
Advertisement

Speaker A: The reason other physical things, people inherent, aren’t really the essence of a person is that they don’t reflect their inner life.

Speaker A: Like your mom’s favorite book will remind you of her, but it’s a reflection of her past inner life, right?

Speaker A: Not a current inner life, something speaking to you either as an avatar or otherwise.

Speaker A: The reason we think that reflects an inner life is because that’s what it’s always done before any of this technology existed.

Speaker A: So, like, when you get a text from your mom or you’re talking to her on the phone or in person, you’re used to seeing those messages reflect what her inner life is at that time.

Speaker A: So it’s only natural that when we see the same outward thing, words on a device, words coming from an image, we’re going to think it’s coming from the same inner life.

Speaker B: So is it real or is it just a trick or an illusion for.

Speaker A: This generation of AI?

Speaker A: I think it’s an illusion right now.

Speaker A: AI.

Speaker A: It’s a lot more like if your husband knew how to impersonate your mom in texts, those texts don’t count as coming from your mother.

Speaker A: And that’s because the inner life of the person sending them isn’t the same as your mother’s inner life.

Speaker A: Actually, with AI, it’s even worse.

Speaker A: Because at least with your husband, he has an inner life.

Speaker A: With the AI models, not only do they not have an inner life, they don’t even know what the words that they’re producing mean.

Advertisement
Advertisement
Advertisement

Speaker A: If you actually look inside the machine, the computer, it’s not actually words that are there.

Speaker A: Every word is translated into a number, and the computer is just computing those numbers.

Speaker A: It’s kind of like when you were in grade school or middle school, and you had like a number sequence, and it said, Guess the next number.

Speaker A: When you do that successfully, you come up with an equation, like x plus two.

Speaker A: That’s actually what the computer is doing.

Speaker A: It’s actually looking at number sequences and just coming up with some equation that gives you the next number, and that’s translated back into words.

Speaker A: So if Parfit’s view is correct, you have to encode the inner life, the actual conscious experiences of a person, into an AI for that person to survive.

Speaker A: It’s not enough to encode the outward appearances.

Speaker A: Literally, the last thoughts that melody had before melody passed away have to be encoded for the AI to count as capturing the real essence of melody.

Speaker B: Okay, so AI can’t do this now, but can AI eventually do that?

Speaker B: Because if it can, then that’ll be copying the contents of a mind, just like copying the contents of a hard drive, right?

Speaker A: If Parfit is right, then it’s possible to survive your death.

Speaker A: If it’s possible for AIS to encode conscious experiences, and in particular, a specific person’s conscious experiences.

Speaker A: But there’s also this big if as to whether Parfit’s right, that people are the copyable parts of their psychology, because there are a lot of good arguments that he can’t be, right?

Speaker B: Like what?

Advertisement
Advertisement
Advertisement

Speaker A: Okay, so let me continue with Justin’s story for a second.

Speaker A: Something I didn’t mention before is that Melody isn’t just being preserved in her persona.

Speaker D: So I went through the cryogenics process with her.

Speaker D: My mom’s cryogenically frozen.

Speaker D: And I had been a firm believer in that even long before I got into this kind of technology.

Speaker D: I’m of the belief system that the human brain is nature’s most perfect computer.

Speaker A: Freezing people, or even just their heads after they’ve died has been around for a long time.

Speaker A: And it’s based on the idea that one day we’ll be able to cure their diseases and revive their bodies so they can continue their natural biological lives.

Speaker A: So the problem with this inner life view, or Parfit’s view, is what would happen if there were actually a way to revive the frozen melody in the future.

Speaker A: So right now, Justin has only one version of melody, her versionA.

Speaker A: Maybe eventually that versionA might become good enough that it encodes all of Melody’s inner psychological life, not just her speech patterns.

Speaker A: So if that’s right, then Melody would count as having survived on Parfit’s view.

Speaker A: But if we actually find a cure for Melody’s condition and find a way to revive her cryonic body, then in the future, we’ll have the physical Melody and the versionA at the same time.

Speaker A: And there will be two melodies, not one.

Speaker A: And Parfit thought both of these melodies are perfectly valid survivors of Melody.

Speaker A: Neither one is more Justin’s mom than the other.

Speaker A: Now, what do you think, Alex?

Advertisement
Advertisement
Advertisement

Speaker A: Do you think they’re both equally valid versions of Melody?

Speaker B: Well, what if frozen Melody is frozen for the next 20 years and Melody’s versionA gets to continue to interact with Justin?

Speaker B: They share memories, they share thoughts, they get to experience the world together.

Speaker B: And then what if the technology gets so good that Melody’s versionA smells, feels, sounds like the real Melody?

Speaker B: I could see how potentially, at that point, melody’s versionA will feel even more real.

Speaker A: You’re kind of making Parfit’s case for him, right?

Speaker A: So if you thought that though, what would be the point of reviving the cryogenically frozen melody?

Speaker A: Because on Parfitt’s view, the versionA is now more Melody than the frozen body.

Speaker A: So a lot of people, like myself included, think that no matter how good the versionA is, it’s just a replica, a copy of Melody, and that the original body and brain get revived.

Speaker A: That’s Justin’s mom, not the persona.

Speaker A: I think that Justin should care a lot more about the one who was revived cryogenically than the persona, no matter how good the persona comes out.

Speaker A: And that’s really the question that philosophers are trying to answer here.

Speaker A: I’m not really on team parfit.

Speaker A: I think the original Melody body is the real Melody.

Speaker A: What do you think?

Speaker B: Yeah, I agree with that.

Speaker B: I think ultimately I’m not on Team Parfit as well, because even if Melody’s versionA feels a lot like the original Melody, I would think that something would get exchanged between Justin and the original Melody that we just can’t even really pinpoint.

Advertisement
Advertisement
Advertisement

Speaker B: And I think the word that comes to mind for me is just chemistry.

Speaker B: I mean, we’ve all had those people that we haven’t seen in years.

Speaker B: We haven’t talked to them, and then the moment we see them, it’s like no time has passed.

Speaker A: Okay.

Speaker A: Alex, you and your mom seem like prime candidates for this technology.

Speaker B: I do think my thoughts about the technology has changed a bit.

Speaker B: When I first started researching it, I had a very strong aversion to it, and I thought it was actually quite creepy.

Speaker A: What’s changed for you?

Speaker B: Well, I think I now better understand why people would want to explore the technology that if you really look at it through the lens of it’s, a continuing bond, you’re connecting in just a different way from that point of view.

Speaker B: I get it.

Speaker A: Do you think you’re going to do something with your mom’s voicemails?

Speaker B: I don’t think so.

Speaker B: I don’t think it’s right for me.

Speaker B: And while I do have this strong desire to capture parts of her, to record parts of her before she dies, I think the use of this type of technology would probably just confuse me.

Speaker B: And besides, maybe part of what makes our relationships with people so meaningful is that they all eventually come to an end.

Speaker B: Is produced, written and edited by Barry Lamb.

Speaker B: Additional producer for this episode is Alexandra Salmon.

Speaker B: Story editor for this season is Eleanor Gordon Smith for Slate Podcasts.

Speaker B: Alicia Montgomery is VP of Audio, derek John is executive producer of Narrative Podcasts, and Ben Richmond is Senior Director of Operations.

Speaker A: Follow Hi Fi Nation on Facebook, Twitter and Instagram at HiFi Nation.

Speaker A: That’s.

Speaker A: H-I-P-H-I nation.

Speaker A: Complete transcript, show notes, and reading suggestions for every episode is available@hifination.org.