• henfredemars
    link
    fedilink
    English
    arrow-up
    41
    ·
    edit-2
    5 months ago

    I don’t believe humans are meant to manage loss in this way — stretching out an imitation of our loved one. As painful as it is, I personally believe humans need to say goodbye. I feel this gets in the way of feeling and truly accepting the loss so that a person can move forward.

    Loss is truly heavy, but I do not believe this is better or healthy.

    • Mycatiskai@lemmy.ca
      link
      fedilink
      arrow-up
      23
      ·
      5 months ago

      My sister has hundreds of YouTube videos she used to help her students learn between music lessons. It will be two years soon since she died, I haven’t been able to watch even one.

      I like to remember her in my mind, it hurts less than seeing her when she was alive.

    • naevaTheRat@lemmy.dbzer0.com
      link
      fedilink
      arrow-up
      11
      ·
      5 months ago

      Yeah. I am not a Buddhist but I’ve always found something rings true in the reflections on impermanence. When we bond with someone we accept the pain of loss, and when we feel it most people seem to describe relief once able to “let go” an accept it being over.

      It seems to me that encouraging clinging and reminiscening stunts you a bit and only really provides temporary relief of the loss while drawing out the time it takes to process it.

      Idk though, maybe I’ll have the misfortune to feel differently some day. It’s hard to judge someone hanging out with their spouse watching death creep closer each day. I have approximately zero idea what my opinions would be in the face of that.

    • thingsiplay@beehaw.org
      link
      fedilink
      arrow-up
      10
      ·
      5 months ago

      People who can’t get over someone losing will sorrow for the rest of the life, or until they get over it. And AI won’t help to get over it. Death is part of our life and as soon as you don’t accept it, it becomes pain.

      It’s last year I think when I read someone created the lost son (or some other family member, I forgot) of a mother, in a VR environment. And she could see him/her again in the VR. Absolutely madness! What does this do to the person? Now couple that with an AI… man the future is grim…

      • henfredemars
        link
        fedilink
        English
        arrow-up
        8
        ·
        5 months ago

        I had this conversation with my wife once. I let her know that it is my advance wish that you must allow me to complete the cycle of life. Anything else, any reconstruction of me that technology allows, is to me, an abomination. Keep the pictures, keep the memories, but don’t keep me here when I am gone.

        I refrain from judging the decisions of others where possible, but this is my personal wish.

    • Scrubbles@poptalk.scrubbles.tech
      link
      fedilink
      English
      arrow-up
      8
      ·
      5 months ago

      I tried things like character AI to play with talking to “celebrities”. It was novel, it was fun. For about 15 minutes. Then… Eh. It’s not the person, and your brain knows it’s not them. It’s always an imitation. I got bored talking with people I’ve always wanted to talk to.

      I can’t imagine it being a lived one who has passed. It would feel hollow, empty, and wouldn’t make the pain leave. Idk, it just wouldn’t be good at all

    • FaceDeer@fedia.io
      link
      fedilink
      arrow-up
      5
      ·
      5 months ago

      I don’t believe humans are “meant” to do anything. We are a result of evolution, not intentional design. So I believe humans should do whatever they personally want to do in a situation like this.

      If you have a loved one who does this and you don’t feel comfortable interacting with their AI version, then don’t interact with their AI version. That’s on you. But don’t belittle them for having preferences different from your own. Different people want different things and deal with death in different ways.

      • frog 🐸@beehaw.org
        link
        fedilink
        English
        arrow-up
        5
        ·
        5 months ago

        There may not have been any intentional design, but humans are still meant to eat food, drink water, and breathe oxygen, and going against that won’t lead to a good end.

        • FaceDeer@fedia.io
          link
          fedilink
          arrow-up
          4
          ·
          5 months ago

          Even with that, being absolutist about this sort of thing is wrong. People undergoing surgery have spent time on heart/lung machines that breathe for them. People sometimes fast for good reasons, or get IV fluids or nutrients provided to them. You don’t see protestors outside of hospitals decrying how humans aren’t meant to be kept alive with such things, though, at least not in most cases (as always there are exceptions, the Terri Schiavo case for example).

          If I want to create an AI substitute for myself it is not anyone’s right to tell me I can’t because they don’t think I was meant to do that.

          • frog 🐸@beehaw.org
            link
            fedilink
            English
            arrow-up
            4
            ·
            5 months ago

            Sure, you should be free to make one. But when you die and an AI company contacts all your grieving friends and family to offer them access to an AI based on you (for a low, low fee!), there are valid questions about whether that will cause them harm rather than help - and grieving people do not always make the most rational decisions. They can very easily be convinced that interacting with AI-you would be good for them, but it actually prolongs their grief and makes them feel worse. Grieving people are vulnerable, and I don’t think AI companies should be free to prey on the vulnerable, which is a very, very realistic outcome of this technology. Because that is what companies do.

            So I think you need to ask yourself not whether you should have the right to make an AI version of yourself for those who survive your death… but whether you’re comfortable with the very likely outcome that an abusive company will use their memories of you to exploit their grief and prolong their suffering. Do you want to do that to people you care about?

            • Zaktor@sopuli.xyz
              link
              fedilink
              English
              arrow-up
              3
              ·
              edit-2
              5 months ago

              This is speculation of corporate action completely divorced from the specifics of this technology and particulars of this story. The result of this could be a simple purchase either of hardware or software to be used as chosen by the person owning it. And the person commissioning it can specify exactly who such a simulacrum is presented to. None of this has to be under the power of the company that builds the simulacrums, and if it is structured that way, then that’s the problem that should be rejected or disallowed, not that this particular form of memento exists.

              • intensely_human@lemm.ee
                link
                fedilink
                arrow-up
                2
                ·
                5 months ago

                It could still be a bad idea even if the profit motive isn’t involved.

                One might be trying to help with the big surprise stash of heroin they leave to their widow, and she might embrace it fully, but that doesn’t make it a good idea or good for her.

                • Zaktor@sopuli.xyz
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  5 months ago

                  Sure, and that point is being made in multiple other places in these comments. I find it patronizing, but that’s neither here nor there as it’s not what this comment thread is about.

            • FaceDeer@fedia.io
              link
              fedilink
              arrow-up
              1
              ·
              5 months ago

              But when you die and an AI company contacts all your grieving friends and family to offer them access to an AI based on you (for a low, low fee!)

              You can stop right there, you’re just imagining a scenario that suits your prejudices. Of all the applications for AI that I can imagine that would be better served by a model that is entirely under my control this would be the top of the list.

              With that out of the way the rest of your rhetorical questions are moot.

      • henfredemars
        link
        fedilink
        English
        arrow-up
        4
        ·
        edit-2
        5 months ago

        Meant, in this context, refers to the conditions that humans have faced over a long period of time and may be more suited to coping with from a survival point of view. I’m an atheist, so I find it strange that you chose to read my comment as highlighting intentional design. Certainly, AI has existed for a much shorter time than the phenomenon on a human encountering the death of a loved one. Indeed, death has been quite a common theme throughout history, and the tools and support available to cope with it and relate to other human experiences far exceed those for coping with the potential issues that come with AI.

        I think one can absolutely speak of needs and adaptation for something as common a human experience as death. If you find something belittling about that opinion, I’m not sure how to address you further. I may simply have to be wrong.

        • frog 🐸@beehaw.org
          link
          fedilink
          English
          arrow-up
          4
          ·
          5 months ago

          Just gonna say that I agree with you on this. Humans have evolved over millions of years to emotionally respond to their environment. There’s certainly evidence that many of the mental health problems we see today, particularly at the scale we see, is in part due to the fact that we evolved to live in a very different way to our present lifestyles. And that’s not about living in cities rather than caves, but more to do with the amount of work we do each day, the availability and accessability of essential resources, the sense of community and connectedness with small social groups, and so on.

          We know that death has been a constant of our existence for as long as life has existed, so it logically follows that dealing with death and grief is something we’ve evolved to do. Namely, we evolved to grieve for a member of our “tribe”, and then move on. We can’t let go immediately, because we need to be able to maintain relationships across brief separations, but holding on forever to a relationship that can never be continued would make any creature unable to focus on the needs of the present and future.

          AI simulacrums of the deceased give the illusion of maintaining the relationship with the deceased. It is certainly well within the possibility that this will prolong the grieving process artificially, when the natural cycle of grieving is to eventually reach a point of acceptance. I don’t know for sure that’s what would happen… but I would want to be absolutely sure it’s not going to cause harm before unleashing this AI on the general public, particularly vulnerable people (which grieving people are.)

          Although I say that about all AI, so maybe I’m biased by the ridiculous ideology that new technologies should be tested and regulated before vulnerable people are experimented on.

      • intensely_human@lemm.ee
        link
        fedilink
        arrow-up
        1
        ·
        5 months ago

        Recent science agrees sexual selection is a much bigger factor in recent human evolution than natural selection. And sexual selection is conscious.

        So, depending on what you consider “design” we have at least been consciously bred for traits by previous generations of humans.

    • intensely_human@lemm.ee
      link
      fedilink
      arrow-up
      3
      ·
      5 months ago

      Yes. Nothing about this idea sounds like a good idea. Honestly I’m kind of pissed at the dude for saddling his wife with this gift.