• raman_klogius@ani.social
    link
    fedilink
    English
    arrow-up
    0
    ·
    27 days ago

    the simulacrants wouldn’t realize the simulation is ever not running.

    Kurzgesagt made a video about how in a dying universe (from heat death) civilizations that uploaded their consciousness into a simulation could live forever, by intermittently running the simulation and pausing it for greater and greater amounts of time as expendable energy in the universe diminishes. The consciousness would not perceive the time the simulation isn’t running and to them things just go on and on for eternity.

    • zbyte64@awful.systems
      link
      fedilink
      arrow-up
      0
      ·
      26 days ago

      What I find interesting is how we abstract away the actual work needed to keep either scenario running reflects how billionaires justify their own extremes. The heat death being a most extreme example as there is no “spare” energy for other organism to be conscious. The uploaded consciousness is detached from reality, living in a dieing universe, and still insists it has a right to exist at the cost of new venues of consciousness.

  • you_are_dust@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    26 days ago

    If this is a way for our simulation creator to decide to pull the plug without guilt, I guess just go ahead and do it. I was holding out hope that this was all real, but it has been getting more clear that it’s not.

  • Tudsamfa@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    26 days ago

    You can “simulate” life inside your brain, too.


    [Alt text: this is Bob. Bob is a figment of you imagination. When you leave, Bob will leave too. “Don’t leave” says Bob]

    The Bob in your head is intelligent, it can communicate in English. Is it unethical to stop thinking about Bob? Was it unethical of me to show you this picture, creating a “Bob” in your head? Is any story unethical to tell?

  • Iconoclast@feddit.uk
    link
    fedilink
    arrow-up
    0
    ·
    27 days ago

    Intelligence isn’t the important factor there - consciousness is. Does it feel like something to be those entities in the simulation? If yes, then I’d argue that ending the simulation is like killing a person painlessly in their sleep.

    I personally don’t think ending the simulation is even the most troubling part. We could unintentionally create a simulation that’s effectively a hell and then populate it with entities that have subjective experiences we don’t realize exist. The only thing worse than ending a life is creating one just for it to suffer through its entire existence.

    • zikzak025@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      27 days ago

      We could unintentionally create a simulation that’s effectively a hell and then populate it with entities that have subjective experiences we don’t realize exist. The only thing worse than ending a life is creating one just for it to suffer through its entire existence.

      And this is basically the plot of the TV series Severance. Has me wondering how they intend to address it.

    • ZombieCyborgFromOuterSpace@piefed.ca
      link
      fedilink
      English
      arrow-up
      0
      ·
      27 days ago

      Didn’t scientists train brain cells to exclusively play Doom? It’s like their whole conscience is stuck in a video game version of hell through a brain in a vat experience.

      • LurkingLuddite@piefed.social
        link
        fedilink
        English
        arrow-up
        0
        ·
        26 days ago

        Not really. It’s not nearly enough cells to have any kind of consciousness as we know it. A few neurons learning to play a game is a far cry from tying a being into a simulation of hell.

  • FreshParsnip@lemmy.ca
    link
    fedilink
    arrow-up
    0
    ·
    24 days ago

    Depends, are they sentient? If they are conscious beings, yeah I think it would be unethical to mass murder them