Assuming our simulation is not designed to auto-scale (and our Admins don’t know how to download more RAM), what kind of side effects could we see in the world if the underlying system hosting our simulation began running out of resources?

  • WhyAUsername_1@lemmy.world
    link
    fedilink
    arrow-up
    2
    ·
    6 months ago

    The server shuts down. Admin adds in few more sticks of ram and powers it on again.

    The day is reset and we wake up again from the morning of that day where there was a RAM shortage.

  • fidodo@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    7 months ago

    That would only be a problem if you need dynamically allocated memory. It could be a statically allocated simulation where every atom is accounted for.

  • Pons_Aelius@kbin.social
    link
    fedilink
    arrow-up
    1
    ·
    7 months ago

    Simply put.

    We wouldn’t notice anything.

    Our perception of the world would be based only on the compute cycles and not on any external time-frame.

    The machine could run at a Million Billion hertz or at one clock-cycle per century and your perception of time inside the machine would be the same.

    Same with low ram, we would have no indication if we were constantly being paged out to a hard drive and written back to ram as required.

    Greg Egan gave a great explanation of this in the opening chapter of his Novel Permutation City

  • flashgnash@lemm.ee
    link
    fedilink
    arrow-up
    1
    ·
    7 months ago

    If our entire universe is a simulation so are our laws of physics, in the parent universe running our simulation the universe might be powered by pure imagination and the concept of memory or CPU cycles or even electricity might not even exist