this post was submitted on 19 Feb 2024
98 points (85.0% liked)

Asklemmy

43382 readers
1389 users here now

A loosely moderated place to ask open-ended questions

Search asklemmy πŸ”

If your post meets the following criteria, it's welcome here!

  1. Open-ended question
  2. Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
  3. Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
  4. Not ad nauseam inducing: please make sure it is a question that would be new to most members
  5. An actual topic of discussion

Looking for support?

Looking for a community?

~Icon~ ~by~ ~@Double_[email protected]~

founded 5 years ago
MODERATORS
 

Assuming our simulation is not designed to auto-scale (and our Admins don’t know how to download more RAM), what kind of side effects could we see in the world if the underlying system hosting our simulation began running out of resources?

top 50 comments
sorted by: hot top controversial new old
[–] [email protected] 2 points 5 months ago

The server shuts down. Admin adds in few more sticks of ram and powers it on again.

The day is reset and we wake up again from the morning of that day where there was a RAM shortage.

[–] [email protected] 1 points 6 months ago

I am the only person who lives in the simulaton. You all are computer generated.

[–] [email protected] 1 points 6 months ago

Teleportation based on old location data being deleted

[–] [email protected] 1 points 6 months ago

This is why older people think slower and lose memories or cognitive functions as side effects. They are depriorizized and moved from ram to pagefiles/swap disk.

If you're unfamiliar, the OS will move process memory onto disk when RAM runs out.

[–] [email protected] 8 points 7 months ago

Have you ever noticed when you look into a telescope that it takes a little bit to position yourself right to see what you're looking at? And it seems like you used to be able to do it a lot faster? That's not age, that's actually lag time added to cover decompressing the data.

[–] [email protected] 5 points 7 months ago

This is a tricky question to answer. To answer this question requires assumptions about how perspectives emerge, if at all, from computation, a theory of time, interpretations of quantum mechanics, and persistence of identity.

Of course, we can start at the simplest possible interpretation, that we live in a "Matrix" style simulation, where we actually have real bodies in the "real world". This sidesteps the question of how to get sentient beings to emerge in a simulation and what that would entail. In this case, running out of RAM would have immediate consequences, since our sense of time in the simulated world would be in 1:1 correspondence with the "real world". We would experience all the possible glitches running out of RAM entails. Imagine taking an Apple Vision Pro and scaling it out. These are your conventional computer glitches. At the point of running out of RAM, you could immediately tell you were in a simulation.

Lets take the next level of interpretation though. Let's assume we live in a "OpenAI Sora" type of simulation. In this simulation, the beings as well as the environment are generated on the fly "randomly". At this point, I am just assuming that subjective perspectives can emerge just as they do in our world, where they are tied to beings that look very much like ourselves. In this case, the subjective time of the simulated beings is entirely uncorrelated with our own time. In a sense, we are just opening a "window" into another universe, like playing back a movie, but the beings themselves would exist whether or not we stumbled upon their particular sequence of bits. The problem of asking what the beings in this type of simulation would experience becomes obvious when you realize that multiple simulators can simulate the exact same simulation with exactly the same sequence of bits. The question then becomes, are the two simulations actually equivalent to each other? From the simulated beings perspective, they could not tell which simulator is simulating them based on their experience, since each simulator can simulate exactly the same bit sequence.

Now this comes to the question of self-locating uncertainty, of being uncertain about which simulator is simulating your own existence. If there were only two simulators in the "real world" simulating your own existence, it would seem to be most reasonable to assign 50% probability that you are being simulated by either simulator. Then the question of what happens when the simulator runs out of RAM turns into the question of which simulator is running out of RAM? If only one simulator runs out of RAM, then from a naive estimate, you would only experience a 50% chance of some sort of "glitch" happening in your world. But of course, we have no way of knowing how many simulators are running this exact sequence of bits. It could very well be infinite. The question then becomes what is the probability distribution over all such simulators running out of RAM? This question seems impossible to answer from the simulated being's point of view.

I haven't even touched upon the question of continuity of identity, of what happens to your perspective when the simulation "crashes" or is paused. This really comes to the question of how conscious awareness supervenes on sequences of bits, or how our perspective gets tied to one sequence of events over another. In other words, this is similar in spirit to the question in the many worlds interpretation of quantum mechanics as to which branch your particular perspective gets tied to when the universe "splits" into different branches. In many worlds quantum mechanics, if there is one branch where the simulator runs out of RAM, there is still the possibility of other branches where your perspective continues unabated. You can see then that this question isn't really a question about simulations or quantum mechanics per se, but of how consciousness decides what perspective comes next.

I suspect the answer is already hidden in the data that we see already. You see, in quantum mechanics there is this notion of "no cloning" where the exact quantum state of a system cannot be cloned, or this would violate the uncertainty principle. I suspect that the solution to the problem of running out of RAM lies in the fact that our own conscious perspective cannot be cloned exactly. In other words, our own conscious experience as we experience it now, might be thought of in the following way. We cannot know what is generating our experience, so we naively assign a probability distribution over all such possible generators of our experience, including those of simulators of our own existence. Some of this probability mass includes situations where our own existence just fluctuates out of the vacuum, but this is vanishingly small. But then there is some other probability mass that is assigned to situations where our existence continues "normally". I suspect the conglomeration of all possible configurations that lead to the particular quantum state that specifies our particular perspective is actually the probability distribution as specified by quantum mechanics. That is, the origin of the probability distribution of quantum mechanics lies entirely in the fact that our own conscious experience can be generated by various possible simulators of various types that converge onto the fixed point probability distribution that is specified by the laws of quantum mechanics.

In this sense, then it is obvious why you cannot clone a quantum state, because a quantum state is a conglomeration of all possible "classical" sequences that have been simulated to such a sufficient degree to be called the same quantum state. In other words, you cannot clone a quantum state because a quantum state is the set of all possible clones that are indistinguishable from each other. Quantum mechanics is the end result of the fact that all possible clones have been carried out on every sequence of bitstrings.

Now the question then arises is why does quantum mechanics seem to obey probability amplitudes and not distributions, that is it utilizes complex numbers instead of ordinary numbers. I suspect this has to do with the fact that quantum mechanics has a certain timeless quality to it, and it is this "time travel" quality that causes the probabilities to be complex valued rather than real valued. You see, if we just assigned classical probabilities to every event, we would just have statistical mechanics instead of quantum mechanics. But statistical mechanics assumes that there is a singular direction of time. I suspect if you relax the notion of a single valued time, you get quantum mechanics.

Thus, simulating a reality, is akin to building a time machine.

[–] [email protected] 8 points 7 months ago (1 children)

That would only be a problem if you need dynamically allocated memory. It could be a statically allocated simulation where every atom is accounted for.

[–] [email protected] 4 points 7 months ago* (last edited 7 months ago)

Given the whole "information can neither be created nor destroyed" aspect of atomic physics, taken literally, this theory checks out.

[–] [email protected] 5 points 7 months ago* (last edited 7 months ago)

The assumption that it isn't designed around memory constraints isn't reasonable.

We have limits on speed so you can't go too fast leading to pop in.

As you speed up the slower things move so there needs to be less processing in spite of more stuff (kind of like a frame rate drop but with a fixed number of frames produced).

As you get closer to more dense collections of stuff the same thing happens.

And even at the lowest levels, the conversion from a generative function to discrete units to track stateful interactions discards the discrete units if the permanent information about the interaction was erased, indicative of low level optimizations.

The scale is unbelievable, but it's very memory considerate.

[–] [email protected] 0 points 7 months ago

For a simulation as complex and powerful as the universe. we would be running in a Real-Time OS. So applications couldn't even run if the resources weren't sufficient.

[–] [email protected] 2 points 7 months ago

We download more RAM.

[–] [email protected] 4 points 7 months ago

I know exactly what would happen. It...uhh, what was I gonna say again? It just slipped out, it'll come back...

[–] [email protected] 4 points 7 months ago

Things will stop making sense, people will start to glitch and make horrible decisions that will affect millions, and...

Wait

[–] [email protected] 1 points 7 months ago

Well, if we're in a simulation, then any assumptions we have about definitions, limitations, they may not apply. So, we think storage needs ram, but outside our restricted simulation, it could be far different.

Like, I frequently ponder how did something come from nothing. But I know I'm making assumptions when I ask that question. It may not be linear, may not be either or, there's something crucial im not seeing.

[–] [email protected] 2 points 7 months ago (1 children)

You get stuff like https://en.wikipedia.org/wiki/Reality_Winner and the same movie/media coming out over and over again

[–] [email protected] 3 points 7 months ago

Mandela effect too

[–] [email protected] 0 points 7 months ago

How would you know what physics runs the host universe? For all we know, things like ram limitations doesn't even apply there

[–] [email protected] 1 points 7 months ago

We are the RAM

[–] [email protected] 3 points 7 months ago* (last edited 7 months ago)

Who is to say that the sim needs ram. What if it were just a giant state machine where the current state only depends on the previous state. And the entire universe is the β€œram”.

[–] [email protected] 11 points 7 months ago (1 children)

These answers are all really fun but I didn't see anyone point out one thing: why should we assume that our creators' "computer" architecture is anything remotely similar to our technology? I'm thinking of something like SETIβ€”We can't just assume that all other life is carbon-based (though evidently it's a pretty good criterion). The simulation could be running on some kind of dark matter machine or some other exotic material that we don't even know about.

Personally I don't subscribe to the simulation theory. But if it were true, why would the system have any kind of limitation? I feel like if it can simulate everything from galactic superclusters down to strings vibrating in Planck Time, there are effectively no limits.

Then again, infinity is quite a monster, so what do I know?

[–] [email protected] 2 points 6 months ago (1 children)

all other life is carbon-based (though evidently it’s a pretty good criterion)

The short version is that the only other element that allows 4 covalent bonds is silicon, but nobody has been able to find a solvent that allows complex silicon-based molecules to form without instantly dissolving any structures they form.

[–] [email protected] 1 points 6 months ago

I remember reading about how silicon is theoretically possible, but I had (erroneously) assumed there were more potential candidates. Thanks for the additional info. This stuff is so fascinating!

[–] [email protected] 3 points 7 months ago

This is the entire premise of No Man's Sky.

[–] [email protected] 16 points 7 months ago

If our entire universe is a simulation so are our laws of physics, in the parent universe running our simulation the universe might be powered by pure imagination and the concept of memory or CPU cycles or even electricity might not even exist

[–] [email protected] 18 points 7 months ago (1 children)

Maybe we’re already there and death is just the garbage collector freeing up more space.

[–] [email protected] 6 points 7 months ago

I love this concept

Could make a good book

[–] [email protected] 9 points 7 months ago

Allthat shit you forgot? All that "forgotten" history? There you go.

[–] [email protected] 1 points 7 months ago
load more comments
view more: next β€Ί