The universe is humongous.

  • The hard drive space is practically limited to the Big Bang on one end and the heat death of the universe on the other, but it contains all of the data for everything that exists. That’s massive.

  • The RAM is massive because it’s handling all the variables and changes of the present.

  • The cache is much smaller as established by the study that found the universe is not locally real. Things only happen once they are observed, but it happens almost instantaneously. Still, the cache is massive because it is handling everything that is being observed at the same time. That’s a lot of things.

All of the above are massive extremes. However,

  • The processing speed is limited at the speed of light. In comparison to the others, the speed of light is soooooo ridiculously slow, causing a bottle neck.

PS - Massive because it’s mass I’ve observed. Not really tho, you silly goat. Big bang while I swig Tang and watch a twig hang.

  • I'm back on my BS 🤪OP
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    2
    ·
    1 day ago

    Yep! The probability code is put within the quantum systems so that things are mostly predictable, but there’s still enough “randomness” to prevent a deterministic system. The cache is basically figuring out all these probabilities when interacted with plus processing the more deterministic calculations of the macro world.

    This works out. I asked Ephen Stephen, and they gave me the 👍👍