• Commiunism@lemmy.dbzer0.com
    link
    fedilink
    arrow-up
    6
    ·
    3 days ago

    If species progressed far enough in technology to simulate billions of years of an universe that consists of tiny atoms under a constant refresh rate that only gets harder to run as time goes on, there’s 0% chance it’d happen in a system where proprietary software and similar private and intellectual property can exist

    • Lifter@discuss.tchncs.de
      link
      fedilink
      arrow-up
      6
      ·
      3 days ago

      The refresh rate doesn’t have to be constant though. Each “step” however long it took to simulate would seem like an instant to us. Our conciousnesses are also simulated, which means we always percieve the new frames as fast as we are simulated.

      The simulator could even break down and resume without us noticing. It also doesnt’t have to be fast enough to simulate a second per second. Imagine a simulator actually running for (more) billions of years. It seems silly but possible.

      • interdimensionalmeme@lemmy.ml
        link
        fedilink
        arrow-up
        1
        ·
        3 days ago

        Yes, time isn’t a limiting factor, but error free, coherent processing is.

        It could get so long that it becomes impossible for that much information to be processed without a certain number of errors and then the simulation would start breaking down.

        The bigger it is, the more information it has and the longer it takes for the next quanta of time in-simulation, the most the risk of error increases.