As a reminder, current estimates are that quantum cracking of a single 2048-bit RSA key would require a computer with 20 million qubits running in superposition for about eight hours. For context, quantum computers maxed out at 433 qubits in 2022 and 1,000 qubits last year. (A qubit is a basic unit of quantum computing, analogous to the binary bit in classical computing. Comparisons between qubits in true quantum systems and quantum annealers aren’t uniform.) So even when quantum computing matures sufficiently to break vulnerable algorithms, it could take decades or longer before the majority of keys are cracked.

The upshot of this latest episode is that while quantum computing will almost undoubtedly topple many of the most widely used forms of encryption used today, that calamitous event won’t happen anytime soon. It’s important that industries and researchers move swiftly to devise quantum-resistant algorithms and implement them widely. At the same time, people should take steps not to get steamrolled by the PQC hype train.

  • humblebun@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    3
    ·
    2 days ago

    If qbits double every year

    And then we need to increase coherence time, which is 50ms for the current 433 qubits large chip. Error correction might work, but might not

    • WolfLink@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      4
      ·
      2 days ago

      Error correction does fix that problem but at the cost of increasing the number of qubits needed by a factor of 10x to 100x or so.

      • humblebun@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        4
        ·
        1 day ago

        But who guarantees that ec will overcome decoherence, introduced by this number of qbits? Not a trivial question that nobody can answer for certain

        • WolfLink@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          1
          ·
          1 day ago

          I mean the known theory of quantum error correction already guarantees that as long as your physical qubits are of sufficient quality, you can overcome decoherence by trading quantity for quality.

          It’s true that we’re not yet at the point where we can mass produce qubits of sufficient quality, but claiming that EC is not known to work is a weird way to phrase it at best.

          • humblebun@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            3
            ·
            1 day ago

            It was shown this year for how many, 47 qbits to scale? How could you be certain this will stand for millions and billions?

            • WolfLink@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              2
              ·
              23 hours ago

              Because the math checks out.

              For a high level description, QEC works a bit like this:

              10 qubits with a 1% error rate become 1 EC qubit with a 0.01% error rate.

              You can scale this in two ways. First, you can simply have more and more EC qubits working together. Second, you can near the error correcting codes.

              10 EC qubits with a 0.01% error rate become one double-EC qubit with a 0.0001% error rate.

              You can repeat this indefinitely. The math works out.

              The remaining difficulty is mass producing qubits with a sufficiently low error rate to get the EC party started.

              Meanwhile research on error correcting codes continues to try to find more efficient codes.

              • humblebun@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                3
                arrow-down
                1
                ·
                20 hours ago

                While you describe the way how error correction works, there are other factors you fail to notice.

                It is widely known, that for each physical qubit T2 time decreases when you place it among other. The ultimate question here is: when you add qubits, could you overcome this decoherence with EC or not.

                Say you want to build a QC with 1000 logical qubits and you want to be sure that the error rate doesn’t exceed 0.01% after 1 second. You assemble it, and it turns out that you have 0.1%. You choose to use some simple code, say 7,1 and now you have to assemble a 7000 chip to execute 1000 qubits logic. You again assemble it and the error rate is higher now (due to decoherence and crosstalk). But the question is how much higher? If it’s lower than your EC efficiency then you just drop a few more qubits, use 15,2 code and you are good to go. But what if no?

                • WolfLink@sh.itjust.works
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  ·
                  20 hours ago

                  That’s a good point which is part of why there is a lot of active research into quantum networking. Once you can connect two otherwise independent quantum computers, you no longer have the issue of increasing crosstalk and other difficulties in producing larger individual quantum chips. Instead you can produce multiple copies of the same chip and connect them together.