Stop worrying about the Bitcoin quantum threat – Why Google can’t steal your BTC, and bad actors are decades behind
Burns Brief
The state of quantum computing and what it would take to threaten Bitcoin Quantum computing has advanced materially over the past 18 months , but the field remains in the transition from noisy hard... Market participants are carefully weighing the implications, with the outcome likely to depend on broader macro conditions and volume. Watch $BTC $ETH for reaction — a decisive move above or below key levels will confirm the next trend.
The state of quantum computing and what it would take to threaten Bitcoin Quantum computing has advanced materially over the past 18 months , but the field remains in the transition from noisy hardware to early fault tolerance. The key shift is away from raw physical-qubit counts and toward logical qubits, gate fidelity, runtime, and error correction. That shift is important for Bitcoin because risk estimates are driven by logical qubits and fault-tolerant operations rather than headline hardware totals. Related Reading Google slashes quantum cracking estimates by 20X creating $600 billion countdown for Bitcoin and Ethereum Google used zero-knowledge proofs to verify quantum attack estimates without exposing the underlying attack circuits. Mar 31, 2026 · Oluwapelumi Adejumo What is the actual state of quantum computing advancement? Progress is visible across three fronts: below-threshold error correction, small logical-qubit demonstrations, and deeper circuits with lower noise. In late 2024, Google’s Willow chip demonstrated below-threshold error correction, in which error rates fell as the encoded system scaled up. IBM says its current systems can run certain circuits with more than 5,000 two-qubit gates and has published a roadmap to a 200-logical-qubit fault-tolerant system by 2029. Quantinuum has reported 48 error-corrected logical qubits and 64 error-detected logical qubits from 98 physical qubits, along with 50 error-detected logical qubits on Helios at better-than-break-even performance. Microsoft and Atom Computing reported 24 entangled logical qubits and computation with 28 logical qubits on neutral-atom hardware. The sector remains short of a large-scale fault-tolerant machine. That is one reason DARPA’s Quantum Benchmarking Initiative exists. Its target is a quantum computer whose computational value exceeds its cost by 2033, and the agency is still validating competing architectures rather than certifying that any team has already reached that point. What can quantum computers do today? Today’s systems can do four things with credibility. They can run benchmark problems beyond classical brute-force methods, including Google’s random circuit sampling and more recent work on Quantum Echoes. They can perform limited, specialized simulations in physics and chemistry, often in hybrid workflows with classical high-performance computing. They can demonstrate logical qubits and fault-tolerant subroutines on small scales. They also function as testbeds for error correction, decoding, and control systems. What they cannot do today is the part that matters for Bitcoin. No public system has anywhere near the logical-qubit count, fault-tolerant gate budget, or sustained runtime needed for cryptographically relevant attacks on secp256k1. Google’s Willow contains 105 physical qubits. The leading public demonstrations of logical qubits remain in the tens, not the thousands. A recent estimate from Google researchers and co-authors puts a Bitcoin-relevant attack in the range of 1,200 to 1,450 logical qubits and tens of millions of Toffoli gates, leaving a large gap between current machines and a cryptographically relevant system. Related Reading Google slashes quantum cracking estimates by 20X creating $600 billion countdown for Bitcoin and Ethereum Google used zero-knowledge proofs to verify quantum attack estimates without exposing the underlying attack circuits. Mar 31, 2026 · Oluwapelumi Adejumo What is required from here to create quantum computers that can crack Bitcoin on some level? The critical threshold is a cryptographically relevant quantum computer capable of running Shor’s algorithm against the elliptic-curve discrete logarithm problem on secp256k1. According to the March 2026 Google paper , fewer than 1,200 logical qubits and 90 million Toffoli gates, or fewer than 1,450 logical qubits and 70 million Toffoli gates, could in principle solve ECDLP-256. Under superconducting assumptions with 10 -3 physical error rates and planar connectivity, the authors estimate that such an attack could be executed in minutes with fewer than 500,000 physical qubits. That sets the engineering problem . The path forward is not simply a linear climb from about 100 physical qubits to 500,000. The harder challenge is building large numbers of stable logical qubits, sustaining tens of millions of fault-tolerant operations, achieving fast cycle times, and integrating all of that with real-time decoding, cryogenics or photonic interconnects, classical control, and manufacturable modules. The same paper argues that fast-clock systems, such as superconducting and photonic platforms, are more relevant to on-spend attacks than slower-clock systems, such as ion traps and neutral atoms, because runtime can be decisive within a mempool window. For Bitcoin, “crack on some level” does not mean breaking the network in one step. The earlier risk is recovering private keys from exposed public keys or attacking spends while public keys
Key Takeaways
- The key shift is away from raw physical-qubit counts and toward logical qubits, gate fidelity, runtime, and error correction
- That shift is important for Bitcoin because risk estimates are driven by logical qubits and fault-tolerant operations rather than headline hardware totals
- Mar 31, 2026 · Oluwapelumi Adejumo What is the actual state of quantum computing advancement
- Progress is visible across three fronts: below-threshold error correction, small logical-qubit demonstrations, and deeper circuits with lower noise
- In late 2024, Google’s Willow chip demonstrated below-threshold error correction, in which error rates fell as the encoded system scaled up