AI was yesterday's news. Today: A chip that breaks time itself. What does it mean for your career?
This week on LinkedIn, I posted about quantum computing. This is a technology I have been following since its inception. It’s very dear to me because the first few physicists working on this breakthrough started when I was taking the Quantum Mechanics exam in my undergraduate course in physics.
The quantum computing landscape just shifted dramatically, but not in the way most headlines suggest. Let me break down Google's Willow quantum chip announcement with the nuance you deserve.
First, the breakthrough is real – but needs context:
The Good:
Imagine trying to solve a complex math problem, but every few seconds someone randomly changes one of your numbers. That's what quantum computers deal with constantly. Quantum bits (qubits) are extremely sensitive. Even tiny disturbances from heat, vibration, or electromagnetic fields can corrupt them.
Without error correction, quantum calculations become meaningless within microseconds.
Google definitively proved error correction is possible with superconducting qubits
Willow is technically the most powerful quantum processor due to its superior connectivity
The results are unique to Google – no other superconducting qubit processor can replicate this
This confirms quantum computing is a "when," not "if" scenario
The Reality Check:
Progress is slower than headlines suggest: 5 years from 50 to 100 qubits
We need ~1 million qubits for practical applications
Significant fabrication challenges remain (uniformity issues between qubits)
Error rates actually increase as we scale up
Read Qlabs’ How to Build a Quantum Supercomputer: Scaling Challenges and Opportunities
Here's what this means for different professionals:
For Investors & Executives:
Long-term horizon required – this is a marathon, not a sprint
Focus on quantum-resilient infrastructure now
Consider strategic partnerships with quantum research institutions
Invest in quantum literacy for your teams
For Technical Professionals:
Error correction expertise will be premium
Focus on quantum-classical integration skills
Understanding fabrication constraints is crucial
Quantum algorithm optimization will be key
For Early-Career & Students:
Perfect time to specialize in quantum error correction
Study superconducting qubit architectures
Learn quantum-classical hybrid approaches
Focus on scalability challenges
Your Strategic Roadmap:
Near-term (1-2 years):
Build foundational quantum computing knowledge
Focus on error correction mathematics
Understand quantum-classical interfaces
Mid-term (3-5 years):
Develop expertise in scalability challenges
Learn about fabrication processes
Specialize in quantum error mitigation
Long-term (5+ years):
Position yourself for the million-qubit era
Focus on practical quantum applications
Develop quantum-resistant systems
The Security Imperative
While practical quantum computers are still distant, preparation for quantum-resistant cryptography should start now. The transition will take years – waiting isn't an option.
Bottom Line: Google's Willow chip is a crucial milestone, but not the quantum revolution yet. It's proof that quantum computing will happen, alongside a stark reminder of the engineering challenges ahead.
This is a perfect moment to position yourself in the quantum computing field – not because it's about to explode, but because you'll have time to grow with the technology as it matures.
Want to dig deeper?
A nice primer:
Something a bit deeper: https://quantum.country
For academics: https://global.oup.com/academic/product/quantum-information-science-9780198787488
Remember: The best time to start understanding quantum computing isn't when it's fully realized – it's now, while we're still solving the fundamental challenges.
Signing off,
—a
It's not just the number if qbits, the error correction, or the stability.
Google improved on its Sycamore hardware and did better on the 'hardware test'. The biggest problem of QM computing is the one conveniently seldom mentioned: software. All we have is simulation of physics and the like (which is great) and a tiny handful of algorithms (not for want of trying for more than 30 years) to speed up digital problems. Isn't it interesting that when talking such algorithms/uses, only one is always mentioned (Shor's).
See https://ea.rna.nl/2019/03/13/to-be-and-not-to-be-is-that-the-answer/ (I was told by someone at a QM computing institution: correct...) and its follow-up about Sycamore.