The Quantum Computing Dawn: Are We Back in 1970?
Why tomorrow's quantum computing revolution strangely resembles yesterday's.
The tech world is abuzz. Almost every week, a shattering announcement promises the imminent arrival of “quantum supremacy.” Tech giants and the boldest startups are engaged in a frenzied race to build the machine that will render our current supercomputers obsolete. Yet, behind the catchy headlines and promises of near-infinite computing power, a more nuanced and fascinating reality is taking shape.
A recent, in-depth analysis published in the prestigious journal Science offers a sobering perspective on this unbridled optimism. Led by a coalition of experts from the University of Chicago, Stanford, MIT, and other premier institutions, this study proposes a bold historical reading: today’s quantum computing is not a ready-to-use technology, but a distant and striking echo of classical computing in the 1970s.
We are not at the end of the road, but at that pivotal moment when transistors were just beginning to replace vacuum tubes. Understanding this parallel is not merely an academic exercise; it is the key to grasping the true challenges, the patience required, and the immense potential that awaits us.
The Quantum Mirage: Why Supercomputing Will Never Break Bitcoin Mining.
Quantum Dreams vs. Thermodynamic Reality: Why Physics Protects Proof-of-Work.
I. The Mirror of History: From ENIAC to Qubits
To understand where we are going, we must look at where we came from. The study published in Science goes beyond evaluating machines; it draws a structural comparison between two eras.
In the 1940s and 1950s, computing was a discipline of physical “behemoths.” Machines like the ENIAC occupied entire rooms, used thousands of fragile vacuum tubes that burned out regularly, and required heroic maintenance. Then came the 1960s and 1970s, the era of the transistor and the first integrated circuits. It was a time of violent transition: the theory was solid, the components worked individually, but the complexity of assembling them posed colossal problems.
The “Tyranny of Numbers”: The Return
William D. Oliver, a co-author of the study and researcher at MIT, points to a crucial concept well-known to tech historians: the “Tyranny of Numbers.” In the 1960s, engineers knew how to make high-performance transistors. The problem? They couldn’t connect enough of them without creating a wiring nightmare that was impossible to manage.
Today, quantum computing faces its own tyranny of numbers.
The Wiring Challenge: Just like early computers, current quantum processors require an astronomical amount of cabling to control each qubit.
Signal Management: Every wire is a potential entry point for thermal noise and electromagnetic interference—the mortal enemies of the quantum state.
The Thermal Obstacle: How do you feed thousands of control signals into a dilution refrigerator (which keeps chips near absolute zero) without heating the system?
The Science analysis highlights that, just as 1970s integrated circuits were limited compared to today’s chips, today’s quantum demonstrations—though “mature” for our time—are primitive compared to actual needs. A high Technology Readiness Level (TRL) today simply means we have achieved a modest demonstration, not that we are ready for mass industry.
Lesson from History: It took decades to go from the first lab transistor to processors containing billions of transistors. The quantum “Moore’s Law” will not be decreed; it will be earned by solving brute engineering problems: mass manufacturing, wiring, and automated calibration.
II. Inside the Machine: The War of Qubits
If the bit (0 or 1) is the atom of classical computing, the qubit is its elusive elementary particle. Unlike a classical bit, which is like a switch (on or off), a qubit can exist in a superposition of states. Imagine a coin: a bit is a coin resting on the table (heads or tails). A qubit is a coin spinning on its edge; as long as it spins, it is a complex mix of heads and tails simultaneously.
It is this property, coupled with entanglement, that allows quantum computers to perform massively parallel calculations, tackling problems that traditional computing would take billions of years to solve—such as simulating new pharmaceutical molecules or optimizing global logistics networks.
However, not all qubits are created equal. The study compared six major platforms, revealing a fragmented technological landscape where each approach has its champions and its Achilles’ heels.
Ion Traps vs. Superconducting – Which Is the Better Quantum Technology?
More and more people are talking about Quantum Computing, whether in the media or in the investment world where everyone fears one thing: missing out on the Next Big Thing.
1. Superconducting Qubits (The Giants of the Cold)
This is the path favored by players like Google and IBM.
The Principle: They use electrical circuits cooled to temperatures near absolute zero (milli-Kelvin) so that current flows without resistance.
Advantage: They are fast to manipulate and benefit from existing chip manufacturing techniques.
Disadvantage: They are physically large (at the quantum scale), have a short lifespan (coherence), and require titanic cooling infrastructures.
2. Trapped Ions (Atomic Precision)
The Principle: Electromagnetic fields are used to trap individual charged atoms (ions) in a vacuum, which are then manipulated with lasers.
Advantage: Exceptional stability (coherence). Ions are identical by nature, which reduces manufacturing errors.
Disadvantage: They are slower to operate and difficult to integrate onto a compact chip at a very large scale.
A Trapped-Ion Quantum Processor Generates the First "Truly Random" Number.
You've probably all waited for the expression “it's random” several times in your life. The phrase is frequently used to describe unpredictable situations, but true randomness, in physics as in mathematics, has long remained an elusive concept.
3. Spin Defects and Others
The study also mentions spin defects in semiconductors (such as nitrogen-vacancy centers in diamond). These approaches promise easier integration with current classical electronics but suffer from delicate manufacturing and complex control requirements.
This diversity proves that we are still in an exploration phase. There is no “standard” yet, just as there were debates between vacuum tubes, relays, and transistors in the mid-20th century.
III. The Verdict of TRL: Between Lab and Reality
To bring rigor to this comparison, the researchers utilized the Technology Readiness Level (TRL) scale. This is a standardized tool (created by NASA) ranging from 1 to 9:
TRL 1: Observation of basic principles.
TRL 9: System proven in a real mission.
Applying AI to evaluate these levels yielded results that temper media enthusiasm. The majority of quantum technologies today sit in the “messy middle” of innovation, between levels 4 and 6.
This means we have functional prototypes in the lab. We know the science works. But we are far from the TRL 9 required for critical industrial applications. For example, to perform useful chemical simulations (like modeling the nitrogenase enzyme to create less energy-intensive fertilizers), we would need millions of qubits operating with near-perfect error correction.
Today, even the most advanced prototypes (like IBM’s Osprey or Google’s Sycamore) struggle with a few hundred or thousand noisy qubits.
The Maturity Paradox: The study warns against false interpretation. A high TRL in a sub-domain (like manufacturing a single qubit) does not guarantee the success of the entire system. It is akin to having a perfect Formula 1 engine (high TRL) but lacking a chassis capable of withstanding its speed without disintegrating.
IV. Challenges of Expansion: Towards a Systemic Approach
If the 1970s taught us one thing, it is that raw power is not enough. Integration is what counts.
The Science study identifies three pillars the industry must focus on to break through the expansion wall:
High-Quality Serial Manufacturing: It is no longer about making one exceptional qubit, but making a million with near-zero variability. This is the foundry challenge.
Interconnection (Wiring and Signals): We must invent the quantum equivalent of the multilayer printed circuit board. How do we route information without introducing noise?
System Control: Mastering power, temperature, and, above all, automated calibration. A quantum computer with a million qubits cannot be manually tuned by physicists; it will need to self-diagnose and self-correct in real-time.
The Need for “Tripartite” Collaboration
The Science article underscores that the road is long and cannot be traveled by a single player. Collaboration between academia (for fundamental research), governments (for patient funding and infrastructure), and industry (for engineering and scaling) is the engine that accelerated recent progress. It is this same virtuous triangle that enabled the emergence of Silicon Valley 50 years ago.
Final Thoughts: The Praise of Patience
Quantum computing is, without a doubt, one of the most exhilarating intellectual and technological adventures of our century. But comparing its current development to the 1970s is a healthy reminder: revolutions take time.
Innovations like lithography (which allows chip etching) or new semiconductor materials took decades to move from the status of laboratory curiosities to pillars of the global economy. Quantum technology is following this same demanding trajectory.
The researchers’ message is clear: we must cultivate patience. A systemic, rigorous, and coordinated approach is vital. We must not confuse media hype with industrial maturity. If we agree to solve these persistent technical obstacles with a long-term vision, without rushing, then the promises of quantum—from personalized medicine to materials science—will become, like today’s computers, a mundane and indispensable reality.
We are only at the beginning of the story. And if the history of classical computing is a reliable guide, the best is yet to come.
The Quantum Mirage: Are We Betting Billions on the Wrong Computing Revolution?
For decades, we've been told a story about the future. It’s a story of ultimate computational power, of problems once deemed unsolvable succumbing to the bizarre and wonderful logic of the quantum realm. Quantum computing, we are promised, will revolutionize medicine, create new materials, break an entire generation of cryptography, and reshape our worl…
Quantum Computing: The Great Scientific Illusion.
Quantum computing. The name alone evokes images of limitless processing power, of unsolvable problems finally conquered, and of a technological revolution that will redefine science, medicine, and finance. At the heart of this promise lies a threat—or an opportunity, depending on your point of view:
The Dawn of the Quantum Internet: The Distance Barrier Finally Broken.
The dream of an unhackable global network has just taken a giant leap forward. A team of researchers has achieved the impossible: connecting two distinct quantum nodes via teleportation, paving the way for the Internet of the future.







