Meanwhile, Even Bigger Breakthroughs by Google Continue
By Ralph Losey, October 21, 2025.
The Nobel Prize in Physics was just awarded to quantum physics pioneers John Clarke, Michel H. Devoret, and John M. Martinis for discoveries they made at UC Berkeley in the 1980s. They proved that quantum tunneling, where subatomic particles can break through seemingly impenetrable barriers, can also occur in the macroscopic world of electrical circuits. So yes, Schrรถdingerโs cat really could die.

Their experiments showed that entire circuits can behave as single quantum objects, bridging the gap between theory and engineering. That breakthrough insight paved the way for construction of quantum computers, including the latest by Google.
Both Devoret and Martinis were recruited years ago by Google to help design its quantum processors. Although John Martinis (right, in the image above) recently departed to start his own company, Qolab, Michel Devoret (center) remains at Google Quantum AI as the Chief Scientist of Quantum Hardware. Last year, two other Google scientists, John Jumper and Demis Hassabis, shared a Nobel prize in chemistry for their groundbreaking work in AI.
Google is clearly on a roll here. As Google CEO Sundar Pichai joked in his congratulatory post on LinkedIn: “Hope Demis Hassabis and John Jumper are teaching you the secret handshake.”

๐น Willow Breaks Through Its Own Barriers
Less than a year ago, Googleโs new quantum chip, Willow, tunneled through its own barriers, performing in five minutes a calculation that would have taken ten septillion years (10ยฒโด) on the fastest classical supercomputers. Thatโs far longer than anyoneโs estimate for the age of our universeโa good definition of mind-boggling.
This result led Hartmut Neven, director of Googleโs Quantum Artificial Intelligence Lab, to suggest it offers strong evidence for the many-worlds or multiverse interpretation of quantum mechanicsโthe idea that computation may occur across near-infinite parallel universes. Neven and a number of leading researchers subscribe to this view.
I explored that seemingly crazy hypothesis in Quantum Leap: Google Claims Its New Quantum Computer Provides Evidence That We Live In A Multiverse (Jan 9, 2025). Oddly enough, it became my most-read article of all timeโthank you, readers.
Todayโs piece updates that story. The Nobel Prize recognition is icing on the cake, but progress has not slowed. Quantum computersโand the lawโremain one of the most exciting frontiers in legal-tech. So much so that Iโm developing a short online course on quantum computing and law, with more courses on prompt engineering for legal professionals coming soon. Subscribe to e-DiscoveryTeam.com to be notified when they launch.
The work of this yearโs Nobel laureatesโClarke, Devoret, and Martinisโwas done forty years ago, so delay in recognition is hardly unusual in this field. Perhaps someday Neven and other many-worlds interpreters of quantum physics will receive their own Nobel Prize for demonstrating multiverse-scale applications. In my view, far more evidence than speed alone will be required.
After all, it defies common sense to imagine, as the multiverse hypothesis suggests, that every quantum event splits reality, spawning a near-infinite array of universes. For example, one where Schrรถdingerโs cat is alive and another slightly different unoiverse where it is dead. It makes Einsteinโs โspooky action at a distance“ seem tame by comparison.

In the meantimeโwhatever the true mechanismโquantum computers and AI are already producing tangible social and legal consequences in cryptography, cybercrime, and evidentiary law. See, The Quantum Age and Its Impacts on the Civil Justice System (Rand, April 29, 2025); Quantum-Readiness: Migration to Post-Quantum Cryptography (NIST, NSA, August, 2023); Quantum Computing Explained (NIST 8/22/2025); but see, Keith Martin, Is a quantum-cryptography apocalypse imminent? (The Conversation , 6/2/25) (“Expert opinion is highly divided on when we can expect serious quantum computing to emerge,” with estimates ranging from imminent to 20 years or more.)
Whether you believe in the multiverse or not, the practical implications for law and technology are already arriving.

๐น Atlantic Quantum Joins Google Quantum AI
On October 2, 2025, Hartmut Neven, Founder and Lead, Google Quantum AI, announced in a short post titled โWeโre scaling quantum computing even faster with Atlantic Quantumโ that Google had just acquired. Atlantic Quantum is an MIT-founded startup developing superconducting quantum hardware. The announcement, written in Nevenโs signature understated style, framed the deal as a practical step on Googleโs long road toward โa large error-corrected quantum computer and real-world applications.โ
Neven explained that Atlantic Quantumโs modular chip stack, which integrates qubits and superconducting control electronics within the cryogenic stage, will allow Google to โmore effectively scale our superconducting qubit hardware.โ That phrase may sound routine to non-engineers, but it represents a significant leap in design philosophy: merging computation and control at the cold stage reduces signal loss, simplifies architecture, and makes modular scalingโthe key to fault-tolerant machinesโrealistically achievable. This is another great acquisition by Google.
Independent reporting quickly confirmed the deal’s importance. In Atlantic Quantum Joins Google Quantum AI, The Quantum Insiderโs Matt Swayne summarized the deal succinctly:
โข Google Quantum AI has acquired Atlantic Quantum, an MIT-founded startup developing superconducting quantum hardware, to accelerate progress toward error-corrected quantum computers. . . .
โข The deal underscores a broader industry trend of major technology companies absorbing research-intensive startups to advance quantum computing, a field still years from large-scale commercial deployment.
The article noted that the integration of Atlantic Quantumโs modular chip-stack technology into Googleโs program was aimed at one of quantum computingโs toughest engineering hurdles: scaling systems to become practical and fault-tolerant.
The MIT-born startup, founded in 2021 by a group of physicists determined to push superconducting design beyond incremental improvements, focused on embedding control electronics directly within the quantum processor. That approach reduces noise, simplifies wiring, and makes modular expansion far more realistic. For another take on the Atlantic story, see Atlantic Quantum and Google Quantum AI are โJoining Upโ (Quantum Computing Report, 10/02/25).
These articles place the transaction within a broader wave of global investment in quantum technologies. Large-scale commercial deployment may still be years away but the industry has already entered a phase of consolidation. Research-heavy startups are increasingly being absorbed by major technology companies, a predictable evolution in a field defined by extraordinary capital demands and complex technical challenges.
For Google, the acquisition is less about headlines and more about infrastructure control, owning every layer of the superconducting stack from design to fabrication. For the industry, it signals that the next phase of quantum development will likely follow the same arc as classical computing: early-stage innovation absorbed by large, well-capitalized firms that can bear the cost of scaling.
For lawyers and regulators, that pattern has familiar consequences: intellectual-property concentration, antitrust scrutiny, export-control compliance, and the evidentiary standards that will eventually govern how outputs from such corporate-owned quantum systems are regulated and presented in court.

๐น Willow and the Many-Worlds Question
Before the Nobel bell rang in Stockholm, Googleโs Quantum AI group had already changed the conversation with its Willow processor.
In my earlier piece on Willowโs mind-bending computations, I quoted Hartmut Nevenโs โparallel universesโ framing to describe its behavior. Some heard music; others heard marketing. Others, like me, saw trouble ahead.
The Nobel Prize did not validate the many-worlds interpretation of quantum mechanics, nor did it disprove it. Neven has not backed away from the theory, nor have others, and Neven has just gotten the best talent from MIT to join his group. What the Nobel Prize did confirmโbeyond any reasonable doubtโis that macroscopic superconducting circuits, at a size you can see, can exhibit genuine quantum behavior under controlled laboratory conditions. That is the solid foundation a judge or regulator can stand on: devices now exist in our world that generate outputs with quantum fingerprints reproducible enough to test and verify.
Meanwhile, the frontier continues to move. In September 2025, researchers at UNSW Sydney demonstrated entanglement between two atomic nuclei separated by roughly twenty nanometers, See, โNew entanglement breakthrough links cores of atoms, brings quantum computers closerโ (The Conversation, Sept. 2025). Twenty nanometers is not big, but it is large enough to measure.
Moreover, even though the electrical circuits themselves are large enough to photograph, the quantum energy was not. That could only be measured indirectly. The researchers used coupled electrons as what lead scientist Professor Andrea Morello called โtelephonesโ to pass quantum correlations and make those measurements.

The telephone metaphor is apt. It captures the engineering ambition behind the resultโconnecting quantum rooms with wires, not whispers. Whispers don’t echo. Entanglement is not a philosophical idea; it is a measurable resource that can be distributed, controlled, and eventually commercialized. It can even call home.
For the legal system, this is where things become concrete. When entanglement leaves the lab and enters communications or sensing devices, courts will be asked to evaluate evidence that can be measured and described but cannot be seen directly. The question will no longer be โIs this real?โ but โHow do we authenticate what can be measured but not observed?โ
Thatโs the moment when the physics of quantum control becomes the jurisprudence of evidenceโand itโs coming faster than most practitioners realize.

๐น Defining the Echo: When Evidence Repeats With a Slight Accent
The many-worlds interpretation of quantum mechanics has always sat on the thin line between physics and philosophy. First proposed in 1957 by Hugh Everett, it replaces the familiar ‘collapse‘ of the wave-function with a more radical notion: every quantum event splits reality into separate branches, each continuing independently. Some brilliant physicists take it seriously; others reject it; many remain agnostic. Courts need not resolve that debate. For law, the relevant question is simpler: can a party show a method that reliably connects a claimed quantum mechanism to a particular output? If yes, the courtโs job is to hear the evidence. If not, the courtโs job is to exclude it.
In its early decades, the idea was mostly dismissed as metaphysical excess. Then Bryce DeWitt, David Deutsch, Max Tegmark and Sean Carroll each found ways to refine and defend it. David Deutsch, known as the Father of Quantum Comnputing, first argued that quantum computers might actually use this multiplicity to perform computationsโeach universe branch carrying part of the load. See e.g., Deutsch, The Fabric of Reality: The Science of Parallel Universes–and Its Implications (Penguin, 1997) (Chapter 9, Quantum Computers). Deutsch even speculates in his next (2011) book The Beginning of Infinity (pg. 294) that some fiction, such as alternate history, could occur somewhere in the multiverse, as long as it is consistent with the laws of physics.
The many-world’s argument, once purely theoretical, gained traction after Googleโs Willow experiments. Hartmut Nevenโs reference to โparallel universesโ was not an assertion of proof but a shorthand for describing interference effects that defy classical intuition. It is what he believes was happeningโand that opinion carries weight because he works with quantum computers every day.
When quantum behavior became experimentally measurable in superconducting circuits that were large enough to photograph, the Everett questionโ’Are we branching universes or sampling probabilities?‘โstopped being rhetorical. The debate moved from thought experiment to instrument design. Engineers now face what philosophers only imagined: how to measure, stabilize, and interpret outcomes that occur across many possible worlds and never converge on a single, deterministic path.
For the law, the relevance lies not in metaphysics but in method. Whether the universe splits or probabilities collapse, the data these machines produce are inherently probabilisticโrepeatable only within margins, each time with a slight accent. The courtroom analog to wave-function collapse is the evidentiary demand for reproducibility. If the physics no longer promises identical outputs, the law must decide what counts as reliable samenessโechoes with an accent.
That shift from metaphysics to methodology is the lawyerโs version of a measurement problem. Itโs not about believing in the multiverse. Itโs about learning how to authenticate evidence that depends on it.

๐น The Law Listens: Authenticating Echoes in Practice
If each quantum record is an echo, the lawโs task is to decide which echoes can be trusted. That requires method, not metaphysics. The legal system already has the toolsโauthentication, replication, expert testimonyโbut they need recalibration for an age when precision itself is probabilistic.
1. Authentication in context.
Under Rule 901(b)(9), evidence generated by a process or system must be shown to produce accurate results. In a quantum context, that showing might include the type of qubit, its error-correction protocol, calibration logs, environmental controls, and the precise code path that produced the output. The burden of proof doesnโt change; only the evidentiary ingredients do.
2. Replication hearings.
In classical computing, replication is binaryโeither a hash matches, or it doesnโt. In quantum systems, replication becomes statistical. The question is no longer โCan this be bit-for-bit identical?โ but โDoes this fall within the accepted variance?โ Probabilistic systems demand statistical fidelity, not sameness. A replication hearing becomes a comparison of distributions, not exact strings of bits.
Similar logic already guides quantum sensing and metrology, where entanglement and superposition improve precision in measuring magnetic fields, time, and gravitational effects. See Quantum sensing and metrology for fundamental physics (NSF, 2024); Review of qubit-based quantum sensing (Springer, 2025); Advances in multiparameter quantum sensing and metrology (arXiv, 2/24/25); Collective quantum enhancement in critical quantum sensing (Nature, 2/22/25). Those readings vary from one run to the next, yet the variance itself confirms the physicsโeach measurement is a statistically faithful echo of the same underlying reality. The variances are within a statistically acceptable range of error.

๐น Two Examples from the Quantum Frontier
1. Quantum Chemistry In Practice.
One of the most mature quantum applications today is the Variational Quantum Eigensolver (VQE), a hybrid quantum-classical algorithm used to estimate the ground-state energy of molecules. See, The Variational Quantum Eigensolver: A review of methods and best practices (Phys. Rep., 2023); Greedy gradient-free adaptive variational quantum algorithms on a noisy intermediate scale quantum computer (Nature, 5/28/25). Also see, Distributed Implementation of Variational Quantum Eigensolver to Solve QUBO Problems (arXiv, 8/27/25); How Does Variational Quantum Eigensolver Simulate Molecules? (Quantum Tech Explained, YouTube video, Sept. 2025).
VQE researchers routinely run the same circuit hundreds of times; each iteration yields slightly different energy readings because of noise, calibration drift, and quantum fluctuations. Yet the outputs consistently cluster around a stable baseline, confirming both the accuracy of the physical model and the reliability of the machine itself.
Now picture a pharmaceutical patent dispute where one party submits quantum-derived binding data for a new molecule. The opposing side demands replication. A court applying Rule 702 may not expect identical numbersโbut it could require expert testimony showing that results consistently fall within a scientifically accepted margin of error. If they do, that should become a legally sufficient echo.
This is reminiscent of prior disputes e-discovery concerning the use of AI to find relevant documents. It has been accepted by all courts that perfection, such as 100% recall, is never required, but reasonable efforts are required. Judge Andrew Peck, Hyles v. New York City, No. 10 Civ. 3119 (AT)(AJP), 2016 WL 4077114 (S.D.N.Y. Aug. 1, 2016). This also follows the official commentary of Rule 702, on expert testimony, where โperfection is not required.โ Fed. R. Evid. 702, Advisory Committee Note to 2023 Amendment.
The reasonable efforts can be proven by numerics and testimony. See for instance my writings in the TAR Course: Fifteenth Class- Step Seven โ ZEN Quality Assurance Tests (e-Discovery Team, 2015) (Zero Error Numerics); ei-Recall (e-Discovery Team, 2015); Some Legal Ethics Quandaries on Use of AI, the Duty of Competence, and AI Practice as a Legal Specialty (May, 2024).

2. Quantum-Secure Archives.
As quantum computing and quantum cryptography advance, most (but not all) of todayโs encryption will become obsolete. This means the vast amount of encrypted data stored in corporate and governmental archivesโmaintained for regulatory, evidentiary, and operational purposesโmay soon be an open book to attackers. Yes, you should be concerned.
Rich DuBose and Mohan Rao, Harvest now, decrypt later: Why todayโs encrypted data isnโt safe forever (Hashi Corp., May 21, 2025) explain:
Most of todayโs encryption relies on mathematical problems that classical computers canโt solve efficiently โ like factoring large numbers, which is the foundation of the RivestโShamirโAdleman (RSA) algorithm, or solving discrete logarithms, which are used in Elliptic Curve Cryptography (ECC) and the Digital Signature Algorithm (DSA). Quantum computers, however, could solve these problems rapidly using specialized techniques such as Shorโs Algorithm, making these widely used encryption methods vulnerable in a post-quantum world.
Also see, Dan Kent, Quantum-Safe Cryptography: The Time to Start Is Now (Gov.Tech., 4/30/25) and Amit Katwala, The Quantum Apocalypse Is Coming. Be Very Afraid (Wired, Mar. 24, 2025), warning that cybersecurity analysts already call this future inflection point Q-Dayโthe day a quantum computer can crack the most widely used encryption. As Katwala writes:
On Q-Day, everything could become vulnerable, for everyone: emails, text messages, anonymous posts, location histories, bitcoin wallets, police reports, hospital records, power stations, the entire global financial system.
Most responsible organizations with large archives of sensitive data have been preparing for Q-Day for years. So too have those on the other sideโnation-states, intelligence services, and organized criminal groupsโwho are already harvesting encrypted troves today to decrypt later. See, Roger Grimes, Cryptography Apocalypse: Preparing for the Day When Quantum Computing Breaks Today’s Crypto (Wiley, 2019). The race for quantum supremacy is on.
Now imagine a company that migrates its document-management system to post-quantum cryptography in 2026. A year later, a breach investigation surfaces files whose verification depends on hybrid key-exchange algorithms and certificate chains. The plaintiff calls them anomalies; the defense calls them echoes. The court wonโt choose sides by theoryโit will follow the evidence, the logs, and the math.

๐น Building the Quantum Record
Judicial findings and transparency. Courts can adapt existing frameworks rather than invent new ones. A short findings order could document:
(a) authentication steps taken;
(b) observed variance;
(c) expert consensus on reliability; and
(d) scope limits of admissibility.
Such transparency builds a common-law recordโthe first body of quantum-forensic precedent. I predict it will be coming soon to a universe near you!
Chain of custody for the probabilistic age. Future evidence protocols may pair traditional logs with variance ranges, confidence intervals, and error budgets. Discovery rules could require disclosure of device calibration history, firmware versions, and known noise parameters. The data once confined to labs will become essential for authentication.
The law doesnโt need new virtues for quantum evidence; it needs old ones refined. Transparency, documentation, and replication remain the gold standard. What changes is the expectation of sameness. The goal is no longer perfect duplication, but faithful resonance: the trusted echo that still carries truth through uncertainty.

๐น Conclusion: The Sound of Evidence
The Nobel Committee rang the bell. Googleโs engineers adding instruments. Labs in Sydney and elsewhere wired new rooms together. The rest of usโlawyers, paralegals, judges, legal-techs, investigatorsโmust learn how to listen for echoes without hearing ghosts. That means resisting hype, insisting on method, and updating our checklists to match what the devices actually do.
Eight months ago in Quantum Leap, I described a canyon where a single strike of an impossible calculation set the walls humming. This time, the sound came from Stockholm. If the next echo is from quantum evidence in your courtroomโperhaps as a motion in limine over non-identical logsโdonโt panic. Listen for the rhythm beneath the noise. The lawโs task is to hear the pattern, not silence the world.
Science, like law, advances by listening closely to what reality whispers back. The Nobel Committee just honored three physicists for demonstrating that quantum behavior can be engineered, measured, and replicatedโits fingerprints recorded even when the phenomenon itself remains invisible. Their achievement marks a shift from theory to tested evidence, a shift the courts will soon confront as well.
When engineers speak of quantum advantage, they mean a moment when machines perform tasks that classical systems cannot. The legal system will have its own version: a time when quantum-derived outputs begin to appear in contracts, forensic analysis, and evidentiary records. The challenge will not be cosmic. It will be procedural. How do you test, authenticate, and trust results that vary within the bounds of physics itself?
The answer, as always, lies in method. Law does not require perfection; it requires transparency and proof of process. When the next Daubert hearing concerns a quantum model rather than a mass spectrometer, the same questions will apply: Was the procedure sound? Were the results reproducible within accepted error? Were the foundations laid? The physics may evolve, but the evidentiary logic remains timeless.
In the end, what matters is not whether the universe splits or probabilities collapse. What matters is whether we can recognize an honest echo when we hear oneโand admit it into evidence.

๐น Postscript.
Minutes before this article was published Google announced an important new discovery called “Quantum ECHO.” Yes, same name as this article, written by Ralph Losey with no advance notice from Google of the discovery or name. A spooky entanglement, perhaps? Ralph will publish a sequel soon that spells out what Google has done now. In the meantime, here is Google’s announcement by Hartmut Neven\ and Vadim Smelyanskiy, Our Quantum Echoes algorithm is a big step toward real-world applications for quantum computing (Google, 10/22/25).
๐น Subscribe and Learn More
If this exploration of Quantum Echoes and evidentiary method has sparked your curiosity, you can find much more at e-DiscoveryTeam.com โ where I continue to write about artificial intelligence, quantum computing, evidence, e-discovery, and the future of law. Go there to subscribe and receive email notices of new blogs and upcoming courses, and special events โ including an online course, with a working title ‘Quantum Law: From Entanglement to Evidence,‘ that will expand on the ideas introduced here. It will discuss how quantum physics and AI converge in the practice of law, from authentication and reliability to discovery and expert testimony.
That program will be followed by two other, longer online courses that are also near completion:
- ‘Beginner โGPT-4 Levelโ Prompt Engineering for Legal Professionals,’ a practical foundation in AI literacy and applied reasoning.
- ‘Advanced โGPT-5 Levelโ Prompt Engineering for Legal Professionals,’ an in-depth study of prompt design, model evaluation, and AI ethics.
All courses are part of my continuing effort to help the legal profession adapt responsibly to the next wave of technology โ with integrity, experience and whatever wisdom I may have accidentally gathered from a long life on Earth.

Subscribe at e-DiscoveryTeam.com for notices of new articles, course announcements, and research updates.
Because the future of law wonโt be written by those who fear new tools, but by those who understand the evidence they produce.
Ralph C. Losey is an attorney, educator, and author of e-DiscoveryTeam.com, where he writes about artificial intelligence, quantum computing, evidence, e-discovery, and emerging technology in law.
ยฉ 2025 Ralph C. Losey. All rights reserved.
Posted by Ralph Losey 











