Modern quantum simulators have revealed a surprisingly strong form of apparent randomness called deep thermalization: large many-body systems come to look “thermal” in a very robust way, and strikingly, they seem to reach this regime after only short evolution times. Existing theoretical models, however, typically explain deep thermalization by invoking long, chaotic dynamics or random quantum circuits that generate extremely complex, highly entangled states. In our work, we offer a different explanation based on computational limits. Using tools from post-quantum cryptography, we construct shallow quantum circuits that output states with low entanglement that are computationally indistinguishable from fully random thermal states, even after measuring large parts of the system. That is, no realistic observer, even those equipped with an efficient quantum computer can reliably tell them apart. This leads to the notion of computational deep thermalization, where thermal-looking behaviour emerges rapidly because no realistic observer can efficiently detect the hidden structure of the underlying state.

Classical and Quantum Thermalization

In everyday life, objects equilibrate with the environment. A cup of tea left on your desk eventually reaches room temperature, losing almost all memory of how it was heated. In physics, this process is called thermalization: many-particle systems evolve towards equilibrium where only a few coarse properties, like temperature or energy, matter, and the fine details of the past are effectively forgotten.

In quantum physics, the story becomes even more intriguing. A large quantum system can be in a perfectly well-defined pure state, evolving deterministically under Schrödinger’s equation, and yet any small part of it can still look completely random and “thermal.” A good mental picture is to imagine a giant quantum state spread over many particles and then split them into two groups: a small “system” you measure and a much larger “bath” that plays the role of the environment. If the joint state is what physicists call typical—roughly, one of the overwhelmingly many states you would get by picking a state at random from all states with the right energy—then the system alone already looks thermal. From its local point of view, all the microscopic information has been hidden in complicated correlations with the bath.

Deep Thermalization

Recently, experiments with highly controllable quantum devices, such as programmable arrays of ultracold atoms or trapped ions, have revealed something even stranger. These platforms let you prepare large, interacting quantum states, evolve them for a short time, and then measure individual particles one by one. When experimentalists did this, they saw not only ordinary thermalization, but a stronger effect that theorists have dubbed deep thermalization. Here, you don’t just ask whether a small subsystem looks thermal before any measurements. You first measure part of the system, collapsing those particles, and then look at the state of the remaining particles conditioned on the outcomes. Deep thermalization, which is stronger than standard thermalization, implies that even after this invasive measurement, the leftover state still looks, in a very precise statistical sense, as if it came from a completely random quantum state.

What makes this surprising is how quickly the experiments seem to reach this regime. In many setups, deep-thermal behaviour appears after relatively shallow evolution: a modest number of time steps or circuit layers. By contrast, the leading theoretical models that try to explain deep thermalization have mostly relied on extremely complex, long-time dynamics. They invoke random quantum circuits that approximate so-called Haar-random states or high-order “designs”, which in rigorous constructions typically require very large depths and enormous entanglement. There was a clear tension: nature seemed to get deep thermalization “on the cheap,” while our clean mathematical models insisted it should be expensive.

Computational Deep Thermalization

Our work offers a resolution to this puzzle. We propose that deep thermalization, at least in many situations, is not purely a statement about intrinsic physical chaos, but also about the limits of computation. We introduce the idea of computational deep thermalization: a quantum state is deeply thermal not because it is literally indistinguishable from a random state in an absolute sense, but because no realistic observer has the computational power to notice that it isn’t. Here, “realistic” means any observer that can be modelled as an efficient quantum computer, something that runs in polynomial time and may even have access to many copies of the state. If every such observer fails to distinguish a given ensemble of states from truly random thermal states, then, operationally, those states are deeply thermal.

To make this idea concrete, we construct a family of highly structured quantum states using shallow, local quantum circuits. Moreover, these states have minimal entanglement, far less than in the complex states produced by chaotic evolution or random circuits. And yet, we prove that to any efficient quantum observer, these states are indistinguishable from genuinely random, “infinite-temperature” equilibrium states—even if that observer is allowed to perform measurements and demand access to many copies of the state.

More strikingly, we show that if you now mimic what the experiments do: measure a large subset of the qubits and condition on the outcomes—the remaining qubits still look random in exactly the same strong sense, for almost every way of splitting the system and the bath. In other words, these simple, low-entanglement states exhibit deep thermalization, but for fundamentally computational reasons.

Seen through this lens, the tension between experiment and theory becomes less mysterious. Experiments can reach deep-thermal-looking states quickly because they do not need to generate truly Haar-random states or enormous entanglement; they only need to generate enough structured pseudorandomness that no physically reasonable observer (classical or quantum) can tell the difference. Our computational assumptions “do the trick”: they explain how deep thermalization can emerge after short timescales. This perspective blurs the boundary between physics and computer science. It suggests that part of what we call thermalization, and especially deep thermalization, may be as much about what is computationally feasible to detect as it is about microscopic dynamics.Our results point toward a universe that can look random and chaotic not necessarily because it truly is, but because even our best quantum computers would struggle to prove otherwise. This work, co-authored by Shantanav Chakraborty (CQST, IIIT Hyderabad), Soonwon Choi (MIT), Soumik Ghosh (U. Chicago) and Tudor Giurgică-Tiron (QuICS, U. Maryland), titled Fast Computational Deep Thermalization, was published in Physical Review Letters, one of the leading physics journals.

Leave a Reply

Your email address will not be published. Required fields are marked *

This field is required.

This field is required.