
Information Theory 2
contributed
Tue, 27 Jan 2026, 13:00 - 13:00
- Optimal Distillation of Qubit ClocksSujay Kazi (Duke University); Iman Marvian (Duke University)[abstract]Abstract: The problem of coherence distillation asks: given multiple copies of a coherent input state, produce a coherent output state with much less noise (that is, much closer to a pure coherent state), in such a way that the time evolution of the output state matches the time evolution of the input state. We thoroughly address this problem for the case of qubit states. Principally, we compute the distillation channel that is asymptotically optimal in the limit of a large number of input qubit states and compute the resulting infidelity (one minus fidelity) between the output qubit state and the desired pure state. We show that this infidelity is closely related to a resource known as purity of coherence, a quantity obtained from the Right Logarithmic Derivative (RLD) Fisher information metric. We thus demonstrate an operational meaning for purity of coherence as the quantity whose monotonicity on time-translation invariant channels sets the strongest asymptotic bound on the capability of coherence distillation for qubit states. For a special choice of input and output state, we additionally investigate a number of extensions of this primary result. In particular, we present numerical schemes to produce the exact optimal distillation protocol for a fixed value of $N$, extend the computation of the optimal protocol and the minimum infidelity to the $1/N^2$ and $1/N^3$ orders, estimate the amount of purity of coherence that is dissipated by the optimal distillation protocol, and compute the mathematical behavior of the protocol more precisely in the low-noise and high-noise limits.
- On the optimization of quantum divergencesGereon Kossmann (RWTH Aachen University); René Schwonnek (Leibniz University Hannover); Mario Berta (RWTH Aachen University); Mark M. Wilde (School of Electrical and Computer Engineering, Cornell University)[abstract]Abstract: Many fundamental quantities in quantum information processing are instances of quantum divergences - functionals on quantum states that satisfy natural axioms grounded in information-theoretic principles. Recently, a new class of divergences - the f-divergences - has gained prominence in quantum information theory and received operational interpretations, while being long established in the classical setting. Furthermore, Frenkel showed that the Umegaki relative entropy is a special case of a quantum f-divergence for the function f(x) = x log x; building on this, Hirche et al. introduced a parameterized family of f-divergences that, in appropriate regimes, recovers the sandwiched and Petz relative entropies as regularizations. Taken together, these results reveal a tight link between the best-understood quantum divergences - the Umegaki, Petz, and sandwiched relative entropies - on a technical level and the general class of f-divergences, thereby strongly motivating a program that connects f-divergences to concrete quantum information tasks as already started by Cheng et al. In this contribution, we develop a variational formulation that approximates general quantum f-divergences to arbitrary precision. These approximations yield (i) efficient evaluation of the quantum relative entropy of channels and already used as the core numerical method in quantum many body physics and (ii) computation of asymptotic key rates in DIQKD in particular in the scenario of two switches in routed Bell scenarios.
- merged withEntanglement theory with limited computational resourcesLorenzo Leone (Freie Universität Berlin); Jacopo Rizzo (Freie Universität Berlin); Jens Eisert (Freie Universität Berlin); Sofiene Jerbi (Freie Universität Berlin)[abstract]Abstract: The precise quantification of the limits to manipulating quantum resources lies at the core of quantum information theory. However, standard information-theoretic analyses do not consider the actual computational complexity involved in performing certain tasks. Here, we address this issue within the realm of entanglement theory, finding that accounting for computational efficiency substantially changes what can be achieved using entangled resources. We consider two key figures of merit: the computational distillable entanglement and the computational entanglement cost. These measures quantify the optimal rates of entangled bits that can be extracted from or used to dilute many identical copies of n-qubit bipartite pure states, using computationally efficient local operations and classical communication. We demonstrate that computational entanglement measures diverge significantly from their information-theoretic counterparts. While the information-theoretic distillable entanglement is determined by the von Neumann entropy of the reduced state, we show that the min-entropy governs the computationally efficient setting. On the other hand, computationally efficient entanglement dilution requires maximal consumption of entangled bits, even for nearly unentangled states. Furthermore, in the worst-case scenario, even when an efficient description of the state exists and is fully known, one gains no advantage over state-agnostic protocols. Our findings establish sample-complexity bounds for measuring and testing the von Neumann entropy, fundamental limitations on efficient state compression, and efficient local tomography protocols.merged withQuantum Computational EntropiesNoam Avidan (Weizmann Institute of Science); Thomas A. Hahn (Weizmann Institute of Science); Rotem Arnon (Weizmann Institute of Science); Joseph M. Renes (Institute for Theoretical Physics, ETH Zurich)[abstract]Abstract: Quantum information theory, particularly its entropic formulations, has made remarkable strides in characterizing quantum systems and tasks. However, a critical dimension remains underexplored: computational efficiency. While classical computational entropies integrate complexity and feasibility into information measures, analogous concepts have yet to be rigorously developed in the quantum setting. In this joint submission, we advance a quantum computational information theory through two complementary works. The first introduces the quantum computational unpredictability entropy, a natural generalization of the min entropy for classical-quantum states and of the classical unpredictability entropy that quantifies the guessing probability of classical randomness using quantum side information and bounded computational power. The second work extends this to the fully quantum setting by defining fully quantum computational min- and max-entropies. The computational min-entropy generalizes unpredictability entropy and retains essential properties, including data processing, a fully quantum leakage chain rule, and it satisfies a novel purification chain rule. The computational max-entropy is defined via a canonical duality relation and it captures a notion of efficient entanglement distillation under bounded quantum circuits. With the introduction of these computational entropies and their analysis, this work marks a critical step toward a quantum information theory that incorporates computational elements.Computational relative entropyJohannes Jakob Meyer (Dahlem Center for Complex Quantum Systems, Freie Universität Berlin); Asad Raza (Dahlem Center for Complex Quantum Systems, Freie Universität Berlin); Jacopo Rizzo (Dahlem Center for Complex Quantum Systems, Freie Universität Berlin); Lorenzo Leone (Dahlem Center for Complex Quantum Systems, Freie Universität Berlin); Sofiene Jerbi (Dahlem Center for Complex Quantum Systems, Freie Universität Berlin); Jens Eisert (Dahlem Center for Complex Quantum Systems, Freie Universität Berlin)[abstract]Abstract: Our capacity to process information depends on the computational power at our disposal. Information theory captures our ability to distinguish states or communicate messages when it is unconstrained with unrivaled beauty and elegance. For computationally bounded observers the situation is quite different -- they can, for example, be fooled to believe that distributions are more random than they actually are. Existing mathematical approaches in computational information theory largely follow the single-shot paradigm that, while being operationally meaningful, also gives complicated statements and is difficult to build intuition for. In our work, we take a new direction in computational quantum information theory that captures the essence of complexity-constrained information theory while retaining the look and feel of the unbounded asymptotic theory. As our foundational quantity, we define the computational relative entropy as the optimal error exponent in asymmetric hypothesis testing when restricted to polynomially many copies and quantum gates, defined in a mathematically rigorous way. Building on this foundation, we prove a computational analogue of Stein's lemma, establish computational versions of fundamental inequalities like Pinsker's bound, and demonstrate a computational smoothing property showing that computationally indistinguishable states yield equivalent information measures. We derive a computational entropy that operationally characterizes optimal compression rates for quantum states under computational limitations and show that our quantities apply to computational entanglement theory, proving a computational version of the Rains bound. Our framework reveals striking separations between computational and unbounded information measures, including quantum-classical gaps that arise from cryptographic assumptions, demonstrating that computational constraints fundamentally alter the information-theoretic landscape and open new research directions at the intersection of quantum information, complexity theory, and cryptography.