We cast the discrimination of two coherent states of light as a reinforcement learning problem, in which an agent has to choose among a large number of configurations of a receiver composed of simple linear optics elements, on/off photodetectors and feedback, all within reach of current technology. The agent, though completely ignorant about the receiver, is asked to find its optimal configuration by repeating the experiment a finite number of times, based only on the information obtained from the photodetectors and on the correctness of its guess. Despite the fact that the quantum signals are not perfectly distinguishable and therefore an optimal configuration may lead to an incorrect guess (no reward), we construct agents that can both discover near-optimal configurations and achieve high real-time success rate, even in the presence of several noise-sources. This talk will be based on the work arXiv:2001.10283.
Devising efficient means to generate nonlocal correlations is important for the development of device-independent technologies. It is known that unsharp measurements (which do not fully destroy entanglement) can be used to produce multiple instances of nonlocal correlations from a single entangled system. Hence, unsharp measurements could be utilized within device-independent protocols to reduce the required number of entangled quantum systems. Here we analyse the setting in which a single party, Alice, attempts to simultaneously violate the CHSH Bell-inequality with multiple other parties, Bobs, using only a single maximally entangled qubit pair. It was conjectured in [Phys. Rev. Lett. 114, 250401 (2015)] and claimed in [Mathematics, 4(3), 48 (2016)] that when the Bobs act independently and with unbiased inputs then at most two of them can expect to violate the CHSH Bell-inequality with Alice. Here we show, contrary to the conjecture and claim, that arbitrarily many independent Bobs can expect to violate the CHSH inequality with Alice. Our proof is constructive and we provide a sequence of explicit measurement strategies that allow a strictly increasing number of Bobs to violate. Moreover, we show that these strategies can be generalized to work with a larger class of two-qubit states. In particular, we are able to show that the number of possible violations remains unbounded if Alice initially shares any pure entangled two-qubit state. We conclude with some comments on the application of our measurement scheme to device-independent tasks.
In what way do quantum prescriptions for physical entities fundamentally deviate from that of classical theories? Such a question necessitates formal notions of classicality and the ontological framework provides a vital ground for such notions. Based on an intuitive generalization of the principle of `the identity of indiscernibles', we introduce such a notion of classicality, called bounded ontological distinctness. Formulated as a principle, bounded ontological distinctness equates the distinguishability of a set of operational physical entities to the distinctness of their ontological counterparts. Employing three instances of two-dimensional quantum preparations, we demonstrate the violation of bounded ontological distinctness or excess ontological distinctness of quantum preparations. Moreover, our methodology enables the inference of tight lower bounds on the extent of excess ontological distinctness of quantum preparations. Similarly, we demonstrate excess ontological distinctness of quantum transformations, using three two-dimensional unitary transformations. However, to demonstrate excess ontological distinctness of quantum measurements, an additional assumption such as outcome determinism or bounded ontological distinctness of preparations is required. Unlike other well-known ontological principles, the operational pre-requisite for the experimental demonstration of excess ontological distinctness i.e. the distinguishability of physical entities is inherently experimentally robust. Furthermore, we show that quantum violations of other well-known ontological principles implicate quantum excess ontological distinctness. Finally, to showcase operational vitality, we introduce two distinct classes of communication tasks powered by excess ontological distinctness.
arXiv:1909.07293 [quant-ph].
Several optimal measurement schemes are closely related to discrete structures in Hilbert space We derive a complete set of five isoentangled mutually unbiased bases exists in dimension four by an explicit analytical construction [1]. Reduced density matrices of these 20 pure states form a regular dodecahedron inscribed in a sphere of radius sqrt(3/20) located inside the Bloch ball of radius 1/2. It provides an example of a mixed-state 2-design — a discrete set of quantum states such that the mean value of any quadratic function of density matrices is equal to the average over the entire set of mixed states with respect to the flat Hilbert-Schmidt measure. We establish necessary and sufficient conditions mixed-state designs need to satisfy and present general methods to construct them. We show that partial traces of a projective design in a composite Hilbert space form a mixed-state design, while decoherence of elements of a projective design yields a design in the classical probability simplex.
[1] J.Cz., D. Goyeneche, M. Grassl and K. Życzkowski, "Isoentangled Mutually Unbiased Bases, Symmetric Quantum Measurements, and Mixed-State Designs", PRL 2020, in press.
Genuinely entangled subspaces (GESs) are the class of completely entangled subspaces (CESs) that contain only genuinely multiparty entangled (GME) states. They constitute a particularly useful notion in the theory of entanglement but have also found diverse applications, for instance, in quantum error correction and cryptography. In the present talk, (i) we present different constructions of GESs and (ii) discuss their entanglement properties. (i) First, we show how GESs can be constructed in any multiparty scenario (importantly, including the qubit case) from the so--called unextendible product bases. This universal construction, although simple and efficient, is suboptimal with respect to the dimensions of GESs it achieves. Thus, in the second part, we put forward an approach to building GESs, which are optimal in this sense. This construction is based on a certain characterization of bipartite CESs and is also universal, albeit much more involved technically. Connections of this particular approach with other mathematical problems, such as spaces of matrices of equal rank and the numerical range, are discussed. (ii) We develop a quantitative study of the entanglement properties of GESs. First, we show how one can attempt to compute analytically the subspace entanglement, defined as the entanglement of the least-entangled vector from the subspace, of a GES and illustrate our method by applying it to a new class of GESs. Second, we show that certain semidefinite programming relaxations can be exploited to estimate the entanglement of a GES and apply this observation to a few classes of GESs revealing that in many cases the method provides the exact results. Finally, we study the entanglement of certain states supported on GESs, which is compared to the obtained values of the entanglement of the corresponding subspaces, and find the white-noise robustness of several GESs. In our study we use the (generalized) geometric measure as the quantifier of entanglement.
MD, R. Augusiak, "From unextendible product bases to genuinely entangled subspaces", Phys. Rev. A 98, 012313 (2018);
MD, R. Augusiak, "Entanglement of genuinely entangled subspaces and states: Exact, approximate, and numerical results", Phys. Rev. A 100, 062318 (2019);
MD, R. Augusiak, "An approach to constructing genuinely entangled subspaces of maximal dimension", arXiv:1912.07536.
An overview of the most recent advances in theoretical methods of quantum metrology will be presented, that in particular benefit from the quantum information related concepts such as quantum error-correction or matrix product states formalism. The theory developed allows to determine whether the Heisenberg scaling of precision is possible for a quantum sensor subject to a general Markovian noise. The theory takes into account all the possible quantum strategies, including entangling the sensor with ancillary systems, adaptive strategies such as e.g. quantum error correction protocols. Moreover, effective algorithms, based on the matrix product states/matrix product operator formalism, are developed that allow to identify the optimal metrological protocols in presence of noise (also correlated noise) in the limit of large number of probes, inaccessible by the state-of-the-art methods. These results are highly relevant for modern developments of quantum enhanced sensing protocols, including NV-center magnetometry, squeezed states enhanced optical and atomic interferometry or stabilization protocols for atomic clocks.
[1] R. Demkowicz-Dobrzański, J. Czajkowski, P. Sekatski, Adaptive quantum metrology under general Markovian noise, Phys. Rev. X 7, 041009 (2017).
[2] K. Chabuda, J. Dziarmaga, T. Osborne, R. Demkowicz-Dobrzański, Tensor Networks for Quantum Metrology, Nat. Commun. 11, 250 (2020).
Mutually unbiased bases (MUBs) correspond to quantum measurements with myriad applications in quantum information processing. Among other tasks, they are optimal for quantum state determination, quantum key distribution, secret sharing and quantum random access coding. The underlying property that makes them so powerful is complementarity: if the outcome of one of the measurements on a quantum state is certain, then the outcome of the another measurement, unbiased to the first one, is completely random. This is an operational characterisation of MUB pairs, however, the full definition also includes the Hilbert space dimension, which is not an observable quantity. For this reason, the usual MUB definition is not fully operational, and consequently, it is unclear how to characterise MUB measurements in a device-independent fashion. In our work, we introduce a fully operational definition of generalised MUBs by removing the dimension assumption from the above definition. We call measurements satisfying this condition mutually unbiased measurements (MUMs), and provide a simple algebraic characterisation of MUMs in terms of the measurement operators. Naturally, d-outcome MUMs have similar characteristics to those of d-dimensional MUBs. They exhibit precisely the same behaviour in the context of entropic uncertainty relations, and they are not jointly measurable to the same extent. Indeed, we show that 2- and 3-outcome MUMs are always isomorphic to a direct sum of 2- and 3-dimensional MUBs. However, we show explicit examples of MUMs with outcome numbers 4 and 5 that cannot be written as direct sums of MUBs, and there does not even exist a completely positive unital map that would map them to 4- and 5-dimensional MUBs. This shows that MUMs are a strictly more general concept than MUBs. Having this device-independent characterisation, we introduce a family of Bell inequalities, parametrised by an integer d >= 2, that are maximally violated by d-outcome MUMs. Moreover, the maximum violation is achieved by a unique probability point, which must therefore be extremal in the set of quantum correlations. This has two implications: first, the maximal violation of these Bell inequalities certifies a pair of d-outcome MUMs. Second, since not even all pairs of MUBs are unique up to a unitary transformation and complex conjugation, this certification is not a self-test. That is, the maximal violation of the Bell inequality does not certify the measurements up to local unitaries and extra degrees of freedom. Up to our knowledge, this is the first example of an extremal point of the set of quantum correlations that is not a self-test. This raises the question: given an extremal point of the set of quantum correlations, to what extent does it determine the a priori unknown quantum realisation?
We study entropy production in nanoscale devices, which are coupled to finite heat baths. This situation is of growing experimental relevance, but most theoretical approaches rely on a formulation of the second law valid only for infinite baths. We fix this problem by pointing out that already Clausius' paper from 1865 contains an adequate formulation of the second law for finite heat baths, which can be also rigorously derived from a microscopic quantum description. This Clausius' inequality shows that nonequilibrium processes are less irreversible than previously thought. We use it to correctly extend Landauer's principle to finite baths and we demonstrate that any heat engine in contact with finite baths has a higher efficiency than previously thought. Importantly, our results are easy to study, requiring only the knowledge of the average bath energy.
We present a fine-structure entanglement classification under stochastic local operation and classical communication (SLOCC) for multiqubit pure states. To this end, we employ specific algebraic-geometry tools that are SLOCC invariants, secant varieties, to show that for $n$-qubit systems there are $\lceil\frac{2^n}{n+1}\rceil$ entanglement families. By using another invariant, $\ell$-multilinear ranks, each family can be further split into a finite number of subfamilies. Not only does this method facilitate the classification of multipartite entanglement, but it also turns out operationally meaningful as it quantifies entanglement as a resource.
Many quantum information protocols require the implementation of random unitaries. Be cause it takes exponential resources to produce Haar-random unitaries drawn from the full n-qubit group, one often resorts to t-designs. Unitary t-designs mimic the Haar-measure up to t-th moments. It is known that Clifford operations can implement at most 3-designs. In this work, we quantify the non-Clifford resources required to break this barrier. We find that it suffices to inject O(t^4 log^2(t) log(1/\)) many non-Clifford gates into a polynomial-depth random Clifford circuit to obtain an \-approximate t-design. Strikingly, the number of non-Clifford gates required is independent of the system size – asymptotically, the density of non-Clifford gates is allowed to tend to zero. We also derive novel bounds on the convergence time of random Clifford circuits to the t-th moment of the uniform distribution on the Clifford group. Our proofs exploit a recently developed variant of Schur-Weyl duality for the Clifford group, as well as bounds on restricted spectral gaps of averaging operators.
Private randomness is one of the most important resources in cryptography. While quantum-state-derived private randomness in scenario with single honest party has been quite early recognized as a practical and reliable source of secure random bits, only recently in [D. Yang, K. Horodecki and A. Winter “Distributed private randomness distillation” Phys. Rev. Lett. 123, 170501 2019] resource theory of distributed private randomness has been established. In the following article, it is shown how two mutually trusting parties can obtain private randomness from asymptotically many copies of shared state in several natural scenarios, including a scenario with free local noise and free communication between parties. An upper bound on distillable private randomness is provided in the case when no local noise is allowed but communication is allowed. This upper bound leads to an appealing complementarity inequality between private key and private randomness. It is shown also, that there does not exist bound private randomness. The problem of optimal sending private randomness via quantum channel in scenario with a single sender and receiver is also solved. This result gives as a byproduct an operational meaning to the co-called reversed coherent information. On the other hand, in [K. Horodecki, R. P. Kostecki, R. Salazar and M. Studziński “Limitations for private randomness repeaters” in preparation] concentrates on scenario with three honest parties and provides quantitative no-go. It is shown that the private randomness content of mixed states with positive partial transposition cannot always be distributed in a manner analogous to entanglement swapping. To prove this, repeated private randomness is defined, and an upper bound on its amount is derived in terms of relative entropy distance from the maximally mixed state. For states with maximally mixed subsystem, it reads (twice) the mutual information of a partially transposed state. This upper-bound enabled to demonstrate extreme gap between the localizable and repeated private randomness considering separable Werner states. E.g., symmetric Werner state has almost 1 bit of localizable randomness while its repeated randomness approaches 0 with a growing local dimension. An upper-bound for states with negative partial transposition is also shown, however, for a restricted class of operations. These results prove analogy between private randomness and private key (see [S. Bäuml, M. Christandl, K. Horodecki and A. Winter, “Limitations for private randomness repeaters”, Nat. Comm. 6:6908, 2015]). This relationship turns out to be much deeper. As the final result it is shown that the set of states containing ideal private randomness (so-called independent states), contain as a proper subset the set of private states (the ones containing ideal secure key).
The concept of self-testing (or rigidity) refers to the fact that for certain Bell inequalities the maximal violation can be achieved in an essentially unique manner. In this work we present a family of Bell inequalities which are maximally violated by multiple inequivalent quantum realisations. We completely characterise the quantum realisations achieving the maximal violation and we show that each of them requires a maximally entangled state of two qubits. This implies the existence of a new, weak form of self-testing in which the maximal violation allows us to identify the state, but does not fully determine the measurements. From the geometric point of view the set of probability points that saturate the quantum bound is a line segment. We then focus on a particular member of the family and show that the self-testing statement is robust, i.e. that observing a non-maximal violation allows us to make a quantitative statement about the unknown state. To achieve this we present a new construction of extraction channels and analyse their performance. For completeness we provide two independent approaches: analytical and numerical. The noise robustness, i.e. the amount of white noise at which the bound becomes trivial, of the analytical bound is rather small (~0.06%), but the numerical method takes us into an experimentally-relevant regime (~5%). We conclude by investigating the amount of randomness that can be certified using these Bell violations. Perhaps surprisingly, we find that the qualitative behaviour resembles the behaviour of rigid inequalities like the Clauser-Horne-Shimony-Holt inequality. This shows that rigidity is not strictly necessary for device-independent applications.
We introduce and analyse the problem of encoding classical information into different resources of a quantum state. More precisely, we consider a general class of communication scenarios characterised by encoding operations that commute with a unique resource destroying map and leave free states invariant. Our motivating example is given by encoding information into coherences of a quantum system with respect to a fixed basis (with unitaries diagonal in that basis as encodings and the decoherence channel as a resource destroying map), but the generality of the framework allows us to explore applications ranging from super-dense coding to thermodynamics. For any state, we find that the number of messages that can be encoded into it using such operations in a one-shot scenario is upper-bounded in terms of the information spectrum relative entropy between the given state and its version with erased resources. Furthermore, if the resource destroying map is a twirling channel over some unitary group, we find matching one-shot lower-bounds as well. In the asymptotic setting where we encode into many copies of the resource state, our bounds yield an operational interpretation of resource monotones such as the relative entropy of coherence and its corresponding relative entropy variance.
Beyond future practical applications in quantum information processing, quantum networks open interesting fundamental perspectives, notably that of quantum correlations. The focus of my talk will be to investigate quantum correlations in networks from the perspective of the underlying quantum states and their entanglement. We address the questions of which states can be prepared in the so-called triangle network. This network consists of three nodes that are connected pairwise by three sources. We derive necessary criteria for a state to be preparable in such a network, considering both the cases where the sources are statistically independent and classically correlated. This shows that the network structure imposes strong and non-trivial constraints on the set of preparable states, fundamentally different from the standard characterisation of multipartite quantum entanglement.
It is imperative to minimize the resources needed to implement quantum operations on existing near-term quantum devices. With this in mind, we propose a scheme to implement an arbitrary general quantum measurement, also known as Positive Operator Valued Measures (POVM) in dimension d using only classical resources and a single ancillary qubit. The proposed method is based on the probabilistic implementation of d outcome measurements which is followed by postselection on some of the received outcomes. This is an extension of an earlier work that required dichotomic measurements, no additional ancillary qubits, and whose success probability scaled like 1/d. The success probability of our scheme depends on the operator norms of the coarse-grained POVM effects. Significantly, we show that for typical Haar random rank-one POVMs with at most d^2 outcomes, the success probability of our simulation scheme does not go to zero with the dimension of the system. We conjecture that this is true for all POVMs in dimension d. This is supported by numerical computations suggesting constant success probability for SIC-POVMs and (non-symmetric) IC-POVMs in dimensions up to 1299. Additionally, for the gate noise model used in the recent demonstration of quantum computational advantage by Google, we prove that for typical Haar random POVMs noise compounding in circuits required by our scheme is substantially lower than in the scheme that directly uses Naimark’s dilation theorem.
Quantum hypothesis testing, firstly introduced for quantum state discrimination, has found interesting applications to quantum channels and dynamics. An example of this is quantum reading, which constitute a quantum channel discrimination protocol with potential technological applications. Here we employ a variant of quantum reading with the aim to discriminate the dynamics of an optomechanical system subjected to unknown decoherence mechanisms. The set up consists of two cavities, one equipped with a movable mirror, in which we pump a two-mode squeezed vacuum light field. The two lights mode interact with the two cavities and are subsequently combined together by a beam splitter. By measuring the output modes, two Gaussian random variables can be obtained whose variance depends on the presence or absence of the unknown decoherence mechanism - constituting the two possible hypotheses on the dynamics. A standard χ^2 hypothesis testing is then applied.
Information Causality is a physical principle which states that the amount of randomly accessible data over a classical communication channel cannot exceed its capacity, even if the sender and the receiver have access to a source of nonlocal correlations. This principle can be used to bound nonlocality of quantum mechanics without resorting to its full formalism, with a notable example of reproducing the Tsirelson bound of the Clauser-Horne-Shimony-Holt inequality. Despite being promising, the latter result found little generalization to other Bell inequalities because of the limitations imposed by the process of concatenation, in which several nonsignaling resources are put together to produce tighter bounds. In talk work I show that concatenation can be successfully replaced by limits on the capacity of the communication channel. This allows us to rederive and in some cases significantly improve all the existing results in a simpler manner, as well as apply the Information Causality principle to previously unapproachable Bell inequalities.
We study the problem of estimating distance measures between unknown quantum states, given M and N copies of each type. In the case of pure states, any distance measure can be expressed in terms of the overlap between the states. Overlap estimation is a fundamental primitive in quantum information processing that is commonly accomplished from the outcomes of N swap-tests, a joint measurement on one copy of each type whose outcome probability is a linear function of the overlap. We show that an optimal estimate can be obtained by allowing for general collective measurements on all copies, in particular using weak Schur sampling. We derive the statistics of the optimal measurement and compute the optimal mean square error in the asymptotic pointwise and finite Bayesian estimation settings. Besides, we consider two strategies relying on the estimation of one or both the states, and show that, although they are suboptimal, they outperform the swap test. In particular, the swap test is extremely inefficient for small values of the overlap, which become exponentially more likely as the dimension increases. We also show that the optimal measurement is less invasive than the swap test and study the robustness to depolarizing noise for qubit states. In the case of mixed states, distance measures cannot be expressed in terms of a single quantity. With the swap test, one can estimate the Hilbert-Schmidt distance and the overlap. We show that simultaneous weak Schur sampling can instead estimate the spectra of each of the states and of their convex combination, with weight given by M/(M+N) and N/(M+N). This estimate is much more informative than the outcome of swap tests, as it can be used to estimate also the Holevo quantity of an ensemble constructed with the states, which is also a proper measure of distinguishability. We discuss how to obtain bounds on the trace distance using these estimates, and apply these bounds to distance testing, improving sample complexity with respect to the swap test.
Detecting entanglement between a qubit and its environment is known to be complicated [1]. To simplify the issue, we study the class of Hamiltonians that describe the interacting system in such a way that the resulting evolution of the qubit is of pure dephasing type. This leads to some loss of generality, but the pure dephasing Hamiltonian describes the dominant decohering mechanism for many types of qubits. When both subsystems are initially in pure states, such an interaction always leads to the creation of entanglement between the two [2]. We have shown that while the creation of qubit-environment entanglement in the pure dephasing case is possible when the environment is initially in a mixed state, its occurrence is by no means guaranteed [3]. We have also shown that the evolution of the environment conditional on the qubit state is qualitatively different in entangling and non-entangling scenarios [3]. This serves as a basis for possible detection of qubit-environment entanglement via measurements on only one of these subsystems. Obviously, such entanglement could be straightforwardly determined by measurements on the environment, but such measurements are rarely accessible.
Here, we propose two schemes for the detection of qubit-environment entanglement which require operations and measurements on the qubit subsystem alone [4,5]. One relies on the fact that only for entangling evolutions does the environment behave differently in the presence of different qubit states. So only if an evolution is entangling can there be a difference in the evolution of qubit coherence when the environment was allowed to relax in the presence of either qubit pointer states prior to the excitation of a superposition state [4]. The scheme is indirect, and tests the ability of entanglement to be created for a given interaction and initial state of the environment. The second scheme [5] is more direct and relies on information about the system state being transferred into the environment during the joint qubit-environment evolution in a more robust way, than for non-entangling evolutions. Both schemes are implementable using the current experimental state-of-the-art for solid state qubits.
References:
[1] B. Kraus, J. I. Cirac, S. Karnas, and M. Lewenstein, Separability in 2×N composite quantum systems, Phys. Rev. A 61, 062302 (2000).
[2] R. Horodecki, P. Horodecki, M. Horodecki, and K. Horodecki, Quantum entanglement, Rev. Mod. Phys. 81, 865 (2009).
[3] K. Roszak and L. Cywiński, Characterization and measurement of qubit-environment- entanglement generation during pure dephasing, Phys. Rev. A 92, 032310 (2015).
[4] K. Roszak, D. Kwiatkowski and Ł. Cywiński, How to detect qubit-environment entanglement generated during qubit dephasing, Phys. Rev. A 100, 022318 (2019).
[5] B. Rzepkowski and K. Roszak, A scheme for direct detection of qubit-environment entanglement generated during qubit pure dephasing, in preparation.
The enhanced realignment criterion originates in a family of entanglement witnesses with a non-linear improvement. We show a family of entanglement witnesses (hence positive maps as well) equivalent to this criterion. As the enhanced realignment criterion is stronger than many known criteria using correlation tensor and is in fact a strongest operational simplification of correlation matrix criterion, the new family of maps gives a new insight into the geometry of set of entanglement witnesses and separable states.
Wave-particle duality is one of the basic features of quantum mechanics, giving rise to the use of complex numbers in describing states of quantum systems, their dynamics, and interaction. Since the inception of quantum theory, it has been debated whether complex numbers are actually essential, or whether an alternative consistent formulation is possible using real numbers only. We attack this long-standing problem both theoretically and experimentally, using the powerful tools of quantum resource theories. We show that - under reasonable assumptions - quantum states are easier to create and manipulate if they only have real elements. This gives an operational meaning to the resource theory of imaginarity, for which we identify and answer several important questions. This includes the state-conversion problem for all qubit states and all pure states of any dimension, and the approximate imaginarity distillation for all quantum states. As an application, we show that imaginarity plays a crucial role for state discrimination: there exist quantum states which can be perfectly distinguished via local operations and classical communication, but which cannot be distinguished with any nonzero probability if one of the parties has no access to imaginarity. This phenomenon proves that complex numbers are an indispensable part of quantum mechanics, and we also demonstrate it experimentally with linear optics.
Self-testing protocols are methods to determine the presence of shared entangled states in a device independent scenario, where no assumptions on the measurements involved in the protocol are made. A particular type of self-testing protocol, called parallel self-testing, can certify the presence of copies of a state, however such protocols typically suffer from the problem of requiring a number of measurements that increases with respect to the number of copies one aims to certify. Here we propose a procedure to transform single-copy self-testing protocols into a procedure that certifies the tensor product of an arbitrary number of (not necessarily equal) quantum states, without increasing the number of parties or measurement choices. Moreover, we prove that self-testing protocols that certify a state and rank-one measurements can always be parallelized to certify many copies of the state. Our results have immediate applications for unbounded randomness expansion.
Recent theoretical and experimental studies have shown significance of quantum information scrambling (i.e. a spread of quantum information over a system degrees of freedom) for problems encountered in high-energy physics, quantum information, and condensed matter. Due to complexity of quantum many-body systems it is plausible that new developments in this field will be achieved by experimental explorations. Since noise effects are inevitably present in experimental implementations, a better theoretical understanding of quantum information scrambling in systems affected by noise is needed. To address this problem I will discuss indicators of quantum scrambling -- out-of-time-ordered correlation functions (OTOCs) in open quantum systems. As most experimental protocols for measuring OTOCs are based on backward time evolution, two possible scenarios of joint system-environment dynamics reversal will be considered: In the first one the evolution of the environment is reversed, whereas in the second it is not. Derivation of general formulas for OTOCs in those cases as well as a study of the spin chain model coupled to the environment of harmonic oscillators will be presented. In the latter case I derive expressions for open systems OTOCs in terms of Feynman-Vernon influence functional. Subsequently, assuming that dephasing dominates over dissipation, I provide bounds on open system OTOCs and illustrate them for a spectral density known from the spin-boson problem. In addition to being significant for quantum information scrambling, this results also advance understating of decoherence in processes involving backward time evolution. Based on Phys. Rev. A 100, 062106.
We investigate the relation between a refined version of Leggett and Garg conditions for macrorealism, namely the no-signaling-in-time (NSIT) conditions, and the quantum mechanical notion of nondisturbance between measurements. We show that all NSIT conditions are satisfied for any state preparation if and only if certain compatibility criteria on the state-update rules relative to the measurements, i.e. quantum instruments, are met. The systematic treatment of NSIT conditions supported by structural results on nondisturbance provides a unified framework for the discussion of the clumsiness loophole. This extends previous approaches and allows for a tightening of the loophole via a hierarchy of tests able to disprove a larger class of macrorealist theories. Finally, we discuss perspectives for a resource theory of quantum mechanical disturbance related to violations of macrorealism.
Entanglement is a fundamental property of quantum systems and one of the key dividing factors between the quantum and the classical worlds. In particular, multi-dimensional entanglement has recently become one of the main characteristics of entangled states to be investigated, since this can enable more efficient implementations of quantum protocols that would otherwise be beyond our technological capabilities. One of the most widespread methods to determine if a quantum state is entangled, or to quantify its entanglement dimensionality, is by measuring its fidelity with respect to a pure state. In our work, we find a large class of states whose (D-dimensional) entanglement cannot be detected in this manner: we call them (D-)unfaithful. We show that not only are most random bipartite states both entangled and unfaithful, but so are almost all pure entangled states when mixed with the right amount of white noise. This is particularly relevant, since such states are ordinarily used in experiments. We furthermore analyze properties of unfaithful states, in particular, we find that faithfulness can be self-activated, i.e., there exist instances of unfaithful states whose tensor powers are faithful. In addition, to explore the entanglement dimensionality of D-unfaithful states, we introduce a complete hierarchy of semidefinite programming relaxations that fully characterizes the set of states of Schmidt rank at most D. This systematic method to construct witnesses for certifying D-dimensional entanglement may prove useful for the quantification of entanglement in noisy experimental setups.
Entropy production plays a fundamental role in both classical and quantum thermodynamics: by being related to the second law at a fundamental level, itenables to identify and quantify the irreversibility of physical phenomena. This issue is of great interest in the case of an open quantum system, where we study the reduced dynamics of a system interacting with its environment. The theoretical description of this scenario is challenging, especially if one goes beyond the standard Born-Markov approximation and considers the possibility of a back-flow of information from the environment to the system. In our work, we focus on an open quantum system composed by two uncoupled harmonic oscillators undergoing non-Markovian dynamics. This minimal, yet insightful, setting allows us to investigate – both numerically and analytically – how different preparations of the initial state (separable, entangled, quantum correlated without being entangled) affect the entropy production rate. This sheds light on the interplay between initial correlations and thermodynamic irreversibility in a scenario of strong experimental relevance.