Entropic measures of joint uncertainty: effects of lack of majorization

We compute R\'enyi entropies for the statistics of a noisy simultaneous observation of two complementary observables in two-dimensional quantum systems. The relative amount of uncertainty between two states depends on the uncertainty measure used. These results are not reproduced by a more standard duality relation. We show that these behaviors are consistent with the lack of majorization relation between the corresponding statistics.


I. INTRODUCTION
Historically, the joint uncertainty of pairs of observables has been mostly addressed in terms of the product of their variances.Nevertheless, there are situations where such formulation is not satisfactory enough [1], thus alternative approaches have been proposed, mainly in terms of diverse entropic measures [2].In this work we consider in particular Rényi entropies [3] and the corresponding uncertainty relations, for two complementary observables.
Previous works [4,5] have shown that entropic uncertainty relations may lead to surprising results.For example, the relative amount of uncertainty between two observables depends on the uncertainty measure used.This is an intriguing result that might jeopardize the usefulness of entropy as an uncertainty measure.In this regard, the aim of this work is twofold.On the one hand, we show that these unexpected behaviors can be fully and satisfactorily explained in terms of the lack of majorization relation between the corresponding statistics.This connection holds because entropic measures are monotone with respect to majorization [6][7][8][9].Thus, such surprising entropic results are not tricky features of entropic measures, but may have a deeper meaning that is actually overlooked by more popular measures of uncertainty or complementarity.On the other hand, we extend the application of entropic measures to the statistics of a simultaneous joint observation of two complementary observables in the same system realization [10][11][12].This setting of complementarity in practice provides a rich arena to examine the interplay between entropic measures and majorization.The simultaneous measurement provides a true joint classical-like probability distribution that enables alternative assessments of joint uncertainty, different from the ones given by the product of individual statistics, either intrinsic or of operational origin.
For simplicity we address these issues in the sim-plest quantum system described by a state in a twodimensional Hilbert space.This comprises very relevant practical situations such as the path-interference complementarity in two-beam interference experiments.This allows us to contrast the performance of entropic measures with more standard assessments of complementarity [13][14][15].The paper is organized as follows: in Sec.II we introduce the discussion on statistics of simultaneous measurements for spin observables.Sec.III exhibits results for entropic quantities that appear to be paradoxical, and an explanation for that behavior is given in Sec.IV.In Sec.V, a duality relation is analyzed.Finally, some concluding remarks are outlined in Sec.VI.

II. STATISTICS AND SIMULTANEOUS MEASUREMENTS
We consider two complementary observables represented by the Pauli spin matrices σ x and σ z .In practical terms they may represent phase and path, respectively, in two-beam interference experiments.The system state is described by a density matrix operator acting on the Hilbert space H S that in Bloch representation acquires the form ρ = 1 2 (I + s • σ) , where I is the identity matrix, σ represents the three Pauli matrices, and s = Tr(ρ σ) is a three-dimensional Bloch vector with |s| ≤ 1.When |s| = 1 the state is pure, ρ = |ψ ψ|, with where |0 and |1 are the eigenstates of σ z with corresponding eigenvalues +1 and −1, and s x = |s| sin θ cos ϕ, s y = |s| sin θ sin ϕ, and s z = |s| cos θ.Since we focus on the observables σ x and σ z , we consider for simplicity a one-parameter family S of pure states with s laying in the xz plane, this is s y = 0 and ϕ = 0.In such a case, the intrinsic statistics for the observables σ x and σ z are with j = ±1 and k = ±1.
The simultaneous measurement of noncommuting observables requires involving auxiliary degrees of freedom, usually referred to as apparatus.In our case we consider an apparatus described by a two-dimensional Hilbert space H A .The measurement performed in H A addresses that of σ z , while σ x is measured directly on the system space H S .The system-apparatus coupling transferring information about σ z from the system to the apparatus is arranged via the following unitary transformation acting on where U ± are unitary operators acting solely on H A .For simplicity the initial state of the apparatus, |a ∈ H A , is assumed to be pure so that the system-apparatus coupling leads to where the states |a ± = U ± |a ∈ H A are not orthogonal in general, with cos δ = a + |a − assumed to be a positive real number without loss of generality.The measurement in H A introducing minimum uncertainty is given by projection on the orthogonal vectors |b ± (see Fig. 1, and Ref. [16]): where φ = π 2 − δ.The joint statistics for the simultaneous measurement of σ x acting on H S and of σ z addressed by the orthogonal where j = ±1 represents the outcomes of the σ x measurement, and k = ±1 those of the σ z measurement.
The marginal statistics for both observables are pX j = 1 2 (1+j s x cos δ) and pZ k = 1 2 (1+k s z sin δ). ( 7) When contrasted with the intrinsic statistics in (2) we get that the observation of σ x is exact for δ = 0, while the observation of σ z is exact for δ = π 2 .For δ = π 4 , the extra uncertainty introduced by the unsharp character of the simultaneous observation is balanced between observables.

III. ENTROPIC UNCERTAINTY ASSESSMENTS
We focus here on Rényi entropies to quantify the uncertainty related to a probability distribution.Let p = {p i } N i=1 be the statistics of some observable with N outcomes, then where α ≥ 0 is the so-called entropic index.The limiting case α → 1 is well defined and gives the Shannon entropy R 1 ≡ − i p i ln p i .The minimum entropy R α = 0 is achieved when all the probability is concentrated in a single outcome: p i = δ i,i for some i , whereas maximum entropy R α = ln N occurs when all the outcomes are equally probable: p i = 1 N for all i.We notice that the operational entropies of the marginal statistics (7) are always larger than for the intrinsic ones (2) : R α (p X ) ≥ R α (p X ), and similarly for the probability distributions related to σ z .Comparing Eqs. ( 2) and ( 7), we can appreciate that the observation amounts to a reduction of the absolute values of the x-and z-components of the Bloch vector, and Rényi entropies are decreasing functions of |s i |.Their only extreme holds at the uniform distribution, which is an absolute maximum.
Other relevant property of the Rényi entropies ( 8) is additivity, that is, for the product of two statistics p and q: Following Refs.[5], it has been seen that the corresponding entropic uncertainty relations for the intrinsic statistics (2) are: where α I ≈ 1.43.There are two sets of states that compete to be the minimum uncertainty states (as well as those of maximum uncertainty within the set S) depending on the value of entropic index used.We refer to them as extreme and intermediate states: • Extreme states are eigenstates of σ x or σ z .These are pure states with θ = m π 2 for integer m in Eq. ( 1), then s x = ±1, s z = 0 or s x = 0, s z = ±1.They present full certainty for one of the observables, and complete uncertainty for the other one.
• Intermediate states are eigenstates of σ x ± σ z .
These are the pure states with θ = (2m + 1) π 4 for integer m in Eq. ( 1), then s x = ±s z and They have essentially the same statistics for both complementary observables so they might be considered as a finite-dimensional counterpart of the Glauber coherent states.

A. Joint uncertainty of σx and σz
In order to assess the joint uncertainty of σ x and σ z , we compute the Rényi entropies of the joint statistics pX,Z ( 6) and of the product of marginal statistics pX pZ (7), for any given value of the entropic index α, as functions of θ within the set S of states (1).These quantities are calculated for balanced measurement, δ = π 4 .We also compute the Rényi entropies of the product of intrinsic statistics p X p Z (2).For the sake of clarity and to simplify comparisons, we mostly focus on normalized quantities of the form where R α,max and R α,min are the maximum and minimum values of R α , respectively, within the set S.
Figure 2(a) shows R norm α (p X,Z ) for α = 1 and 2.5 as functions of θ.We observe that for α = 1 the minimum uncertainty states are the intermediate states θ = π 4 , whereas for α = 2.5 the minimum uncertainty states are the extreme states θ = 0, π 2 .The opposite happens for the product of marginal statistics R α (p X pZ ), as illustrated in Fig. 2(b), that is, for α = 1 the minimum uncertainty states are the extreme states θ = 0, π 2 , whereas for α = 2.5 the minimum uncertainty states are the intermediate states θ = π 4 .This result coincides with the conclusions derived from the intrinsic entropies R α (p X p Z ) as shown in Fig. 2(c) (see also Refs.[5]).

B. Intermediate versus extreme states
Now, we analyze the intermediate-extreme competition for minimum uncertainty.In Fig. 3, we plot the difference of Rényi entropies between intermediate and extreme states of the joint, product of marginals and product of intrinsic statistics, always for balanced mea- as functions of the entropic index α.∆R α > 0 implies that extreme states are of minimum uncertainty while, on the contrary, ∆R α < 0 implies that intermediate states are the minimum uncertainty ones.
We observe that, for the joint statistics, ∆R α (p X,Z ) is positive for α ∈ (2, 3), thus there are two critical values of the entropic index at which the minimizer changes.On FIG. 3. Differences between the entropies for intermediate and extreme states, Eqs. ( 12), for the joint statistics ∆Rα(p X,Z ) (solid line), the product of marginals ∆Rα(p X pZ ) (dashed line), and the product of intrinsic statistics ∆Rα(p X p Z ) (dotted line), as functions of α for balanced joint measurement δ = π 4 .A positive value of ∆Rα means that extreme states give the minimum uncertainty, while a negative value corresponds to minimizing intermediate states.
the other hand, for the product of marginal and intrinsic statistics, ∆R α (p X pZ ) and ∆R α (p X p Z ) change their sign at one critical value: α M ≈ 1.34 in the former case and α I ≈ 1.43 in the latter; in both situations, the relative difference changes from positive to negative with increasing entropy index.

C. Entropy of the joint statistics versus entropy of the product of marginal and of intrinsic statistics
We report other behaviors regarding entropy-related quantities.In particular, we compute the Rényi entropy of the joint statistics pX,Z , of the product of its marginals pX pZ and of the product of intrinsic statics p X p Z .The differences are plotted in Fig. 4 as functions of α for θ = π 4 (intermediate states).The fact that the difference of entropies ( 13) is negative for some entropic indices as can be seen in the solid curve plotted in Fig. 4 is not surprising, due to it is known that the Rényi entropies do not satisfy the subadditivity property [17].On the other hand, the negativity of the difference (14) as showing in the dotted curve plotted in Fig. 4 reveals the paradoxical result that the entropy of the joint distribution pX,Z can be larger than the entropy of the product of the intrinsic distributions p X p Z .Differences between the entropy of joint statistics and of the product of marginals have also been found in Ref. [12] for Shannon entropy.We will se in Sec.IV that these behaviors can be interpreted in a more general framework given by the majorization theory.

IV. MAJORIZATION
Entropic uncertainty relations have a deep connection with the majorization of statistical distributions [7,8], which has been already applied to examine uncertainty of thermal states [6].(This is closely related to the idea of mixing character in Ref. [9].)In this section we show that majorization is compatible with the surprising results found above.
The statistics p majorizes the statistics p , denoted as p ≺ p, if after forming with p an N -dimensional vector with components in decreasing order p 1 ≥ p 2 ≥ . . .≥ p N and similarly with p , we get for any α ≥ 0, that is Rényi entropies are order-preserving (or Schur-concave) functions.However, majorization is a relation of partial order, so that there are distributions that cannot be compared.Next we see that this is the case for the extreme and intermediate states, and that the contradictions reported above hold when the two statistics cannot be compared by majorization.

A. Lack of majorization relation between the statistics
Let us call λ = pX,Z ex and µ = pX,Z in the fourdimensional vectors obtained by arranging the values of pX,Z in decreasing order, for extreme and intermediate states, respectively: Thus while µ 1 > λ 1 , we have µ 1 + µ 2 < λ 1 + λ 2 , so that neither λ ≺ µ nor µ ≺ λ.Similar results happen if one considers the extreme and intermediate states of the products of marginals and intrinsic statistics.This explains the change of sign of ∆R α reported in Fig. 3.
Moreover, majorization order relation is absent for the joint and product statistics of the same intermediate state, explaining the change of sign in the solid curve reported in Fig. 4.More specifically the decreasinglyordered vector ν associated to the product statistics pX pZ for the intermediate states is (9,3,3,1) , so that while ν 1 > µ 1 and ν 1 + ν 2 = µ 1 + µ 2 , we also get This situation can be extended also to all states within the subset S except the extreme states for which pX,Z = pX pZ .
Similarly, there is no majorization order relation between µ and the corresponding decreasingly-ordered vector ξ associated to the product of intrinsic statistics p X p Z for the intermediate states: Thus ξ 1 > µ 1 and ξ 1 +ξ 2 > µ 1 +µ 2 , whereas ξ 1 +ξ 2 +ξ 3 < µ 1 + µ 2 + µ 3 , so that neither ξ ≺ µ nor µ ≺ ξ.This explains the change of sign of the dotted curve plotted in Fig. 4. Therefore, the striking situations reported in Figs. 3  and 4 are of a rather fundamental character and not an artifact of the particular measures of uncertainty employed.

B. Majorization uncertainty relations
Majorization provides a rather neat form for uncertainty relations in terms of suitable constant vectors majorizing the statistics of two or more observables associated for every system state [7,8].In our case these are pX,Z ≺ ω, pX pZ ≺ ω and p X p Z ≺ ω, (15) where in the left-hand sides we understand the result of arranging the corresponding statistics into fourdimensional vectors, and ω, ω , and ω are constant vectors.By readily applying the procedure outlined in Ref. [8] we get Then the expected uncertainty relations holds, for example for the joint distribution, as R α (p X,Z ) ≥ R α (ω).
It is worth noting that there is a definite majorization relation between ω and the other two vectors, that is ω ≺ ω and ω ≺ ω.
These two relations are quite natural expressing that the uncertainty lower bound is larger for the statistics derived from simultaneous joint measurement, either pX,Z or pX pZ , than for the exact intrinsic statistics.This is the majorization relation counterpart of the well-known result that the variance-based lower bound for operational position-momentum uncertainty is at least four times the intrinsic one [10].However, there is no majorization relation between ω and ω since while ω 1 > ω1 , we have also ω 1 + ω 2 + ω 3 < ω1 + ω2 + ω3 .This might be related with the general lack of majorization relation between pX,Z and pX pZ found above and illustrated in Fig. 4.
Finally let us show that there is no system state ρ leading to statistics equating the distribution (16).To this end we can use Eqs.( 2), (6), and (7) to determine the values of s x and s z that would lead to pX,Z , pX pZ and p X p Z , equating ω, ω and ω, respectively.Without loss of generality we consider s x and s z to be positive.For the joint statistics, the null component in ω implies that s x = s z = 1 √ 2 .Thus according to Eq. ( 6) the other values for pX,Z should be 1 2 , 1 4 and 1 4 , which are not equal to the corresponding values in ω.For the product of marginals pX pZ the sum of the maximum and minimum of ω imply that s x = s z = 1 √ 2 .Thus after Eq. ( 7) the other values for pX,Z should be both 3  16 , which are not equal to the corresponding values in ω .For the intrinsic statistics, we have that the two zeros of ω imply that either s x = 0 or s z = 0.In any case Eq. ( 2) would then imply that the other p X p Z values should be both 1  2 , which is different from the corresponding values in ω.

V. DUALITY RELATION
Following the approach in Ref. [15] we may compare these entropic results with some other assessments of joint uncertainty or complementarity.Among them, one of the most studied is the duality relation between path knowledge and visibility of interference in a Mach-Zehnder interferometric setting [13].This fits with our approach by regarding |± as representing the internal paths of a Mach-Zehnder interferometer, while |a ± represent the states of the apparatus monitoring the path followed by the interfering particle.
One of most used duality expression is [14] where is the so-called distinguishability.Regarding our particular case where the system and apparatus are in pure states, we have ρ This represents the knowledge available about the path followed by the particle, which is grosso modo inversely proportional to path uncertainty.On the other hand, the interference is assessed by the standard fringe visibility V obtained when the relative phase ϕ is varied in Eq. ( 1), This roughly speaking represents the phase uncertainty, the counterpart of the uncertainty of σ x in our approach.Note that in these duality relations path and interference are not treated symmetrically, contrary to the approach developed here in terms of entropic measures.After Eqs. ( 19) and ( 20) we can appreciate that D 2 + V 2 = 1 whenever the system and apparatus are in pure states.This is to say that this duality relation is blind to the differences between extreme and intermediate states, in sharp contrast to the more complete picture provided by the entropic measures with equal entropic indices.This was already shown in Ref. [15] regarding its intrinsic counterpart P 2 + V 2 ≤ 1, where P = |w + − w − | is the predictability.Nevertheless, an equivalence with the duality relation is obtained, using different entropic indices that lead to the so-called min-max entropies, as was recently shown in Ref. [18].
Since the duality relation does not discriminate between pure states it may be interesting to complete the duality analysis by examining the states of maximum D or V , as well as those states with D = V .
From Eq. ( 19) the maximum distinguishability, D = 1, holds either when w + = 0, w − = 0, or a + |a − = 0.These are all the cases where the particle actually follows just a single path, or when the apparatus can provide full information about the path followed.On the other hand, after Eq. ( 20), the maximum visibility, V = | a + |a − |, holds when both paths are equally probable w + = w − = 1 2 .Furthermore the maximum visibility reaches unity, V = 1 when |a + is proportional to |a − .This is when both paths are equally probable and the apparatus provides no information about the path.Within the set S, the extreme states s z = ±1 satisfy the requirements for extreme distinguishability, while those with s x = ±1 reach maximum visibility.This agrees with the case of unobserved duality [15].
On the other hand, D = V holds provided that w

VI. CONCLUDING REMARKS
We have presented several examples of application of Rényi entropies as measures of quantum uncertainty.We have explored those situations leading to unexpected or contradicting predictions for different entropies and states.We have shown that all the striking behaviors found derive from an underlying lack of majorization relation between statistics.Moreover, we have shown that none of the paradoxical features is reproduced by the most popular measure of complementarity.Thus, majorization emerges as a powerful tool to understand fundamental aspects of quantum uncertainty and complementarity in the most complete and simple form.

FIG. 2 .
FIG. 2. Normalized Rényi entropies of: (a) the joint statistics R norm α (p X,Z ), (b) the product of marginal statistics R norm α (p X pZ ), and (c) the product of intrinsic statistics R norm α (p X p Z ), for α = 1 (dashed line) and α = 2.5 (solid line), as functions of θ.In all cases we consider balanced measurement δ = π 4 .

1 √ 2 1 2. 1 √ 2 ≥
+ w − | a + |a − | 2 = 1 8 .For balanced detection, | a + |a − | = so that w + w − = 1 4 and then w + = w − = Within the set S this is satisfied by the extreme states being eigenstates of σ x .Contrary to what happens for the unobserved duality relation, the intermediate states do not satisfy D = V .The extreme s x = ±1 can reach both maximum visibility and D = V since for balanced joint detection we get D ≥ V for all states.