Revisiting the European sovereign bonds with a permutation-information-theory approach

In this paper we study the evolution of the informational efficiency in its weak form for seventeen European sovereign bonds time series. We aim to assess the impact of two specific economic situations in the hypothetical random behavior of these time series: the establishment of a common currency and a wide and deep financial crisis. In order to evaluate the informational efficiency we use permutation quantifiers derived from information theory. Specifically, time series are ranked according to two metrics that measure the intrinsic structure of their correlations: permutation entropy and permutation statistical complexity. These measures provide the rectangular coordinates of the complexity-entropy causality plane; the planar location of the time series in this representation space reveals the degree of informational efficiency. According to our results, the currency union contributed to homogenize the stochastic characteristics of the time series and produced synchronization in the random behavior of them. Additionally, the 2008 financial crisis uncovered differences within the apparently homogeneous European sovereign markets and revealed country-specific characteristics that were partially hidden during the monetary union heyday.


Introduction
The study of informational efficiency is a classic topic in financial economics. According to the usual definition of Fama [1], a market is informationally efficient if prices reflect all relevant information. Based on the extent of the information set, efficiency can be divided into three nested groups. The first one is the weak form of the informational efficiency. According to it, a market is weakefficient if prices incorporate the past history of prices. The second one is the semi-strong form of informational efficiency, and is fulfilled whenever prices reflect all public available information, in addition to previous prices. Finally the most demanding definition of informational efficiency is its strong form, where the validation set is all kind of information (both public and private).
It has been extensively documented in both developed and emerging markets, that informational efficiency, at least in its weak form, is not constant through time [2]. a e-mail: lucianoz@ciop.unlp.edu.ar In fact, Bariviera [3] finds that it is affected by liquidity. Moreover, Zunino et al. [4,5] show that informational efficiency varies according to the developmental stage of the country. Finally, Bariviera et al. [6] find that long-term memory in some European markets was affected by the 2008 credit crunch.
The aim of this paper is to study the evolution of the informational efficiency of seventeen European sovereign bond indices. Our sample covers European Monetary Union (EMU) and non-EMU countries. In particular we are interested in studying the behavior of the informational efficiency since the inception of the euro until present in order to determine factors that affect the transmission of information through the markets.
The remaining of the paper is organized as follows. Section 2 covers a brief review on financial integration. Section 3 explains the methodology used in this paper. Section 4 details the data used in this article and is followed by Section 5, which describes the results. Finally, Section 6 draws the main conclusions of our paper.

Financial integration: a brief review
The interest of the European Central Bank (ECB) in a high degree of financial integration stems from the needs of a monetary policy to be applied and transmitted efficiently to consumption and investment decisions across the whole euro area. Solans [7] refers to the desirable level of financial integration where the capital is assigned to the most profitable investment opportunities and removes all distortions which would hinder the efficient allocation of resources within the eurozone.
The road toward the single currency among European Union (EU) member states began in the early 1990's with the signature of the Maastricht Treaty. It entailed the liberalization of capital movements and the beginning of a macroeconomic convergence process. Additionally, it provided the substitution of national currencies by a common currency and established a set of rules for the convergence of some macroeconomic variables such as price stability, exchange rates, public deficit and debt levels for the member countries. A fixed exchange rate among euro partners was set effective at the beginning of 1999, marking the birth of the euro as an accounting currency, completed in January 2002 with the circulation of euro banknotes.
Among the benefits of the EMU are the reduction in transactions costs associated with the existence of multiple currencies, greater efficiency due to the decline of uncertainty related to currency instability and increased credibility of monetary policy within Europe [8]. Baele et al. [9] highlight three important benefits related to financial integration: expanded opportunities to share risk, increased capital distribution and enhanced economic growth. Terceño and Guercio [10] show evidence that financial integration in Latin American markets favors economic growth in that region. Before the introduction of the euro, yield differentials within sovereign European debt instruments were mostly determined by four elements: exchange rate variation, differences in domestic tax treatments, liquidity and credit risk [11].
However a great financial integration decreases financial opportunities to diversified financial portfolios, and moreover increases the contagion effect when facing economics problems. According to Baele et al. [9] financial integration is achieved when all potential market participants in the euro area face identical rules and have the same access to financial instruments or services in the market. This definition entails three important features: inclusion of financial markets intermediates, absence of frictions in the process and the presence of single supply and demand for investment opportunities as a typical market.
Since all products are nominated in a common currency, financial risk is reduced and transparency of assets prices is increased. Since the inception of the EMU the exchange rate risk between members states disappeared and the financial markets became more integrated. Pagano and Von Thadden [12] consider that EMU opened the possibility for the creation of a new and integrated financial market. Ca' Zorzi and Rubaszek [13] show that foreign exchange rates were the first to translate the effects of the euro introduction. Alonso and Cendejas [14] mention the interbank market as one of the most integrated markets, additionally to the money market. Moreover, Baele et al. [9] consider two categories to measure the degree of financial integration based on the law of the one price. The first one, related to yield spreads and the second one, associated with the response of asset prices in individual countries to common factors. According to the existence of different measures for the financial integration, it is commonly studied the influence of domestic and international factors as main drivers of sovereign bonds spreads.
Many studies focus on studying the degree of financial integration, especially since the start of the financial crisis in 2008 in United States which has been propagated to some European financial markets. Most of them analyze government bond markets evolution since the euro era and the consecutive effect of the financial crisis impact over the European Union, distinguishing between EMU and non-EMU countries [15]. Gómez-Puig [16] affirms that the EMU produced important changes related with the financial integration, especially in the euro area sovereign securities market, even though the market size produces differences between the spreads of diverse members. It is evident that the process of integration is far from complete. In particular, the euro area still lacks a unified market infrastructure. Abad et al. [17] study the integration in European government bond markets and show evidence that the financial markets since 2007 moved toward higher segmentation, especially within EMU members. The 2008 financial crisis produced different consequences over European countries. Sovereign bond spreads levels and the inception of new hedge instruments, affected financial markets evolution. Zunino et al. [5] find that the stochastic characteristics of sovereign bonds time series is different for EMU and non-EMU countries. The extent of the crisis obligated European policy makers to undertake rigorous fiscal measures, to inject large amounts of money into financial institutions and to set rescue packages for more affected countries such as Greece, Ireland, Portugal and, to some extent, Spain.
The recent financial crisis produced considerable global impact, not only because of its scale but also by the fact that it has been originated in the largest economy of the world. The consequences differ from country to country. Abad et al. [18] analyze the impact of the introduction of the euro over the degree of the integration of European government bond markets applying the CAPMbased model and find that EMU members are less vulnerable to the influence of world risk factors, but more susceptible to the EMU ones. Bariviera et al. [6] find that the 2008 financial crisis affected the long term memory of bond returns of several European markets. It seems clear that the global financial crisis modified investors' perception of risk and that the regional diversification of investment portfolios propagated market risks. Since 2008 European countries are more unstable and the single currency acted as a transmission mechanism of instabilities, becoming the Achilles' heel of the EMU, putting into question the convenience of a close economic and financial integration. Alonso and Cendejas [14] study the effect of the financial crisis over European financial markets and reaffirm that this period goes to the financial divergence in the European Union.

Entropy and statistical complexity
The information content of a system is usually evaluated via a probability distribution function (PDF) describing the apportionment of some measurable or observable quantity, commonly a time series S(t) = {x t ; t = 1, . . . , N}. An information measure can be roughly defined as a quantity that characterizes this given probability distribution. The Shannon entropy is very often used as the most "natural" one [19]. Given any arbitrary discrete probability distribution P = {p i , i = 1, . . . , M}, with M the number of degrees of freedom, Shannon's logarithmic information measure reads It can be regarded as a measure of the uncertainty associated with the physical process described by P . If S[P ] = S min = 0 we are in position to predict with complete certainty which of the possible outcomes i, whose probabilities are given by p i , will actually take place. Our knowledge of the underlying process described by the probability distribution is maximal in this instance. In contrast, our knowledge is minimal for the equiprobable distribution P e = {p i = 1/M, i = 1, . . . , M} and, consequently, the uncertainty is maximal, S[P e ] = S max . It is widely known that an entropic measure does not quantify the degree of structure presents in a process [20]. Moreover, it was recently shown that measures of statistical or structural complexity are necessary for a better understanding of chaotic time series because they are able to capture their organizational properties [21]. This specific kind of information is not revealed by randomness' measures. The opposite instances of perfect order and maximal randomness (a periodic sequence and a fair coin toss, for example) are very simple to describe because they do not have any structure. The complexity should be zero in both these cases. At a given distance from these extremes, a wide range of possible degrees of physical structure exists. The complexity measure allows to quantify this array of behavior [21]. Rosso and coworkers introduced an effective statistical complexity measure (SCM) that is able to detect essential details of the dynamics and differentiate different degrees of periodicity and chaos [22]. This specific SCM, that provides important additional information regarding the peculiarities of the underlying probability distribution, is defined via the product 1 of the normalized Shannon entropy with S max = S[P e ] = ln M , (0 ≤ H S ≤ 1) and P e the equiprobable distribution, and the so-called disequilibrium Q J . This latter quantifier is defined in terms of the extensive (in the thermodynamical sense) Jensen-Shannon divergence J [P, P e ] that links two PDFs. We have with Q 0 is a normalization constant, equal to the inverse of the maximum possible value of J [P, P e ]. We stress on the fact that the statistical complexity defined above is the product of two normalized entropies (the Shannon entropy and the Jensen-Shannon divergence), but it is a non-trivial function of the entropy because it depends on two different probabilities distributions, i.e., the one corresponding to the state of the system, P , and the equiprobable distribution, P e (taken as reference state). Furthermore, it has been shown that for a given value of H S , the range of possible SCM values varies between a minimum C min and a maximum C max [24]. Therefore, the evaluation of the complexity provides additional insight into the details of the systems probability distribution, which is not discriminated by randomness measures like the entropy. It can also help to uncover information related to the correlational structure between the components of the physical process under study [25,26].

Complexity-entropy plane
In statistical mechanics one is often interested in isolated systems characterized by an initial, arbitrary, and discrete probability distribution, and the main objective is to describe its evolution towards equilibrium. At equilibrium, we can suppose, without loss of generality, that this state is given by the equiprobable distribution P e . The temporal evolution of the statistical complexity measure (SCM) can be analyzed using a two-dimensional (2D) diagram of C JS versus time t. However, the second law of thermodynamics states that, for isolated systems, entropy grows monotonically with time (dH S /dt ≥ 0). This implies that H S can be regarded as an arrow of time, so that an equivalent way to study the temporal evolution of the SCM is through the analysis of C JS versus H S . The complexity-entropy plane has been used to study changes in the dynamics of a system originated by modifications of some characteristic parameters (see, for instance, Refs. [4,5,27,28] and references therein).

Estimation of the probability distribution function
The study and characterization of time series S(t) by recourse to information theory tools assume that the underlying probability distribution function (PDF) is a priori given. Consequently, part of the concomitant analysis involves extracting the PDF from the data. Bandt and Pompe (BP), almost ten years ago, introduced a successful methodology for the evaluation of the PDF associated to scalar time series data using a symbolization technique [29]. The pertinent symbolic data are created by ranking the values of the series and defined by reordering the embedded data in ascending order, which is equivalent to a phase space reconstruction with embedding dimension (pattern-length) D and time lag τ (see definitions and methodological details in Ref. [30]). In this way it is possible to quantify the diversity of the ordering symbols (patterns) derived from a scalar time series. Note that the appropriate symbol-sequence arises naturally from the time series and no model-based assumptions are needed. In fact, the "partitions" are devised by comparing the order of neighboring amplitude values rather than by apportioning these amplitudes according to different levels. This technique, as opposed to most of those in current practice, takes into account the temporal structure of the time series generated by the physical process under study. This feature allows us to uncover important details concerning the ordinal structure of the time series [31][32][33][34].
It is clear that, applying this prescription for symbolizing time series, some details of the original amplitude information and variability are lost. However, a meaningful reduction of the complex systems to their basic inherent structure is provided. The symbolic representation of time series by recourse to a comparison of consecutive points (τ = 1) or non-consecutive (τ > 1) points allows for an accurate empirical reconstruction of the underlying phase-space, even in the presence of weak (observational and dynamical) noise [29]. Furthermore, the ordinal pattern associated PDF is invariant with respect to nonlinear monotonous transformations. Accordingly, nonlinear drifts or scalings artificially introduced by a measurement device will not modify the quantifiers' estimation, a useful property if one deals with experimental data (see, i.e., Ref. [35]). These advantages make the BP approach more convenient than conventional methods based on range partitioning. Additional advantages of the method reside in its simplicity (we need few parameters: the embedding dimension D and the embedding delay τ ) and the extremely fast nature of the pertinent calculation-process.
The BP-generated probability distribution P is obtained once we fix the embedding dimension D and the embedding delay τ . The former parameter plays an important role in the evaluation of the appropriate probability distribution, since D determines the number of accessible states, given by D!. Moreover, it has been established that the length N of the time series must satisfy the condition N D! in order to achieve a reliable statistics and proper distinction between stochastic and deterministic dynamics [31,36]. With respect to the selection of the parameters, BP suggest in their cornerstone paper [29] to estimate with 3 ≤ D ≤ 7 and time lag τ = 1. Nevertheless, other values of τ might provide additional information. It has been recently shown that this parameter is strongly related, when it is relevant, to the intrinsic time scales of the system under analysis [37][38][39][40].
In this work both quantifiers, namely the normalized Shannon entropy, H S (Eq. (3)), and the SCM, C JS (Eq. (2)), are evaluated using the permutation probability distribution. Defined in this way, these quantifiers are usually known as permutation entropy and permutation statistical complexity [41]. They characterize the diversity and correlational structure, respectively, of the orderings present in the complex time series. The complexity-entropy causality plane (CECP) is defined as the two-dimensional (2D) diagram obtained by plotting permutation statistical complexity (vertical axis) versus permutation entropy (horizontal axis) for a given system [31]. For further details about the estimation of permutation quantifiers and an exhaustive list of its main biomedical and econophysics applications we refer the readers to reference [30].
The estimation of the permutation quantifiers considered in this work does not require the stationarity of the time series under analysis. Actually, it has been shown that the distribution of ordinal patterns is time-invariant for processes with stationary increments. Consequently, unbiased estimators of the ordinal pattern probabilities are obtained by their corresponding relative frequencies [42]. Particularly, fractional Brownian motions, that are non-stationary stochastic processes, have been suitable characterized by estimating these symbolic informationtheory-derived quantifiers [31,41,43,44].

Data
We use the JP Morgan government bond index (GBI). This index is made up of domestic sovereign fixedrate bonds that give international institutional investors an opportunity to invest in liquid debt markets. This means that bonds are stable, active and regularly issued. We use daily redemption yield of the indices of government bonds that mature between seven and ten years. Our database, retrieved from Datastream, comprises seventeen European countries: Austria, Belgium, Czech Republic, Denmark, Finland, France, Germany, Greece, Hungary, Ireland, Italy, Netherlands, Poland, Portugal, Spain, Sweden and the United Kingdom. Period of study goes from 01/01/1999 until 23/04/2013 for a total of N = 3733 datapoints, except for Poland that begins on 01/12/1999 (N = 3495 datapoints) and Hungary and Czech Republic that begin on 01/01/2001 (N = 3212 datapoints). We perform the analysis dividing the sample into three non-overlapping periods. The rationale for dividing the sample is to evaluate the information content of the time series in different stages of the European Monetary Union. The first period goes from 01/01/1999 to 30/12/2003 and corresponds to the introduction of the Euro as common currency. The second period corresponds to the consolidation of the common currency until the 2008 financial crisis. The date we select as the beginning of the financial crisis is 15/09/2008, the fall of Lehman Brother. This date was also used in several studies, e.g. in references [45,46]. Finally, the third period comprises Daily prices are analyzed in this work. It is well-known that most research on financial markets focus on price returns because they are stationary. However, it has previously shown that in the particular case of the normalized permutation entropy and in order to discriminate time series, better results are obtained for prices than returns [47].

Results
According to the aim of our paper, i.e. to analyze the behavior of the informational efficiency under different economic environments, we compute a set of quantifiers (described in Sect. 3) for each country and for each subperiod (detailed in Sect. 4). In particular we are interested in extracting the information endowment of bond yields time series during the period of establishment of the euro, its consolidation and since the aftermath of the 2008 financial crisis.
Permutation entropy and permutation statistical complexity are estimated for pattern lengths (embedding dimensions) D = {3, 4, 5} and embedding delay τ = 1 (because we use daily data). The pattern length should be as large as possible in order to include more information into the permutations. Nevertheless, the sample size (N ) should fulfill the condition N D! in order to obtain reliable statistics [31,36]. Consequently the maximum pattern length under consideration is D = 5. Table 1 shows the results of the permutation entropy and permutation statistical complexity for each market and each subperiod, considering D = 5 and τ = 1. Results for the other pattern lengths are similar.   We detect the presence of two outliers; the first one corresponds to Ireland during the first subperiod and the second one to Greece during the third period. In these figures we also observe clusters of points, which are roughly grouped by subperiods. Since the greater the embedding dimension the greater the causality information, hereafter results are discussed only for D = 5. We can observe these clusters with better detail in Figure 4.  A high correlation is clearly observed between both quantifiers, i.e. permutation entropy and permutation statistical complexity. This information redundancy is characteristic of stochastic processes. In order to exemplify this fact we have performed the analysis of fractional Brownian motions (fBms) with different Hurst exponents. These correlated stochastic processes are widely used for modeling financial time series. More precisely, we have generated one thousand numerical independent realizations of fBms with Hurst exponent H ∈ {0.10, 0.20, 0.30, 0.40, 0.45, 0.50, 0.55, 0.60, 0.65, 0.70, 0.80, 0.90} of length N = 3733 datapoints. The function wfbm of MATLAB, that follows the algorithm proposed by Abry and Sellan [48], was employed for the simulation task. The curve described by fBms in the CECP with D = 5 and τ = 1 is depicted in Figures 3 and 4 (black continuous curve). Location of the European sovereign bonds and fBms are very close in the CECP, supporting the fact that they share some dynamical properties. More precisely, according to this information-theory-derived methodology,
In Figure 5 we compare the location in the CECP of the original and shuffled data. Shuffling procedure destroys all non-trivial temporal correlations sorting the original data in a random order. Estimated permutation entropy and statistical complexity quantifiers for the shuffled realization are close to 1 and 0, respectively. This result allows us to conclude that the underlying temporal correlations are significant and play a starring role in the bond price formation. Moreover, we confirm with this simple test that the estimated values obtained from sovereign bond original data are not obtained by chance. As it was explained in Section 3, the CECP allows for an informational efficiency discrimination according to the planar location of the time series under analysis. Consequently, the distance to the maximal efficiency point, i.e. H S = 1 and C JS = 0, could be considered as an inefficiency metric. Taking this fact into account we propose the Euclidean distance to this point as our inefficiency index: Table 2 displays the inefficiency of each market for each subperiod. If we consider each subperiod as a whole set, we observe that information quantifiers during the first subperiod has a mean inefficiency of 0.215. Graphically (see Fig. 3), four outliers could be detected: Ireland, Poland, Hungary and Czech Republic. These four countries exhibit the worst informational efficiency for this period. The last three countries did not belong to the EU during this subperiod. The remaining thirteen countries of the sample constitute a compact cluster with a mean inefficiency of 0.185. Within this group the most efficient are Germany and France, and the least one is Austria. If we analyze the second subperiod practically all markets in the sample enhances their informational efficiency. There is only one exception (Czech Republic). We detect that three of the outliers of the previous period (Poland, Hungary and Czech Republic) continue showing a singular behavior, locating apart from the other countries. It must be highlighted that these countries joined the EU at the beginning of this subperiod. On the other hand, Ireland seems to approach the other countries to conform a compact cluster. This behavior reflects the period of consolidation of the monetary union. The mean inefficiency of this period for all markets in the sample is 0.177, reflecting an improvement of this metric. In order to compare results with the previous period, without the original four outliers, mean inefficiency is 0.169, reflecting also a more random behavior of the markets. Within this group the most efficient is United Kingdom (UK) and the least efficient is Sweden.
The third period, as defined in Section 4, begins with the spillover of the 2008 financial crisis across the EU. The main finding in this period is a deterioration of informational efficiency for all countries except UK. The mean inefficiency is 0.220, and considering all the markets except the four original outliers the mean inefficiency is 0.217. This situation means that the disturbation produced by the financial crisis was common to all countries, evidently with the aforementioned exception of UK. On contrary to the previous periods, there is no cluster formation. Considering that the four original outliers does not alter our results, we continue our static comparative analysis with the remaining thirteen markets. In this case, there is a large increase in the standard deviation of the inefficiency index for all countries. The standard deviation in this period is close to ten times greater than in the previous one. The most efficient market is United Kingdom again and the worst is Greece.
An alternative analysis could be done by looking for the relative position that each market has in each period. Table 3 ranks markets from the most until the least efficient in each subperiod. In the first subperiod, we can observe that Ireland and three countries that were not at the EU at that moment are at the bottom of the ranking. These countries were among the least developed countries of our sample, according to the Gross Domestic Product (GDP) per capita. During the second period, the consolidation of the euro area helped some countries, such as Ireland and Greece, to increase their efficiency. The new countries of the EU (Hungary, Poland and Czech Republic) remain at the bottom of the ranking. However, one remarking feature of this period is that EU countries that do not belong to the euro area (Denmark and Sweden) reduced their efficiency vis-à-vis euro countries. In some way it seems that not belonging to the euro was a handicap for enhancing efficiency. The exception was UK, which is the most efficient market in our sample for this period. In the third subperiod results changed radically. In the bottom part of the ranking there are, besides Poland and the Czech Republic, PIIGS (Portugal, Ireland, Italy, Greece, Spain) countries and Belgium. Additionally, countries that do not adopt the euro are better off than euro countries. Indeed, they are ranked in a better position than in the first and second subperiods.
In order to analyze the temporal evolution of the randomness degree associated with the European sovereign bonds we estimate the permutation entropy for D = 4 and τ = 1 using a sliding window of N s = 220 datapoints, which roughly correspond to one business year. We move this window by δ s = 20 points forward, which represent approximately one business month, eliminating the first δ s observations and including the next ones, and the quantifier is re-estimated. This procedure is repeated until the end of the time series. We cannot include in our sample Greece and Ireland because these countries have constant indices during several consecutive days (see Fig. 6). In particular, the index corresponding to Ireland is constant from 20/05/1999 to 28 the present dynamical analysis. With the estimated permutation entropy values for all the different countries for each sliding window, we evaluate the mean and standard deviation of the permutation entropy as a function of time.
In this way we sketch a graph of the randomness evolution averages and deviations obtained for the whole period of data. Figure 7 exhibits the mean permutation entropy (red curve) and the corresponding standard deviation (blue vertical lines). Segmented black vertical lines indicate the three subperiods considered in the previous static analysis.
The results of the analysis performed with sliding windows confirm the fact that the common currency and the 2008 financial crisis affected the synchronization of the European markets. In fact, as it can be observed in Figure 7, there is a reduction in the standard deviation of the permutation entropy until June 2004. This reduction is achieved during the first subperiod of our previous analysis. From this date until mid 2008 the standard deviation remains at low levels, reflecting the consolidation of the monetary union. The contagion effect of the financial crisis seems to generate a desynchronization of the markets, reflected in a constant increase in the standard deviation of the permutation entropy. The absolute minimum value of the standard deviation of the permutation entropy during the whole period is reached just before the outbreak of the crisis. We believe that the separation between the three periods is confirmed by the analysis performed with sliding windows. Last but not least, in Figure 7 there are inconclusive clues of the starting of a new period of synchronization among markets, reflected in a decrement of the standard deviation values. This could be a consequence of the bailout programs set by the EU and the stronger measures adopted by the European Central Bank. The analysis should be continued in time in order to confirm this provisional finding.

Conclusions
This paper studies the evolution of the informational efficiency of seventeen European sovereign bonds markets. The period under analysis (1999-2013) is long enough to scrutinize the impact of important economic events such as the establishment of the common currency and the 2008 financial crisis. The time series are analyzed using two symbolic quantifiers, permutation entropy and permutation statistical complexity, which are derived from information theory. The proposed methodology is new and alternative to standard econometric analysis. According to our results, we detect diverse informational efficiency levels. From 1999 until 2008 there is trend toward a synchronization of the time series of the different markets, reflected in the conformation of a more compact cluster in the CECP. Moreover, this synchronization is observed by a reduction in the standard deviation of the unpredictability degree of the time series, quantified via the permutation entropy, when using sliding windows. This time series coupling seems to be a consequence of the establishment of the euro. In fact during 2004-2008 the most efficient sovereign markets are those of the euro area, with the exception of the UK. The informational efficiency was negatively affected by the 2008 financial crisis, which produced a decoupling in the European sovereign bonds time series, reflected in an increment of the standard deviation of the permutation entropy and a greater dispersion of the markets across the CECP. Additionally the most affected markets are those of the euro area. On the contrary, several non-EMU markets, although being less efficient than in the previous period, increased their relative efficiency position within our sample.