Gonzalo
VazquezVilar
Ph.D. Electrical Engineering
FiniteLength Information Theory
Modern communications impose short delay and large data rates constraints. This motivates the search of novel finite blocklength bounds to the system performance that describe these new constraints. Here we use classical and novel analysis techniques to derive accurate bounds to the performance of different communication systems.
Introduction
Shannon's Information Theory establishes the fundamental limits of information processing systems. A concept that is hidden in the mathematical proofs most of the Information Theory literature, is that in order to achieve the fundamental limits we need sequences of infinite duration.
Nowadays, due to the fading nature of wireless channels together with the large rates and delay requirements of moderm communication standards, there exists a trend to communicate over short temporal durations. While the asymptotic fundamental limits derived by Shannon still can be used as performance bounds in the finite block length regime, new analytical techniques are required in order to derive tighter bounds for short codebooks. In this regime analytical techniques valid for infinitely long transmission durations, such as typicality or Fano's inequality, are inacurate and ought therefore to be replaced by tools valid for arbitrary transmission lengths.
In our work we study general achievable rates and converse bounds in the finite block length regime for several communication problems. The analytical techniques used include random coding, largedeviation theory, saddlepoint analysis, and statistical hypothesis testing. The results obtained offer new insights in how to approach those bounds by using practical schemes.
Joint sourcechannel coding
Implicit in Shannon's sourcechannel coding theorem is the fact that reliable transmission of a source through a channel can be accomplished by using separate source and channel codes. This means that a concatenation of a (channelindependent) source code followed by a (sourceindependent) channel code achieves vanishing error probability as the block length goes to infinity, as long as the source entropy is smaller than the channel capacity.
However, in the nonasymptotic regime an optimally designed joint sourcechannel code can perform strictly better than a separate code. This improvement (i. e., reduction in error probability) has been quantified in terms of error exponents and in terms of source and channel dispersion. Joint design has an error exponent at most twice of that of separate codes, and a dispersion gain that depends on the target error probability; for vanishing values of the latter, the dispersion of joint design is at best half of the dispersion of separate design. This potential gain justifies the interest in good finitelength joint sourcechannel codes.
Novel accurate joint sourcechannel bounds are provided in [ISIT11] and [Allerton12]. In [ISIT11] we propose a new family of randomcoding achievability bounds. In [Allerton12] we extend several finitelength converse results to the joint sourcechannel coding setting. This is related to the results in [ISIT13], where we prove that one of these converse bounds indeed coincides with the actual probability of error of the best code.
In our work we also propose a novel sourcechannel coding scheme. In this scheme source messages are assigned to disjoint classes and encoded by codes that depend on the class index. Under MAP decoding, this scheme provides random coding bounds that attain the joint sourcechannel reliability function in the cases where it is known to be tight [TIT14]. Compared to previous results, the proof techniques employed in [TIT14] are quite elementary, and can be easily generalized to continuous channels. More details can be found in [ISIT12], [CISS12] and [TIT14].
However, practical communication systems cannot implement MAP decoding efficiently. Hence, the applicability of the results in [TIT14] is in principle limited. In [ISIT14] we characterize the performance of this coding scheme under simpler, suboptimal decoding. First, we process the channel output in parallel for each class using a bank of maximum likelihood (ML) decoders. Then, the decoded message is selected from the outputs of the ML decoders based on a MAP criterion. While this construction fails to achieve the best performance of joint sourcechannel coding, for a fixed number of classes, it can be implemented with reduced complexity using existing source and channel codes. This scheme is shown to improve on the error exponent of separate coding, and, as the number of classes increases, to approach the error exponent of joint sourcechannel coding. More details can be found in [arXiv14a].
Hypothesis testing and information theory
Statistical hypothesis testing problems appear in areas as diverse as information theory, image processing, signal processing, social sciences or biology. In the context of reliable communication, binary hypothesis testing has been instrumental in the derivation of converse bounds on the error probability. Already in 1967, Shannon, Gallager and Berlekamp used an instance of binary hypothesis testing to derive the well known spherepacking bound. Later, Blahut an Omura emphasized the fundamental role of binary hypothesis testing by obtaining converse bounds for different informationtheoretic problems. More recently, the hypothesistesting method has been used to obtain accurate finitelength lower bounds to several communication problems.
In [ISIT13] we show that binary hypothesis testing provides the exact error probability for a fixed joint sourcechannel code and an appropriate choice of the test parameters. While this expression is not computable in general, it identifies the weaknesses of several known converse bounds to the minimum achievable error probability.
In [arXiv14b] we further develop the connection between informationspectrum, hypothesistesting and converse bounds in information theory by providing a number of alternative expressions to the error probability of Bayesian Mary hypothesis testing. We show that this probability can be equivalently described by the error probability of either a single binary hypothesis test with certain parameters, or a bank of M binary hypothesis tests. We also provide an explicit alternative expression using informationspectrum measures and illustrate the connection with existing informationspectrum bounds. This framework is then used to study the performance of different communication problems and to identify the weaknesses of several converse results in the literature.
More to come...
This section is work in progress. I will keep updating it with recent results in finitelength information theory.
References
[arXiv14b] 
Gonzalo VazquezVilar, Adria Tauste Campo, Albert Guillen i Fabregas, Alfonso Martinez. Statistical Hypothesis Testing and Lower Bounds to the Error Probability. Information Theory, IEEE Transactions on. In review. 

[arXiv14a] 
Irina E. Bocharova, Albert Guillen i Fabregas, Boris D. Kudryashov, Alfonso Martinez, Adria Tauste Campo and Gonzalo VazquezVilar. MultiClass SourceChannel Coding. Information Theory, IEEE Transactions on. In review. 

[ISIT14] 
Irina E. Bocharova, Albert Guillen i Fabregas, Boris D. Kudryashov, Alfonso Martinez, Adria Tauste Campo and Gonzalo VazquezVilar. SourceChannel Coding with Multiple Classes. 2014 IEEE International Symposium on Information Theory (ISIT 2014). Honolulu, HI, USA, Jun 29Jul 4, 2014. 

[TIT14] 
Adria Tauste, Gonzalo VazquezVilar, Albert Guillen i Fabregas, Tobias Koch and Alfonso Martinez. A derivation of the sourcechannel error exponent using nonidentical product distributions. Information Theory, IEEE Transactions on, vol.60, no.6, pp.32093217, Jun. 2014, doi: 10.1109/TIT.2014.2315818. 

[ISIT13] 
Gonzalo VazquezVilar, Adria Tauste, Albert Guillen i Fabregas and Alfonso Martinez. For a Fixed Code the MetaConverse Bound is Tight. 2013 IEEE International Symposium on Information Theory (ISIT 2013). Istanbul, Turkey, Jul 712, 2013. 

[Allerton12] 
Adria Tauste, Gonzalo VazquezVilar, Albert Guillen i Fabregas, Tobias Koch and Alfonso Martinez. Converse Bounds for FiniteLength Joint SourceChannel Coding. 50th Annual Allerton Conference on Communication, Control and Computing (Allerton 2012). Allerton, IL, USA., Oct 15, 2012. Invited. 

[ISIT12] 
Adria Tauste, Gonzalo VazquezVilar, Albert Guillen i Fabregas, Tobias Koch and Alfonso Martinez. Achieving Csiszár’s exponent for joint sourcechannel coding with product distributions. 2012 IEEE International Symposium on Information Theory (ISIT 2012). Boston, USA., Jul 16, 2012. 

[CISS12] 
Adria Tauste, Gonzalo VazquezVilar, Albert Guillen i Fabregas, Tobias Koch and Alfonso Martinez. Random Coding Bounds that Attain the Joint SourceChannel Exponent. 46th Annual Conference on Information Sciences and Systems (CISS 2012). Princeton, USA., March 2123, 2012. Invited. 

[ISIT11] 
Adria Tauste, Gonzalo VazquezVilar, Albert Guillen i Fabregas and Alfonso Martinez. RandomCoding Joint SourceChannel Bounds. 2011 IEEE International Symposium on Information Theory (ISIT 2011). Saint Petersburg, Russia., Jul 31  Aug 5, 2011. 