Ph.D. Electrical Engineering
Finite-Length Information Theory
Modern communications impose short delay and large data rates constraints. This motivates the search of novel finite block-length bounds to the system performance that describe these new constraints. Here we use classical and novel analysis techniques to derive accurate bounds to the performance of different communication systems.
Shannon's Information Theory establishes the fundamental limits of information processing systems. A concept that is hidden in the mathematical proofs most of the Information Theory literature, is that in order to achieve the fundamental limits we need sequences of infinite duration.
Nowadays, due to the fading nature of wireless channels together with the large rates and delay requirements of moderm communication standards, there exists a trend to communicate over short temporal durations. While the asymptotic fundamental limits derived by Shannon still can be used as performance bounds in the finite block length regime, new analytical techniques are required in order to derive tighter bounds for short codebooks. In this regime analytical techniques valid for infinitely long transmission durations, such as typicality or Fano's inequality, are inacurate and ought therefore to be replaced by tools valid for arbitrary transmission lengths.
In our work we study general achievable rates and converse bounds in the finite block length regime for several communication problems. The analytical techniques used include random coding, large-deviation theory, saddle-point analysis, and statistical hypothesis testing. The results obtained offer new insights in how to approach those bounds by using practical schemes.
Joint source-channel coding
Implicit in Shannon's source-channel coding theorem is the fact that reliable transmission of a source through a channel can be accomplished by using separate source and channel codes. This means that a concatenation of a (channel-independent) source code followed by a (source-independent) channel code achieves vanishing error probability as the block length goes to infinity, as long as the source entropy is smaller than the channel capacity.
However, in the non-asymptotic regime an optimally designed joint source-channel code can perform strictly better than a separate code. This improvement (i. e., reduction in error probability) has been quantified in terms of error exponents and in terms of source and channel dispersion. Joint design has an error exponent at most twice of that of separate codes, and a dispersion gain that depends on the target error probability; for vanishing values of the latter, the dispersion of joint design is at best half of the dispersion of separate design. This potential gain justifies the interest in good finite-length joint source-channel codes.
Novel accurate joint source-channel bounds are provided in [ISIT11] and [Allerton12]. In [ISIT11] we propose a new family of random-coding achievability bounds. In [Allerton12] we extend several finite-length converse results to the joint source-channel coding setting. This is related to the results in [ISIT13], where we prove that one of these converse bounds indeed coincides with the actual probability of error of the best code.
In our work we also propose a novel source-channel coding scheme. In this scheme source messages are assigned to disjoint classes and encoded by codes that depend on the class index. Under MAP decoding, this scheme provides random coding bounds that attain the joint source-channel reliability function in the cases where it is known to be tight [TIT14]. Compared to previous results, the proof techniques employed in [TIT14] are quite elementary, and can be easily generalized to continuous channels. More details can be found in [ISIT12], [CISS12] and [TIT14].
However, practical communication systems cannot implement MAP decoding efficiently. Hence, the applicability of the results in [TIT14] is in principle limited. In [ISIT14] we characterize the performance of this coding scheme under simpler, sub-optimal decoding. First, we process the channel output in parallel for each class using a bank of maximum likelihood (ML) decoders. Then, the decoded message is selected from the outputs of the ML decoders based on a MAP criterion. While this construction fails to achieve the best performance of joint source-channel coding, for a fixed number of classes, it can be implemented with reduced complexity using existing source and channel codes. This scheme is shown to improve on the error exponent of separate coding, and, as the number of classes increases, to approach the error exponent of joint source-channel coding. More details can be found in [arXiv14a].
Hypothesis testing and information theory
Statistical hypothesis testing problems appear in areas as diverse as information theory, image processing, signal processing, social sciences or biology. In the context of reliable communication, binary hypothesis testing has been instrumental in the derivation of converse bounds on the error probability. Already in 1967, Shannon, Gallager and Berlekamp used an instance of binary hypothesis testing to derive the well known sphere-packing bound. Later, Blahut an Omura emphasized the fundamental role of binary hypothesis testing by obtaining converse bounds for different information-theoretic problems. More recently, the hypothesis-testing method has been used to obtain accurate finite-length lower bounds to several communication problems.
In [ISIT13] we show that binary hypothesis testing provides the exact error probability for a fixed joint source-channel code and an appropriate choice of the test parameters. While this expression is not computable in general, it identifies the weaknesses of several known converse bounds to the minimum achievable error probability.
In [arXiv14b] we further develop the connection between information-spectrum, hypothesis-testing and converse bounds in information theory by providing a number of alternative expressions to the error probability of Bayesian M-ary hypothesis testing. We show that this probability can be equivalently described by the error probability of either a single binary hypothesis test with certain parameters, or a bank of M binary hypothesis tests. We also provide an explicit alternative expression using information-spectrum measures and illustrate the connection with existing information-spectrum bounds. This framework is then used to study the performance of different communication problems and to identify the weaknesses of several converse results in the literature.
More to come...
This section is work in progress. I will keep updating it with recent results in finite-length information theory.
Gonzalo Vazquez-Vilar, Adria Tauste Campo, Albert Guillen i Fabregas, Alfonso Martinez. Statistical Hypothesis Testing and Lower Bounds to the Error Probability. Information Theory, IEEE Transactions on. In review.
Irina E. Bocharova, Albert Guillen i Fabregas, Boris D. Kudryashov, Alfonso Martinez, Adria Tauste Campo and Gonzalo Vazquez-Vilar. Multi-Class Source-Channel Coding. Information Theory, IEEE Transactions on. In review.
Irina E. Bocharova, Albert Guillen i Fabregas, Boris D. Kudryashov, Alfonso Martinez, Adria Tauste Campo and Gonzalo Vazquez-Vilar. Source-Channel Coding with Multiple Classes. 2014 IEEE International Symposium on Information Theory (ISIT 2014). Honolulu, HI, USA, Jun 29-Jul 4, 2014.
Adria Tauste, Gonzalo Vazquez-Vilar, Albert Guillen i Fabregas, Tobias Koch and Alfonso Martinez. A derivation of the source-channel error exponent using non-identical product distributions. Information Theory, IEEE Transactions on, vol.60, no.6, pp.3209-3217, Jun. 2014, doi: 10.1109/TIT.2014.2315818.
Gonzalo Vazquez-Vilar, Adria Tauste, Albert Guillen i Fabregas and Alfonso Martinez. For a Fixed Code the Meta-Converse Bound is Tight. 2013 IEEE International Symposium on Information Theory (ISIT 2013). Istanbul, Turkey, Jul 7-12, 2013.
Adria Tauste, Gonzalo Vazquez-Vilar, Albert Guillen i Fabregas, Tobias Koch and Alfonso Martinez. Converse Bounds for Finite-Length Joint Source-Channel Coding. 50th Annual Allerton Conference on Communication, Control and Computing (Allerton 2012). Allerton, IL, USA., Oct 1-5, 2012. Invited.
Adria Tauste, Gonzalo Vazquez-Vilar, Albert Guillen i Fabregas, Tobias Koch and Alfonso Martinez. Achieving Csiszár’s exponent for joint source-channel coding with product distributions. 2012 IEEE International Symposium on Information Theory (ISIT 2012). Boston, USA., Jul 1-6, 2012.
Adria Tauste, Gonzalo Vazquez-Vilar, Albert Guillen i Fabregas, Tobias Koch and Alfonso Martinez. Random Coding Bounds that Attain the Joint Source-Channel Exponent. 46th Annual Conference on Information Sciences and Systems (CISS 2012). Princeton, USA., March 21-23, 2012. Invited.
Adria Tauste, Gonzalo Vazquez-Vilar, Albert Guillen i Fabregas and Alfonso Martinez. Random-Coding Joint Source-Channel Bounds. 2011 IEEE International Symposium on Information Theory (ISIT 2011). Saint Petersburg, Russia., Jul 31 - Aug 5, 2011.