Information Theory


  • Gowtham R Kumar, Cheuk Ting Li and Abbas El Gamal, "Exact Common Information," presented at ISIT-2014, Honolulu, HI.
    pdf | slides
  • ― This paper introduces the notion of exact common information, which is the minimum description length of the common randomness needed for the exact distributed generation of two correlated random variables \((X,Y)\). We introduce the quantity \(G(X;Y)=\min_{X\to W\to Y}H(W)\) as a natural bound on the exact common information and study its properties and computation. We then introduce the exact common information rate, which is the minimum description rate of the common randomness for the exact generation of a 2-DMS \((X,Y)\). We give a multiletter characterization for it as the limit \(\bar{G}(X;Y)=\lim_{n\to\infty}(1/n)G(X^n;Y^n)\). While in general \(\bar{G}(X;Y)\) is greater than or equal to the Wyner common information, we show that they are equal for the Symmetric Binary Erasure Source. We do not know, however, if the exact common information rate has a single letter characterization in general.

  • Gowtham R Kumar and Thomas A Courtade, "Which Boolean Functions are Most Informative?", presented at ISIT-2013, Istanbul, Turkey.
    pdf | slides
  • ― We introduce a simply stated conjecture regarding the maximum mutual information a Boolean function can reveal about noisy inputs. Specifically, let \(X^n\) be i.i.d. Bernoulli(1/2), and let \(Y^n\) be the result of passing \(X^n\) through a memoryless binary symmetric channel with crossover probability \(\alpha\). For any Boolean function \(b:\{0,1\}^n\rightarrow \{0,1\}\), we conjecture that $$I(b(X^n);Y^n)\leq 1-H(\alpha).$$ While the conjecture remains open, we provide substantial evidence supporting its validity.

  • Gowtham Kumar and Alexandros Manolakos, "No input symbol should occur more frequently than 1-1/e", arXiv:1201.6425 [cs.IT].
    pdf | poster
  • ― Consider any discrete memoryless channel (DMC) with arbitrarily but finite input and output alphabets \(X, Y\) respectively. Then, for any capacity achieving input distribution all symbols occur less frequently than \(1-1/e\). That is, \[ \max\limits_{x \in \mathcal{X}} P^*(x) < 1-\frac{1}{e} \] where \(P^*(x)\) is a capacity achieving input distribution. Also, we provide sufficient conditions for which a discrete distribution can be a capacity achieving input distribution for some DMC channel. Lastly, we show that there is no similar restriction on the capacity achieving output distribution.

  • Paul Cuff, Thomas Cover, Gowtham Kumar, Lei Zhao, "A Lattice of Gambles", presented at IEEE ISIT-2011.
    pdf | slides | poster
  • ― A gambler walks into a hypothetical fair casino with a very real dollar bill, but by the time he leaves he's exchanged the dollar for a random amount of money. What is lost in the process? It may be that the gambler walks out at the end of the day, after a roller-coaster ride of winning and losing, with his dollar still intact, or maybe even with two dollars. But what the gambler loses the moment he places his first bet is position. He exchanges one distribution of money for a distribution of lesser quality, from which he cannot return. Our first discussion in this work connects known results of economic inequality and majorization to the probability theory of gambling and Martingales. We provide a simple proof that fair gambles cannot increase the Lorenz curve, and we also constructively demonstrate that any sequence of non-increasing Lorenz curves corresponds to at least one Martingale. We next consider the efficiency of gambles. If all fair gambles are available then one can move down the lattice of distributions defined by the Lorenz ordering. However, the step from one distribution to the next is not unique. Is there a sense of efficiency with which one can move down the Lorenz stream? One approach would be to minimize the average total volume of money placed on the table. In this case, it turns out that implementing part of the strategy using private randomness can help reduce the need for the casino's randomness, resulting in less money on the table that the casino cannot get its hands on.

  • Gowtham Kumar and Andrew Thangaraj, "Computation of Secrecy Capacity for More-Capable Channel Pairs", presented at IEEE ISIT-2008.
  • ― We prove that the Arimoto-Blahut like algorithm introduced by Yasui et. al. to solve for the Secrecy Capacity of a less noisy Broadcast Channel can be extended to a more-capable DMC pair subject to the availability of a suitable initial guess.


  • Gowtham R Kumar and Thomas A Courtade, "Which Boolean Functions are Most Informative?", poster presented at ITA-2013. Link
  • Gowtham Kumar, "On The Most Significant Bit w.r.t. Side Information", poster presented at CSoI-2012 workshop. Link
  • Gowtham Kumar, Alexandros Manolakos, "No input symbol should occur more frequently than 1- 1/e", poster presented at CSoI-2011.Link
  • Paul Cuff, Tom Cover, Lei Zhao, Gowtham Kumar, "A Martingale Lattice", poster presented at ITA-2011. Link

Research Notes

These are simply results that I like. They haven't been published and may not necessarily be published. I haven't put in enough efforts to free them from errors. They are based on my philosophy that research is meant to be fun and not necessarily meant for publishing.

  • Zhang-Berger region for multiple descriptions when applied to Broadcast co-ordination, coincides with Paul Cuff region. Link

  • Computation of Renyi correlation using Singular Value Decomposition and a simple proof of Witsenhausen's theorem. Link Matlab Code

  • Under correlation constraints, a Gaussian distribution is worst in the sense that it forces the highest error probability on the agreement of binary decisions. Link

  • On computation of secrecy capacity of Broadcast Channels using a 3-step Arimoto-Blahut-like alternating optimization algorithm. Link Matlab Code