Information TheoryPublications
― This paper introduces the notion of exact common information, which is the minimum description length of the common randomness needed for the exact distributed generation of two correlated random variables \((X,Y)\). We introduce the quantity \(G(X;Y)=\min_{X\to W\to Y}H(W)\) as a natural bound on the exact common information and study its properties and computation. We then introduce the exact common information rate, which is the minimum description rate of the common randomness for the exact generation of a 2DMS \((X,Y)\). We give a multiletter characterization for it as the limit \(\bar{G}(X;Y)=\lim_{n\to\infty}(1/n)G(X^n;Y^n)\). While in general \(\bar{G}(X;Y)\) is greater than or equal to the Wyner common information, we show that they are equal for the Symmetric Binary Erasure Source. We do not know, however, if the exact common information rate has a single letter characterization in general. ― We introduce a simply stated conjecture regarding the maximum mutual information a Boolean function can reveal about noisy inputs. Specifically, let \(X^n\) be i.i.d. Bernoulli(1/2), and let \(Y^n\) be the result of passing \(X^n\) through a memoryless binary symmetric channel with crossover probability \(\alpha\). For any Boolean function \(b:\{0,1\}^n\rightarrow \{0,1\}\), we conjecture that $$I(b(X^n);Y^n)\leq 1H(\alpha).$$ While the conjecture remains open, we provide substantial evidence supporting its validity. ― Consider any discrete memoryless channel (DMC) with arbitrarily but finite input and output alphabets \(X, Y\) respectively. Then, for any capacity achieving input distribution all symbols occur less frequently than \(11/e\). That is, \[ \max\limits_{x \in \mathcal{X}} P^*(x) < 1\frac{1}{e} \] where \(P^*(x)\) is a capacity achieving input distribution. Also, we provide sufficient conditions for which a discrete distribution can be a capacity achieving input distribution for some DMC channel. Lastly, we show that there is no similar restriction on the capacity achieving output distribution. ― A gambler walks into a hypothetical fair casino with a very real dollar bill, but by the time he leaves he's exchanged the dollar for a random amount of money. What is lost in the process? It may be that the gambler walks out at the end of the day, after a rollercoaster ride of winning and losing, with his dollar still intact, or maybe even with two dollars. But what the gambler loses the moment he places his first bet is position. He exchanges one distribution of money for a distribution of lesser quality, from which he cannot return. Our first discussion in this work connects known results of economic inequality and majorization to the probability theory of gambling and Martingales. We provide a simple proof that fair gambles cannot increase the Lorenz curve, and we also constructively demonstrate that any sequence of nonincreasing Lorenz curves corresponds to at least one Martingale. We next consider the efficiency of gambles. If all fair gambles are available then one can move down the lattice of distributions defined by the Lorenz ordering. However, the step from one distribution to the next is not unique. Is there a sense of efficiency with which one can move down the Lorenz stream? One approach would be to minimize the average total volume of money placed on the table. In this case, it turns out that implementing part of the strategy using private randomness can help reduce the need for the casino's randomness, resulting in less money on the table that the casino cannot get its hands on. ― We prove that the ArimotoBlahut like algorithm introduced by Yasui et. al. to solve for the Secrecy Capacity of a less noisy Broadcast Channel can be extended to a morecapable DMC pair subject to the availability of a suitable initial guess. Posters
Research NotesThese are simply results that I like. They haven't been published and may not necessarily be published. I haven't put in enough efforts to free them from errors. They are based on my philosophy that research is meant to be fun and not necessarily meant for publishing.
