# A lower bound for expected value of log-sum

Lately, I have been working with Poisson Matrix Factorization models and
at some point a needed to work a lower bound for $\text{E}_q[\log \sum_k X_k]$. After seeing some people using this lower bound without a good explanation, I decided to write this blog post. Also, this is included as an appendix to my ECML-PKDD 2017 paper about poisson factorizatiom model for recommendation.
The function $\log(.)$ is a concave function, which means that: $\log(p_1 x_1+p_2 x_2) \geq p_1\log x_1+p_2 \log x_2, \forall p_1,p_2:p_1+p_2=1, p_1,p_2 \geq 0$
By induction this property can be generalized to any convex combination of $x_k$ ( $\sum_k p_k x_k$ with $\sum_k p_k=1$ and $p_k \geq 0$ ): $\log \sum_k p_k x_k \geq \sum_k p_k\log x_k$

Now with the a random variable we can create a similar convex combination by multiplying and dividing each random variable $X_k$ by $p_k$ and apply the sum of of expectation property: $\text{E}_q[\log \sum_k X_k] = \text{E}_q[\log \sum_k \frac{p_k X_k}{p_k}]$ $\log \sum_k p_k\frac{X_k}{p_k} \geq \sum_k p_k\log \frac{X_k}{p_k}$ $\Rightarrow\text{E}_q [\log \sum_k p_k\frac{X_k}{p_k}] \geq \sum_k p_k \text{E}_q[\log \frac{X_k}{p_k}]$ $\Rightarrow \text{E}_q [\log \sum_k X_k ] \geq \sum_k p_k \text{E}_q[\log X_k]- p_k\log p_k$

If we want a tight lower bound we should use Lagrange multipliers to choose the set of $p_k$ that maximize the lower-bound given that they should sum to 1. $L(p_1,\ldots,p_K) = \left(\sum_k p_k \text{E}_q[\log X_k]- p_k\log p_k\right)+\lambda \left(1-\sum_k p_k\right)$ $\frac{\partial L}{\partial p_k} =\text{E}_q[\log X_k]-\log p_k-1-\lambda = 0$ $\frac{\partial L}{\partial \lambda} =1-\sum_k p_k = 0$ $\Rightarrow \sum_k p_k = 1$ $\Rightarrow\text{E}_q[\log X_k]=\log p_k+1+\lambda$ $\Rightarrow\text{E}_q[\log X_k]=\log p_k+1+\lambda$ $\Rightarrow \exp\text{E}_q[\log X_k]=p_k \exp(1+\lambda)$ $\Rightarrow \sum_k \exp\text{E}_q[\log X_k]=\exp(1+\lambda)\underbrace{\sum_k p_k}_{=1}$ $\Rightarrow p_k=\frac{\exp \{\text{E}_q[\log X_k]\}}{\sum_k \exp \{\text{E}_q[\log X_k]\}}$

The final formula for $p_k$ is exactly the same that we can find for the parameters of the the Multinomial distribution of the auxiliary variables in a Poisson model with rate parameter as sum of Gamma distributed latent variables. Also using this optimal $p_k$ we can show a tight bound without the auxiliary variables. $\text{E}_q [\log \sum_k X_k ] \geq \sum_k \frac{\exp \{\text{E}_q[\log X_k]\}}{\sum_j \exp \{\text{E}_q[\log X_j]\}}\text{E}_q[\log X_k]- \frac{\exp \{\text{E}_q[\log X_k]\}}{\sum_j \exp \{\text{E}_q[\log X_j]\}}\log \frac{\exp \{\text{E}_q[\log X_k]\}}{\sum_j \exp \{\text{E}_q[\log X_j]\}}$ $= \sum_k \frac{\exp \{\text{E}_q[\log X_k]\}}{\sum_j \exp \{\text{E}_q[\log X_j]\}} \log \sum_j \exp \{\text{E}_q[\log X_j]\}$ $= \log \sum_j \exp \{\text{E}_q[\log X_j]\} \underbrace{ \sum_k \frac{\exp \{\text{E}_q[\log X_k]\}}{\sum_j \exp \{\text{E}_q[\log X_j]\}} }_{=1}$
This results in: $\text{E}_q [\log \sum_k X_k ] \geq \log \sum_k \exp \{\text{E}_q[\log X_k]\}$

## 4 thoughts on “A lower bound for expected value of log-sum”

1. yup (you mean the final bound without the p_k right?). The nice thing about this proof here is that the intermediate bound with p_k is useful for calculating the ELBO in Poisson-gamma matrix/tensor factorization models once you augment the model with latent Poisson counts.

1. there is a discussion here about this particular bound and other similar bounds and how it is useful to retain the auxiliary variables sometimes http://www.columbia.edu/~jwp2128/Teaching/E6720/Fall2016/papers/twobounds.pdf (what I was trying to accomplish here with this post is just to point to that particular bound because it is useful in poisson-gamma factorization and maybe show a pedagogical derivation of it)