Online Linear Regression using Burg Entropy

We consider the problem of online prediction with a linear model. In contrast to existing work in online regression, which regularizes based on squared loss or KL-divergence, we regularize using divergences arising from the Burg entropy. We demonstrate regret bounds for our resulting online gradient-descent algorithm; to our knowledge, these are the first online bounds involving Burg entropy. We extend this analysis to the matrix case, where our algorithm employs LogDet-based regularization, and discuss an application to online metric learning. We demonstrate empirically that using Burg entropy for regularization is useful in the presence of noisy data.