Advances in Logistic Regression
Logistic regression is one of the workhorses of statistical learning and it has been used and studied for several decades. In this talk I will focuss on two recent advances in logistic regression:
1) Logistic regression under sparseness priors (of the LASSO type) can not be approached using the standard algorithms, due to the non-differentiability of these priors. I will describe bound-optimization algorithms for sparse logistic regression (with parallel or sequential updates). Experimental results on real benchmark data show that sparse logistic regression outperforms both support vector machines and relevance vector machines. Performance bounds on this type of classifiers will be briefly mentioned.
2) In the second part of the talk, I will present a semi-supervised extension of logistic regression, which is able to use both labelled and unlabelled data. A Bayesian formulation and an EM algorithm allow for the automatic adjustment of the tradeoff between the contributions of the labelled and unlabelled data. Encouraging results are presented on benchmark data.