Reducing the Computational and Communication Complexity of a Distributed Optimization for Regularized Logistic Regression

In this paper, we propose a new distributed optimization method that computes a Lasso estimator for logistic regression in the case when two parties have explanatory variables corresponding to distinct attributes. An existing protocol using the alternating direction method of multipliers (ADMM) for linear regression can be applied to logistic regression. However, this protocol needs an underlying iterative method such as the gradient method. We show that the proposed protocol using the generalized Bregman ADMM, which removes the necessity to use the underlying iterative method, requires lower computational and communication complexity.