Category: Seminars and Conferences
State: Archived
February 3, 2021

REGULARIZED EMPIRICAL RISK MINIMIZATION ON RANDOM SUBSPACES - ERNESTO DE VITO - UNIVERSITÀ DI GENOVA

at 05:30PM - Hosted on ZOOM

Regularized empirical risk minimization on Reproducing Kernel Hilbert spaces achieve optimal convergence rates, however they require huge computational resources on high dimensional datasets. In the recent years there is an increasing interest for extensions of empirical risk minimization where the hypothesis space is a low dimensional random space. This approach naturally leads to computational savings, but the question is whether the corresponding learning accuracy is degraded. If the random subspace is spanned by a random subset of the data, the statistical-computational tradeoff has been recently explored for the least squares loss and self-concordant loss functions, as the logistic loss. In this talk, based on a joint work with A. Della Vecchia, J. Mourtada and L. Rosasco, I will present some recent results dealing with non-smooth convex Lipschitz loss functions, as the hinge loss.