- Shopping Bag ( 0 items )
Regularization, Optimization, Kernels, and Support Vector Machines offers a snapshot of the current state of the art of large-scale machine learning, providing a single multidisciplinary source for the latest research and advances in regularization, sparsity, compressed sensing, convex and large-scale optimization, kernel methods, and support vector machines. Consisting of 21 chapters authored by leading researchers in machine learning, this comprehensive reference:
Regularization, Optimization, Kernels, and Support Vector Machines is ideal for researchers in machine learning, pattern recognition, data mining, signal processing, statistical learning, and related areas.
Contents
Preface
Contributors
An Equivalence between the Lasso and Support Vector Machines; Martin Jaggi
Regularized Dictionary Learning; Annalisa Barla, Saverio Salzo, and Alessandro Verri
Hybrid Conditional Gradient-Smoothing Algorithms with Applications to Sparse and Low Rank Regularization; Andreas Argyriou, Marco Signoretto, and Johan A.K. Suykens
Nonconvex Proximal Splitting with Computational Errors; Suvrit Sra
Learning Constrained Task Similarities in Graph-Regularized Multi-Task Learning; Rémi Flamary, Alain Rakotomamonjy, and Gilles Gasso
The Graph-Guided Group Lasso for Genome-Wide Association Studies; Zi Wang and Giovanni Montana
On the Convergence Rate of Stochastic Gradient Descent for Strongly Convex Functions; Cheng Tang and Claire Monteleoni
Detecting Ineffective Features for Nonparametric Regression; Kris De Brabanter, Paola Gloria Ferrario, and László Györfi
Quadratic Basis Pursuit; Henrik Ohlsson, Allen Y. Yang, Roy Dong, Michel Verhaegen, and S. Shankar Sastry
Robust Compressive Sensing; Esa Ollila, Hyon-Jung Kim, and Visa Koivunen
Regularized Robust Portfolio Estimation; Theodoros Evgeniou, Massimiliano Pontil, Diomidis Spinellis, Rafal Swiderski, and Nick Nassuphis
The Why and How of Nonnegative Matrix Factorization; Nicolas Gillis
Rank Constrained Optimization Problems in Computer Vision; Ivan Markovsky
Low-Rank Tensor Denoising and Recovery via Convex Optimization; Ryota Tomioka, Taiji Suzuki, Kohei Hayashi, and Hisashi Kashima
Learning Sets and Subspaces; Alessandro Rudi, Guillermo D. Canas, Ernesto De Vito, and Lorenzo Rosasco
Output Kernel Learning Methods; Francesco Dinuzzo, Cheng Soon Ong, and Kenji Fukumizu
Kernel Based Identification of Systems with Multiple Outputs Using Nuclear Norm Regularization; Tillmann Falck, Bart De Moor, and Johan A.K. Suykens
Kernel Methods for Image Denoising; Pantelis Bouboulis and Sergios Theodoridis
Single-Source Domain Adaptation with Target and Conditional Shift; Kun Zhang, Bernhard Schölkopf, Krikamol Muandet, Zhikun Wang, Zhi-Hua Zhou, and Claudio Persello
Multi-Layer Support Vector Machines; Marco A. Wiering and Lambert R.B. Schomaker
Online Regression with Kernels; Steven Van Vaerenbergh and Ignacio Santamaría
Index
Overview
Regularization, Optimization, Kernels, and Support Vector Machines offers a snapshot of the current state of the art of large-scale machine learning, providing a single multidisciplinary source for the latest research and advances in regularization, sparsity, compressed sensing, convex and large-scale optimization, kernel methods, and support vector machines. Consisting of 21 chapters authored by leading researchers in machine learning, this comprehensive reference: