Structured Sparsity Reading Group (SSRG) 
Spring 2011 

Day/time: 1-2pm, Tuesdays
Location: CS 3310
Mailing list: sign up here
Organizer: Junming Sui


In this reading group, we will discuss recent works on high-dimensional sparse learning with prior structural information. Sparse models, which can conduct classification and feature selection simultaneously, are preferred in learning problems with high-dimensional data. In many applications, some intrinsic structure information is available on either input or output sides. For example, several genes may belong to the same functional group; in multi-task learning, several estimators are expected to share common types of covariates. Incorporating such structure information into sparse learning can achieve better prediction performance and produce interpretable results. We will cover several topics on this direction, including some structured sparsity regularizer and optimization techniques, multi-task learning, varying coefficient models, graphical models, dictionary learning, relation with submodular functions.


An up-to-date schedule will be maintained as a Google Calendar (see below), but the order of topics will follow roughly what is mapped out below. Schedule is adjustable.

Tentative Plan

Multiple papers are listed as 'suggested/optional' for one week. Whoever presents can decide to pick only one to focus on, or to present an overview of multiple papers.

Other useful references:

Sparse methods for machine learning: Theory and algorithms, ECML/PKDD 2010 Tutorial
NIPS-2010 Workshop on Practical Applications of Sparse Modeling: Open Issues and New Directions