Authors
Julien Mairal, Francis Bach, Jean Ponce, Guillermo Sapiro
Publication date
2009/6/14
Book
Proceedings of the 26th annual international conference on machine learning
Pages
689-696
Description
Sparse coding---that is, modelling data vectors as sparse linear combinations of basis elements---is widely used in machine learning, neuroscience, signal processing, and statistics. This paper focuses on learning the basis set, also called dictionary, to adapt it to specific data, an approach that has recently proven to be very effective for signal reconstruction and classification in the audio and image processing domains. This paper proposes a new online optimization algorithm for dictionary learning, based on stochastic approximations, which scales up gracefully to large datasets with millions of training samples. A proof of convergence is presented, along with experiments with natural images demonstrating that it leads to faster performance and better dictionaries than classical batch algorithms for both small and large datasets.
Total citations
2009201020112012201320142015201620172018201920202021202220232024202511398710117221927430327427024318916916612210321
Scholar articles
J Mairal, F Bach, J Ponce, G Sapiro - Proceedings of the 26th annual international …, 2009