Graph signals offer a very generic and natural representation for data that lives on networks or irregular structures. The actual data structure is however often unknown a priori but can sometimes be estimated from the knowledge of the application domain. ...
We propose a new statistical dictionary learning algorithm for sparse signals that is based on an α-stable innovation model. The parameters of the underlying model—that is, the atoms of the dictionary, the sparsity index α and the dispersion of the transfo ...
Many natural images have low intrinsic dimension (a.k.a. sparse), meaning that they can be represented with very few coefficients when expressed in an adequate domain. The recent theory of Compressed Sensing exploits this property offering a powerful frame ...
Although our work lies in the field of random processes, this thesis was originally motivated by signal processing applications, mainly the stochastic modeling of sparse signals. We develop a mathematical study of the innovation model, under which a signal ...
While the recent theory of compressed sensing provides an opportunity to overcome the Nyquist limit in recovering sparse signals, a solution approach usually takes the form of an inverse problem of an unknown signal, which is crucially dependent on specifi ...
In this paper we address the problem of learning image structures directly from sparse codes. We first model images as linear combinations of molecules, which are themselves groups of atoms from a redundant dictionary. We then formulate a new structure lea ...
Sparsity-based models have proven to be very effective in most image processing applications. The notion of sparsity has recently been extended to structured sparsity models where not only the number of components but also their support is important. This ...
We cast the query by example spoken term detection (QbE-STD) problem as subspace detection where query and background subspaces are modeled as union of low-dimensional subspaces. The speech exemplars used for subspace modeling are class-conditional posteri ...
We propose to model the acoustic space of deep neural network (DNN) class-conditional posterior probabilities as a union of low- dimensional subspaces. To that end, the training posteriors are used for dictionary learning and sparse coding. Sparse represen ...