Overview

The sparsity of signals and images in a certain transform domain or dictionary has been exploited in many applications in signal and image processing, machine learning, and medical imaging. Analytical sparsifying transforms such as Wavelets and DCT have been widely used in compression standards. Recently, the data-driven learning of sparse models such as the synthesis dictionary model have become popular especially in applications such as denoising, inpainting, compressed sensing, etc. Our group’s research at the University of Illinois focuses on the data-driven adaptation of the alternative sparsifying transform model, which offers numerous advantages over the synthesis dictionary model.

We have proposed several methods for batch learning of square or overcomplete sparsifying transforms from data. We have also investigated specific structures for these transforms such as double sparsity, union-of-transforms, and filter bank structures, which enable their efficient learning or usage. Apart from batch transform learning, our group has investigated methods for online learning of sparsifying transforms, which are particularly useful for big data or real-time applications. The proposed algorithms for transform learning have been shown to be highly efficient.

Our research has demonstrated promising performance for transform learning methods in sparse representation, image and video denoising, classification, and compressed sensing (MRI and CT image reconstruction) tasks. We also established several convergence guarantees for our transform learning or image reconstruction schemes, which were previously lacking for prior adaptive dictionary-based methods.

This website contains the various manuscripts and theses published by our group on transform learning. It also contains conference posters and oral presentation slides on the various proposed methods.

Software

Software implementations of the various algorithms and data used to generate the results in our publications are available from the Software tab.

Funding

Our work on transform learning is supported in part by the National Science Foundation under grants CCF-1018660 and CCF-1320953.