.Compressed Sensing MRI (MATLAB CODES)

Compressed sensing is an efficient sensing/sampling paradigm and has been widly studied in different fields recently. Based on the compressed sensing theory, a signal can be recovered from far fewer samples or measurements than what the Shannon sampling theory requires if certain conditions hold. In MRI, fewer samples mean reduced acquisition time, hardware cost or energy consumption. It has attracted great interest to apply compressed sensing techniques to fast MRI. This project aims to fill the gap between the mathematical theory behind compressed-sensing, large scale optimization techniques and medical-imaging physical practice to maximize the potential of compressed sensing in MRI.


Large Scale Convex Optimization (MATLAB CODES)

Large scale convex optimization techniques are important for medical imaging, machine learning and computer vision. We consider the minimization of a smooth convex function regularized by the composite prior models. This problem is generally difficult to solve even each subproblem regularized by one prior model is convex and easy. In this project, we propose a Fast Composite Splitting Algorithm (FCSA) to attach this problem. The proposed FCSA has been sucessfully applied to the compressed MR image reconstruction and low-rank tensor completion respectively.


Learning With Structured Sparsity (MATLAB CODES)

Structured sparsity is a natural extension of the standard sparsity concept in statistical learning and compressive sensing. By allowing arbitrary structures on the feature set, this concept generalizes the group sparsity idea. A general theory is developed for learning with structured sparsity, based on the notion of coding complexity associated with the structure. Moreover, a structured greedy algorithm is proposed to efficiently solve the structured sparsity problem. We have demonstrated the advantage of structured sparsity over standard sparsity theoretically and experimentally.


Dynamic Group Sparsity (MATLAB CODES)

Dynamic group sparsity is a natural extension of the standard sparsity concept in compressive sensing, and is motivated by the observation that in some practical sparse data the nonzero coefficients are often not random but tend to be clustered. Intuitively, better results can be achieved in these cases by reasonably utilizing both clustering and sparsity priors. Motivated by this idea, we have developed a new greedy sparse recovery algorithm, which prunes data residues in the iterative process according to both sparsity and group clustering priors rather than only sparsity as in previous methods. Moreover, it can adaptively learn the dynamic group structure and the sparsity number if they are not available in the practical applications.


Transformation Invariance Sparse Representation (MATLAB CODES)

We introduce a simple technique for obtaining transformation-invariant image sparse representation. It is rooted in two observations: 1) if the aligned model images of an object span a linear subspace, their transformed versions with respect to some group of transformations can still span a linear subspace in a higher dimension; 2) if a target (or test) image, aligned with the model images, lives in the above subspace, its pre-alignment versions would get closer to the subspace after applying estimated transformations with more and more accurate parameters. We have applied the proposed methodology to two applications: face recognition, and dynamic texture registration. The improved performance over previous methods that we obtain demonstrates the effectiveness of the proposed approach.