|
Sparse coding, which is usually viewed as a method for rearranging the structure of the original data in order tomake the energy compact over non-orthogonal and overcomplete dictionary, is widely used in signal processing, pattern recognition, machine learning, statistics, and neuroscience. Unfortunately, finding sparse codes and learning bases remain computationally difficult up to now, and the performance of sparse coding is sensitive to the learned dictionary. In this paper, we propose a blockwise coordinate descent algorithm with guaranteed convergence to solve these two problems under a unified scheme. The variables involved in the optimization problems are partitioned into several suitable blocks with convexity preserved, making it possible to perform an exact block coordinate descent. For each separable subproblem, based on the convexity and monotonic property of the parabolic function, a closed-form solution is obtained. Thus the algorithm is simple, efficient and effective. Experimental results show that our algorithm not only significantly accelerates the learning process, but also greatly helps improve the performance of real applications. |
|
Keywords:signal and information processing; sparse coding; blockwise coordinate descent; image classification |
|