A Neurodynamic Optimization Approach to Constrained Sparsity Maximization Based on Alternative Objective Functions


In recent years, constrained sparsity maximization problems received tremendous attention in the context of compressive sensing. Because the formulated constrained L0 norm minimization problem is NP-hard, constrained L1 norm minimization is usually used to compute approximate sparse solutions. In this paper, we introduce several alternative objective functions, such as weighted L1 norm, Laplacian, hyperbolic secant, and Gaussian functions, as approximations of the L0 norm. A one-layer recurrent neural network is applied to compute the optimal solutions to the reformulated constrained minimization problems subject to equality constraints. Simulation results in terms of time responses, phase diagrams, and tabular data are provided to demonstrate the superior performance of the proposed neurodynamic optimization approach to constrained sparsity maximization based on the problem reformulations.

Meeting Name

6th IEEE World Congress on Computational Intelligence, International Joint Conference on Neural Networks (2010: Jul. 18-23, Barcelona, Spain)


Computer Science

Keywords and Phrases

Artificial Intelligence; Hyperbolic Functions; Network Layers; Phase Diagrams; Recurrent Neural Networks

Document Type

Article - Conference proceedings

Document Version


File Type





© 2010 Institute of Electrical and Electronics Engineers (IEEE), All rights reserved.

Publication Date

01 Jul 2010