A Neurodynamic Optimization Approach to Constrained Sparsity Maximization Based on Alternative Objective Functions
In recent years, constrained sparsity maximization problems received tremendous attention in the context of compressive sensing. Because the formulated constrained L0 norm minimization problem is NP-hard, constrained L1 norm minimization is usually used to compute approximate sparse solutions. In this paper, we introduce several alternative objective functions, such as weighted L1 norm, Laplacian, hyperbolic secant, and Gaussian functions, as approximations of the L0 norm. A one-layer recurrent neural network is applied to compute the optimal solutions to the reformulated constrained minimization problems subject to equality constraints. Simulation results in terms of time responses, phase diagrams, and tabular data are provided to demonstrate the superior performance of the proposed neurodynamic optimization approach to constrained sparsity maximization based on the problem reformulations.
Z. Guo and J. Wang, "A Neurodynamic Optimization Approach to Constrained Sparsity Maximization Based on Alternative Objective Functions," Proceedings of the 6th IEEE World Congress on Computational Intelligence, International Joint Conference on Neural Networks (2010, Barcelona, Spain), Institute of Electrical and Electronics Engineers (IEEE), Jul 2010.
The definitive version is available at https://doi.org/10.1109/IJCNN.2010.5596553
6th IEEE World Congress on Computational Intelligence, International Joint Conference on Neural Networks (2010: Jul. 18-23, Barcelona, Spain)
Keywords and Phrases
Artificial Intelligence; Hyperbolic Functions; Network Layers; Phase Diagrams; Recurrent Neural Networks
Article - Conference proceedings
© 2010 Institute of Electrical and Electronics Engineers (IEEE), All rights reserved.
01 Jul 2010