Distributed Learning of Deep Sparse Neural Networks for High-Dimensional Classification
While analyzing high dimensional data-sets using deep neural network (NN), increased sparsity is desirable but requires careful selection of "sparsity parameters." In this paper, a novel distributed learning methodology is proposed to optimize the NN while addressing this challenge. To address this challenge, the optimal sparsity in the NN is estimated via a two player zero-sum game in the paper. In the proposed game, sparsity parameter is the first player with the aim of increasing sparsity in the NN while NN weights is the second player with the goal of improving its performance in the presence of increased sparsity. To solve the game, additional variables are introduced into the optimization problem such that the output at every layer in the NN depends on this variable instead of the previous layer. Using these additional variables, layer wise cost-functions are derived that are then independently optimized to learn the additional variables, NN weights and the sparsity parameters. To implement the proposed learning procedure in a parallelized and distributed environment, a novel computational algorithm is also proposed. The efficiency of the proposed approach is demonstrated using a total of six data-sets.
S. Garg et al., "Distributed Learning of Deep Sparse Neural Networks for High-Dimensional Classification," Proceedings of the 2018 IEEE International Conference on Big Data (2018, Seattle, WA), pp. 1587 - 1592, Institute of Electrical and Electronics Engineers (IEEE), Dec 2018.
The definitive version is available at https://doi.org/10.1109/BigData.2018.8621888
2018 IEEE International Conference on Big Data, Big Data 2018 (2018: Dec. 10-13, Seattle, WA)
Electrical and Computer Engineering
Mathematics and Statistics
Intelligent Systems Center
Second Research Center/Lab
Center for High Performance Computing Research
Keywords and Phrases
Big data; Clustering algorithms; Cost functions; Computational algorithm; Distributed environments; Distributed learning; High dimensional data; Learning procedures; Optimal sparsities; Optimization problems; Sparse neural networks; Deep neural networks
International Standard Book Number (ISBN)
Article - Conference proceedings
© 2018 Institute of Electrical and Electronics Engineers (IEEE), All rights reserved.
01 Dec 2018