$\log$ -Sigmoid Activation-Based Long Short-Term Memory for Time Series Data Classification
Abstract
With the Enhanced Usage of Artificial Intelligence (AI) Driven Applications, the Researchers Often Face Challenges in Improving the Accuracy of the Data Classification Models, While Trading Off the Complexity. in This Paper, We Address the Classification of Time Series Data using the Long Short-Term Memory (LSTM) Network While Focusing on the Activation Functions. While the Existing Activation Functions Such as Sigmoid and $\tanh$ Are Used as LSTM Internal Activations, the Customizability of These Activations Stays Limited. This Motivates Us to Propose a New Family of Activation Functions, Called $\log$-Sigmoid, Inside the LSTM Cell for Time Series Data Classification, and Analyze its Properties. We Also Present the Use of a Linear Transformation (E.g., $\log \tanh$) of the Proposed $\log$-Sigmoid Activation as a Replacement of the Traditional $\tanh$ Function in the LSTM Cell. Both the Cell Activation as Well as Recurrent Activation Functions Inside the LSTM Cell Are Modified with $\log$-Sigmoid Activation Family While Tuning the $\log$ Bases. Further, We Report a Comparative Performance Analysis of the LSTM Model using the Proposed and the State-Of-The-Art Activation Functions on Multiple Public Time-Series Databases.
Recommended Citation
P.
Ranjan
et al.,
"
The definitive version is available at https://doi.org/10.1109/TAI.2023.3265641
Department(s)
Computer Science
Keywords and Phrases
Activation; Artificial intelligence; classification; Data models; Deep learning; LSTM; Mathematical models; Recurrent neural networks; sigmoid; Time series analysis; Tuning
International Standard Serial Number (ISSN)
2691-4581
Document Type
Article - Journal
Document Version
Citation
File Type
text
Language(s)
English
Rights
© 2023 Institute of Electrical and Electronics Engineers, All rights reserved.
Publication Date
01 Jan 2023