$\log$-Sigmoid Activation-Based Long Short-Term Memory for Time Series Data Classification


With the Enhanced Usage of Artificial Intelligence (AI) Driven Applications, the Researchers Often Face Challenges in Improving the Accuracy of the Data Classification Models, While Trading Off the Complexity. in This Paper, We Address the Classification of Time Series Data using the Long Short-Term Memory (LSTM) Network While Focusing on the Activation Functions. While the Existing Activation Functions Such as Sigmoid and $\tanh$ Are Used as LSTM Internal Activations, the Customizability of These Activations Stays Limited. This Motivates Us to Propose a New Family of Activation Functions, Called $\log$-Sigmoid, Inside the LSTM Cell for Time Series Data Classification, and Analyze its Properties. We Also Present the Use of a Linear Transformation (E.g., $\log \tanh$) of the Proposed $\log$-Sigmoid Activation as a Replacement of the Traditional $\tanh$ Function in the LSTM Cell. Both the Cell Activation as Well as Recurrent Activation Functions Inside the LSTM Cell Are Modified with $\log$-Sigmoid Activation Family While Tuning the $\log$ Bases. Further, We Report a Comparative Performance Analysis of the LSTM Model using the Proposed and the State-Of-The-Art Activation Functions on Multiple Public Time-Series Databases.


Computer Science

Keywords and Phrases

Activation; Artificial intelligence; classification; Data models; Deep learning; LSTM; Mathematical models; Recurrent neural networks; sigmoid; Time series analysis; Tuning

International Standard Serial Number (ISSN)


Document Type

Article - Journal

Document Version


File Type





© 2023 Institute of Electrical and Electronics Engineers, All rights reserved.

Publication Date

01 Jan 2023

This document is currently not available here.