Abstract
Auto-encoders typically require batch learning to be effective. There is a lack of online learning mechanisms for auto-encoders. To address this shortcoming in the literature, we propose an auto-encoder that can not only learn on a sample-by-sample basis without back-propagation but also has a memory to benefit from past learning. The memory can be adapted to fit the current state of the data by varying the memory length of the auto-encoder. Simulation supports our approach, especially when the data is nonstationary.
Original language | English |
---|---|
DOIs | |
Publication status | Published - Jul 2021 |
Event | The International Joint Conference on Neural Networks 2021 - Virtual Duration: 18 Jul 2021 → 22 Jul 2021 https://www.ijcnn.org/ |
Conference
Conference | The International Joint Conference on Neural Networks 2021 |
---|---|
Abbreviated title | IJCNN 2021 |
Period | 18/07/21 → 22/07/21 |
Internet address |