A Variable Memory Length Auto Encoder. / Mohomad Ibunu, Mohomad; Weller, Samuel; Cheong Took, Clive.

2021. Paper presented at The International Joint Conference on Neural Networks 2021, .

Research output: Contribution to conferencePaperpeer-review

Published

Standard

A Variable Memory Length Auto Encoder. / Mohomad Ibunu, Mohomad; Weller, Samuel; Cheong Took, Clive.

2021. Paper presented at The International Joint Conference on Neural Networks 2021, .

Research output: Contribution to conferencePaperpeer-review

Harvard

Mohomad Ibunu, M, Weller, S & Cheong Took, C 2021, 'A Variable Memory Length Auto Encoder', Paper presented at The International Joint Conference on Neural Networks 2021, 18/07/21 - 22/07/21. https://doi.org/10.1109/IJCNN52387.2021.9533475

APA

Mohomad Ibunu, M., Weller, S., & Cheong Took, C. (2021). A Variable Memory Length Auto Encoder. Paper presented at The International Joint Conference on Neural Networks 2021, . https://doi.org/10.1109/IJCNN52387.2021.9533475

Vancouver

Mohomad Ibunu M, Weller S, Cheong Took C. A Variable Memory Length Auto Encoder. 2021. Paper presented at The International Joint Conference on Neural Networks 2021, . https://doi.org/10.1109/IJCNN52387.2021.9533475

Author

Mohomad Ibunu, Mohomad ; Weller, Samuel ; Cheong Took, Clive. / A Variable Memory Length Auto Encoder. Paper presented at The International Joint Conference on Neural Networks 2021, .

BibTeX

@conference{12f10b30d45f4900836cca334fbd59d4,
title = "A Variable Memory Length Auto Encoder",
abstract = "Auto-encoders typically require batch learning to be effective. There is a lack of online learning mechanisms for auto-encoders. To address this shortcoming in the literature, we propose an auto-encoder that can not only learn on a sample-by-sample basis without back-propagation but also has a memory to benefit from past learning. The memory can be adapted to fit the current state of the data by varying the memory length of the auto-encoder. Simulation supports our approach, especially when the data is nonstationary.",
author = "{Mohomad Ibunu}, Mohomad and Samuel Weller and {Cheong Took}, Clive",
year = "2021",
month = jul,
doi = "10.1109/IJCNN52387.2021.9533475",
language = "English",
note = "The International Joint Conference on Neural Networks 2021, IJCNN 2021 ; Conference date: 18-07-2021 Through 22-07-2021",
url = "https://www.ijcnn.org/",

}

RIS

TY - CONF

T1 - A Variable Memory Length Auto Encoder

AU - Mohomad Ibunu, Mohomad

AU - Weller, Samuel

AU - Cheong Took, Clive

PY - 2021/7

Y1 - 2021/7

N2 - Auto-encoders typically require batch learning to be effective. There is a lack of online learning mechanisms for auto-encoders. To address this shortcoming in the literature, we propose an auto-encoder that can not only learn on a sample-by-sample basis without back-propagation but also has a memory to benefit from past learning. The memory can be adapted to fit the current state of the data by varying the memory length of the auto-encoder. Simulation supports our approach, especially when the data is nonstationary.

AB - Auto-encoders typically require batch learning to be effective. There is a lack of online learning mechanisms for auto-encoders. To address this shortcoming in the literature, we propose an auto-encoder that can not only learn on a sample-by-sample basis without back-propagation but also has a memory to benefit from past learning. The memory can be adapted to fit the current state of the data by varying the memory length of the auto-encoder. Simulation supports our approach, especially when the data is nonstationary.

UR - https://www.ijcnn.org/draft-program-ijcnn-2021

U2 - 10.1109/IJCNN52387.2021.9533475

DO - 10.1109/IJCNN52387.2021.9533475

M3 - Paper

T2 - The International Joint Conference on Neural Networks 2021

Y2 - 18 July 2021 through 22 July 2021

ER -