Continuous Online Sequence Learning with an Unsupervised Neural Network Model

Created on 2021-01-31T13:51:17-06:00

Return to the Index

This card pertains to a resource available on the internet.

This card can also be read via Gemini.

Reiteration of HTM theory.

Robustness to signal noise, sparseness.

Temporal memory is not robust to temporal noise.

HTM applied to grammar learning was 98.4% accurate while LSTMs were 100%

I don't consider being ~2% less than LSTM to be a considerable flaw given the system is able to learn in real time and LSTMs are not.