Long-Short Term Memory Network
People don’t start their thinking from scratch every second.As a example when you read this article, you understand each word based on your understanding of previous words.
Traditional neural networks can’t do this, and it seems like a major drawback. Imagine you want to classify what kind of event is happening at every point in a movie. It’s unclear how a traditional neural network could use its reasoning about previous events in the film to inform later ones.
Recurrent neural networks address this problem effectively. They are networks with loops in them, allowing information to persist.
Through chain-like nature reveals that recurrent neural networks are intimately related to sequences and lists. They’re the natural architecture of neural network to use for such data.
Key advantages of using Long-Short Term Memory,
Using LSTM networks lies in the fact that they address the vanishing gradient problem that makes network training difficult for a long sequence of words or integers.
It is also used to update Recurrent Neural Network parameters.
LSTM networks basically used in classifying, processing and making predications.
LSTMs use a gating mechanism that controls the memoizing process. Information in LSTMs can be stored, written, or read via gates that open and close. These gates store the memory in the analog format, implementing element-wise multiplication by sigmoid ranges between 0–1. Analog, being differentiable in nature, is suitable for backpropagation.
Reference