Recurrent Neural Networks provide an intriguing twist to common neural networks. A vanilla neural network acquires a fixed size vector as input and restricts its usage in scenarios that involve certain type of inputs with no preestablished size.
Furthermore, RNNs are able to remember the past, while the decisions made through an RNN are influenced by previously acquired knowledge. RNNs are able to take one or more input vectors, establish one or more output vectors, while the output is influenced by both weights applied on inputs and the context depicted by prior input/ output.
In other words, the input is able to offer a distinct output depending on the previous inputs featured in the series. Basically, a vanilla neural network stands for a fixed size input vector that is reconstructed into a fixed size output vector. In addition, the network becomes recurrent due to the repeated transformations done to a series of given input.
In a vanilla RNN, a long short-term memory network or LSTM provides extra gates and a cell state, which improves the RNN, along with addressing the issue of keeping or resetting the context.