Recurrent neural networks possess a problem of vanishing gradients and eventually running out of memory to save essential information. To overcome this a modified version of RNN called LSTM or long short term memory was introduced. It is similar in operation to RNN but the difference lies in the cell states.
To download the cheat sheet, please login below and follow us on Instagram.