Temporal dependency 개념 중요.
LSTM is one option to overcome the Vanishing Gradient problem in RNNs.
Please use these resources if you would like to read more about the Vanishing Gradient problem or understand further the concept of a Geometric Series and how its values may exponentially decrease.
If you are still curious, for more information on the important milestones mentioned here, please take a peek at the following links:
Here is the original Elman Network publication from 1990. This link is provided here as it's a significant milestone in the world on RNNs. To simplify things a bit, you can take a look at the following additional info.
In this LSTM link you will find the original paper written by Sepp Hochreiter and Jürgen Schmidhuber. Don't get into all the details just yet. We will cover all of this later!
As mentioned in the video, Long Short-Term Memory Cells (LSTMs) and Gated Recurrent Units (GRUs) give a solution to the vanishing gradient problem, by helping us apply networks that have temporal dependencies. In this lesson we will focus on RNNs and continue with LSTMs. We will not be focusing on GRUs. More information about GRUs can be found in the following blog. Focus on the overview titled: GRUs.
There are so many interesting applications, let's look at a few more!
Are you into gaming and bots? Check out the DotA 2 bot by Open AI
Here is a cool tool for automatic handwriting generation
Amazon's voice to text using high quality speech recognition, Amazon Lex.
Facebook uses RNN and LSTM technologies for building language models
Netflix also uses RNN models - here is an interesting read