Attention!
Recurrent Neural Networks have "Attention Deficiency". Came along LSTMs with their ability to store information for long. But, as they say, all good things have an even better alternatives, came Attention. Attention Mechanism is a way to provide the ...




