12 January 2021 4 9K Report

In RNN, Yt is the output which is given by the equation;

Yt = Why.ht

where Why is the weight associated with the output layer and ht is the current state.

What is the need of having this weight at the output layer of an RNN? Is it anything related to the memory of the network?

More Palak Nath's questions See All
Similar questions and discussions