In RNN, Yt is the output which is given by the equation;
Yt = Why.ht
where Why is the weight associated with the output layer and ht is the current state.
What is the need of having this weight at the output layer of an RNN? Is it anything related to the memory of the network?