I am doing a time series forecasting using an LSTM Network. My data contain a lot of outliers and change points and therefore the LSTM is not giving good performance. The idea is to identify if there is a suspicious point in the network input. In case there is such points, the corresponding LSTM gradient is downweighted (for outliers) or hightly weighted (for change points). Do you think that I need to do a from scratch implementation of LSTM (to be able to customize the gradient weights) or there is a way to do it with Keras .
I will refer to the paper : "Robust Online Time Series Prediction with Recurrent Neural Networks" for further informations