Dear researchers 

Let's say that we have a time-series input signal that is applied to a transform X, the output was a set of vectors each of which has the same length of the input signal (This is how Empirical Mode Decomposition works). If the energy of each vector at output is calculated, and those energies were found to follow a decaying trend (monotonically decreasing), how could we calculate the confidence interval of the output energies? Any suggestion would be appreciated.

Best, 

More Mahdi Al-Badrawi's questions See All
Similar questions and discussions