I want to use DWT algorithm through Java Machine Learning (JML) package; therefore, I have to transform my feature matrix to TimeSeries. How can I do that?
one day I worked on a matrix of N x M, this matrix represent M visual speech features (constant), and N is varied in the time series, so I considered it as M signals in the time series, and applied DTW on each signal with length N, the final output distance was the summation of all distances from 1 to M.
for more information you may refer to my thesis at: https://www.researchgate.net/publication/210333268_Visual_Words_for_Automatic_Lip-_Reading
What is your data matrix exactly? What size? A matrix can be seen as a stack of time series if each sample of a line (or column) represents a value at a particular time. So I am not sure what you mean by "transform the feature matrix".