I am performing a simulation of a system that involves a lot of random variables. The response of this system varies each time I run the simulation. It is always non-asymptotically stable though, as it eventually fluctuates around the equilibrium point with a variant magnitude (of the fluctuation) each time.

The attached figure illustrates how two different runs of the same simulation can look like.

I would like to have an automated function that calculates the fluctuation magnitude and the settling time each run, from the time response.

More Ahmed Mahfouz's questions See All
Similar questions and discussions