This problem was posed to me by Russ Rinehart.  He is interested in finding the expected number of iterations that it takes for a stochastic optimization algorithm, called leapfrogging, to stop. A simplified version of the problem can be formally described as follows.

  • Let Y(0) = 1 and let U(1), U(2), ... be independent and identically distributed Uniform(0, a) random variables and let Y(t) = Y(t-1)U(t) for t = 1, 2, .... 
  • Let stopping time S be the first time that Y(t)
  • More Joshua Habiger's questions See All
    Similar questions and discussions