I must have misunderstood something... isn't it that once you choose an equilibrium point as initial condition, you have at least a constant solution corresponding to this initial data, namely the constant one. This is the definition of an equilibrium point. At least for the case the problem has a unique solution, this constant should be the only solution.
Ravi's second question looks a bit more as a question whether the given Cauchy problem has a solution (for a given initial condition), and maybe whether this is unique. Miodrag already gave an answer to this.
An initial value problem (also called the Cauchy problem by some authors) is an ordinary differential equation together with a specified value, called the initial condition, of the unknown function at a given point in the domain of the solution:
ydot=f(y,t). We think the the choice of chosing equilibruim points as initial condition is convenient.
The Picard–Lindelöf theorem guarantees a unique solution on some interval containing t0 if ƒ is continuous on a region containing t0 and y0 and satisfies the Lipschitz condition on the variable y. The proof of this theorem proceeds by reformulating the problem as an equivalent integral equation. The integral can be considered an operator which maps one function into another, such that the solution is a fixed point of the operator. The Banach fixed point theorem is then invoked to show that there exists a unique fixed point, which is the solution of the initial value problem.
I must have misunderstood something... isn't it that once you choose an equilibrium point as initial condition, you have at least a constant solution corresponding to this initial data, namely the constant one. This is the definition of an equilibrium point. At least for the case the problem has a unique solution, this constant should be the only solution.
Ravi's second question looks a bit more as a question whether the given Cauchy problem has a solution (for a given initial condition), and maybe whether this is unique. Miodrag already gave an answer to this.
where g is acceleration due to gravity, l is the length of the pendulum, and y=\theta is the angular displacement. Set k=\sqrt{g/l} and y=\theta .
Using the small-angle approximation, we get the equation for a harmonic oscillator, y'' +k^2 y =0. Given the initial conditions y(0) = y_0 and y'(0) = 0, the solution becomes
Inspired by all excellent answers I would like to add my simple interpretation of the original question. Let us analyze the set of autonomous (Ordinary Differential Equations) ODEs written in the vector form
$$\dot{x}=f(x)$$
(This kind of ODEs can be obtained for example after space discretization of a large problem described by the Partial Differential Equations)
Now we define $\delta x=x-x^{*}$; taking the derivative of $\delta x$ we obtain
$$\dot{\delta x}=\dot{x}$$
With this result we can rewrite our ODEs in the form
$$\dot{\delta x}=J(x^{*}) \delta x$$
The above equation governs time evolution of the perturbations from the equilibrium point. The analysis of the eigenvalues of $J$ is the foundation of the stability analysis .
where we can find an excellent graphical illustration of the problem; looking at these graphs it is easy to conclude how the system starting from the equilibrium point will evolve in time depending on the eigenvalues of J.
For more in depth discussion I would like to suggest the textbook by Arnold on the ODEs (https://mitpress.mit.edu/books/ordinary-differential-equations).
His book offers an excellent method based on the analysis of phase space portraits; this technique is an essential element in the theory of dynamical systems.