Take a look at image #1
Let $\mathbb{G}(\mathbb{R}^n, \mathbb{R}^m)$ the vector space of computational
functions, this means, that all functions in $\mathbb{G}$ are computable by an
imaginary but finite computer.
Then there is a mapping $t:\mathbb{G} \times \mathbb{R}^n \rightarrow \mathbb{N}$
that maps a function and an input value to the processor time requiered
to calculate the result.
Let $g: \mathbb{R}^n \rightarrow \mathbb{R}^m$ be a computable function,
and $f: \mathbb{R} \rightarrow \mathbb{N}; x \mapsto \sup\limits_{\|y\| < x} t[g, y]$.
Then the asymptotial runtime exists, if there is a function $r: \mathbb{R} \rightarrow \mathbb{R}$
continuous with
$$ \lim\limits_{x \rightarrow\infty} \frac{f(x)}{r(x)} = c \in \mathbb{R}$$
Then $\mathcal{O}(f) = \mathcal{O}(r)$.
See the rendered output in link #1.
http://quicklatex.com/cache3/a5/ql_ca3ebd33cf7839def6a51c5c735512a5_l3.png