To be specific, this relates to growth/complexity functions. We say a growth function f(n) is in O(1) if and only if there exists a positive real c and positive integer n_0, such that for all n>=n_0, f(n)
Can we use this notation for referring us to space occupation? In other words can we say with O(n) that the space we are going to occupy in the memory or in the disk it will increase linearly together with the size of the data?
Yes we can when we talk about problems and algorithms. That is called space-complexity. Often it becomes important when talking about memory, and sub-linear time algorithms (where space becomes contentious) or if your algorithm relates maybe to applications in memory. It is also with respect to the input size. It is an important metric to keep in mind as there is usually a time-space complexity trade off.
I don't want to be polemic or rude but I don't see the point on insert a lot of answers that do not add any useful information to the Daniel Page one. I thought Q&A is meant to help each other not to increase scores. Sorry again if offended someone.
O(1)={x| there exist some positive constants c and n0 such that for all n≥n0 there is 0≤x≤c}, which means the complexity is irrelevant to the size of the input.
The Execution-Steps of the algorithm is approx. constant, so not dependend on the size of the input data (with cardinality n). In practice it could mean, there is even no loop over the n data items...its soething like a "best case" situation....
In short, O(1) means that it takes a constant time, like 14 nanoseconds, or three minutes no matter the amount of data in the set. O(n) means it takes an amount of time linear with the size of the set, so a set twice the size will take twice the time. You probably don't want to put a million objects into one of these.
According to the strict mathematical notation, the definition f(n)=O(1) just means that there exists a constant c>0 such that \lim_{n\to\infty}f(n)\leq c. But, this assertion does not mean that f(n) should be a constant itself.
Extending this definition to complexity analysis, one can say that, if running time of the analysed algorithm is O(1), then this quantity has fixed UPPER BOUND, does not depend on any instance data.
Shortly: You are right. It means that the time (needed by your algorithm) is constant; independent from the input.
It is the Landau-Notaton (also called Big-O or Landau-Bachmann notation. a short overview is found on Wikipedia: http://en.wikipedia.org/wiki/Big_O_notation
To conclude our `constant vs non-constant' discussion about running time of algorithms having O(1) complexity bound, consider the following toy problem:
Input: a natural number N.
If N is an odd number print `Hello, World!' ones, and if N is even print `Hello, World!' one milliion times.
It is easy to verify that running time of the trivial algorithm (solving this problem) has a constant upper bound (which is surely independent on the input length). Therefore, it's complexity is O(1). But, this running time is not a constant one itself.
This is question from the method known as Big O method. There are two ways to calculate the efficient of an algorith. Relative efficieny and Absolute efficiency. The Absolute efficiency means to calculate what exact time the CPU has taken to complete the given program. The Absolute efficiency depents on the make of the computer . The The relative efficiency is to calculte the time independent of the machine. This relative efficiency is calculated by using Big O method.
The simple way to understand is :
If f(n)= n^3+n^2+1 then the O(n) is n^3 . Here the variable whcich is having the hieghest power is considered.
In the Bubble sort , the total comparisons is n(n-1)/2 =(n^2-n)/2 . Discard the constant .Then N^2-n . The heighest power is 2 hence O(n) is n^2 for bubble sort.
For any monotonic functions f(n) and g(n) from the positive integers to the positive integers, we say that f(n) = O(g(n)) when there exist constants c > 0 and n0 > 0 such that:
f(n) ≤ c * g(n), for all n ≥ n0
Intuitively, this means that function f(n) does not grow faster than g(n), or that function g(n) is an upper bound for f(n), for all sufficiently large n→∞. Here is a graphic representation of f(n) = O(g(n)) relation:-
Constant Time: O(1)
An algorithm is said to run in constant time if it requires the same amount of time regardless of the input size. Examples: