If you're just doing element-wise, the number of steps needed is just the number of elements in the matrices. That is, NxM = (M/2)xM = M^2/2. In big-oh notation, this is O(M^2).
I have three different stages in my algorithm: one with O(N^2logN) and other two with O(N^2) complexity. I need to know the overall complexity of algorithm.
Is there any tutorial from where I can understand how to estimate big-oh complexity of algorithms? Please help!
If you are really interested in the properties of algorithms, perhaps you should consult the bible of algorithms: The Art of Computer Programming by Donald Knuth. It's all there! ;-)
About big O notation. Let f(N) be the complexity of your algorithm. For example, f(N) = N^2/2. And you have some basic function g(N), like N^2, N^3, 2^N and so on. You can write f(N) = O(g(N)), if there exists a positive number C and a number N0 such that |f(N)| < Cg(N) for all N > N0. Thus, if f(N) = O(N^2) + O(N^2 log N), as in your example, then |f(N)| < C N^2 + B N^2 log N < (C+B) N^2 log N = A N^2 log N, for N > N0. Hence, f(N) = O(N^2 log N). See https://en.wikipedia.org/wiki/Big_O_notation
I would like to clarify your question a bit more. What do you mean by elementwise division or element wise multiplication? Suppose the two matrices are A and B with entries a_{i,j} for i = 1, 2,...., N and j = 1, 2, ...., M. Do you want to mean the followings now?
(i) You want the entries of the output matrix C to be c_{i,j} = a_{i,j}/b_{i,j}?
(ii) You want the entries of the output matrix C to be c_{i,j} = a_{i,j}*b_{i,j}?