Not at all. For example consider the case where only X=-1 and X=1 are possible, with P(X=-1) = 1/2 = P(X=1). Then E(X) = 0 and |X|^3 = |X| = 1. You are asking whether E(|X|) = 1 and E(X) = 0 are equivalent.
There is no simple formula, but the definition, which in the most popular cases of a mixture of a discrete and absolutely continuous pd-s (probability distributions) is given by the formula:
Or do you have some more information about when and how often X < E(X)? Because then you can condition your expectancies on it, which also removes the absolute values. For example
E{|X-E(X)|3} = E{(X-E(X))3 | X>E(X)} P(X>E(X)) - E{(X-E(X))3 | X
I like your remarks about the even moments and on the conditional moments for odd orders. BUT calculating the second ones, it is not enough to know the probabilities of the inequalities X < E(X). Indeed, the calculation of the remaining conditional expectation requires the same integrals as the definition. The only win is, that your proposal reduces the calculations to ONE integral. However, from somewhere you need to know the probability P{X < E(X)},which is again an integral using the pd of X given X
Let me note that Mohamed I. Riffi's condition "X >= E(X) a.s." and the following one "X= E(X), a.s." are equivalent. Under these conditions the unique possible value for all central absolute moments is zero.
Remarks completing the suggested splitting into two sub cases are given by Claude.