I want to prove that if f is an increasing function on [a,b], then the function g(x)=f(x)+f(a+b-x) is decreasing on [a,(a+b)/2] and increasing on [(a+b)/2,b].
If x=(a+b)/2, clearly, x\in[a,b], but (f(x)+f(a+b-x))=2 (f(a+b)/2). Therefore, in general, the inequality may not hold for the arbitrary values of x in [a,b].
You can construct a counterexample in the following way. Take a continuous function on [0,1] which is linear on [0,1/2] and [1/2,1]. Let the derivative of your function is c>0 on (0,1/2) and is d>0 on (1/2,1). If you take c>d then you will have a counterexample. The derivative of f(x)+f(1-x) on (0,1/2) will be c-d>0.
If f(x) is a convex function on [a,b], then for any c,d inside the [a,b] the inequality f(c)+f(d)>2f((c+d)/2) is true. Now put c=x and d=b-(x-a)=a+b-x. If f(x) is a concave function, then the opposite inequality is valid.
In the given inequality you (may be) missed the equality case. For example, take very simple function f(x)=x, which is increasing. By putting this function in your inequality we get x+a+b-x=a+b>2(a+b)/2=a+b i.e. a+b>a+b which is not true.
Since f is an increasing function on [a,b] then of course the function f(a+b-x) is also an increasing function on [a,b]. Then how is it possible that the sum of two increasing functions is decreasing on the subinterval of the domain?