I am solving problem of problem of natural convection regarding tube. while plotting graph of heat transfer coefficient, there are two terms as mentioned inn question. What is difference between them ?? how to calculate it?
I do not work in the package of Fluents. I am not acquainted with this term (wall function heat transfer coefficient). I am engaged only experimental researches
The difference is simple actually, assume that there is layer which is very close to wall in either case of turbulent or laminar (you can check it by yourself also) and that h is very high compare to upper layer h which surface heat transfer coefficient.
Wall Heat Flux(q)W/m^2 = h*(Tfluid - Twall)
So, that layer temperature is high and thus the difference is less which automatically make our h big at the wall and q is also big. The h calculate from formulas give us surface heat transfer coefficient which is away from the wall. Because that time modern simulations software doesn't exist and experimental data correlations what we have.
Surface Heat Transfer Coef. is defined in Eq. (41.42) on p. 3600 in the ANSYS Fluent User's Guide v.2021R1.
Wall Func. Heat Tran. Coef. is defined in Eq. (41.61) on p. 3609.
The former is based on the wall heat flux and difference between wall temperature and reference temperature, whereas the latter is based on the turbulence wall function quantities (u* and T*).
Sverre Gullikstad Johnsen by reference temperature, what do you mean? the parameter free stream temperature at boundary conditions? or the operating temperature when checking the menu physics>operating conditions?
Daniel Lemos Marques , it is not what I mean - it is what the ANSYS Fluent User's guide mean. Please read the reference I provided.
The reference temperature is user input to the Fluent simulation, which is specified in the Reference Values Task Page (see p. 4306 in the ANSYS Fluent User's Guide v.2021R1).