I assume that the difference is caused by different inputs required by tools. Contour tool needs raster as input meaning it already has a resolution that defines the final resolution and shape of contours. However, GA layer is only a representation of the layer and its properties/models in ArcGIS that can be later converted to raster or layer with defined resolution. Due to this, the resulting contours might differ.
So, geostatistical layer contours differs from the contours generated from raster layer. What is more relevant (accurate) and why ? Which procedure is used for topo maps production ?
You can create a raster from GA layer ... the resolution (granularity) etc. then set how it will look. You can hardly have contours that will represent the real surface 100% (however you can get close with e.g. LIDAR) and even the best data needs to be generalised for the usage in topographic maps ... so both are relevant. A precision of contours also relies on the method that was used during the creation of the raster you can calculate e.g. RMSE of several methods and pick one with the lowest value if you have a raster layer you can e.g. compare it with point elevation points.
Lukas' answer is correct in my opinion. A GA layer on screen is a simplified representation. When the export to raster from GA is invoked, the entire GA layer is recomputed at the output resolution, then contoured. Contouring a given grid at a given cell size using a computer should be near identical between programs because contouring is a very well documented and straightforward function.
I'm not sure at what scale GA will export the contours. My process would be to output a grid at the optimal resolution, which is dictated by the density of input data and then generate the contours from that.
User defines interval of contour classes in GA Layer. So, that is actually a part where user defines the scale, before exporting to vector data. It can surely be determined in relation to input data by using some algorithm.... I am wondering if it can provide a more accurate model...
Contour interval may not be the same as scale. The contour tool will also allow the user to set an interval, but his does not mean that a smaller contour interval is going to give you a more accurate result. It only means the result is divided into smaller intervals. If your data is coarse to begin with, a smaller interval will still be based on the coarse input data.