Grid resolution alone does not determine accuracy. A sensor with coarse resolution (eg. 1 km) may accurately represent average surface temperature over an area, and the sensor may be able to reproduce this result both accurately and consistently over time. However, the observation is not very precise. Finer resolution grids may offer greater precision, although not necessarily greater accuracy. For example, you may have a sensor with a 10 m grid that suffers from systematic bias, noise or other random errors, or delivers data on temperature variations of no consequence, which may be less accurate than a 1 km grid. It is more precise, but less accurate. The combination of accuracy and precision should be selected to fit the error tolerances that meet the objectives of your study.
If you're dealing with a large grid cell-size (e.g. 1 km), the temperature signal for the city itself will be mixed with that of the surrounding area. Finer scale resolution grids would be best for this kind of analysis.
I think it also depends of your model configuration and initial data accuracy.
You can work with very high resolutions but if your model is not able to represent the small scale physics it is useless and a waste of computing time.
Try to balance resolution and check if you can adjust your model physics in a set more suitable for urban environments.
Grid resolution alone does not determine accuracy. A sensor with coarse resolution (eg. 1 km) may accurately represent average surface temperature over an area, and the sensor may be able to reproduce this result both accurately and consistently over time. However, the observation is not very precise. Finer resolution grids may offer greater precision, although not necessarily greater accuracy. For example, you may have a sensor with a 10 m grid that suffers from systematic bias, noise or other random errors, or delivers data on temperature variations of no consequence, which may be less accurate than a 1 km grid. It is more precise, but less accurate. The combination of accuracy and precision should be selected to fit the error tolerances that meet the objectives of your study.
I totally agree with Colin. In regard of your need of resolution, you should choose the grid size. The accuracy is strongly linked to raw data accuracy. Assuming a Gaussian distribution of the differences between the real value and each measurement, you will get a better accuracy when averaging. It should remove the noise but not the bias. But never forget the physics of the measurement to establish your errors budget. Refer to metrology principles and defintions, vocabulary is often confused : Precision, accuracy, resolution.
In continuation to Colin and Patrice, you may structure your requirements based on the analysis of these primary parameters
Raw data availability - resolution/grid size and precision : This is the bottom line for accuracy of your models. The best accuracy of the model is the precision of raw data and its base grid size.
Area of Analysis - Consider your area of study - both city and surroundings and take the least half the area of minimum observation you require as the resolution/grid size, if availability of data allows you to. Mostly it would be less than a km if your model, say mm5, WRF etc if you are doing predictions at scales of around 1km and less, or may be the model you would choose.
Model or process : You need to evaluate the process and the transformations happening to the base data during the flow. If re-sampling, re-scaling of the data is involved in the process, then the output precision would definitely reduce. Choice of grid size is also affected by this.
I will answer this question from a sampling perspective since resolution or gridding in general is a sampling problem. In digital signal processing faithful sampling is governed by Nyquist theorem or frequency. Practically, if the resolution represents or captures of the underlying environmental conditions, then 1 km or even 5 km will be more than good enough to deliver a good accuracy (as indicated in the previous answers). In classical surveying engineering we have the following rule for topographic measurement: "Regardless of the grid of your measurement scheme, you need to collect measurement at abrupt changes". In other words, if you have a mountain between 2 points, you need to take a measurement of the top of this mountain, for example, to get a faithful representation of your topography; otherwise your profile will not reflect or represent the underlying physical reality. Similarly, in your case you need to consider the homogeneity/heterogeneity of your area to determine the correct resolution for your work. One possible approach, is to use multi-scale or multi-resolution data set and evaluate the discrepancy between different scales as a metric of homogeneity/heterogeneity to judge the constancy/ variability. Zero or small variability between 2 scales may suggest the use of the lower resolution (large grid size) and the process continue till the last level of your resolution or scale. In other words, your problem, may need to be embedded in a multi-scale or multi-resolution modeling following a coarse-to-fine strategy. Now based on the above discussion, you may end-up with under-sampling or over-sampling. Mathematically, oversampling is not an issue because it will lead to a redundant observation, which is good, for example, from a least-squares parameter estimation point of view, but it may violate the economical principles in terms of cost/benefit. On the other hand, under-sampling will lead to incorrect or approximated results that may not compatible with your objectives. Hopefully this answer may address your question.
This is indeed a sampling problem, well understood by the Nyquist-Shannon Sampling Theorem. If you are unable to assure that you're sampling at abrupt changes in the signal, or that you're sampling at multiple resolutions, as Gamal has suggested, you want to at least meet the fundamental criterion of the Sampling Theorem, which is that your regularly-spaced samples are at at least twice the frequency of what you wish to detect. So, if you're detecting 5m-wide houses, you want a geographic raster cell size of 2.5m or smaller to be sure each house can actually be detected and not go undetected between sample postings (i.e., pixels in this case). One rule of thumb (i.e., heuristic) that is often used is that you use a sampling frequency 1/5th the size of the smallest object you want to detect. Good discussions are provided by Waldo Tobler (citations at bottom).
I add only that over sampling isn't entirely a non-problem, at least not yet. In theory one can find a macro pattern in very finely-sampled data. However, in practice with geographic data, this isn't as easily done as one might think. Noise is frequently apparent in finer-sampled data, and generalizing that data such that a clear macro pattern is discernible is not a straight-forward process. The entire field of generalization ("generalisation" if you're British ;) ) in cartography is focused on the set of problems that come up in that context.
-- citations:
Tobler, W. R. (1987, May 25-28). Measuring spatial resolution. Paper presented at the International Workshop On Geographic Information Systems, Beijing, China.
Tobler, W. R. (1988). Resolution, resampling, and all that. In H. Mounsey & R. F. Tomlinson (Eds.), Building Databases for Global Science: the proceedings of the first meeting of the International Geographical Union Global Database Planning Project (pp. 129-137). Hampshire, U.K.: Taylor and Francis.
Your Q is about output and resolution. Output depends on input data and resolution. What are your input data? Contiguous surface temperature measurements by sensors on a satellite? Or extrapolation of point data of a few meteorological stations? At what resolution? l am looking forward to your response.
In several answers it is apparent that surface temperatures are measured from space or airborne platforms for an entire Area of Interest. However, in many environmental studies temperatures are extrapolated from point data to grids (e.g. WorldClim). In the latter case the raster size in relation to the point density and the terrain configuration will impact on accuracy/reliability.
Article Where the bears roam in Majella National Park, Italy