When resistance decreases with temperature in semiconductors, it becomes very small, say millie ohm. I want to calculate it by ohm meter which has a sensitivity to reach 200 ohms.
I think measuring such a little resistance will be of no use.
Because the instrument you will use for measuring will itself have a internal resistance of around 2-3 ohms which will get added to the measured resistance and give you wrong readings.
Even the leads or wires connected to the instrument have some resistance greater than milli ohms.
Current through such a small resistance will almost be same.
Also their will be a negligible voltage drop across such a small resistance.
I think you can "abuse" even handheld multimeter to measure such small resistances.
All digital multimeters measure voltage and analog usually measure currents. So you will have to use the smallest range for voltage and smallest for current for analog.
Handheld digital (low clas) have internal resistance of around 1M ohm and desktop digital multimeters (middle class)have resistance of 10 M ohm. This has to be considered for some set ups.
If you use bridge (Weston or Carey Foster. See the Wikipedia pages) than you could measure acurately resistances outside of the manufacturers specifications.
If you have 2 identical samples, heating or cooling one of them will be most convinient.
I know question is old and probably not relevant any more.