Sensor is calibrated with radiometric accuracy with 8-bits and same has been improved with 10 bits . My question does how this increase in bits help us to get good quality image? What exactly it means when increasing?
The number of bits (or bytes) a sensor signal is coded in, is not related to sensor accuracy. Typically, sensor accuracy is determined by a sensor's linearity, its slope (sensitivity), the signal to noise ratio and the dark current. A sensor will have more or less fixed values for the rad!iometric variables just cited. When the sensor signal is digitized, accuracy will not increase with the number of bits (or grey levels) the signal is digitized into. A non-linear signal will not become linear when it is digitized into two bytes instead of one. The same is true for the signal to noise ratio. A signal's S/N ratio will not improve when digitizing it into two as opposed to one byte. A low S/N ratio (hence radiometric accuracy) will stay low no matter the number of digits used to code the signal.
However, when one wants to represent a true colour image, one needs at least 3 bytes, to store a Red band (1 byte), a Green band (1byte) and a Blue band (1 byte). The bytes added together give typically 3 bytes to represent a RGB image in true colour.
Evidently, when one uses an imaging sensor, the higher the number of digits per colour, the larger the image will be in digital size, especially with professional camera's of 16 MPixels. Typically hyperspectral RS camera's give very high digital volume imagery. Hence one has to be careful with the number of digits used per image pixel or pretty soon the image storage capacity of your sensor system, will be exceeded.
I hope this broad outline is of some help with respect to your question.
In continuation to what Frank has stated, it is true that whether you have grey-scale image or color (RGB) image, corresponding to each PIXEL you get say one byte of data. So is the S/N ratio that decides the signal quality.
But from your question it appears that when you calibrate the system with 8-bit resolution you might be loosing some finer details in terms of the image contrast. On the other hand when you increased the resolution to 10-bits you are seeing the finer details to give you an enhanced image contrast. SO you see a better image.
Alternately, it might be the case that your digitzation system may well be capable of giving higher bits (>10-bits). And in the earlier case the Effective Number of Bits (ENOB) was 8-bits. When your S/N ratio improved, the ENOB also improved to 10-bits resolution. This means that you are getting more bits (of resolution) that carry meaningful information than the noise. That is why you see an enhancement in the image quality.
In terms of image quality, it matters a great deal what the dynamic range is of the imaged scene. So it's not only important how many bits you have at your disposal, but also how large the dynamic range is, for which these bits represent a linear response. And ofcourse crosstalk between pixels (e.g. blooming) decreases the radiometric imaging performance. When wanting to systematically detect and measure something from images, then calibrated radiometric imaging offers a large advantage: Truly calibrated radiometric imaging namely offers repeatable and comparable imaging (of great importance in e.g. diagnostic imaging).