Well, that's rather simple. You need to use an A/D convertor (analog to digital convertor). Most likely, you also need to scale down the analog voltage with a resistive divider.
You did not tell what application, what sample frequency. 0.1 % seems to be 12 bits. If its for instrumentation, the simplest is to use a microprocessor with an integrated suitable ADC. Check the web pages of TI, Infineon, ST, NXP, Maxim, Linear, (semi companies) Mouser, Digikey (resellers) to find something suitable.
I agree with Henri, in order to convert a voltage greater than the maximum input of your A/D converter, then you need to divide it by a potential divider. Here you can divide your 10V by 2 which means that you extend the range of your A/D converter by 2. Division by two means to shift your digital number by one digit to left. This is for the net result you get after division by 2. As for circuit you need, you can use two equal resistors which draw negligible current from your circuit.
First select an MCU with integrated ADC. The operating voltage may be 5V or 3.3V. Based on your selection, scale down the input voltage to maximum Vcc supported by ADC. If you don't intend you to use MCU, there are several ADC ICs which can be interfaces thru I2C comm.
Circuit design:
1st stage is design of divider circuit.The current required by ADC is in uA range. So the voltage divider resistor value can be in few 10k or even few 100k range. Check the datasheet of ADC input channel current absolute rating. You can choose to use an OpAmp in 2nd stage or directly to ADC. Provide filter caps (diff & comm mode) at the input of OpAmp or just a filter cap if directly giving to ADC pin. If you want extra safety, you could use a diode protection for input sense voltage to Vcc.
Converting to digital value:
You to scale up by same factor that was used in voltage division.