Answer:
about 153 µV
Step-by-step explanation:
You want to know the minimum voltage change detectable by a measurement system that uses 16 bits to represent voltages in the range 0–10V.
Bins
The 16 bits allow the coding of 2^16 = 65536 different voltage values. If those are uniformly distributed over the 0–10V range, each classification bin will cover a range of 10V/65535 ≈ 0.0001526 V.
The system has a nominal resolution of 153 µV.
__
Additional comment
Suppose a converter can produce 3 output values: {0, 1, 2}. If these cover the range 0–10V, we typically have 0=0V, 1=5V, 2=10V. That is, the difference in voltage to change from one output value to the next is 10/(3-1) = 10/2 = 5. Our converter has 65536 output codes, so the change required from one bin to the next is 10/(65536 -1) = 10/65535.
Analog to digital conversion is often done in a way that causes the bin boundaries not to be separated uniformly. It is not uncommon for some bins to be 2–5 times as wide as others. Thus, the minimum voltage change that changes coded data may be somewhat larger or smaller than 153 µV, and may vary with absolute voltage.
The specification that defines the possible deviations in step size is "linearity." It is often referred to full scale. In the case of a 16-bit converter, a linearity specification of 0.001% of full scale means the bin width may vary ±65536×10^-5 ≈ ±0.66 times the nominal bin width. Some bins could be 53 µV wide, while others could be 253 µV wide.