Answer:
the minimum number of bits needed for quantization is 7
Explanation:
Given the data in the question;
if quantization is Δv, then the maximum quantization Error is Δv/2
now if we split up the whole range from -
to
evenly into L levels, so
Δv =
/ L
Δv / 2 =
/ L
given that the error must be at most 1% of
,
1%
=
/ L
1/100 ×
=
/ L
100 = L
now L must be a factor of 2 to be binary encoded,
so lets consider;
L = 128
2ⁿ = 128¹
2ⁿ = 2⁷
n = 7
Therefore, the minimum number of bits needed for quantization is 7