Answer:
We ignore decimal point at first and multiply two variables.
Final is count all decimal points then add it.
Explanation:
I. Multiply 104.3 with 10 and ignore the decimal points.
= 1043 * 10
= 10430
II. Count all decimal points from two variables,
in this case 104.3 is 1 decimal and 10 is 0 decimal so total decimal is 1 point
= 1043.0 or 1,043
Back to the question, why 104.3 * 0.10 is 10.43
stepI => 1043 * 1 = 1043 {when 0.10 is the same as 0.1} [decimal 1 + 1 = 2]
stepII => 10.43
Hope that help :)