When you multiply a number by a multiple of 10 (10, 100, 1000, etc), you add the amount of zeros at the end of the multiple to the number.
In the case of 10, you would add 1 zero by shifting the decimal point to the right 1 because it has 1, and in the case of 100, you would add 2 by shifting the decimal point to the right 2. Just for reference, 1 is the same as 1.0, and moving the decimal point 1 to the right is 10.
When it comes to decimals, just do the same. You have 5.743 * 100 which is also (10^2).
If you shift the decimal point of 5.743 2 places to the right, you increase it by 100, which gives you 574.3.
Therefore, 5.743 * 100 = 574.3.
Hope this helps! :)