Why do you multiply when you are trying to find the percentage of a number?
Hello, I'm curious about something regarding percent problems.
So when you are given a problem such as: What is 25% of 20?
You have to multiply 25% by turning it into a fraction form:
![(25)/(100) * (20)/(1) = (500)/(100) = 5](https://img.qammunity.org/2023/formulas/mathematics/college/rj2vfdnw6lf1aluff9to3e3et7a0ednn.png)
you can also use the decimal form : 0.25 × 20 = 5
I understand how to do it a little, My confusion is in understanding why we multiply first.
So I'd like to ask exactly why do you multiply when you are trying to find the percentage of a number?