38.9k views
1 vote
1.6 Mr. Jordan bought a camera for $175 and sold it for 90% of the cost. For how much did he sell the camera?

User Itwarilal
by
8.3k points

1 Answer

5 votes

Final answer:

Mr. Jordan sold the camera for 90% of the original price, which was $175. The selling price was therefore calculated as $175 times 0.9, resulting in $157.50.

Step-by-step explanation:

Mr. Jordan sold the camera for 90% of its original cost of $175. To calculate the selling price, we first convert the percentage to a decimal by dividing 90 by 100, which gives us 0.9. Then, we multiply this decimal by the original cost of the camera:

$175 × 0.9 = $157.50

So, Mr. Jordan sold the camera for $157.50.

User Rakete
by
8.0k points