68.1k views
3 votes
sofia wants to rent an apartment. she visits apartment a which charges $100 for a one time deposit and $400 per month for rent. she visits apartment b which charges $500 for deposit, but only charges $350 per month in rent. write and solve an inequality that shows how many months it would take for the total paid to be less for apartment b.

User Null Head
by
5.7k points

1 Answer

2 votes

Given that one time charge of appartment "a" = $100

Given that monthly rent of appartment "a" = $400

say the number of months = x

then total cost of the rent for apartment "a" = 100+400x


Given that one time charge of appartment "b" = $500

Given that monthly rent of appartment "b" = $350

then total cost of the rent for apartment "b" = 500+350x


Now we have to make an inequality so that number of months it would take for the total paid to be less for apartment b.

that means 100+400x is less than 500+350x


Hence required inequality is 100+400x<500+350x.


Now we have to solve this inequality.

400x-350x<500-100

50x<400

divide both side by 50

x<8

hence final answer is number of months should be less than 8.

User Zyd
by
5.9k points