198k views
1 vote
tavon has a gift card for $85 that loses $4 for each 30-day period it is not used. he has another gift card for $75 that loses $3.50 for each 30-day period it is not used

User Jim Jin
by
6.7k points

2 Answers

1 vote

30 x 4 = 120

30 x 3.5 = 105

He loses less money with the 75 dollar guft card. I'm not sure if that is the question though. You did not type the question asked of you.

User Zarenor
by
7.5k points
2 votes

Final answer:

The student's question is a math problem about calculating the depreciation of two gift cards' values over time due to inactivity. It involves linear functions and can be solved using simple algebraic equations that model the decrease in each card's value as a function of the days passed.

Step-by-step explanation:

The question asked by the student involves calculating the decrease in value of two gift cards over time, based on a fixed rate of loss per 30-day period of non-use.

This is a mathematics problem that involves understanding linear functions and rates of change.

Tavon has one gift card that loses $4 every 30 days starting from $85, and another that loses $3.50 every 30 days starting from $75. The calculation would require setting up a function for each gift card to track its value as a function of time.

For example, for the first gift card, we can set up a function: Value(t) = $85 - 4*(t/30), where t is the number of days since the card was last used. Similarly, for the second gift card: Value(t) = $75 - 3.5*(t/30).

User Muhammad Razib
by
8.4k points