152k views
1 vote
A certain first-order reaction (A→productsA→products) has a rate constant of 9.90×10−3 s−1s−1 at 45 ∘C∘C. How many minutes does it take for the concentration of the reactant, [A][A] , to drop to 6.25%% of the original concentration? Express your answer with the appropriate units

User Wexxor
by
7.3k points

1 Answer

1 vote

Answer:

4.66667 minutes

Step-by-step explanation:

Rate constant, k = 9.90×10−3 s−1

Time = ?

Initial concentration, [A]o = 100

Final concentration, [A] = 6.25

The integral rate law for first order reactions is given as;

ln[A] = ln[A]o − kt

kt = ln[A]o - ln[A]

t = ( ln[A]o - ln[A]) / k

t = [ln(100) - ln(6.25)] / 9.90×10−3

t = 2.77 / 9.90×10−3

t = 0.28006 ×103

t = 280 seconds

t = 4.66667 minutes (Upon conversion by dividing by 60)

User Rohit Srivastava
by
7.3k points
Welcome to QAmmunity.org, where you can ask questions and receive answers from other members of our community.