49.2k views
1 vote
A certain first-order reaction (A→products) has a rate constant of 8.10×10−3 s−1 at 45 ∘C. How many minutes does it take for the concentration of the reactant, [A], to drop to 6.25% of the original concentration? Express your answer with the appropriate units.

User Shaves
by
5.2k points

1 Answer

0 votes

Answer:

The answer is 5.7 minutes

Step-by-step explanation:

A first-order reaction follow the law of
Ln [A] = -k.t + Ln [A]_(0). Where [A] is the concentration of the reactant at any t time of the reaction,
[A]_(0) is the concentration of the reactant at the beginning of the reaction and k is the rate constant.

Dropping the concentration of the reactant to 6.25% means the concentration of A at the end of the reaction has to be
[A]=(6.25)/(100).[A]_(0). And the rate constant (k) is 8.10×10−3 s−1

Replacing the equation of the law:


Ln (6.25)/(100).[A]_(0) = -8.10x10^(-3)s^(-1).t + Ln[A]_(0)

Clearing the equation:


Ln [A]_(0).(6.25)/(100) - Ln [A]_(0) = -8.10x10^(-3)s^(-1).t

Considering the property of logarithms:
Ln A - Ln B = Ln (A)/(B)

Using the property:


Ln ([A]_(0))/([A]_(0)).(6.25)/(100) = -8.10x10^(-3)s^(-1).t

Clearing t and solving:


t = (Ln (6.25)/(100) )/(-8.10x10^(-3)s^(-1) ) = 342.3s

The answer is in the unit of seconds, but every minute contains 60 seconds, converting the units:


342.3x(1min)/(60s) = 5.7min

User Slykat
by
5.1k points