Final answer:
The fission of a nucleus of uranium-235 into antimony-132 and niobium-101 nuclei releases an energy of 2062.91 MeV, calculated by finding the mass defect and converting it to energy using E = mc².
Step-by-step explanation:
To calculate the energy released by a nucleus of uranium-235 during fission, we need to compare the total mass of the reactants with the total mass of the products. You've given the atomic masses for 235U, 132Sb, and 101Nb as 235.044 u, 131.915 u, and 100.915 u, respectively. Using the equation E = (mass of reactants - mass of products) × c², where c is the speed of light, we can find the mass defect.
First, calculate the total mass of the reactants:
Then, calculate the total mass of the products:
- Mass of 132Sb: 131.915 u
- Mass of 101Nb: 100.915 u
- Assuming there are no extra neutrons in the products, the mass defect is the mass of the reactant minus the sum of the product masses.
Mass defect = Mass of 235U - (Mass of 132Sb + Mass of 101Nb)
Mass defect = 235.044 u - (131.915 u + 100.915 u)
Mass defect = 235.044 u - 232.83 u
Mass defect = 2.214 u
Now convert the mass defect to energy using the formula E = mc², where m is the mass in kilograms and c is the speed of light in meters per second. Since 1u = 931.5 MeV/c², the energy released (E) in MeV can be calculated as:
E = 2.214 u × 931.5 MeV/u = 2062.91 MeV
Thus, the fission of a nucleus of uranium-235 into an antimony-132 nucleus and a niobium-101 nucleus releases 2062.91 MeV of energy.