103k views
5 votes
A TV requires a potential difference of 11.5V and a current of 10A. It uses a transformer to produce the

correct potential difference from the mains supply. If the mains supply is 230, what current does it
provide to the transformer?

2 Answers

6 votes

Final answer:

To find the current supplied to the transformer by the mains, we calculate the power required by the TV and then use this to find the input current (0.5 A) considering the transformer as 100% efficient.

Step-by-step explanation:

The question is about calculating the current provided to a transformer by the mains supply given the output current and voltage required for a TV. Using the principle of conservation of energy, we know that in an ideal transformer (ignoring losses due to heat), the input power is equal to the output power. The power (P) of an electrical device is calculated using the formula P = V × I, where V is the voltage and I is the current.

First, let's calculate the power required by the TV:

Power required by the TV (PTV)

PTV = VTV × ITV = 11.5 V × 10 A = 115 W

Assuming the transformer is 100% efficient and no power is lost, the input power must equal the output power:

Pinput = PTV = 115 W

We have the mains supply voltage (Vmains) and we need to calculate the current (Imains) provided to the transformer:

Input current (Imains)

Imains = Pinput / Vmains = 115 W / 230 V = 0.5 A

Therefore, the mains supply provides a current of 0.5 A to the transformer.

User Gene Merlin
by
8.0k points
5 votes

Answer:

1800W

Step-by-step explanation:

A TV requires a potential difference of 11.5V and a current of 10A. It uses a transformer to produce the correct potential difference from the mains supply. If the mains supply is 230V, what current does it provide to the transformer? The required electrical power output of a step-up transformer is 1800W.

User Martin Mogusu
by
7.7k points