Why are transformers used in the National Grid?

The National Grid is how electricity is distributed from suppliers to consumers (commercial or private household). Power stations produce power in the region of Mega Watts (the order of millions of watts) and we want as much of this power to be transferred efficiently (without losses) to be used by consumers. There are two equations which are important to consider:

  1. Power loss in a line = current^2 x resistance of the wire (written as P = I^2 x R)

  2. Power = current x voltage (written as P = I x V)

Looking at equation 1) we see that in order to minimise power losses in the line we need to have as small a current as possible (the resistance of the wires is constant). Equation 2) tells us that for a constant power, in order to have low current we must increase the voltage.

Step-up and Step-down transformers are used to increase and decrease voltage respectively. Once electricity is generated from the power station (~20,000V) it is 'stepped-up' to between 132,000V and 420,000V and distributed via cables to consumers. Mains electrics (what we use in our homes) runs at 230V, so a step down tranformer is needed to reduce the voltage to this safe level.

MR
Answered by Mike R. Physics tutor

17380 Views

See similar Physics GCSE tutors

Related Physics GCSE answers

All answers ▸

Why do you weigh less on the Moon than on Earth?


Describe the difference between reflection and refraction (assume the mediums have smooth surfaces)


Could you explain the how an AC generator works?


Why would the National Grid limit the amount of fossil fuels we combust at peak times of energy demand?


We're here to help

contact us iconContact ustelephone icon+44 (0) 203 773 6020
Facebook logoInstagram logoLinkedIn logo

MyTutor is part of the IXL family of brands:

© 2025 by IXL Learning