Why are transformers used in the National Grid?

The National Grid is how electricity is distributed from suppliers to consumers (commercial or private household). Power stations produce power in the region of Mega Watts (the order of millions of watts) and we want as much of this power to be transferred efficiently (without losses) to be used by consumers. There are two equations which are important to consider:

  1. Power loss in a line = current^2 x resistance of the wire (written as P = I^2 x R)

  2. Power = current x voltage (written as P = I x V)

Looking at equation 1) we see that in order to minimise power losses in the line we need to have as small a current as possible (the resistance of the wires is constant). Equation 2) tells us that for a constant power, in order to have low current we must increase the voltage.

Step-up and Step-down transformers are used to increase and decrease voltage respectively. Once electricity is generated from the power station (~20,000V) it is 'stepped-up' to between 132,000V and 420,000V and distributed via cables to consumers. Mains electrics (what we use in our homes) runs at 230V, so a step down tranformer is needed to reduce the voltage to this safe level.

Answered by Mike R. Physics tutor

12736 Views

See similar Physics GCSE tutors

Related Physics GCSE answers

All answers ▸

Compare the advantages and disadvantages of the two methods of generating electricity (figure 1 in answer):


What is the difference between speed and velocity?


Ultrasound is used to scan unborn babies but X-rays are not used to scan unborn babies.


Explain the difference between a real and a virtual image.


We're here to help

contact us iconContact usWhatsapp logoMessage us on Whatsapptelephone icon+44 (0) 203 773 6020
Facebook logoInstagram logoLinkedIn logo

© MyTutorWeb Ltd 2013–2024

Terms & Conditions|Privacy Policy