Maintaining power quality for future energy
As the power system evolves, the challenges for maintaining power quality are changing. Dr Bill Drury and Dr Norman Macleod looks at key issues and mitigation methods.
In the UK, the electricity supply system is extremely reliable with supply interruptions being rare. The National Grid quotes an energy availability figure of 99.99987%, which relates to the transmission system not the supply to the end consumers. But the network is changing. Historically, generators created a perfect sinusoidal voltage with poor quality of electricity supply related to large industrial loads on the network, such as steel plants and coal mines. However, increasing levels of Renewable Energy Sources (RES) are displacing conventional generating plant while new types of loads, such as data centres and motor drives, displace industrial loads. These new energy sources and loads are set to introduce new power quality issues.
Dealing with the issues Over-voltages are a key quality issue. The normal operating voltage of the UK grid varies depending on the voltage level to ensure a safe level of voltage is available to users of the system. Typically any voltage in excess of 110% of the nominal system level is considered to be an over-voltage. A temporary over-voltage may persist for many seconds or minutes, and lead to equipment damage on the system. Key methods to control over-voltages include Automatic Voltage Regulators (AVR) at the generating stations, tap-changer control on grid transformers, switched shunt reactors, Static Var Compensators (SVC) and Static Synchronous Compensators (STATCOM). Voltage dips are a second quality issue.
Typically any voltage between 10% and 90% of nominal system level, normally caused by a system fault, is defined as a voltage dip. These dips can lead to the tripping of sensitive loads, such as motor loads. There are specific requirements on the recovery voltage after fault clearance, to prevent induction motors stalling.
A persistent voltage dip is sometimes referred to as a “brown-out”. And ultimately, a failure to recover from a voltage dip can lead to a wide area loss of supply to consumers, referred to as “black-out”.
A key third power quality issue comes from voltage unbalance. The level of unbalance between the phases of the transmission system is typically less than 1.5%, with exceptions up to 2%, in England and Wales, and 2% in Scotland.
Unbalance may be caused by a lack of transposition on long transmission lines and by uneven balance of loads in the distribution system. Mitigation measures aim to improve the balance of the loads between phases,
perhaps by active Demand Side Management (DSM). Voltage unbalance, defined as the ratio of the negative sequence voltage to the positive sequence voltage, is also a big problem in many loads, notably motors. Here, a 2.5% unbalance typically requires a motor to be de- rated by 10%.
Meanwhile, flicker is a further power quality issue, and originally related to a change in illumination intensity levels of incandescent bulbs, caused by low frequency (5-20Hz) modulation of the system voltage level. As these bulbs have been withdrawn from the market, modern definitions of flicker relate to rapid voltage variation. This can be caused by industrial sources creating repetitive or random changes in load current. These changes in load current can create a voltage drop in the supply system voltage, dependent on the strength of the power supply to the disturbing load. Some RES, such as wind turbines, may create flicker effects due to pulsation of the power generated due to blade tower shadow or ground interference effects. Increasing the strength of the system or applying dynamic reactive power compensation (STATCOM) at the connection to the grid mitigates this problem.
Harmonics and more
Harmonics, superimposed on the 50Hz fundamental voltage, may result in increased voltage stresses on insulation, higher thermal stress on equipment, and potential interference with communication systems. While harmonic distortion stems from industrial, commercial and domestic sources, it is noticeable that the background distortion in most developed countries, such as the UK, now follows a domestic rather than commercial or industrial pattern. UK performance limits define compatibility (equipment withstand), immunity (performance without degradation), emission (disturbance created by an equipment or system) and planning (performance) levels. System resonance effects, which may cause magnification or attenuation of distortion, must also be considered when evaluating performance and compatibility levels. Most standards consider harmonics up to the 50th, but modern converters may generate distortion to much higher orders. New standards and system modelling techniques are being developed to deal with this greater bandwidth.
Mitigation techniques may involve the installation of harmonic filters, improved converter technologies to cancel harmonics and improvements to the design of domestic power supplies for digital devices. For frequency, the limits on deviation from the nominal 50Hz on the transmission system, as defined in the Grid Code, are ±0.5Hz; typically the frequency is controlled
15
within the range ±0.1Hz in the UK. The displacement of large synchronous generators, such as coal-fired and nuclear plant, by RES, which use front-end power electronic converters is progressively reducing the inertia on the system. This potentially impacts on the ability of the system to maintain the balance between generation and demand during major system perturbations. Although maintaining frequency stability is important for the overall security of the system, there are no overall adverse effects from frequency pertubations. Mitigation measures may include the development of “synthetic inertia” algorithms for converter controllers or the installation of a new generation of synchronous compensators. In small scale systems, such as islanded networks, where frequency changes can be quite rapid - more than 1 Hz/s - some loads can struggle to maintain synchronism. Faults on the transmission networks can also be initiated by lightning strikes and switching surges, which can cause damage to many load types, as well as transmission and distribution equipment.
These are normally cleared by rapid opening of the line circuit breakers, followed after a suitable delay by an auto-reclose, to restore power flow with the minimum interruption. If the fault is still present, a further open and re-close cycle, with a longer delay is implemented to clear the fault. On low voltage networks, impulse voltages up to 2kV can be caused by, say, switching loads, but contain little energy and do not result in equipment damage.
Correct design and testing of the clearances and insulation systems on the network can minimise effects, as can the deployment of surge arresters with coordinated protective levels.
Looking to future, as the power system evolves the challenges for maintaining power quality will continue to change. The earlier model of generators creating a 50Hz voltage, which is supplied to passive loads, such as heating and lighting, and disturbed by large industrial loads is less valid today. Many designs of RES generators, such as wind and solar power, bring power quality issues, domestic and commercial loads create adverse power quality, and the industrial load base is diminishing. Crucially, suitable mitigation techniques can be developed and deployed to maintain good power quality as the technologies for the whole system approach are implemented.
Dr Bill Drury is a Visiting Industrial Professor at the University of Bristol and Independent Consultant.
Dr Norman Macleod is Technical Director, HVDC, Power Networks at Parsons Brinckerhoff.
Page 1 |
Page 2 |
Page 3 |
Page 4 |
Page 5 |
Page 6 |
Page 7 |
Page 8 |
Page 9 |
Page 10 |
Page 11 |
Page 12 |
Page 13 |
Page 14 |
Page 15 |
Page 16 |
Page 17 |
Page 18 |
Page 19 |
Page 20 |
Page 21 |
Page 22 |
Page 23 |
Page 24