Transformers are key components in every electrical distribution network. They are used in multiple locations throughout the network operating companies (NOCs) systems, and large energy users such as factories and hospitals frequently have one or more power transformers of their own, usually forming part of an on-site substation.
Given the ever-present and growing need for energy efficiency, the good news is that transformers are relatively efficient. In fact, with typical modern power transformers efficiencies in excess of 97percent are routinely achieved, which on the face of it sounds perfectly satisfactory.
Looked at in another way, however, this figure means that up to 3percent of all electrical power generated is wasted in transformer losses. Clearly these losses are far from negligible, and anything that can be done to reduce them has the potential to deliver huge savings, not just in monetary terms, but also in terms of reduced environmental impact.
And there is a perfectly good way of cutting these losses – the use of transformers with amorphous cores.
Conventional transformers have cores assembled from stacks of laminations that are made from silicon steel with an almost uniform crystalline structure. In transformers with amorphous cores, a ribbon of steel is wound, usually into a rectangular toroid shape, to form the core. Although the material used for the core is still a form of silicon steel, it is produced in such a way that it has no regular crystalline structure – hence the name amorphous, which means without structure.
The big benefit is that amorphous steel has lower hysteresis losses. Put in simple terms, this means that less energy is wasted in magnetising and demagnetising it during each cycle of the supply current. In addition, the construction of amorphous cores means that they have higher electrical resistance than conventional cores, so losses due to unwanted eddy currents in the core are also reduced.
These effects, known collectively as iron losses, are most significant in transformers that are lightly loaded, so just how important are they in practice? To answer this question, it is important to realise that transformers rarely operate at full load. In fact, because they have to be sized to handle the maximum anticipated load, most spend many hours a day very lightly loaded.
For example, a transformer supplying a factory may be, say, 70percent loaded during working hours, and only 10 per cent loaded during the evenings and at weekends. This variation in loading is usually expressed as a load factor that, in effect, represents the percentage of the transformer’s overall capacity to supply energy that is used over a given period – often a year.
On this basis, studies have shown that transformers used to supply factories typically have load factors around 40percent, while those used to supply offices, hospitals and similar premises often have load factors as low as 20percent.
With this in mind, let us look at some loss data for a 500kVA transformer supplying an industrial installation with a load factor of 40percent. With a conventional transformer of modern design, the no-load losses were 665W, while the on-load losses were 4400W. At 40percent load factor, this equates to total losses of 11992kWh
per year. With an amorphous core transformer, the corresponding figures are 220W, 3500W and annual losses of just 6883kWh.
This is a massive reduction of 5159kWh which, at £0.08 per unit, corresponds to a cash saving in excess of £400. Over the typical 30-year life of the transformer, the saving is an impressive £12 000 at today’s prices. Even more impressive is the associated reduction in CO2 emissions, which equates to almost 3tonnes per year.
Finally, it is worth noting that these calculations are based on an industrial installation with a 40percent load factor. In commercial and residential applications where the load factor is invariably lower, even greater savings will be achieved.
If power transformers with amorphous cores have so much to offer, the big question has to be why are they not more widely used? Before answering this question, it's important to put it in a global context. The demand for amorphous core transformers is, in fact, increasing rapidly in many countries, including Japan, China and India. The big exceptions are Europe and the USA.
In these relatively conservative markets, the usual objection is that amorphous core transformers are more expensive than their conventional counterparts. There is some truth in this but, in recent years, the silicon steel used in ordinary transformers has increased in price much more rapidly than the amorphous materials, so the price differential between the two types of transformer is now small.
Recent calculations have, in fact, shown that the payback period for the extra investment in an amorphous core transformer is usually in the region of three to five years. If, as seems likely, energy prices increase, this period will become even shorter.
Regrettably the problem still remains of contracts being placed on the lowest initial price, often with scant regard to lifetime costs. Growing environmental concern is, however, starting to force a change in this attitude, which hopefully means that the benefits of amorphous core transformers will, in future, be more carefully taken into account when contracts are placed.
Other objections raised in connection with amorphous core transformers are that they are physically larger than conventional types, and that they generate more noise. Once again there is an element of truth behind these assertions but, with the latest amorphous materials, these differences are becoming smaller and, in particular, the noise issue is almost completely solved.