HPC 2014-15 | Processors
processors. Given that the HPC industry has always pushed the limits of what is possible, and adopted weird and wonderful technologies in order to gain a performance edge, the dominance of HPC systems by the x86 architecture is quite remarkable. Te transition that started 25 years ago now
seems to be complete – so is it game over for anyone other than Intel in the HPC processor industry? Absolutely not, the game is still very much alive, and the forces that have shaped the industry over the last 25 years – and other issues – will continue to drive change over the next 25 years.
One of the drivers for change that has led
to Intel’s strong position in the HPC market has been price – or to be more precise, price/ performance. Intel doesn’t just build chips for PCs or database servers and hope that these can be successful in HPC. Over the years many technologies and features have been introduced to the x86 family to make these processors more in tune with the needs of the HPC market. Whereas early x86 processors did not even have a floating point unit, a modern x86 processor looks much like a multiprocessor Cray supercomputer on a chip. A combination of the HPC advances made by Intel (and, to a lesser extent, by AMD), together with the low price driven by high volume, has put Intel in the position it is in today in HPC. Te economics driven by high volume,
which has been Intel’s friend in recent decades,
“Te transition that started 25 years ago now seems to be complete – so is it game over for anyone other than Intel in the HPC processor industry?”
may turn out to be its enemy in the future, as very high volume of processors today are deployed in mobile devices – smart phones and tablets – a market segment dominated by ARM, and not by Intel. While an even larger market segment is embedded computing, where more exotic technology is oſten deployed and x86 and ARM compete with DSPs and FPGAs. Another driver for change that is now at the
top of everyone’s thinking (but was not even on anyone’s radar when the first TOP500 list was published) is power consumption. Te first system on the list that provided power consumption information was the Earth Simulator in Japan, which was installed in 2002. It went straight in at the number one spot (by almost a factor of four over second place), stayed there for two years (a lifetime in the world of HPC), and used 3,200 kW of power. Tis was a particularly power hungry system, with the IBM BlueGene/L system that topped the list in June 2005 drawing only 716 kW, while some systems in the top 10 positions on the list used
less than 100 kW of power. Fast forward to June 2014 and the top system, China’s Intel Xeon Phi based Tianhe-2 (MilkyWay-2) requires 17,808 kW, with systems as low as 78th position on the list drawing more than 1 MW (1,000 kW) of power. Bear in mind that a small town requires a 10MW power supply, and it is clear that the trend of HPC systems using more and more power has to change. Couple the need to reduce power
consumption with the mass market economics of processors for mobile devices and embedded computing, and there is potential for future HPC systems to be driven by an evolution of technology that is today deployed outside of mainstream computing and HPC.
What next for Intel? Intel is well aware of the changes going on in the HPC industry, and is working hard to ensure that it can maintain its strong position in the HPC market. Future generations of Intel’s Xeon processor
will continue to be major components in many HPC systems, but cannot be the only processing technology used if the HPC industry is to respond to the power consumption challenges it faces. Combining its own developments with
technology acquired from Cray and QLogic Intel has announced its Omni Scale Fabric, an integrated network designed to meet the needs of the HPC community. Omni Scale
5
Intel
Page 1 |
Page 2 |
Page 3 |
Page 4 |
Page 5 |
Page 6 |
Page 7 |
Page 8 |
Page 9 |
Page 10 |
Page 11 |
Page 12 |
Page 13 |
Page 14 |
Page 15 |
Page 16 |
Page 17 |
Page 18 |
Page 19 |
Page 20 |
Page 21 |
Page 22 |
Page 23 |
Page 24 |
Page 25 |
Page 26 |
Page 27 |
Page 28 |
Page 29 |
Page 30 |
Page 31 |
Page 32 |
Page 33 |
Page 34 |
Page 35 |
Page 36 |
Page 37 |
Page 38 |
Page 39 |
Page 40