This page contains a Flash digital edition of a book.
high-performance computing


new HPC architecture, based on X-Gene, the Applied Micro ARM 64 bit CPU, with the support for 4 Nvidia Tesla K40 GPUs. We think that steps like these are important to reach exascale, although we are perfectly aware we will need to take a lot of these steps and at a rather fast pace. Another aspect of future systems will be


liquid cooling. Seven years ago we bet on direct-contact hot-water cooling and we will continue to develop this technology through stages of improvement. We have developed the second generation of Aurora direct liquid cooling – lighter, more compact and effective – allowing extraordinary densities. It is highly likely that future exascale


systems will be heterogeneous, including in one system different computation and storage components, such as processors, accelerators, FPGAs, NVMs, etc Te Eurotech Aurora Bricks architecture goes into the direction of modular heterogeneity. Also, exascale systems will use so


many components, that it will be almost impossible for the whole system to operate without faults. Resiliency – the ability to recover from faults – will be paramount. As a manufacturer, Eurotech aims to make systems as reliable as possible and also to provide to applications all information they


need to prevent and manage faults. However, there are limits to what we can attain with hardware reliability alone. In the future, it is likely there will be the adoption of an integrated approach to fault management across the system stack, with a balance of


PEOPLE WHO HAVE


SPECIALISED IN HPC ARE CURRENTLY


SCARCE, ESPECIALLY IN EUROPE


GIAMPIETRO TECCHIOLLI


proactive prevention and reactive resilience. Te future of scientific computing is not


only exascale and extreme scale systems. Exascale technologies will be used in different applications, including mid- range systems and solutions that are not conventional HPC. Tis will match the expansion of HPC usage driven by a demand for acceleration coming from SMEs and new HPC application areas over and beyond scientific computing. OSINT (open source


intelligence), cybersecurity, computational finance, media and rendering and, real time situational awareness, high performance embedded computing will likely become areas in which HPC techniques and technologies will be more in demand. People who have specialised in scientific


computing and enabling technologies such as HPC are currently scarce, especially in Europe. Tis will surely trigger a reaction and the growth of training programmes so one can expect a more structured education in HPC in the future, even if it will take a while to fill the current gap. Organisations or countries which are better equipped with those skills will be in a position of competitive advantage. Tese have triggered a wealth of


attention and investment worldwide, laying the groundwork for technological discontinuities. At the same time, simulation is being adopted by more industries and an increasing amount of data is produced within science and beyond. All of this will determine an extraordinary evolution of scientific computing, at a time that is turbulent, exciting and promising.


Giampietro Tecchiolli is CTO of Eurotech


The best is yet to come T


Although advances in HPC have been stellar, in Mark Seager’s view there is even more still to come


he single most important truth about high-performance computing (HPC) over the next decade is that it will have a more profound societal impact


with each passing year. Te issues that HPC systems address are among the most important facing humanity: disease research and medical treatment; climate modelling; energy discovery; nutrition; new product design; and national security. In short, the pace of change and of enhancements in HPC performance – and its positive impact on our lives – will only grow. Tis phenomenon stems from what can


generally be called ‘predictive scientific simulation,’ which has revolutionised the scientific method itself. Since Galileo invented the telescope 405 years ago and observed the moons of Jupiter, scientists have moved research


www.scientific-computing.com l


forward through theory and experiment. Now the pace of scientific discovery has been radically accelerated as a result of theory and experiment being augmented by predictive scientific simulations using parallel supercomputers. Scientific simulations inform theory


and experiment in three ways. Te ‘hero’ simulations, typically use the entire system on a single record-setting simulation, and produce higher-fidelity results. Te ‘ensemble’ simulations are groups of simulations run in throughput mode, typically utilising major fractions of the system, and which provide sensitivity information of the ‘hero’ results on inputs and model parameters that allow one to estimate confidence in the ‘hero’ results (e.g. error bars). Lastly, the rapid analysis of large experimental data sets increases the usefulness


@scwmagazine


of these results. Tese procedures enable significant savings through informed decisions and actions in the real world. Te effect of scientific simulations on the


scientific method has been enabled by the astonishing increase in the computational power of supercomputer systems. Over the past 20 years – which is a heartbeat compared to long span of the scientific method (and even shorter when compared to the evolution of biological systems or to the timescales of geological change) – supercomputers have been transformed in many ways. Here’s a quick analysis of the Top500 list


of supercomputers, comparing the Tinking Machine’s CM-5/1024 of 1993 to China’s


Tianhe-2 of 2013: l Performance: Systems have exploded


AUGUST/SEPTEMBER 2014 29 ➤


Page 1  |  Page 2  |  Page 3  |  Page 4  |  Page 5  |  Page 6  |  Page 7  |  Page 8  |  Page 9  |  Page 10  |  Page 11  |  Page 12  |  Page 13  |  Page 14  |  Page 15  |  Page 16  |  Page 17  |  Page 18  |  Page 19  |  Page 20  |  Page 21  |  Page 22  |  Page 23  |  Page 24  |  Page 25  |  Page 26  |  Page 27  |  Page 28  |  Page 29  |  Page 30  |  Page 31  |  Page 32  |  Page 33  |  Page 34  |  Page 35  |  Page 36  |  Page 37  |  Page 38  |  Page 39  |  Page 40  |  Page 41  |  Page 42  |  Page 43  |  Page 44  |  Page 45