This page contains a Flash digital edition of a book.
Challenging, exciting, promising times


Giampietro Tecchiolli expects the techniques of high-performance computing to become more pervasive but


worries about shortage of skilled manpower C


omputational science is now commonly considered a third mode of science and engineering, complementing and adding to field/


experimentation/observation and theory. Computer models and computer simulations have become an important part of the research repertoire, supplementing (and in some cases replacing) experimentation. Scientific computing spans different


disciplines such as computer science, applied mathematics and science/ engineering. It is a complex and diverse area, requiring domain expertise, mathematical modelling, numerical analysis, algorithm development, soſtware implementation, program execution, analysis, validation, and visualisation of results. Te fundamental technological enabler


of scientific computing is high performance computing (HPC). Te linkage between scientific computing and HPC exists at different levels: some scientific and engineering problems need to scale over large computational resources. Others are suited to thousands of simulations running in parallel. A third group deals with large amounts of data. Tese three areas share the same need for


more computing. Te key point is exactly in those two words: ‘more computing’, because there lies the very essence of the challenges HPC is facing – challenges that will shape the future of HPC and scientific computing. Te first, and very considerable, challenge


is exascale, which in concrete terms is the ability to install and run a production supercomputer with a peak performance of at least 1000 PFlop/s and a power consumption limited to 20 MW. Another challenge is to take the


technologies for exascale to much smaller dimensions so that they become available to all users, bringing benefits such as fast


28 SCIENTIFIC COMPUTING WORLD


‘Human Brain’, in which Eurotech is taking part. With current computational resources, we can map and simulate 5-10 per cent of the human brain but not in real time. Extrapolation demonstrates that a real-time complete human brain simulation will need from 1 to 10 exaflop/s and 4 PB of memory. While theoretical projections suggest


Eurotech’s Tigon board shows the company’s cooling technology


and energy-efficient computation to many companies and institutions in a variety of markets. A third challenge is skills. With HPC


systems becoming larger and more pervasive and with the application of scientific computing in many additional areas, there is and there will continue to be a lack of specific competences, at least until proper education becomes more widespread and starts producing positive effects. Te future of scientific computing will


be very much conditioned by exascale and its derivations. Te current largest Top 500 supercomputers are petascale installations that already seem large enough in terms of energy consumption, operational cost, and space occupancy. But, they are not large enough for the challenges science is facing in many fields: from high energy physics to astronomy, climate modelling, chemistry, material science, biology, and others. Exascale supercomputers can enable


progress in ‘old’ science such as climate modelling, molecular dynamics, aerodynamic design, and cosmology. In other words, there is a part of science and engineering for which it is already known that more computing will lead to faster and better results. But, exascale could also open possibilities of ‘new’ science and engineering, so new applications that will lead to discoveries that we don’t yet imagine. Take, for example, a project like the


the possibility of having an exascale system by 2018, reality tells us that a usable supercomputer of that size will require at least few years into the next decade. Simply adopting the current approach – more of the same, but bigger and faster – will not work, due to constraints in power availability, power cost, reliability, and scalability of applications. Exascale poses new constraints that will


force the entire HPC community to think differently. Power (cost and availability) is the largest of these constraints. In the last decade, we realised a one-time gain in power efficiency by switching to accelerators/ manycore. Tis is not a sustainable trend, without any future technological discontinuity in combination with a new approach. New technologies such as silicon photonics, as well as improvements on existing technologies, such as low power processors, accelerators, liquid cooling and 3D Torus, will be combined with a new way of using systems and new programming models (so, in essence, a new scientific computing) that will not focus only on performance but also on energy efficiency. For instance, since the majority of energy consumed by today’s supercomputers is used to move data around the system, a lot of attention will be given to concurrency and locality. As a manufacturer, Eurotech believes it can


contribute to the exascale effort by working with research institutions (as we are doing in projects like QPACE2, DEEP, and Human Brain) and using technology advances in systems that are usable and affordable. What we envision is that the combination


of novel, extremely energy-efficient architectures and liquid cooling should provide the ground on which to build exascale systems. We recently presented a


@scwmagazine l www.scientific-computing.com


Page 1  |  Page 2  |  Page 3  |  Page 4  |  Page 5  |  Page 6  |  Page 7  |  Page 8  |  Page 9  |  Page 10  |  Page 11  |  Page 12  |  Page 13  |  Page 14  |  Page 15  |  Page 16  |  Page 17  |  Page 18  |  Page 19  |  Page 20  |  Page 21  |  Page 22  |  Page 23  |  Page 24  |  Page 25  |  Page 26  |  Page 27  |  Page 28  |  Page 29  |  Page 30  |  Page 31  |  Page 32  |  Page 33  |  Page 34  |  Page 35  |  Page 36  |  Page 37  |  Page 38  |  Page 39  |  Page 40  |  Page 41  |  Page 42  |  Page 43  |  Page 44  |  Page 45