HPC YEARBOOK 2021/22
ECMWF continues expansion with a new office and datacentre
offices in Bonn, Germany and a new data centre in Bologna, Italy to support the expansion of its compute capacity. In January 2020 ECMWF announced
T
Atos had been contracted to supply a new HPC system that would be able to increase ECMWF’s computing power by a factor of five. The new system was designed to enable ECMWF researchers to more reliably predict the occurrence and intensity of extreme weather events significantly ahead of time, which
he European Centre for Medium- Range Weather Forecasts (ECMWF) has opened new
is essential to understanding and responding to the growing severity of climate and weather problems facing the world today. At the time of the announcement
Dr Florence Rabier, director general at ECMWF, commented: ‘Weather forecasting is computationally intensive and demands the best in high performance computing power. ‘This is one of the main reasons
we chose Atos. We trust in its ability to supply and integrate the best technologies available, but also in its proven expertise to deliver effective solutions to the weather forecasting
community across Europe. Thanks to this investment, we will now be able to run higher-resolution forecasts in less than an hour, meaning that better information will be shared with our member states even faster to enable much-improved weather forecasts as they are able to combine this enhanced information with their own data and predictions. ‘As governments and society
continue to grapple with the impacts of increasingly severe weather, we are also proud to be relying on a supercomputer designed to maximise energy efficiency.’
DOE Invests $13.7 million for research in data reduction for science
projects that will advance the state of the art in computer science and applied mathematics. The projects – led by five universities
T
and five DOE National Laboratories across eight US states – will address the challenges of moving, storing, and processing the massive data sets produced by scientific observatories, experimental facilities, and supercomputers, accelerating the pace of scientific discoveries. As scientific user facilities upgrade
and expand, their capacity for generating unwieldy amounts of scientific data has started to exceed scientists’ abilities to stream, archive, and analyse that data. This has created an urgent need to develop new mathematical and computer-science techniques to shrink these data sets by removing trivial or repetitive data while preserving the important scientific information that can lead to discovery. While the need for data reduction
techniques is clear, the scientists using those techniques must trust that they are not losing important scientific information, and this presents a key challenge. Research supported by this program must address not
he US Department of Energy (DOE) announced $13.7 million in funding for nine research
only the efficiency and effectiveness of a data reduction technique but its trustworthiness as well. Barb Helland, associate director for
advanced scientific computing research, DOE Office of Science comments: ‘Scientific user facilities across the nation, including the DOE Office of Science, are producing data that could lead to exciting and important scientific discoveries, but the size of that data is creating new challenges. Those discoveries can only be uncovered if the data is made manageable, and the techniques employed to do that are trusted by the scientists.’ Projects selected in today’s announcement cover a wide range of topics that promise important innovations in data-reduction techniques, including techniques using advanced machine learning, large- scale statistical calculations, and novel hardware accelerators. A sample of the projects include: • Methods to compress streaming data: Researchers at Oak Ridge National Laboratory will develop techniques to compress data coming directly from a scientific instrument or a computer model by taking advantage of its specific structure and integrating advanced machine-learning techniques,
while allowing scientists to control certain features of the data;
• Methods to intelligently select and tune compression techniques: Researchers at Texas State University will develop techniques to search the vast space of potential data compression techniques and select the best method based on the user’s requirements for fidelity, speed, and memory usage;
• Compression methods for related groups of data sets: Researchers at the University of California, San Diego will develop scalable techniques for compressing multiple related streams of data, such as those from multiple sensors observing the same physical system, by taking advantage of the relationships between the data sets; and
• Methods for programming custom hardware accelerators for streaming compression: Researchers at Fermi National Accelerator Laboratory will develop techniques for encoding advanced compression and filtering, including those based on machine learning methods, as custom hardware accelerators for use in a wide array of experimental settings.
Page 1 |
Page 2 |
Page 3 |
Page 4 |
Page 5 |
Page 6 |
Page 7 |
Page 8 |
Page 9 |
Page 10 |
Page 11 |
Page 12 |
Page 13 |
Page 14 |
Page 15 |
Page 16 |
Page 17 |
Page 18 |
Page 19 |
Page 20 |
Page 21 |
Page 22 |
Page 23 |
Page 24 |
Page 25 |
Page 26 |
Page 27 |
Page 28 |
Page 29 |
Page 30 |
Page 31 |
Page 32 |
Page 33 |
Page 34 |
Page 35 |
Page 36 |
Page 37 |
Page 38 |
Page 39 |
Page 40 |
Page 41 |
Page 42