applications
Model outputs from Caliope and Dust. Both models have been developed by the BSC in collaboration with the weather research community
➤ kind of a failure within the storage infrastructure.’ To overcome the technical and operational
challenges with using such large systems, computer scientists must strive to increase performance of their code but usability of the soſtware is just as important according to Francisco Doblas Reyes, earth sciences department director at the Barcelona Supercomputing Centre.
Optimising run-times across different machines Reyes explained that much of their work cannot be completed on the BSC supercomputer, because the simulations are just too big for the time that is available. Tis has led Reyes’ group to look out across Europe for more compute time on other top European HPC systems. ‘Te model and its underlying code must be
optimised so that they can run efficiently on these platforms. Te code must be optimised not only for the Mare Nostrum system, therefore; it must run efficiently on any cluster with sufficient computational power,’ said Reyes. Reyes continued: ‘Te fact is, if we want to
run these kind of experiments then we have to be ready to run them on different platforms and to port our system as quickly and efficiently as possible. So the scene is very different to the one ten years ago, when each institution would have its own computer and people would optimise their model to that specific platform.’ In order to speed up this process, the
BSC team has developed a custom tool that automatically allocates resources from the system to attain optimal performance of the simulation. Oriol Mula-Valls, computing earth sciences
group leader, said: ‘We have developed a custom tool that is called Autosubmit (Autosubmit 3.0.6) that allows us to make the best usage of the computing resources that we have available.’ Mula-Valls continued: ‘Previously we were just
trying to deploy the model and perform a small performance test in order to assess the number
48 SCIENTIFIC COMPUTING WORLD
of processors needed for each component, so that we could determine the best performance.’ He went on to explain that now the research
is progressing to the point that they are now using the BSC performance tools to analyse the behaviour of the model itself: ‘We try to tune the model, even modifying the code and reporting back to the developers,’ Mula-Valls concluded. Reyes said: ‘Tis tool that Oriol was referring
to, Autosubmit, is trying to make life a bit easier for the scientists. It offers an interface to the user which is uniform; it does not depend on the platform on which they are going to run their experiments. Tis really simplifies life a lot in this context, where you have resources distributed across many different HPC centres.’ Reyes said: ‘Tis is why we have the Computer Earth Sciences group. It provides computational
UK MET OFFICE ORDER WAS CRAY’S LARGEST OUTSIDE THE USA
and data support to all the other groups, and tries provide solutions to some of the problems that are slowing down progress; not just in the department but also in the whole community working on weather, air quality, and climate research.’ Usability is a key factor for weather simulation
and HPC in general, because soſtware must be able to scale effectively for simulations of this size. Anything that developers can do to alleviate these challenges through performance tools, the quicker a simulation can be undertaken. Usability was a key aspect of the Met
Office requirements for its new system from Cray, as Nyberg explained: ‘From a usability perspective, scalability is absolutely critical and one of the things that they need to be doing is implementing new science. Te faster they can implement new science, the more immediate that return on investment.’
Nyberg continued: ‘Ultimately the benefit
is improved forecast accuracy. One of the big trends especially at the larger weather services, and the Met Office is a leader in this, is this move towards a suite of seamless forecasting services – everything from emergency response, all the way through to mitigation policy and climate vulnerability assessments.’
Predicting storm surges Regular forecasting is massively reliant on constant availability of systems and usability of soſtware, but it is equally important to have easy to use, scalable soſtware in disaster prevention when a forecast must be prepared in advance to warn local communities. A team of researchers at the University of
Louisiana focus on a very specific aspect of weather – simulating storm surges – but they face the same sorts of challenges associated with delivering more general weather forecasts: a short space of time and a very high resolution. Dr Jim Chen, professor in coastal engineering
from the department of civil and environmental engineering at Louisiana State University, said: ‘Here in Louisiana, we have seen a lot of hurricanes, including hurricane Katrina. What we have been doing here at LSU is using models to forecast storm surges, but also waves on top of the surge that impact the coastal structure and coastal communities.’ Te primary models they use are the Advanced
Circulation model (ADCIRC) and another one called SWAN, a wave model developed by the Delſt University of Technology. Chen explained that these are open source models which are crucial to LSU’s studies, because it allows them to update models integrating new physics or improving the model by interacting with the community. Chen said: ‘A crucial point here is that we use those models, but we also improve those models because they are open source – that is the reason that we use them.’ ‘Tese models are very computationally
intensive; it requires HPC resources to complete ➤ @scwmagazine l
www.scientific-computing.com
Barcelona Supercomputing Centre /Google Earth
Page 1 |
Page 2 |
Page 3 |
Page 4 |
Page 5 |
Page 6 |
Page 7 |
Page 8 |
Page 9 |
Page 10 |
Page 11 |
Page 12 |
Page 13 |
Page 14 |
Page 15 |
Page 16 |
Page 17 |
Page 18 |
Page 19 |
Page 20 |
Page 21 |
Page 22 |
Page 23 |
Page 24 |
Page 25 |
Page 26 |
Page 27 |
Page 28 |
Page 29 |
Page 30 |
Page 31 |
Page 32 |
Page 33 |
Page 34 |
Page 35 |
Page 36 |
Page 37 |
Page 38 |
Page 39 |
Page 40 |
Page 41 |
Page 42 |
Page 43 |
Page 44 |
Page 45 |
Page 46 |
Page 47 |
Page 48 |
Page 49 |
Page 50 |
Page 51 |
Page 52 |
Page 53 |
Page 54 |
Page 55 |
Page 56