search.noResults

search.searching

dataCollection.invalidEmail
note.createNoteMessage

search.noResults

search.searching

orderForm.title

orderForm.productCode
orderForm.description
orderForm.quantity
orderForm.itemPrice
orderForm.price
orderForm.totalPrice
orderForm.deliveryDetails.billingAddress
orderForm.deliveryDetails.deliveryAddress
orderForm.noItems
MODELLING AND SIMULATION


 Large power transformers are particularly vulnerable to extreme weather conditions exacerbated by climate change


g


resolutions of 10km or so,’ he added. Such medium-range forecasts


require a high level of expertise and computational power, as Bauer explained: ‘It is a challenge to enhance the physical realism of the system to produce better forecasts, because many processes are either not well understood or difficult to represent in a computer model. Also, the more sophisticated the system becomes, the more computing resources it requires. Computing cost is a limitation in terms of both having sufficiently large computers and supplying the electrical power for running the machine and cooling it.’ The IFS computer model uses a


representation of Earth system physics that includes the atmosphere, oceans, sea-ice and land surfaces. It also produces initial conditions that describe the starting point for each forecast, which requires it to use 40 million observations per day. The IFS then simulates the weather


for the next weeks in slightly different configurations, as a single 9km forecast up to 10 days in advance, and as an 18km ensemble up to 15 days in advance every day.


The IFS undergoes a major upgrade


once or twice every year. The next upgrade (or cycle) is due in the summer of this year and will significantly improve the physical representation of the transfer of solar and thermal radiation, as it passes through and reflects back across the Earth’s atmosphere. This radiation transfer is a continuous


and complex process to represent, making it too computationally expensive for the IFS to correct on a regular basis. Florence Rabier, director-general at the ECMWF, explained: ‘In the next cycle, the IFS will be able to correct its representation of the radiation in the atmosphere every hour for the ensemble prediction, instead of every three hours, because we have significantly improved the efficiency of the code.’


26 Scientific Computing World April/May 2019


Rabier added: ‘We will also increase the number of observations we use to improve the description we have of the atmosphere, which will improve the accuracy of this physical representation.’ This cycle will also introduce a number of changes to other weather phenomena that are modelled in the IFS. Any such change requires extensive testing to be carried out. While it’s relatively easy to check if individual changes improve the accuracy of the system, compromises may have to be made when looking at the whole Earth system. Rabier explained: ‘All of the changes may not interact positively when we test them together. Then, it’s a judgement call, based on which parameters will degrade. This is based on the thinking that some parameters are due for improvements in a later cycle, or they may be more significant than other parameters.’


The ECMWF also uses probability forecasting to add prediction uncertainty


“Any forecast beyond a few days requires global simulations, because all processes are interconnected. Short-range forecasting systems can be limited to regions”


to its forecasts. This is done by running 51 scenarios for each forecast to account for the chaotic nature of the Earth system and modelling uncertainties. Bauer explained: ‘Each forecast can be judged based on the associated uncertainty. Sources of uncertainty are the natural predictability of a given weather pattern (for example, local summer storms are less predictable than stable winter high-pressure situations) but also the imperfection of the initial conditions and the model itself.’ The ECMWF is starting to investigate machine learning techniques ‘for a smarter way of exploiting observational information, for reducing the computational cost of the forecast model, and for more optimal exploitation of the information that is generated by the model,’ according to Bauer.


Deep learning Machine learning techniques are now being developed to work with the world’s weather and climate prediction systems. For example, a team of researchers at the Department of Energy’s Lawrence


Berkeley National Laboratory (LBNL) is developing a deep learning system, called ClimateNet, to understand how extreme weather events are affected by our changing climate.


Deep learning is a subset of machine learning, where useful information is extracted from raw datasets for pattern detection at multiple levels of abstraction. For this project, a database of 10,000 to 100,000 curated images will be created where climate experts have labelled the images to tell the computer what it’s looking at.


This database will then be used to train machine learning models to more quickly and accurately identify approximately 10 classes of distinct weather and climate patterns, to help understand and predict how extreme events are changing under global warming. The project hopes to address several


shortfalls of current pattern detection schemes used by the climate science community. Karthik Kashinath, climate informatics and AI specialist at the National Energy Research Scientific Computing Center (NERSC) at the LBNL, explained: ‘Existing pattern detection schemes and heuristics are not usable on a global scale because the algorithms tend to be designed for specific regions and climate scenarios. However, for the ClimateNet project, we will create a unified machine learning-based global pattern detection model to address these challenges and improve the range of machine learning-based applications.’ Kashinath added: ‘Deep learning is also a highly scalable technique for large data sets, which performs better with larger amounts of labelled training data and larger computing systems. Hence, climate science is well positioned to utilise the true power of deep learning, provided we can develop high-quality labelled data for training, and that is exactly what the goal of ClimateNet is.’ As a result, the ClimateNet project


could dramatically accelerate the pace of climate research that requires complex patterns to be recognised in large datasets. Kashinath explained: ‘When it’s up and running, ClimateNet will pull out the interesting patterns, and not have to use the whole dataset to predict the evolution of specific weather phenomena. This will vastly reduce the time it takes climate scientists to test out their hypotheses.’ Kashinath added: ‘If we can accelerate this process, then it will give scientists the time and space to think about the harder problems they need to resolve, when tackling climate change and making weather predictions.’


@scwmagazine | www.scientific-computing.com


Iris Shreve Garrott/Flickr


Page 1  |  Page 2  |  Page 3  |  Page 4  |  Page 5  |  Page 6  |  Page 7  |  Page 8  |  Page 9  |  Page 10  |  Page 11  |  Page 12  |  Page 13  |  Page 14  |  Page 15  |  Page 16  |  Page 17  |  Page 18  |  Page 19  |  Page 20  |  Page 21  |  Page 22  |  Page 23  |  Page 24  |  Page 25  |  Page 26  |  Page 27  |  Page 28  |  Page 29  |  Page 30  |  Page 31  |  Page 32