search.noResults

search.searching

dataCollection.invalidEmail
note.createNoteMessage

search.noResults

search.searching

orderForm.title

orderForm.productCode
orderForm.description
orderForm.quantity
orderForm.itemPrice
orderForm.price
orderForm.totalPrice
orderForm.deliveryDetails.billingAddress
orderForm.deliveryDetails.deliveryAddress
orderForm.noItems
MODELLING AND SIMULATION The imperfect storm


GEMMA CHURCH UNCOVERS THE VAST RANGE OF MODELLING AND SIMULATION TOOLS REQUIRED FOR ACCURATE WEATHER PREDICTION


Weather prediction seems simple nowadays. You just have to summon Siri and ask it if you’ll


need an umbrella or a sunhat. But behind the scenes, the world


of weather prediction is far more sophisticated and diverse. Modelling and simulation tools have appeared in abundance to help deal with the complexity of making predictions for a vast range of weather systems. Adam Clark, a research scientist from the National Severe Storms Laboratory (NSSL) in the US, explained: ‘Weather is complex. There is no ‘silver bullet’ that can give you a perfect forecast. Perfect weather predictions would require accurately observing every square inch of the Earth’s atmosphere.’ As a result, billions of observations must be incorporated into prediction models. But these observations could be affected by instrumental errors and, to make matters more complicated, these prediction models are also prone to errors because they are not exact solutions to the set of differential equations that describe atmospheric motion. Clark explained: ‘These multiple error


sources have a compounding effect and can grow exponentially. Therefore, forecasters have to use every source of information that is readily available, and then use their own knowledge and intuition based on years of experience to make the best possible prediction.’ Observations are an important source


of information for specific short-term weather phenomena, such as issuing tornado warnings. But as we move to longer timeframes, models play an increasingly important role. Convection-allowing models (CAMs)


24 Scientific Computing World April/May 2019


are the NSSL’s primary simulation tool for weather prediction. NSSL’s CAMs are weather models that typically only run over the US and have high enough resolution to depict storms and storm complexes that lead to hazardous weather. A supercell thunderstorm, for example, is a type of thunderstorm with a deep rotating updraft, and is also a weather phenomenon modelled using CAMs.


Clark explained: ‘Using a model with 32km horizontal grid-spacing, which was typical for National Weather Service (NWS) models until 10 to 20 years ago, a supercell thunderstorm was only sampled


“Observations are an important source of information for specific short-term weather phenomena, such as issuing tornado warnings. But as we move to longer timeframes, models play an increasingly important role”


by a single grid-point. However, using 4km grid-spacing, which is needed for CAM applications, supercells can be adequately depicted.’ CAMs were first tested as a forecasting


tool during collaborative forecasting experiments run by NSSL and sister US- government body the Storm Prediction Center during the mid to late 2000s. Based on their success, the National Weather Service operationalised CAMs in 2014 with the implementation of its High- Resolution Rapid Refresh (HRRR) model. The HRRR model is a real-time


3km resolution, hourly updated, cloud-resolving, convection-allowing atmospheric model. It assimilates radar data every 15 minutes over a one-hour period. ‘Nowadays, we have enough computer resources to run


CAM ensembles, which are a group of several CAMs – typically 10 to 20. In the ensemble, each prediction is obtained with slightly different input, and/or model parameters, which gives a range of forecast solutions. The different solutions can be used to determine which is most likely, and the possible range of outcomes,’ Clark added. Data assimilation is a vital but often challenging aspect of the work done at the NSSL, because the researchers have to provide high-resolution forecasts for short lead times. Clark explained: ‘It basically requires very sophisticated algorithms to stitch together very different observational data sources into an accurate and balanced state that can be input into a model and give an accurate forecast. Data assimilation is one of the big challenges associated with a large initiative at NSSL called Warn-on- Forecast.’ The Warn-on-Forecast (WoF) research


program is currently tasked to increase tornado, severe thunderstorm, and flash- flood warning lead times. Clark added: ‘In general, one of the biggest problems right now for CAM ensemble systems is that they often do not depict the full range of future outcomes very well. In other words, the weather that actually occurs is too often not forecast by any of the CAM predictions.’ This problem is called ‘under-


dispersion’, according to Clark, who added: ‘To fix under-dispersion, there are many areas of ongoing research that involve figuring out how to properly account for model and observational errors, and how to best assimilate different data sources into the model.’ However, initial results from the WoF


programme are promising. In 2017, an output from the WoF helped predict a tornado in Elk City, Oklahoma, allowing forecasters to alert the public. MIT conducts a range of modelling and simulation research in weather prediction. This work primarily focuses on predicting changes in the occurrence of extreme/ damaging weather events that result from the slowly evolving (over the coming decades) continental-to-global scale


@scwmagazine | www.scientific-computing.com


Page 1  |  Page 2  |  Page 3  |  Page 4  |  Page 5  |  Page 6  |  Page 7  |  Page 8  |  Page 9  |  Page 10  |  Page 11  |  Page 12  |  Page 13  |  Page 14  |  Page 15  |  Page 16  |  Page 17  |  Page 18  |  Page 19  |  Page 20  |  Page 21  |  Page 22  |  Page 23  |  Page 24  |  Page 25  |  Page 26  |  Page 27  |  Page 28  |  Page 29  |  Page 30  |  Page 31  |  Page 32