This page contains a Flash digital edition of a book.

Simulating the stars

Data volume is a growing problem in many scientific disciplines, as new techniques are developed for gathering information – but in the astrophysics world the data volumes are, put simply, astronomical. Gillian Law explains


he Panoramic Survey Telescope and Rapid Response System (Pan-STARRS), a planned array of astronomical cameras and telescopes

and computing facility is expected to gather in the order of 0.4 petabytes of data per year. Te 8.4-metre Large Synoptic Survey Telescope (LSST) will survey the entire visible sky every week, and will reach six petabytes per year – and when the world’s largest radio telescope, the Square Kilometer Array, is commissioned in around 2020, it is estimated that data transport from the dishes will produce between 10 and 100 times the current global internet traffic. Data management is therefore one of the

biggest challenges the astrophysics community faces, comments Dr Norman Gray, research fellow in the University of Glasgow’s School of Physics and Astronomy. He believes that the boundaries being pushed in astronomy are in the data volumes rather than the analysis. Astronomers tend to run huge, long-term simulations for four or five years – the complexity is not so much in the compute being done, but in the sheer amount of data to be churned through. Gray is working with the International

Virtual Observatory Alliance, a body set up in 2002 to push for standards in astronomical data archives and interfaces. It aims to facilitate the sharing of that data between

scientists and allow it to be used in one place, rather than moved around the world. Moving the query to the database is much easier than moving the database to the query when you’re dealing with data volumes this large, he says.

Seeing is believing Te volume of data available can also create problems in terms of the results of those simulations, in that the output is of a far higher quality than anything that can actually be observed. ‘You can oſten do things that you can’t observe,’ says Dr Ken Rice of the University of Edinburgh’s Institute for Astronomy. ‘You can [achieve] much higher resolution than a telescope and you do have to try to get something out that can be confirmed in some way. So you’re running simulations and trying to relate them to some sort of observation that could suggest there’s something here that’s valid,’ he comments. For example, in Rice’s own work – researching the evolution of accretion discs and how this relates to the formation of stars and planets – a simulated density map may show ‘what it “should” look like, but you might never see that in reality.’ Simulations are not just ‘nice movies’,

adds Rice, but ways of probing the science and so researchers need to look at how the results relate to the real world, as far as it can be observed. Tat can mean adjusting the parameters of a simulation: ‘Taking all this high-resolution data and putting it together to get something [with] much worse resolution,


Page 1  |  Page 2  |  Page 3  |  Page 4  |  Page 5  |  Page 6  |  Page 7  |  Page 8  |  Page 9  |  Page 10  |  Page 11  |  Page 12  |  Page 13  |  Page 14  |  Page 15  |  Page 16  |  Page 17  |  Page 18  |  Page 19  |  Page 20  |  Page 21  |  Page 22  |  Page 23  |  Page 24  |  Page 25  |  Page 26  |  Page 27  |  Page 28  |  Page 29  |  Page 30  |  Page 31  |  Page 32  |  Page 33  |  Page 34  |  Page 35  |  Page 36  |  Page 37  |  Page 38  |  Page 39  |  Page 40  |  Page 41  |  Page 42  |  Page 43  |  Page 44  |  Page 45  |  Page 46  |  Page 47  |  Page 48