astrophysics
processor,’ explains Klaus. He also had to establish the interface between the Kepler SOC and the supercomputer. ‘We wanted to automate
that process, obviously, because it’s a lot of data. Te pipeline automatically transfers the data to the supercomputer, spreads it out across multiple processors… and then brings the results back to the SOC,’ he adds. Klaus says that he couldn’t
have done this without the help of Chris Henze, lead of the Visualization Group in the NASA Advanced Supercomputing (NAS) Division: ‘Chris was a huge help in teaching me which APIs to use, and basically how to write the code – I’d never done a project of that size on hardware that big.’ Henze explains that this is
the way NASA’s supercomputing works – there is a team of computer scientists in place to help users work more effectively – in optimising code and parallelising it to run on NASA’s specific architecture, and then there are ‘people like me who help [users] analyse and manage the data that are produced by simulations. Tat largely involves a lot of data management just because the data volumes are so large,’ he adds.
Continuing mission Kepler was originally planned as a three and a half year mission, but the team is now applying to continue for up to eight years. If it goes ahead, that will take data volumes to a whole new level. ‘We’re getting just incredible
Further information:
University of Glasgow’s School of Physics and Astronomy
www.gla.ac.uk/schools/physics
International Virtual Observatory Alliance
www.ivoa.net
University of Edinburgh’s Institute for Astronomy
www.roe.ac.uk/ifa
www.scientific-computing.com
science from this mission,’ says Klaus. ‘And because Kepler is a statistical mission, the data builds over time. Te more data we have, the more transits we see for a particular planet, the better we can estimate things like planet size. And our data is all related, which makes the computational challenge that much bigger, because we’re not just processing little chunks of data, but have to look at all data over the course of the mission every time we run the transit search.’ Te scale of data means that
improved codes will be essential to success, he adds. ‘One of the things about the algorithm at the moment is that the amount of computation time scales as the square of the amount of data points. So when you add 30 days’ worth of data, the time to process can go up by six times. We’re projecting right now that, with the current algorithm, it would take three months on the supercomputer to run the whole mission data. So we’re definitely looking at ways of refining the algorithm… to make it more efficient,’ comments Klaus. Back in the UK, Tom
Kitching has been working with Edinburgh’s EPCC (Edinburgh Parallel Compute Centre) on how to work with the vast databases that are being produced. ‘In your typical astronomy department, most have clusters of a few hundred CPUs. But in order to run big simulations you need thousands,’ he says. Te HECToR (High End
Computing Terascale Resource)
Kepler Science Operations Centre (SOC)
kepler.nasa.gov/Mission/team/soc
NASA Advanced Supercomputing (NAS) Division
www.nas.nasa.gov/
Edinburgh Parallel Compute Centre (EPCC)
www.epcc.ed.ac.uk
HECToR (High End Computing Terascale Resource)
www.hector.ac.uk
FEBRUARY/MARCH 2012 43
supercomputer in Edinburgh, run by EPCC is ‘the ideal kind of architecture on which to run this type of simulation,’ says Kitching, but even there researchers are forced to make difficult choices, he adds, because the resource is still finite. ‘If you have a few months of CPU space on Hector, for example, then you can either use that to simulate the entire universe to fairly low resolution, or you can simulate, say, a single galaxy at very high resolution. We actually need both, so it’s an interesting trade off that depends on what scientific question you’re trying to answer,’ explains Kitching.
An astronomical challenge Astronomical observation is about to improve, bringing yet more data. ‘We’re going through a big transition in cosmology at the moment,’ says Kitching. ‘Te whole sky is 40,000 square
Creating software solutions for scientists and engineers
Scientific Software We have opportunities for motivated
and enthusiastic developers to join our team of dedicated staff who are passionate about combining their scientific knowledge and IT expertise to solve real-world problems.
You should have:
• BSc (min. 2:1), MSc or PhD in biological sciences, chemistry, physics, mathematics or engineering
• Programming experience in at least one of: Java, C#, C++, C or
VB.NET
You will enjoy a varied and challenging role, working closely with our clients to design and develop innovative software solutions.
Find out more and apply online:
http://jobs.tessella.com
degrees in total and up until now we’ve observed about 100 square degrees of that – quite a small fraction. Over the next five to 10 years we’re going to multiply that by orders of magnitude – so in the next five years we would have observed about a thousand square degrees, and over the next 10 to 15 years we’ll observe the entire sky, with Hubble (Space Telescope) quality resolution.’ Ken Rice adds that there are
big data sets coming in now and a lot of the big new projects, like the square kilometre array, all need supercomputers just to handle the data. ‘Te next generation [of data] will just be immense,’ he says. Dealing with this level of information is going to require serious thought, by both astronomers and their colleagues in computer science. Te data is there and the potential enormous – now the challenge is to harness and use it well.
Developers Tessella delivers software engineering and consulting services to leading global scientific and engineering organisations.
Page 1 |
Page 2 |
Page 3 |
Page 4 |
Page 5 |
Page 6 |
Page 7 |
Page 8 |
Page 9 |
Page 10 |
Page 11 |
Page 12 |
Page 13 |
Page 14 |
Page 15 |
Page 16 |
Page 17 |
Page 18 |
Page 19 |
Page 20 |
Page 21 |
Page 22 |
Page 23 |
Page 24 |
Page 25 |
Page 26 |
Page 27 |
Page 28 |
Page 29 |
Page 30 |
Page 31 |
Page 32 |
Page 33 |
Page 34 |
Page 35 |
Page 36 |
Page 37 |
Page 38 |
Page 39 |
Page 40 |
Page 41 |
Page 42 |
Page 43 |
Page 44 |
Page 45 |
Page 46 |
Page 47 |
Page 48