search.noResults

search.searching

note.createNoteMessage

search.noResults

search.searching

orderForm.title

orderForm.productCode
orderForm.description
orderForm.quantity
orderForm.itemPrice
orderForm.price
orderForm.totalPrice
orderForm.deliveryDetails.billingAddress
orderForm.deliveryDetails.deliveryAddress
orderForm.noItems
HIGH PERFORMANCE COMPUTING


“This is real-time, responsive high- performance data analytics and to make it harder we need to operate day in day out”


are 50,000 pixels on each access. That is 1,000 desktop hard drives one petabyte per 3-D image. This is real time, responsive high performance data analytics and to make it harder we need to operate day-in, day-out,’ concluded Bolton. Once the project is fully operational the


combined power of the array underpinning the SKA project will be able ‘to detect TV signals, if they exist, from the nearest tens maybe 100 stars and will be able to detect the airport radars across the entire galaxy’ Bolton said in reply to a question at the end of the keynote presentation.


 Artists impression of SKA antennas


A square kilometre of collecting area The SKA project is extremely ambitious in scope as its aim is to create a radio telescope array with a collecting area of one square kilometre. This will make the SKA the largest radio telescope array ever constructed, by some margin. To achieve this, the SKA will use thousands of dishes (high frequency) and many more low frequency and mid- frequency aperture array telescopes. Rather than just clustered in the


central core regions, the telescopes will be arranged in multiple spiral arm configurations, with the dishes extending vast distances from the central cores. In this array the physical distance


between the telescopes is calculated precisely using the time difference between the arrival of radio signals at each receiver. Computers can then calculate how to combine these signals to synthesise something the equivalent size of a single dish measuring the width of the distance between the two scopes. Using interferometry techniques the


researchers can emulate a telescope with a size equal to the maximum separation between the telescopes in the array, or some smaller configuration based on a section of the full array.


www.scientific-computing.com | @scwmagazine


Ultimately this means that, rather than build one gigantic dish, the capabilities of one huge dish can be surpassed by the flexibility that this interferometry configuration brings. The system can act either as one gigantic telescope, or a combination of multiple smaller telescopes.


Combining all of these signals is a huge task and this is where HPC hardware is needed to process and analyse the huge amounts of data that will be created by the SKA once it becomes fully operational. Bolton explained: ‘A typical SKA map is


going to contain hundreds of thousands of radio sources. Our iterative calibration and imaging process will use dataflow programming with 400 million points on the graph.’


This huge amount of data requires large


scale computing infrastructure. ‘In total, the processing power we need in the SKA science and data processors is about 250 petaflops peak,’ said Bolton. The systems will also include around 80 petabytes storage and require 0.5-1 terabyte per second of sustained write to storage with sustained read rates approximately 10 times higher. ‘The incoming data sets are about 10 petabytes and our output 3-D images


Data for new scientific breakthroughs The SKA is a global science project that could help scientists and researchers answer key questions relating to the nature and history of the universe, the formation of galaxies and planets and even explain the basis for magnetism and dark matter and energy. The SKA also enable astronomers to produce 3-D maps of the universe on an unprecedented scale. ‘Magnetic fields play an important role


throughout the universe on scales as small as centimetres and as big as a billion of lightyears. With the SKA we hope to address the challenge of how and when magnetic fields arose and rose to their current strength and we will produce the first three dimensional kinetic map of the universe’ commented Bolton. ‘Dark matter and dark energy are ongoing huge mysteries and we plan to play a role in tackling them by studying galaxy evolution. Even in its deployment phase the SKA will be able to map 10 million galaxies spanning eight billion years of evolution. Once the SKA is fully deployed we will conduct the biggest ever galaxy census ever contemplated, in 3-D encompassing up to a billion individual galaxies and covering 12.5 billion years of cosmic history,’ Bolton added. From this data astronomers will be able


to make the most precise determination yet of the properties of dark energy driving the expansion of the universe. Without a project on the scale of SKA this research would not be possible as it allows astronomers to map galaxies much further away than previously possible.


December 2017/January 2018 Scientific Computing World 9


Page 1  |  Page 2  |  Page 3  |  Page 4  |  Page 5  |  Page 6  |  Page 7  |  Page 8  |  Page 9  |  Page 10  |  Page 11  |  Page 12  |  Page 13  |  Page 14  |  Page 15  |  Page 16  |  Page 17  |  Page 18  |  Page 19  |  Page 20  |  Page 21  |  Page 22  |  Page 23  |  Page 24  |  Page 25  |  Page 26  |  Page 27  |  Page 28