This page contains a Flash digital edition of a book.
HPC NEWS


OSC resources shed light onto galactic evolution


The Ohio Supercomputer Center (OSC) has revealed that Ohio State University (OSU) astronomers are using the centre’s resources to unlock some of the mysteries surrounding the formation of galaxies and the evolution of massive black holes. Two research teams led by Stelios Kazantzidis, a long-term fellow at the Center for Cosmology and Astro-Particle Physics (CCAPP) at OSU have used 1,000 processor hours on the parallel high- performance computing systems each day for more than two years. To develop their detailed models and resulting simulations, Kazantzidis and his colleagues tapped OSC’s flagship system, the Glenn IBM Cluster 1350,


which features more than 9,600 Opteron cores and 24 terabytes of memory.


Kazantzidis and University of Zurich student Simone Callegari recently authored a paper, ‘Growing massive black hole pairs in minor mergers of disk galaxies,’ and submitted it for publication in the Astrophysical Journal. Their study involved a suite of high-resolution, smoothed-particle hydrodynamics simulations of merging disk galaxies with supermassive black holes (SMBHs). These simulations include the effects of star formation and growth of the SMBHs, as well as feedback from both processes.


The astronomers found that the mass ratios of SMBH pairs in


Platform Computing forms partnership with Sophis


Platform Computing has announced a partnership with Sophis, a provider of risk management software for the financial services industry. Sophis is now offering an integrated solution with Platform Symphony for users in the banking, insurance and investment management sectors to distribute resource-hungry calculations such as P&L and sensitivities calculations, instrument pricing, risk simulations and value at risk (VaR). Sophis and Platform carried out a benchmark test on their joint solution at IBM’s Product and Solution Support Centre (PSSC) in Montpellier, France in July 2010 using the IBM’s hardware: IBM Power 750, IBM XIV Storage System, IBM System x3850 M2 and IBM BladeCenter HS22. Using OTC structured


products, the test ran a historical www.scientific-computing.com


VaR calculation on a multi-asset portfolio with 32,000 positions, which was representative of a true portfolio. The test, computing VaR calculations with 270 historical scenarios, took four hours using less than 300 nodes and less than 90 minutes using slightly more than 800 nodes with no apparent limitation and linear scalability. Samer Ballouk, head of


product management and business development at Sophis, said: ‘The results of this benchmark with Platform Computing are very good news for our customers, who have increasingly demanding risk management and portfolio valuation requirements. By speeding up calculations using a grid approach, they can introduce an intra-day VaR calculation, for example, and comply with the latest guidance on risk management and reporting.’


the centres of merged galaxies do not necessarily relate directly to the ratios they had to their original host galaxies, but are ‘a consequence of the complex interplay between accretion of matter (stars and gas) onto them and the dynamics of the merger process.’ As a result, one of the two SMBHs can grow in mass much faster than the other. Kazantzidis and his colleagues also recently developed sophisticated computer models to simulate the formation of dwarf spheroidal galaxies, which are satellites of our own galaxy the Milky Way. The study concluded that, in a majority of cases, disk-like dwarf galaxies – known in the field as disky dwarfs – experience significant


loss of mass as they orbit inside their massive hosts, and their stellar distributions undergo a dramatic morphological, as well as dynamical, transformation; from disks to spheroidal systems. The goal of Kazantzidis’ team is to develop representations of galaxies that are as accurate as possible. Access to the Glenn Cluster increases the number of objects (or simulation particles) that can be depicted in the model, enhancing their ability to perform accurate and meaningful calculations. These projects were funded by CCAPP, the Swiss National Science Foundation, the Polish Ministry of Science and Higher Education, and by an allocation of computing time from OSC.


ffA and Statoil


Foster Findlay Associates (ffA) and Statoil have announced a continuation of their ongoing research and development collaboration. This is the sixth phase of the collaboration, which has allowed the companies to work together in producing state-of-the-art 3D seismic analysis capabilities. These capabilities are deployed within the Statoil ‘AVI’ Advanced Volume Interpretation software application that has resulted from this ongoing collaboration. ffA and Statoil’s collaborative work to date has successfully produced a range of new techniques that have dramatically increased the clarity and detail with which geological features can be imaged and delineated from 3D seismic datasets. This has been shown to add significant value to seismic interpretation in a range of settings including carbonates environments. Within this phase of


continue collaboration


collaboration, best practice workflows to aid interpretation, imaging and modelling utilising the range of technologies with AVI will be defined. To further enhance AVI, this new phase of the R&D collaboration will develop and give Statoil preferential access to new capabilities aimed at calibrating 3D seismic derived analyses to enable easy integration into modern interpretation workflows that also leverage rock physics capabilities. An important aspect of the collaboration is definition of Statoil specific best practice workflows designed to enable Statoil to obtain competitive advantage through utilisation of these new techniques.


MORE NEWS AVAILABLE ONLINE www.hpcprojects.com


SCIENTIFIC COMPUTING WORLD DECEMBER 2010/JANUARY 2011 15


Page 1  |  Page 2  |  Page 3  |  Page 4  |  Page 5  |  Page 6  |  Page 7  |  Page 8  |  Page 9  |  Page 10  |  Page 11  |  Page 12  |  Page 13  |  Page 14  |  Page 15  |  Page 16  |  Page 17  |  Page 18  |  Page 19  |  Page 20  |  Page 21  |  Page 22  |  Page 23  |  Page 24  |  Page 25  |  Page 26  |  Page 27  |  Page 28  |  Page 29  |  Page 30  |  Page 31  |  Page 32  |  Page 33  |  Page 34  |  Page 35  |  Page 36