search.noResults

search.searching

dataCollection.invalidEmail
note.createNoteMessage

search.noResults

search.searching

orderForm.title

orderForm.productCode
orderForm.description
orderForm.quantity
orderForm.itemPrice
orderForm.price
orderForm.totalPrice
orderForm.deliveryDetails.billingAddress
orderForm.deliveryDetails.deliveryAddress
orderForm.noItems
Gorodenkoff/Shutterstock.com


HIGH PERFORMANCE COMPUTING


Managing HPC resources As HPC systems get larger and more complex, companies are developing services and tools to help users manage their resources effectively, writes Robert Roe


As the complexity of both HPC hardware and the challenges that scientists are trying to solve using HPC systems increases, ensuring a system is utilised efficiently is necessary to deliver a computing service with value for money and a high research output. This requires HPC centres focus on how


to make best use of available resources through software and services designed to get applications running on a HPC system and to increase performance. Andrew Jones, vice-president of


strategic HPC services and consulting at The Numerical Algorithms Group (NAG), said that one of the most common pitfalls that hinders HPC performance is the focus on hardware, instead of overall service and investment in application development. ‘The single biggest thing that people


do is focus too much on the hardware, and forget about the rest of the stuff. The procurement process is driven by the compute system. It doesn’t necessarily take into account that you are going to need people and software to deliver value,’ said Jones. ‘It doesn’t necessarily look at total cost


www.scientific-computing.com | @scwmagazine


of ownership (TCO). Hardware is generally less than half of the TCO over the service life of a HPC system. The same is true in operational practice, people tend to start thinking around the hardware and then add things on top of that, in terms of people, support services or software and so on. Now that’s not to say they are an afterthought, but there is a hardware- centric focus around HPC,’ he added. Jones also noted the second most


common mistake was the assumption that buying and setting up a system is easy. ‘In reality it is a complex research facility that requires expertise in the facility, just as much as it requires expertise in the science or other aspects. It’s not an IT facility, its a research facility that happens to have been built with IT components,’ said Jones. ‘It’s the software, people and applications, the thought about the process; how are you going to actually use this HPC system?’.


Encouraging HPC performance NAG helps customers evaluate and procure the correct system for their workloads. Once a system is up and running, they also help users with porting applications and technology benchmarking. These services all help users to better


“It’s not an IT facility, it’s a research facility that happens to have been built with IT components”


utilise their existing hardware to deliver better application performance by better managing their available HPC resources. Jones notes that it does not require a


massive change in computing hardware to require code tuning. ‘Even just moving to the next generation of CPU – your code will probably run but you need to put effort into tuning it to get best performance. Let’s say you have just upgraded from the previous generation to the new Xeon processor, or to the latest EPYC processor on the AMD side, for example,’ said Jones. ‘You are still going to have to do some tuning to get effective performance. Clearly, that step is bigger if you are moving to a GPU, there is potentially more code work to be done.’ Jones stressed that it is not always


users adopting a new technology, it can be the move to scale up an existing workflow. ‘They have an application that runs at a certain resolution to simulate the properties of an aircraft wing, running on a hundred cores, for example. Now, they


June/July 2019 Scientific Computing World 11


g


Page 1  |  Page 2  |  Page 3  |  Page 4  |  Page 5  |  Page 6  |  Page 7  |  Page 8  |  Page 9  |  Page 10  |  Page 11  |  Page 12  |  Page 13  |  Page 14  |  Page 15  |  Page 16  |  Page 17  |  Page 18  |  Page 19  |  Page 20  |  Page 21  |  Page 22  |  Page 23  |  Page 24  |  Page 25  |  Page 26  |  Page 27  |  Page 28  |  Page 29  |  Page 30  |  Page 31  |  Page 32