This page contains a Flash digital edition of a book.
Sigur/Shutterstock.com


high-performance computing Tere are other efforts to make it easier to


access the cloud, and find a way through the competing claims of different cloud providers. One commercial initiative to provide help and support to potential users in finding the right cloud service for them is a German start-up company, Ascamso. In an interview, its co-founder, Jan Tielscher, pointed out that it was conducting thousands of tests on cloud providers, to study their capabilities and provide insight into which one was appropriate for a customer’s needs. In addition to assessing capabilities and performance, it also looks at pricing, in order to deliver an overall service provider rating and clear price comparisons. In an echo of Weber’s point about engineering workflows, Tielscher stressed that scientific and engineering use cases differ from those of commercial and business users of the cloud, and Ascamso’s assessment process takes that into account.


Prototype tests minibus without drivers for the streets of Leon, Spain. Driverless vehicles need big data, and HPC to process it





integrate the results of the project into the Cloud28+ initiative that it is sponsoring, Martinelli said. Tis is an attempt to provide a catalogue of services and service providers available in Europe and thus accelerate the adoption of cloud technologies. It is notable that the Cloud28+ website prominently displays a statement on the importance of having a ‘strong focus on compliance with the European rules on data privacy and security’.


Engineering workflow in the cloud But considerations of data protection aside, existing cloud services tend not to reflect the workflow in engineering companies, Daniel Weber, the deputy head of the department of Interactive Engineering technologies at the Fraunhofer Institute for Computer Graphics in Darmstadt, told the meeting. Engineering workflows are complex, he


continued, using different techniques such as CFD and simulation and hence a variety of soſtware packages that needed to talk to each other. In contrast, most cloud solutions for high-performance computing provided access to the cloud for only a single application at a time. ‘Our idea is go beyond isolated use to


seamless engineering workflows in the cloud,’ he said. He is working on another EU-funded project, called CloudFlow, which aims to make it easy to pass data from one simulation or soſtware package to another. Te challenge is to ensure that the data exchange is based on accepted standards. (Tere is an unfortunate


14 SCIENTIFIC COMPUTING WORLD


clash of names: the European CloudFlow project is an open platform at http://www.eu- cloudflow.eu/ while there is a US commercial venture also offering cloud related services but in a very different, non-engineering context at http://www.cloudflow.net/ .) Te cloud related components are based


on Open Stack, but there are known issues with using Open Stack for HPC and so the project had to create an ‘abstraction layer’ to get round the problem, which may mean that each HPC centre has to tailor its own implementation. Weber cited the example of a small German engineering company, Stellba Hydro, which conducts maintenance of water


ISVS ARE


STRUGGLING WITH PAY- PER-USE PRICING


turbines but which wanted access to HPC to design and simulate flow systems – not least to check the safety of water-powered electricity generating stations that had been installed decades earlier. Weber believes that implementing a cloud-


based engineering workflow will not only open up HPC to small and medium-sized companies, it will also broaden the range of customers that the ISVs can attract. However, he continued that in his view, the ISVs ‘are really struggling’ with the issue of pricing on a pay-per-use basis. ‘ISV licencing is complex and difficult,’ he said. ‘I look forward to the transition to pay-per-use.’


Convergence of cloud, big data and HPC In the view of Stephan Gillich, Intel’s director of technical computing for EMEA, the boundaries between the cloud, big data, and high-performance computing are dissolving, and the technologies are converging on each other. Tere is an overlap between simulation and analytics, he said, such that modern life is characterised by ‘pervasive analytics’, especially in transport, the life sciences, and manufacturing. However, the problem was that a distinction between data and simulation was built in at the systems level: simulation uses Fortran or C++ as the programming language whereas data analytics uses Java or Hadoop; the file system soſtware is different as is the resource management soſtware. Simulation is compute and memory focused; whereas data analytics is storage focused, he said. Intel was pursuing this issue of a converged


architecture for HPC and big data, and he envisaged a future in which a cluster’s resource manager was both HPC and big data aware; storage was Lustre with a Hadoop adaptor; and the hardware was both compute and big data capable. Te future would lie with ‘in-memory computing’, he said, and memory technology would be a key enabler of this converged computing world. He stressed that Intel’s range of products encompassed much more than just processors and pointed to the new 3D XPoint technology, announced by Intel and Micron at the end of July. In the original launch announcement, the two companies called it the first new mainstream memory chip to come to market in 25 years. According to Gillich, they sit in between SSD


➤ @scwmagazine l www.scientific-computing.com


Page 1  |  Page 2  |  Page 3  |  Page 4  |  Page 5  |  Page 6  |  Page 7  |  Page 8  |  Page 9  |  Page 10  |  Page 11  |  Page 12  |  Page 13  |  Page 14  |  Page 15  |  Page 16  |  Page 17  |  Page 18  |  Page 19  |  Page 20  |  Page 21  |  Page 22  |  Page 23  |  Page 24  |  Page 25  |  Page 26  |  Page 27  |  Page 28  |  Page 29  |  Page 30  |  Page 31  |  Page 32  |  Page 33  |  Page 34  |  Page 35  |  Page 36  |  Page 37  |  Page 38  |  Page 39  |  Page 40  |  Page 41  |  Page 42  |  Page 43  |  Page 44