This page contains a Flash digital edition of a book.
high-performance computing

The installation of Mira, at the Argonne National Laboratory, illustrates some of the complexity of the current generation of supercomputers

performance of all 500 systems is noticeably influenced by the very large systems at the top of the list. Recent installations of very large systems – up to June 2013 – have counteracted the reduced growth rate at the bottom of the list, but with few new systems at the top of the past few lists, the overall growth rate is now slowing. In the USA, the joint Collaboration

of Oak Ridge, Argonne, and Lawrence Livermore (Coral) was established in late 2013 to leverage supercomputing investments, streamline procurement processes and reduce costs in developing supercomputers. In addition to the $325 million in procurement of new machines, the Department of Energy also announced approximately $100 million to further develop extreme scale supercomputing technologies as part of a research and development programme titled FastForward 2. On the Scientific Computing World

website in July this year, Tom Wilkie discussed how the Coral programme was a classic example of how the US Government can push technological development in the direction it desires by means of its l

procurement policy. Tere are (at least) three ways in which Governments can force the pace of technological development, his article suggested. One is by international research cooperation – usually on projects that do not have an immediate commercial product as their end-goal. A second is by funding commercial companies to


conduct technological research – and thus subsidising, at taxpayers’ expense, the creation or strengthening of technical expertise within commercial companies. Te third route is subsidy by the

back door, through military and civil procurement contracts. Use of procurement policies to push technology development in a particular direction has been a consistent – and very successful – strand in US Government policy since the end of the Second World War. Nearly two


decades ago, in his book Knowing Machines, Donald MacKenzie, the distinguished sociologist of science based at Edinburgh University, showed how the very idea of a supercomputer had been shaped by US Government policy. He concluded: ‘Without the [US National]

weapons laboratories there would have been significantly less emphasis on floating-point- arithmetic speed as a criterion of computer performance.’ Had it been leſt solely to the market, vendors would have been more interested in catering to the requirements of business users (and other US agencies such as the cryptanalysts at the US National Security Agency) who were much less interested in sheer speed as measured by Flops and this would have led to a ‘subtly different’ definition of a supercomputer, he pointed out. Te purchasing power of the laboratories

was critical, he argued, in shaping the direction of supercomputer development: ‘Tere were other people – particularly weather forecasters and some engineers and academic scientists – for whom floating point speed was key, but they lacked the sheer concentrated purchasing clout.’


Argonne National Laboratory

Page 1  |  Page 2  |  Page 3  |  Page 4  |  Page 5  |  Page 6  |  Page 7  |  Page 8  |  Page 9  |  Page 10  |  Page 11  |  Page 12  |  Page 13  |  Page 14  |  Page 15  |  Page 16  |  Page 17  |  Page 18  |  Page 19  |  Page 20  |  Page 21  |  Page 22  |  Page 23  |  Page 24  |  Page 25  |  Page 26  |  Page 27  |  Page 28  |  Page 29  |  Page 30  |  Page 31  |  Page 32