This page contains a Flash digital edition of a book.

This content requires Adobe Flash Player version 8.0.0 or later. Either you do not have Adobe Flash Player installed, or your version is too old, or there is a problem with your Flash installation and we were unable to detect it.

Get Flash.

Attempt to view the Digital Edition anyway.

Te timescales required meant that


analysing multiple design variations was impractical. Tis was the ‘first age’ of CFD. Getting a simulation result at all was difficult. CFD was usually deployed at the end of the design process, as a final verification, or for troubleshooting purposes, when everything else had failed. Te arrival of cheap Linux computers


reduced parallel licensing costs and the continually improving simulation technology opened up the ‘second age of CFD’, in which CFD engineers could reliably provide simulation results within reasonable timescales. Consequently, engineering simulation began to establish itself as the core part of the design process, occurring earlier and earlier and providing a constant stream of simulation data that could be used to drive design decisions. Increasingly, simulation began to displace experimentation as a way of verifying


CFD analysis of an AIrbus A380, modelling engines and landing gear


designs. Te problems that we could solve expanded beyond the core CFD disciplines of fluid mechanics and heat transfer, as we began to consider problems that involved ‘fluid-structure interaction’, multiphase flow, and chemical reaction. With a little engineering ingenuity, there were very


few problems that engineering simulation couldn’t offer some insight to. Which brings us to today, and the dawn of


the ‘third age of CFD’, where lines between CFD and structural mechanics are becoming so blurred that it makes little sense calling it ‘CFD’ at all. An uncomfortable truth about


Behind the L


ooking around at the technology landscape today, we see exciting trends that are getting a lot of attention: Internet of Tings; MOOCs; Big


Data; Industry 4.0; and so on. But if we were to compose a list from just five years ago, we might have seen quite a different list, including social computing, SaaS, and Executable Internet. Trends, aſter all, come and go. However there are several long-term, fundamental transformations that are driving many of these technology trends. Four such


transformations are: l Algorithms in everything; l Hardware in more and more specialised form factors;


l Connected chips, devices, and systems; l People computing anywhere.


Each of these transformations presents both


challenges and opportunities for the scientific and engineering community, and for society as a whole as it adopts the new ideas, new


22 SCIENTIFIC COMPUTING WORLD


technology trends Jack Little looks at the long term transformations that will change how scientists and engineers solve their complex design and analysis problems


products, and new ways of living that they enable. Tese transformations don’t act in isolation – they fuel and feed off each other, together driving a dizzying pace of progress. To understand the whole picture, let’s look


at each transformation and the response it demands:


Algorithms in everything One proxy for increased algorithmic content is the explosive growth of transistors. Tere are predictions that, in the next year or so, there will be 170 billion transistors per person on the planet, an astonishing number. Tese transistors, and the algorithms that they execute, are in billions of devices and systems that, as a result, are smarter, more robust, and more capable than ever. Tis transformation requires tools for


conceiving algorithms, as well as ways for someone to find and apply algorithms that someone else has created.


Increasing diversity and availability of hardware platforms Te hardware that executes those algorithms is available in increasingly diverse platforms and form factors for different purposes. In addition to laptops and desktop computers, there are smartphones and tablets on the low- end, and HPC and cloud-based systems on the high end. Microcontrollers, GPUs, FPGAs, and ARM cores live side-by-side in systems. Meanwhile, low-cost open-source hardware such as Arduino and Raspberry Pi is exploding, bringing programmable processing power to the masses. Tree years ago, the Raspberry Pi did not exist. Now, over 2.5 million have been shipped around the world. To take advantage of this broad range of


hardware, tools need to provide an abstraction from the hardware-specific details, while automatic code generation provides the means to run algorithms on the hardware quickly and efficiently.


@scwmagazine l www.scientific-computing.com


Page 1  |  Page 2  |  Page 3  |  Page 4  |  Page 5  |  Page 6  |  Page 7  |  Page 8  |  Page 9  |  Page 10  |  Page 11  |  Page 12  |  Page 13  |  Page 14  |  Page 15  |  Page 16  |  Page 17  |  Page 18  |  Page 19  |  Page 20  |  Page 21  |  Page 22  |  Page 23  |  Page 24  |  Page 25  |  Page 26  |  Page 27  |  Page 28  |  Page 29  |  Page 30  |  Page 31  |  Page 32  |  Page 33  |  Page 34  |  Page 35  |  Page 36  |  Page 37  |  Page 38  |  Page 39  |  Page 40  |  Page 41  |  Page 42  |  Page 43  |  Page 44  |  Page 45