This page contains a Flash digital edition of a book.
aerospace


the analysis runs on the same compute resources as the simulation, and co-analysis, where the simulation data is moved to a separate resource for analysis. ‘Whereas the in situ approach requires the researchers to know what questions they want answered beforehand, co-analysis allows far more freedom and flexibility to analyse data on the fly,’ he explains. According to Papka, co-analysis is an active


Multi-disciplinary optimisation of a landing gear lug


time to calculate and not enough disk space. ‘Typically,’ he says, ‘data analysis takes place once the simulation data has been collected and compiled. But for large-scale simulation runs involving hundreds of thousands of processors and processor hours, a scientist or engineer needs to analyse a data subset at several points along the way in order to decide how to proceed. Tis process has become a major challenge.’ Papka continues by stating that there


are two data analysis approaches at the forefront of research: in situ analysis, where


The final frontier


David Whittle, sector director for Space, Government and Defence at Tessella, discusses why spacecraft design brings its own set of challenges


With some aircraft systems, flight data can be gathered in advance and modifications made during the process, but with a spacecraft, once it launches it has to work. There really isn’t much scope for tweaking things at a later date, other than where the software is concerned, and we therefore need to be very sure that the system can be put it into safe mode should anything go wrong and that the mode is going to be very robust. Working with organisations like the European Space Agency, we use the MathWorks toolset – and specifically Matlab Simulink – to create models and run simulations of engineering situations. One of the problems commonly faced within the industry is that people often design by simulation. In our experience, this is a mistake as it’s all too easy to jump in, use the tools and start putting things together without taking a step back. These toolsets are great to use, but people need to begin with an engineering knowledge of the problem and then use the simulation modelling to help solve whatever difficulty is being faced.


46 SCIENTIFIC COMPUTING WORLD


Long before diving into the toolset, design engineers should take a blank sheet of paper and truly understand what it is they are trying to achieve. The design can then be verified or further extrapolated from the models. The biggest challenge is ensuring that everything is accurate. The testing and simulation campaigns serve to prove the envelope of all the possible combinations of events because we have to verify, and be seen to be able to verify, that the system is robust in any situation that might occur – there is little chance of recovery otherwise. Another crucial challenge is finding the right staff who can work with both the simulation and modelling packages, but who also have the engineering understanding of the issues to enable them to design robustly in the first place. In a space mission, we can’t conduct ground tests for every eventuality. While in orbit, for example, a spacecraft will have to do various manoeuvres, such as spins and twists, that simply cannot be duplicated in a lab using the real hardware. Instead, we try to simulate the space environment so that when we’re designing a control system that can cope with the firing of rocket motors, we have a means of testing how robust the algorithms will be in any given situation. In terms of the environment, we model the position


of the sun and stars, the orbit dynamics and the effect of gravity, as well as characteristics such as the actuators that are going to move the spacecraft around, the mechanical systems, the movement of the fuel in the tanks, and the flexible modes of the arrays. To a large degree, the Mathworks products


are viewed as an industry standard in this area. We’ve been using the tools for many years and when feeding back any problems or issues we’ve encountered with the software, the company has been quick to respond. There were times when we had to write our own add-ins, but we haven’t had to do that for quite some time. Mathworks also offers the option of selling packages on another company’s behalf within a partner programme. Looking forward, within the testing environment


the trend is going to move more towards the use of modelling and simulation. If doing a missile test, for example, the design engineer should be modelling the expected behaviours and exploring the envelope as far as possible before using that mechanical test to verify the models. As costs continue to be a deciding factor, I believe that as an industry we will be running far more verifications through modelling, even in environments where it could be done in other ways.


www.scientific-computing.com


research effort at the ALCF, and it is among the set of emerging tools and methods now enabling scientists to use supercomputers as tools for inquiry. He comments that there are still other issues to solve, such as latency, interaction and time-to-solution, but is confident that these challenges will be overcome. He adds that if scientists follow the in situ approach and don’t store the data, they run the risk of missing something. ‘Groups at other Department of Energy


(DOE) national laboratories and at several universities are also working on ways to reduce the amount of data being written to storage and methods to provide faster insight,’ he says. ‘All of us working in the field are keenly aware that the days of writing everything to disk are gone, especially as we


move toward exascale. Exascale will radically change how scientists and engineers make major discoveries, and the DOE is leading the charge to fund novel tools and methods that will allow them to optimise every step in the simulation and analysis pipeline.’ At Argonne, Papka’s research team worked


with Ken Jansen, Professor of Aerospace Engineering Sciences, and his team from the University of Colorado to modify his simulation code. Papka explains that the modifications were done to connect a standard analysis tool at various points in time, to query and visualise the data, and to home in on certain phenomena – such as what was happening with the airflow. Tis enabled Jansen to quickly identify regions of interest that are moved out to a separate resource for fine-grained analysis is extremely useful for his application. ‘We demonstrated this capability at the


2011 Supercomputing conference when we ran Ken’s application on Intrepid, our Blue Gene/P machine, while he queried and analysed the data,’ says Papka. ‘Our infrastructure allows for “hooks” in the simulation code to plug in tools and easily visualise information in real time.’


Altair Engineering


Page 1  |  Page 2  |  Page 3  |  Page 4  |  Page 5  |  Page 6  |  Page 7  |  Page 8  |  Page 9  |  Page 10  |  Page 11  |  Page 12  |  Page 13  |  Page 14  |  Page 15  |  Page 16  |  Page 17  |  Page 18  |  Page 19  |  Page 20  |  Page 21  |  Page 22  |  Page 23  |  Page 24  |  Page 25  |  Page 26  |  Page 27  |  Page 28  |  Page 29  |  Page 30  |  Page 31  |  Page 32  |  Page 33  |  Page 34  |  Page 35  |  Page 36  |  Page 37  |  Page 38  |  Page 39  |  Page 40  |  Page 41  |  Page 42  |  Page 43  |  Page 44  |  Page 45  |  Page 46  |  Page 47  |  Page 48  |  Page 49  |  Page 50  |  Page 51  |  Page 52