HPC YEARBOOK 2021/22
Deluge of data
Hsueh-Li Wang discusses HPC trends that include data analysis, virtualisation and how hardware developers are looking to solve the challenges in HPC performance
What are the primary factors driving change in HPC? If we look at HPC, we can split the market into scientific simulation, scientific research, artificial intelligence (AI), and virtualisation. After that, there will be any other generic general- purpose compute, either with CPU or GPU.
Current research areas in HPC
are focused on medical research, biosciences, geoscience, weather forecasting, demographic data analysis and so on. There has been a rise in the use of demographic data analysis due to the increasing use of big data, and also the fact that computing technologies are gradually becoming more miniaturised. Today you can deploy very high-
performance, high-power compute nodes at multiple locations, remote locations even where there is no data centre. Now the computing capability of a single device is so strong, that just one small device can do almost any kind of analysis if you set it up correctly. For example, if I take a smart, edge
device, the size of a smartphone, it can be installed on any telephone post or power pole. Outdoor or indoor, even in a supermarket. That node can be used to collect and analyse data on facial recognition, object detection, object recognition, traffic flow, human traffic flow, and so on. The rise in demographic data analysis is due to all these end-use cases, and also the fact that data can now be collected on a more
18
granular level. What I mean is that, in the past, when you collected data, it was straightforward in terms of data structure. Data collection was focused on just a few simple data points, for example, gender, race, and maybe skin colour. Now almost all the end-use cases
are moving to 3D or multi-dimensional structures. This more complex data collection means that for a set of demographic data, you can have facial features, race, ethnicity, skin colour, emotions, attire, age, height, weight and even the speed they are walking. AI and HPC end-use cases are
seeing an increasing trend towards demographic data, because now we can collect more data, data is becoming more complex, and compute nodes are getting more and more powerful. The fact that data is becoming more
complex is also applicable to other areas of HPC research. When you look at HPC as a whole, in terms of scientific research, scientific simulation, all the trends have one thing in common: data is becoming more complex. Now we have the databases; the data systems that can host that data. We have the compute units, such as CPU and GPU, with the right software to analyse it.
How do virtualisation technologies fit into HPC research? Virtualisation is also interrelated with HPC in the sense that HPC used to be based on a monolithic compute architecture, meaning that you deploy heavy analysis machines and use those
machines just as compute nodes. You deploy them and then, without much virtualisation, you launch the workload on the entire cluster. This is my opinion, this doesn’t
represent Gigabyte, but I think that you will see more and more granular hardware designs into GPU technologies that will allow for virtualisation and also physical isolation of GPU compute units within the GPU device. So for example, if I take a GPU card, when we used to virtualise the GPU card it was the entire device. Because the GPU architecture is the same for all compute units inside, you can do mostly the same kind of workloads. It doesn’t matter which virtual machine you use, it doesn’t matter which piece of hardware you get, because the GPU hardware is the same. The compute units inside are the same. What we’re going to see now is a GPU
architecture trend where you will see different compute units, different sizes, different types of memory management within the same GPU card. This means that you can isolate the GPU hardware, so that you can allocate different parts of the same GPU to different kinds of users, running different kinds of data analysis.
What are the challenges facing HPC and AI hardware development? There are really two aspects to this question. The first aspect is the physical limitation. So as you know, now, most of our semiconductors are based on silicon. So in terms of physics, the existing semiconductor materials
www.scientific-computing.com
Page 1 |
Page 2 |
Page 3 |
Page 4 |
Page 5 |
Page 6 |
Page 7 |
Page 8 |
Page 9 |
Page 10 |
Page 11 |
Page 12 |
Page 13 |
Page 14 |
Page 15 |
Page 16 |
Page 17 |
Page 18 |
Page 19 |
Page 20 |
Page 21 |
Page 22 |
Page 23 |
Page 24 |
Page 25 |
Page 26 |
Page 27 |
Page 28 |
Page 29 |
Page 30 |
Page 31 |
Page 32 |
Page 33 |
Page 34 |
Page 35 |
Page 36 |
Page 37 |
Page 38 |
Page 39 |
Page 40 |
Page 41 |
Page 42