TEST, SAFETY & SYSTEMS
Above: Mcity is a facility at the University of Michigan purpose-built
for testing connected and automated vehicles and technologies
Left: Creating training datasets
involves the production of highly detailed ground truth
connected together, and we do this for many diff erent types of use cases.” Many systems now are AI-based,
requiring huge volumes of training data in order to work eff ectively. This does not only involve virtual simulation, but also the production of highly detailed ground truth, involving semantic segmentations and depth buff ers that must be joined together to make a training dataset to train the AI. “So what are the key things we
grounds, public roads, diff erent types of racetracks and testing circuits, as well as all diff erent types of specifi c testing locations and challenging environments. So far, the company has built 15 private proving grounds for 11 diff erent OEM and Tier One companies across the world, having scanned and digitally built their own in-house testing locations.
SYNTHETIC TRAINING DATA “We help to connect all the diff erent parts of a simulation together,” Daley explains. “In the past, it was humans driving in a simulation, connected to vehicle models. But now, we have sensors. Our job is to develop highly detailed and accurately engineered inputs into virtual sensors to create a complex environment within which we can interact, involving traff ic, other vehicles and road users and so on. RFpro is the central simulation that enables all of those things to be
need when making synthetic training data?” asks Daley. “We need locations, we need virtual models of physical world places, we then need to fi ll them with lots of diff erent types of traff ic. And that’s not just your simple passenger cars, but also emergency vehicles, utility vehicles, things that shine and refl ect. We need all of those complexities that drivers and vehicles will experience on the real roads in the simulation. The next stage is to fi ll the virtual model with diff erent objects, like pedestrians, fl ashing lights, diff erent roadsides and so on. Then, we have to make all of this into something useful, so we build scenarios. You can do that either manually by placing cones, for instance, or you can write Python scripts to automatically change the size and shape of locations, rotate objects or change their colours. Doing this, you can start to create the thousands upon thousands of diff erent tests and variations of your synthetic training data required for your AI models.”
SUPERIOR SENSORS Within this synthetic training environment, you not only need cameras but also lidar and radar models, too. All these sensors need to be physically modelled, including the lens, colour fi lters, arrays, CMOS chips, ISP and so on. By building these up, we can achieve highly reactive scenes capable of producing full HDR simulations, where you can input the multiple sampling frames that are going into each part of a camera simulation, and physically model the way that the sensor is opening over time. This allows the addition of motion blur, rolling shutter eff ects, LED fl icker, exposures and other eff ects that can infl uence the sensor’s identifi cation and perception capabilities. “Once you have developed the
sensors you need, you can start to develop the training data to sit alongside it,” Daley adds. “All of this works together: automatic 2D and 3D bounding boxes, semantic segmentation, all of the objects identifi ed in your training data. All of this is why simulation is a more powerful tool than purely relying on physical testing alone. Synthetic training data has taken up massively in the past few years, and is now at that engineering level that can make a real diff erence to the perception testing of your systems.”
For more information, visit
www.rfpro.com
www.engineerlive.com 33
Page 1 |
Page 2 |
Page 3 |
Page 4 |
Page 5 |
Page 6 |
Page 7 |
Page 8 |
Page 9 |
Page 10 |
Page 11 |
Page 12 |
Page 13 |
Page 14 |
Page 15 |
Page 16 |
Page 17 |
Page 18 |
Page 19 |
Page 20 |
Page 21 |
Page 22 |
Page 23 |
Page 24 |
Page 25 |
Page 26 |
Page 27 |
Page 28 |
Page 29 |
Page 30 |
Page 31 |
Page 32 |
Page 33 |
Page 34 |
Page 35 |
Page 36 |
Page 37 |
Page 38 |
Page 39 |
Page 40 |
Page 41 |
Page 42 |
Page 43 |
Page 44