Mil Tech Trends: Controlling the UAV overhead (Part 1 of 2)
Autonomous ISR software reduces operator workload An interview with executives from Kitware
EDITOR’S NOTE Dr. Will Schroeder Dr. Anthony Hoogs
European citizens are familiar with the ubiquity of cameras everywhere, monitoring high-traffic public places and roadways. In most cases, video feeds back to operations centers where operators look for specific events. On today’s digital battlefield, ISR sensor data and video from UAS platforms, satellites, aircraft, and ground forces feed into command centers and TOCs where military analysts perform similar functions to their civilian counterparts, but with deadly seriousness. One missed event could mean death by IED to a warfighter. Yet how can a human watch and correctly identify threats within hundreds of hours of reconnaissance data? Open source software company Kitware, with some help from DARPA, has the answer. I sat down with executives from Kitware to learn about leading-edge autonomous software. Edited excerpts follow. – Chris A. Ciufo, Editor
›Let’s start by talking about Kitware.
SCHROEDER: We’ve been around since 1998. We’re an open source company, and we develop technologies in the scientific computing arena. In most cases, we give that technology away in the form of open source code available from our website.
›How do you license it, and how do you make any money?
SCHROEDER: It’s a very permissive BSD licensing agreement, which allows people to use it without necessarily giving anything back to us. We make money as a service company: By giving this stuff away, people start using it and they need help using it, integrating it into their systems.
›Does your company maintain any intellectual property?
SCHROEDER: It’s minimal. The majority of our “intellectual property” is customer relationships and customer contact lists. We have a small number of patents and some patents pending. There are proprietary software systems we’ve developed for our customers, but we can’t necessarily sell those. We have a couple proprietary products, too. The value in our company is the employees and the knowledge that travels the open source world.
›Which software technology areas do you focus on?
SCHROEDER: We have five key areas. We’re a software company, so we focus a lot on software quality and software process. Around that software core, we have four technology areas. One is computer vision, which Anthony [Hoogs] can talk more about.
HOOGS: Sure. Most of our funding comes from DARPA, which is mostly interested in aerial video. So primarily what we do is have a computer examine the video and determine whether the video contains anything of interest to a video analyst. The overall goal
40 March/April 2011 MILITARY EMBEDDED SYSTEMS
HOOGS: The military uses consumer cameras that are high resolution and create lots of data. Sometimes image quality issues can arise, but they are not because of poor cameras; rather, they are because the scene is miles away from the UAV, such that atmospheric disturbances and imaging conditions have an effect. Additionally, if the camera’s on a moving sensor or platform, … you have issues of parallax: 3D objects in the scene like a build- ing rising above the ground plane with apparent motion.
›So the intention is to make this poor-quality video into high-quality video while discerning people and events.
HOOGS: Right. [A computer-flagged event] might be someone implanting IEDs; the computer flags things like that. For exam- ple, there might be a Predator sweeping along a road because the analyst is following a vehicle of interest. The camera might sweep right over somebody digging a hole off the side of the road. If the analyst is watching that vehicle, he may not even notice that person digging a hole. However, the computer doesn’t get distracted by watching the vehicle versus something on the side of the road.
›Is a second or two of video enough to save that portion of the image?
HOOGS: The amount of time required to recognize a certain type of event or action depends on the event. It does vary quite a bit, but typically it’s just a few seconds. We don’t have any kind of control feedback in the systems now, but that’s certainly conceivable.
is really to make this enormous amount of video being collected by the military indexable and accessible so that you can know what’s in it without having humans look at it – until there’s an event that requires operator intervention.
›Let’s set the stage. Are these super high-res cameras? Are there ever image quality issues?
Page 1 |
Page 2 |
Page 3 |
Page 4 |
Page 5 |
Page 6 |
Page 7 |
Page 8 |
Page 9 |
Page 10 |
Page 11 |
Page 12 |
Page 13 |
Page 14 |
Page 15 |
Page 16 |
Page 17 |
Page 18 |
Page 19 |
Page 20 |
Page 21 |
Page 22 |
Page 23 |
Page 24 |
Page 25 |
Page 26 |
Page 27 |
Page 28 |
Page 29 |
Page 30 |
Page 31 |
Page 32 |
Page 33 |
Page 34 |
Page 35 |
Page 36 |
Page 37 |
Page 38 |
Page 39 |
Page 40 |
Page 41 |
Page 42 |
Page 43 |
Page 44 |
Page 45 |
Page 46 |
Page 47 |
Page 48