search.noResults

search.searching

saml.title
dataCollection.invalidEmail
note.createNoteMessage

search.noResults

search.searching

orderForm.title

orderForm.productCode
orderForm.description
orderForm.quantity
orderForm.itemPrice
orderForm.price
orderForm.totalPrice
orderForm.deliveryDetails.billingAddress
orderForm.deliveryDetails.deliveryAddress
orderForm.noItems
Case Study


sensors can increase motion blur, but you can compensate for that later in software, and if users will be recording video in low-light situations, you’ll need image sensors that can better capture light.


• Motion sensors. The best video stabilization performance requires accurate data from a true, hardware- based gyroscope. So-called “pseudo gyros,” which attempt to emulate mechanical sensors in software, aren’t exact enough for reliable stabilization.


2. Collect the necessary metadata With your hardware platform chosen, it’s time to integrate the video stabilization software and check that you have the metadata required for effective stabilization. Metadata is crucial to ensuring the software algorithms are in sync with the hardware.


If your metadata is accurate from the start, calibration will be easier. Inaccuracies in everything from sensors and lenses to processors can cause stabilization errors down the line. For example, your product’s ability to generate accurate metadata about camera movement and zoom can make or break video stabilization performance. Moreover, metadata about parameters such as frame exposure and rolling shutter times can speed up the calibration phase In general, make sure you’ve got metadata for sensor and processor specifications, frame exposure, rolling shutter time, camera movements (for instance, due to rotation), lens position (autofocus and optical image stabilization), alignment relative to the horizon, and zoom (both optical and digital). And ensure the system is generating certain metadata frequently enough to be effective. For example, getting metadata about lens position once a second (a frequency of 1 Hz) isn’t good enough. Depending on the use case, accurate lens position metadata needs updating at a rate of 100 to 400 Hz.


3. Calibrate your camera system When it comes to video stabilization, calibration is important. This is the time to check that the metadata and hardware are in sync and correct for any performance issues. Calibration can go a long way toward improving video quality in general, and video stabilization performance specifically. For example, the metadata from your gyro sensor must match that of the image captured or else video stabilization won’t work properly.


www.cieonline.co.uk


There are several calibration tests you’ll want to run, starting with the gyro sensor. Is it working and generating the right data? Is its zero-rate offset low, meaning does it report (correctly) that it’s rotating slowly even though it’s laying still? You’ll also want to test that the hardware is running algorithms fast enough to support real-time processing, if you’ve chosen that route. And check that time stamp and exposure time metadata are correct.


4. Tune the system to your particular use case


If calibration involves general testing to see if a camera product’s software and hardware are in sync, tuning involves adjusting parameters to suit specific needs. In doing so, tuning requires production teams to set priorities and make compromises depending on how they expect customers to use their products. For example, video stabilization reduces resolution. Using a higher- resolution sensor could help compensate, but might also drain battery power faster, therefore the team needs to decide what’s the higher priority based on use case. Other common video stabilization-related tuning considerations include: • Battery life vs. video stabilization performance. Real-time processing can take a toll on battery life, which may not be an issue for brief video recording but could impact long-range drone surveillance or lengthy video calls.


• Motion blur vs. image noise. Reducing motion blur can generate image artifacts, especially in low light.


• Video stabilization vs. field of view. Video stabilization reduces a camera’s field of view, so if the use case requires a wider field of view, you may need to tune down video stabilization. In the end, incorporating video stabilization software into a camera product takes work and planning, but it doesn’t have to be difficult. The right tools, including a robust software development kit, can make integrating video stabilization into the production process much easier. Done right, video stabilization results in smoother, more fit-to-purpose video output for a wider variety of end users.


For more information on achieving better video stabilization, go to https://content.weareimint. com/video-stabilization-in- product-development


Components in Electronics December/January 2021 31


Page 1  |  Page 2  |  Page 3  |  Page 4  |  Page 5  |  Page 6  |  Page 7  |  Page 8  |  Page 9  |  Page 10  |  Page 11  |  Page 12  |  Page 13  |  Page 14  |  Page 15  |  Page 16  |  Page 17  |  Page 18  |  Page 19  |  Page 20  |  Page 21  |  Page 22  |  Page 23  |  Page 24  |  Page 25  |  Page 26  |  Page 27  |  Page 28  |  Page 29  |  Page 30  |  Page 31  |  Page 32  |  Page 33  |  Page 34  |  Page 35  |  Page 36  |  Page 37  |  Page 38  |  Page 39  |  Page 40  |  Page 41  |  Page 42  |  Page 43  |  Page 44  |  Page 45  |  Page 46  |  Page 47  |  Page 48  |  Page 49  |  Page 50  |  Page 51  |  Page 52  |  Page 53  |  Page 54