This page contains a Flash digital edition of a book.
22 TVBEurope MAM Forum 2013


The MAM is just the


FOR BROADCASTERS, the volume of digital media in their libraries never stops growing. Broadcasters are producing new digital content every day. And those that have progressed to the point of digitising their legacy audio and video assets, sometimes decades’ worth, could be dealing with hundreds of thousands of hours of content — and counting. That’s a daunting amount of data to organise and manage. It becomes even more daunting when you consider that broadcasters who don’t organise and manage their data well could be leaving money on the table. Why? Because the point, of


course, is to repurpose the assets. Whether the aim is to reach new audiences, serve the second (or third) screens, create completely new programming, prove compliance, or some combination of those, the ultimate goal for many broadcasters is to monetise the assets that would otherwise languish in their media libraries. To do it, many broadcasters have chosen to use a media asset


management (MAM) system. After all, if you can’t find it, then you don’t have it, and the purpose of a MAM system is to make sure you can always find it. There are many MAM systems to choose from at a range of price points and features, and the process of choosing the right one can take a lot of research and consideration. Once the MAM application is in place, it’s tempting to think that the process is over and that discovery problems are solved, but for many media managers, the MAM is just the beginning. A good MAM system is a critical part of any file-based media operation, to be sure, but it’s only as good as the metadata that goes into it. Without rich, descriptive metadata, a MAM system can be a black hole where media goes in and never comes out. How do you get your assets into the MAM in a way that makes them findable and, ultimately, monetisable?


www.tvbeurope.com December 2013


Broadcasters who don’t organise and manage their data well could be leaving money on the table. A well-designed MAM is key


beginning


Two products by Nexidia, QC and Dialogue Search, were chosen for our Best Of IBC2013. KJ Kandell, Nexidia senior director, media and entertainment division, explains how a poorly managed MAM system is as good as no MAM system at all


More metadata is better metadata Assets usually get ingested into a MAM system with basic information such as filename, file type, date, timecode, and duration information. It might also include ratings, descriptions, or relevant


“If you can’t find it, then you


don’t have it, and the purpose of a MAM system is to


make sure you can always find it” KJ Kandell, Nexidia


keywords. Unfortunately for many broadcasters, that’s where the metadata stops…and the search problems begin. When it comes to metadata,


Nexidia’s Dialogue Search allows for phonetic searching of dialogue across a library


the more you have, usually the better your chances of finding exactly what you’re looking for, but it’s hard to find what you’re looking for based on simple file attributes alone. It takes additional metadata that describes the content within a given media file, which usually must be entered manually using a logging application. It’s a laborious process that requires someone to watch the video and make notes about it in the logging application, and most media operations simply don’t have the appetite to spend the money and resources it takes to


Nexidia’s KJ Kandell: “Without rich, descriptive metadata, a MAM system can be a black hole where media goes in and never comes out”


do it regularly and thoroughly. The


result: the files in the MAM system often don’t contain enough descriptive


metadata for the MAM to be useful. And so they sit unused and, sometimes, forgotten.


Finding the right words Nexidia’s Dialogue Search is based on patented technology that doesn’t rely on descriptive metadata. Instead, the software works by creating a phonetic index of a media library, a process that results in a searchable index based not on the information that has been typed into the metadata fields, but on what is actually spoken on the audio tracks. When an asset or library is selected for indexing, Dialogue Search analyses the audio tracks and creates a searchable index of the dialogue. Once that index is created, descriptive metadata is no longer required to search for an asset, and searches are almost instantaneous.


Dialogue Search finds any


spoken word or phrase across massive media libraries in seconds. It uncovers assets that basic, file-based metadata — even keywords — could never expose. Users type any combination of words or phrases into the Dialogue Search interface, and the application will find any media clip in the system where those words or phrases are spoken. Users can preview results in a video player without having to scroll through numerous clips to find a specific sound bite. After factoring in the time and cost of properly logging assets in order to make them searchable — and the potential for assets to go unmonetised without the right metadata — the value of a language-based solution becomes clear. It also explains why the MAM is only the beginning of an integrated workflow that helps users to be more creative and, at the same time, more efficient. www.nexidia.com


Page 1  |  Page 2  |  Page 3  |  Page 4  |  Page 5  |  Page 6  |  Page 7  |  Page 8  |  Page 9  |  Page 10  |  Page 11  |  Page 12  |  Page 13  |  Page 14  |  Page 15  |  Page 16  |  Page 17  |  Page 18  |  Page 19  |  Page 20  |  Page 21  |  Page 22  |  Page 23  |  Page 24  |  Page 25  |  Page 26  |  Page 27  |  Page 28  |  Page 29  |  Page 30  |  Page 31  |  Page 32  |  Page 33  |  Page 34  |  Page 35  |  Page 36  |  Page 37  |  Page 38  |  Page 39  |  Page 40  |  Page 41  |  Page 42  |  Page 43  |  Page 44  |  Page 45  |  Page 46  |  Page 47  |  Page 48  |  Page 49  |  Page 50  |  Page 51  |  Page 52