CLICK this icon for video



One of the more interesting consequences of the dramatic increases in computing power over recent years, specifi cally the increase in power of graphics processing units (GPUs), has been the arrival of photorealistic, real- time CG.

The industry has followed a stratospheric curve as a

result, one that has its roots in clunky rendering times for even primitive polygons and ends – currently – with real-time raytracing and being able to generate utterly convincing scenes, animations and virtual sets on the fl y. But, while the 3D output can now be generated in

real-time, the tools that we use to create these animations are lagging and are still defi ned by mouse-based, 2D production pipelines. That is partly why the co- Champions from one of 2020’s most popular Accelerator projects, CG Animation Production: New Immersive & Real-Time Workfl ows, return this year with a new challenge looking at creating transmedia content using cutting-edge XR tools and technology with genuine, real- time workfl ows.

The POC will aim to bring RT-3D – Real-Time 3D – to life using a combination of immersive XR software and traditional production tools, with a focus on animation or photo-realistic live action output. And it has scored a bit of a coup as content feeds using the major competing real-time game engines on the market, Unity and Unreal, both working together again in this Accelerator, will look to show how output can be optimised with time and budget effi ciencies that scale over ‘traditional’ methods by using an XR-based pipeline. “I’m pretty sure we’re the only public project on the planet that has those two folks in the same room together,” comments project lead, Matthew McCartney, head of immersive technology at Sky.


“A couple of years ago the only way you could create a 3D asset was sitting down on a laptop with your mouse and you draw it, then you have to manipulate it – it’s quite fi ddly,” says McCartney. “Using software like Masterpiece Studio or Tvori, who

are both onboard as Participant partners, you put on your VR rig and you’re suddenly sculpting – this ability to create a 3D asset in 3D for 3D output is very new.” One of the main tasks of the Accelerator is precisely assessing the viability of that pipeline. The effort is split across four creative teams with the plan being that each will create two pieces of content for IBC; a production diary and the content itself.

These will showcase the variety that can be achieved with this new breed of XR production tool as well as reveal potential synergies between the different workfl ows (which McCartney points out tend to be a common pipeline until they hit a games engine and start being purposed for a specifi c use). RT-3D workfl ows bring a range of potential benefi ts to

“Using VR, you are able to move the camera with your head. That actually lowers the barrier for a lot of people to get into working in digital art,”

Dylan Sisson, Pixar Animation Studios

users. Speed and effi ciency are two of the most obvious, while they also allow artists to literally be able to walk around their creations and identify any problems more easily in the process. They are also far more naturalistic to use. This means that the barriers to entry for new users are minimised as the learning curve for digital content creation (DCC) tools remains high. “There’s a million of those little, tiny barriers that you have to learn to get even a simple thing created in a DCC tool,” says Pixar Animation Studios’ Dylan Sisson. “Using VR, you are able to move the camera with your head. That actually lowers the barrier for a lot of people to get into working in digital art; an illustrator can now be part of a digital pipeline when they couldn’t before.” This is evolving all the time. New for this year is a test workfl ow that integrates motion capture into the workfl ow via Noitom, another new Participant for 2021. This technology allows artists to quickly tie motion into their models for animation purposes; they move their arm, their CG model does the same, and so on.


One of the key tasks for the Accelerator is assessing the advantages this confers and putting actual numbers to the benefi ts an XR workfl ow can bring. Grace Dinan is a Viz Artist at Irish broadcaster RTÉ and explains how she is teaming up with participants Pink Kong Studios, as well as fellow Champions Trinity College Dublin to do a formal user study which will compare the costs, time and resources used in XR workfl ows with traditional DCC-driven ones.

RT-3D Interactive Content Creation

for Multi-platform Distribution Champions: Sky (Project Lead), Pixar, Cartoon Network/Warner Media, Unity Technologies, Unreal/ Epic Games, RTE, Trinity College Dublin, Fox Sports, Facebook Reality Labs Participants: Anchorpoint, Noitom, Pink Kong Studios, Trick3D, Masterpiece Studios

“We’re going to do some studies with novices new to animation,” she explains. “We will give them a simple task, such as creating a snake moving across the fl oor, add textures, lights and render it, and measure how easy that is to learn and do. I think that’s where we’re going to get some really interesting results with their XR tools. We’re also going to do a user study with experienced professional animators; assigning them a task such as facial or hand animation and seeing how well they perform it in both methods.”

Alongside this collation of empirical evidence, the Accelerator is also going to look at the concept of fi delity. The new breed of XR tools are more capable than ever – and extending their functionality with every point release – but as yet they are not accomplished enough to deliver the very highest quality of CG on their own. McCartney reckons they are between 5% and 10% shy of being the complete article. “Hair, for example, it’s very diffi cult to get hair into strands using XR tools,” he says,” so were still going to need integration into more traditional tools to get things over the line. The Accelerator will provide valuable feedback to developers, either indicating where they need to focus their efforts and/or improve those integrations.”

For more information on the IBC Accelerator Media Innovation Programme, supported by Nvidia, visit ibc-accelerator-media-innovation-programme

Page 1  |  Page 2  |  Page 3  |  Page 4  |  Page 5  |  Page 6  |  Page 7  |  Page 8  |  Page 9  |  Page 10  |  Page 11  |  Page 12  |  Page 13  |  Page 14  |  Page 15  |  Page 16  |  Page 17  |  Page 18  |  Page 19  |  Page 20  |  Page 21  |  Page 22  |  Page 23  |  Page 24  |  Page 25  |  Page 26  |  Page 27  |  Page 28  |  Page 29  |  Page 30  |  Page 31  |  Page 32  |  Page 33  |  Page 34  |  Page 35  |  Page 36  |  Page 37  |  Page 38  |  Page 39  |  Page 40  |  Page 41  |  Page 42  |  Page 43  |  Page 44  |  Page 45  |  Page 46  |  Page 47  |  Page 48  |  Page 49  |  Page 50  |  Page 51  |  Page 52  |  Page 53  |  Page 54  |  Page 55  |  Page 56  |  Page 57  |  Page 58  |  Page 59  |  Page 60  |  Page 61  |  Page 62  |  Page 63  |  Page 64  |  Page 65  |  Page 66  |  Page 67  |  Page 68  |  Page 69  |  Page 70  |  Page 71  |  Page 72