search.noResults

search.searching

saml.title
dataCollection.invalidEmail
note.createNoteMessage

search.noResults

search.searching

orderForm.title

orderForm.productCode
orderForm.description
orderForm.quantity
orderForm.itemPrice
orderForm.price
orderForm.totalPrice
orderForm.deliveryDetails.billingAddress
orderForm.deliveryDetails.deliveryAddress
orderForm.noItems
PRODUCTION/POST


VIRTUAL PRODUCTION 2.0


We can now provide optical tracking from just the natural features


Andy Newham NCAM


This should allow a DoP or technicians to actually run that process themselves, make those LUTs and get it into their system without having to send a file off somewhere to be converted and bring it back. That’s a massive efficiency in the hands of a lot more people.”


TRACKS WITH NO TEARS Also at the Sony VP space,


NCAM tracking hardware locks what the camera sees/captures to the rendered content from Unreal Engine on the LED screen. Sales Director EMEA at Ncam Technologies, Andy Newham, says the optical tracking company also takes on the challenge of VP in more demanding environments: “If you’re going on location, you don’t necessarily have the luxury of large lighting racks, blackout screens, and lots of points where you can use infrared and controlled environments. We can now provide optical tracking from just the natural features, as we do for sports broadcasting. On a snow-covered mountain for example there aren’t a lot of reference points to track from, instead, we crank up the gamma


70 televisual.com Summer 2023


to create as much contrast as possible, literally tracking off the shadows in the snow. We can apply that same methodology for cinema; you can shoot on location and provide that absolute pinpoint accuracy on the tracking in previz, which then gets taken into post for the overlay of more advanced graphics.” The latest initiative from


Mo-Sys is the VP Pro XR suite. “It runs inside the Unreal Editor and leverages Epic’s nDisplay to generate the best possible quality content on the LED wall at the lowest latency,” says Mo-sys Technical Director James Uren. “Tools include Cinematic XR Focus, a hardware integration with lens motors to enable focus pulls from real to virtual; MoView - Switched multi-camera XR with director’s preview for live broadcast; set extensions with auto-colour matching and AR objects; and tracking and lens data recording to enable further VFX in post or Mo-Sys NearTime rendering.” The company has also


redesigned its Startracker tool to specifically support narrative filming workflows. “StarTracker


Max delivers even greater accuracy and [offers a] smaller and lighter on-camera unit, which makes it incredibly convenient for use with Steadicam and handheld cameras,” says Uren, who adds the company has the potential to introduce a web interface to further enhance StarTracker Max’ capabilities. Zero Density has released the


Traxis Camera Tracker, a central hub for managing tracking and lens data from multiple vendors and provides fast and accurate lens calibration. “This means there’s no need to waste time on time-consuming setups for a photoreal result — even when tracking challenging closeups, quick camera movements, and crane rigs,” says Zero Density Chief Marketing Officer, Ralf van Vegten. Also from ZD is the upgraded


Traxis Talent Tracker which uses AI to identify individuals within a 3D environment. “This helps to track multiple people without any wearables, easily generating accurate reflections, refractions, and virtual shadows in real time,” says van Vegten. “The Talent Tracker makes it easy to blend


the real and virtual worlds together seamlessly.”


IL LUMINATING IDE AS “Traditionally VFX, lighting,


and set were three completely different disciplines. What we’re seeing now is integration,” says Simon Evans of Sumolight. “You need a portfolio of tools to make it work really efficiently. [The section of the set] that the camera is actually seeing needs to be a high-definition video wall, but the rest of it is all about textures, textured infill, and reflected surfaces, which our Sumo Sky and Sumo Max do brilliantly. They can also be pixel-mapped, so people can then run effects through them. It becomes a lighting VFX element on set.” The Sumo Sky is a flexible


array of LEDs, specially created to work with virtual production. “You get a huge amount of output, really good high colour quality, [quickly configured] in so many different ways,” says Evans “You can squish the bars together and have it as relatively high resolution, or you can space them apart and have relatively low resolution. You can have a


disguise


Page 1  |  Page 2  |  Page 3  |  Page 4  |  Page 5  |  Page 6  |  Page 7  |  Page 8  |  Page 9  |  Page 10  |  Page 11  |  Page 12  |  Page 13  |  Page 14  |  Page 15  |  Page 16  |  Page 17  |  Page 18  |  Page 19  |  Page 20  |  Page 21  |  Page 22  |  Page 23  |  Page 24  |  Page 25  |  Page 26  |  Page 27  |  Page 28  |  Page 29  |  Page 30  |  Page 31  |  Page 32  |  Page 33  |  Page 34  |  Page 35  |  Page 36  |  Page 37  |  Page 38  |  Page 39  |  Page 40  |  Page 41  |  Page 42  |  Page 43  |  Page 44  |  Page 45  |  Page 46  |  Page 47  |  Page 48  |  Page 49  |  Page 50  |  Page 51  |  Page 52  |  Page 53  |  Page 54  |  Page 55  |  Page 56  |  Page 57  |  Page 58  |  Page 59  |  Page 60  |  Page 61  |  Page 62  |  Page 63  |  Page 64  |  Page 65  |  Page 66  |  Page 67  |  Page 68  |  Page 69  |  Page 70  |  Page 71  |  Page 72  |  Page 73  |  Page 74  |  Page 75  |  Page 76  |  Page 77  |  Page 78  |  Page 79  |  Page 80  |  Page 81  |  Page 82  |  Page 83  |  Page 84  |  Page 85  |  Page 86  |  Page 87  |  Page 88  |  Page 89  |  Page 90  |  Page 91  |  Page 92  |  Page 93  |  Page 94  |  Page 95  |  Page 96  |  Page 97  |  Page 98  |  Page 99  |  Page 100  |  Page 101  |  Page 102  |  Page 103  |  Page 104  |  Page 105  |  Page 106  |  Page 107  |  Page 108  |  Page 109  |  Page 110  |  Page 111  |  Page 112