search.noResults

search.searching

saml.title
dataCollection.invalidEmail
note.createNoteMessage

search.noResults

search.searching

orderForm.title

orderForm.productCode
orderForm.description
orderForm.quantity
orderForm.itemPrice
orderForm.price
orderForm.totalPrice
orderForm.deliveryDetails.billingAddress
orderForm.deliveryDetails.deliveryAddress
orderForm.noItems
PRE VISUALISATION


POST


and about 70% of postvis (where assets are integrated with shot footage for final decisions). According to Creative Director and Senior


Supervisor Vincent Aupetit, almost 80 people at FPS worked for more than a year and a half on the series’ 1400 VFX shots. “We use information that’s as accurate as possible


from LIDAR scans, location scout photos, video references and so on. When we get to techvis we take this even further; we use real dimensions for the sets, and for the equipment like cranes or motion-control,” he continues. “As it’s real-time, you can have a back and forth between the previs version of the shot and the techvis, with all changes auto-updating.” Everything that FPS does in Unreal sessions is


Top:Previs frame created in


Unreal Engine of Stormtroopers in Season 1 of Obi- Wan Kenobi. Image copyright Lucasfilm and courtesy of The Third Floor


tracked and recorded; the company is working on integration between Shotgun and Unreal to hone this. “You want to be able to refer to and act upon all the decisions the director takes during the sessions,” says Aupetit. As they are on a shared network, FPS and the


VFX supervisors at Framestore can keep up with each other’s progress. On Moon Knight, VFX supervisors could see previs environments evolve while VFX animation tests on creatures could be integrated into previs. Later, changes to the final model from VFX could be updated directly in the postvis to keep everything consistent. Neville Kidd ASC, cinematographer on many VFX-


heavy TV shows, uses previs for anything involving a virtual build and describes it as “a game-changer” for all departments in the way the shot is being thought


“If you know how everything is going to be structured, you don’t have to show up with an entire truck’s worth of redundant equipment”


of, and for budgets. “You don’t want to find out on the day that the shot physically won’t work because you can’t get the camera through a passageway. It’s also a great indicator of when a shot is too long, your angles are wrong, or when the story is not being told. Fixing that becomes very expensive and very time- consuming afterwards.” Kidd has lensed 16 episodes of Netflix’s The


Umbrella Academy, and “prevised it to within an inch of its life”. “Working with Spin FX for season two, we had ten shots to join together into one big seamless sequence,” he explains. “Previs was absolutely essential. We constantly fine-tuned it, so that it was an accurate representation of what was on the screen at the end of the day.” For season three, DigitalFilm Tree used previs tools


from Unity to build virtual sets, dropping in virtual camera models and rigs for the shots to be planned out in finer detail. “Even if you haven’t decided what your cameras are [before previs], it just takes the click of a few buttons in software to change from ‘shooting’ on an Alexa 65 to shooting on an Arri LF,” Kidd says.


VIRTUAL VOLUMES Previs can be “incredibly useful” in helping decide how you’re going to shoot things in advance, especially when virtual production could be in the mix, says Kidd: “You realise very quickly what needs to be shot for real in camera, what needs to be shot on green screen or blue screen, and then what needs to be shot within a LED volume. That’s all super helpful at the very early stage of production.” Using virtual production during previs, sees “the


process of post-production and content finalisation accelerated, saving time and ultimately money”, according to Sebastian Leske, Product Manager, Cinema, XDCAM, Virtual Production, Sony Europe. “Colour matching and accuracy can be done between the scene elements, the LED Walls and the cameras, especially when using the Sony Venice with Sony CLEDs. This reduces the heavy lifting in colour matching and compensation normally done in post.”


Spring 2023 televisual.com 115


Page 1  |  Page 2  |  Page 3  |  Page 4  |  Page 5  |  Page 6  |  Page 7  |  Page 8  |  Page 9  |  Page 10  |  Page 11  |  Page 12  |  Page 13  |  Page 14  |  Page 15  |  Page 16  |  Page 17  |  Page 18  |  Page 19  |  Page 20  |  Page 21  |  Page 22  |  Page 23  |  Page 24  |  Page 25  |  Page 26  |  Page 27  |  Page 28  |  Page 29  |  Page 30  |  Page 31  |  Page 32  |  Page 33  |  Page 34  |  Page 35  |  Page 36  |  Page 37  |  Page 38  |  Page 39  |  Page 40  |  Page 41  |  Page 42  |  Page 43  |  Page 44  |  Page 45  |  Page 46  |  Page 47  |  Page 48  |  Page 49  |  Page 50  |  Page 51  |  Page 52  |  Page 53  |  Page 54  |  Page 55  |  Page 56  |  Page 57  |  Page 58  |  Page 59  |  Page 60  |  Page 61  |  Page 62  |  Page 63  |  Page 64  |  Page 65  |  Page 66  |  Page 67  |  Page 68  |  Page 69  |  Page 70  |  Page 71  |  Page 72  |  Page 73  |  Page 74  |  Page 75  |  Page 76  |  Page 77  |  Page 78  |  Page 79  |  Page 80  |  Page 81  |  Page 82  |  Page 83  |  Page 84  |  Page 85  |  Page 86  |  Page 87  |  Page 88  |  Page 89  |  Page 90  |  Page 91  |  Page 92  |  Page 93  |  Page 94  |  Page 95  |  Page 96  |  Page 97  |  Page 98  |  Page 99  |  Page 100  |  Page 101  |  Page 102  |  Page 103  |  Page 104  |  Page 105  |  Page 106  |  Page 107  |  Page 108  |  Page 109  |  Page 110  |  Page 111  |  Page 112  |  Page 113  |  Page 114  |  Page 115  |  Page 116  |  Page 117  |  Page 118  |  Page 119  |  Page 120  |  Page 121  |  Page 122  |  Page 123  |  Page 124  |  Page 125  |  Page 126  |  Page 127  |  Page 128  |  Page 129  |  Page 130  |  Page 131  |  Page 132  |  Page 133  |  Page 134  |  Page 135  |  Page 136  |  Page 137  |  Page 138  |  Page 139  |  Page 140  |  Page 141  |  Page 142  |  Page 143  |  Page 144  |  Page 145  |  Page 146  |  Page 147  |  Page 148