VFX: Making the Unreal into a reality | Industry Trends | IBC – IBC365

Last month, Epic Games took the covers off an early iteration of its latest game engine, Unreal Engine 5, and pretty much broke theparts of the internet that are interested in gaming. Running on a developer version of the PlayStation 5 hardware,the results uploaded to Vimeo looked fabulous enough to begin with, but when yourealisedthat this wasrealtimegameplay footage they were genuinely jaw-dropping.

The visual quality is pretty much unmatched in anything apart from current high-end, pre-renderedVFXwork and showcases incredible levels of detail alongside genuinely photorealistic lighting. And, of course,whatseven better, is that news of UE5 also lit up the parts of the internet that deal with broadcast graphics, film effects, virtualstudios and pretty much everywhere else concerned with top quality visuals.

While other graphics engines such as Unity are, of course, available, Epic has pulled off the neat trick of combining a leadingfeaturesetwith ease of use and a business model that encourages UEs use in other software. For instance any game or software maker that uses Unreal for commercial purposes doesnt pay any license fees until the softwares gross revenue hits a $1m barrier (recently raised from $50k) and this has helped widespread integration of it into a whole host of technologies.

Its use throughout the graphics stack is analogous to the way that hooking up to the IT industry as a whole has accelerated development across the broadcast sector; it allows a comparatively small industry to piggyback on the development efforts of a much larger one and the speed of change were seeing as a result is impressive.

When the new Unreal Engine 5 comes out you wont be able to tell whats real and whats not, comments PhilVentre,VPSportsand broadcast atNcamTechnologies. Games engine integration isdemocratisingthe way that companies use AR and VR and its not just going to be a technology for the Tier One broadcasters in the future.

Changing the gameWhile the new demo was partly created to show off some of the very clever new technology in the forthcoming PS5, such as its literally game-changing M.2 solid-state drive, UE5 is showcasing some new technologies that dramatically move the goalposts forrealtimeCG work. Thereare twoin particular worthmentioning: Nanite and Lumen.

Nanite is a new virtualizedmicropolygongeometry that essentially lets artists create as much geometric detail as they want. It is streamed and scaled inrealtime, an important consideration whenyoureplanning an engine that will run on everything down to a smartphone.

Then theres Lumen. This is a fully dynamic global illumination system that immediately reacts to scene and lightchanges, andis part of the secret of making game graphics lookso good running in-console.Itscapable of renderingdiffuse inter-reflection with infinite bounces and indirect specular reflections in what Epic calls huge, detailed environments, at scales rangingfromkilometrestomillimetres. It adapts too: blow ahole in a wall and the scene will change to accommodate the light coming through the hole.

The ability to light and render hundreds of millions of polygons in real time is a quantum shift that will change the level of engagement filmmakers have with theimages they create, says Miles Perkins,businessdevelopmentmanager at Epic Games. These new technologies will allow creatives to see the totality of their vision without having to disassociate the various parts of their shotsreviewing animation separate from lighting, separate from environments and effects. Everything will be right there in front of them, fully directable. Filmmakers will be able to compose and light shots in real time, regardless of whether they are physical, virtual, or a combination of both.

Perhaps one of the key points here is that Unreal Engine, currently on version 4.25, is already very good indeed.

We are currently using Unreal Engine 4 heavily in bothprevisand virtual production, says Hugh Macdonald,chieftechnologyinnovationofficer atNviz. Forprevis, the real-time nature of UE4 means that we can get incrediblyhigh qualitypictures with minimal render time. We also use it for virtual production, giving better looking integration than we would historically have been able to.

Nvizuses Unreal in two main tools, which are a good illustration of howitsbeing used forpreviswork across the industry and where it could be going. A Virtual Camera System enables virtual scouting of theprevisenvironment, and allows directors and cinematographers to have a hands-on experience with the camera, while asimulcamtoolset is tightly integrated into Unreal, and provides a preview on-set to the production crew of what a shot will look like once the VFX has been added in post.

Unreal allows us to ensure that this is both flexible and high-quality, says Macdonald. Based on what we know so far about UE5, the major jumps are going to be around geometry detail, in that far higher resolution assets will be able to be used and streamed in. This will fit far better with a film VFX workflow, as the hope is that assets wont need as much processing to make themengine-ready. The fully dynamic lighting that is Lumen will mean that there will be less need for baking the lighting to getthe same result. This will allow us to keep scenes fully dynamic, allowing us to adjust the lighting live during production if required, which is often something that is asked for on set, as the physical lighting is changed depending on the shot.

NewcreativityAs well as an increase in quality, whichNCamsVentrelikens to the jump from UE3 to UE4, UE5 holds out thetantalisingprospect of introducing both new ways of working and new ways of creatively exploring virtual spaces.

Unreal Engine will be a big part of the future of cinematography, says Sam Measure at CVP. Its bringing back the ability to get practical effects in camera, whether that be interactive lighting on an actor or dynamically changing the backgrounds in real time, even though they have been created in a virtual space. The ability to get instant feedback of how something is going to look is invaluable.

This is going to feed into many more live spaces than onsetprevis. Macdonald mentions the theatre and event sector, where video screens have become part of the interplay with lighting to create whole new live spectacles, while there is going to be a further jump in quality onvirtual sets to make them indistinguishable enough from the real thing so that that only a live audience will tip the decision towards using a physical set. Even those audiences might see quite different shows, with blended elements from AR being used seamlessly in the final TX. In the postCovidtimes, you will beabsolutely ableto assemble three guests on a sofafor a chat show without any of them leaving their homes and no one being the slightest bit the wiser.

And then of course, there is the way it could accelerate development of live shoots against LED screens as pioneered by shows such asThe Mandalorian.

While Unreal has been used on LED screens while filming a number of times, the new updates will hopefully allow this to be pushed muchfurther, andget a higher proportion of finished shots directly from the camera, enthuses Macdonald.

And perhaps whatis most impressive of all is that the UE5 footage to date is very early stages. UE5isntdue for full release until late 2021 (there will be a preview release earlier in the year) while the PS5 isnt due till at least the end of this current year. In otherwordsthere is a lot ofoptimisationand performance still to be wrung out of all this that is still to come.

I believe we are just scratching the surface of what will be possible, says Perkins. Game engine technology will not be limited to individual film departmentsall departments will be able to make contributions to the virtual sets just as they would on a physical set. In the future, the set will be a big creative sandbox, where the art department, set design, gaffers, cinematographers, visual effects, action designers, directors, will all contribute to the set both physically and virtuallybringing the best of their talents to bear in that moment when the film captures the image.

Read the original post:

VFX: Making the Unreal into a reality | Industry Trends | IBC - IBC365

Related Posts

Comments are closed.