You are currently viewing Films like ‘RRR’ and ‘PS:1’ are only possible because of the video game industry’s Unreal Engine- Technology News, FP

Films like ‘RRR’ and ‘PS:1’ are only possible because of the video game industry’s Unreal Engine- Technology News, FP


Films like RRR and Ponniyin Selvan: I are winning all sorts of accolades this year for a number of reasons, but most importantly, how stunning the film looked, how jaw-dropping and stunning certain sequences from the film were. Although coming up with a sequence like this may not be easy, actually making or filming them for the screen is significantly more challenging even for the most seasoned filmmakers.

Believe it or not, in such a scenario, it is actually a tool made for the video game industry that has proven to be a blessing for filmmakers and VFX directors, a tool without which, most of the stunning VFX that we see in films, simply couldn’t be made. One of the biggest tools that filmmakers have at their disposal today is the Unreal Engine, which, actually is a gaming engine that is used by game developers to determine how objects in a digital or virtual setting will interact with each other. 

Films like ‘RRR’ and ‘PS_1’ are only possible because of the video game industry’s Unreal Engine

A significant portion of Bheem’s first entry shot for RRR was shot inside a studio and used sequences made in Unreal Engine. The shot of Bheem’s reflection in the water and the camera turning 360 degrees, was made entirely in Unreal Engine.

In fact, Unreal Engine was developed exclusively for video games. Video games need an engine upon which things are coded. This engine not only drives the graphics, but also the physics based on which every element in a game interacts with other elements – including the natural light. It would take years for game developers to recreate the laws of physics, how light interacts with elements and how different components interact with each other, if they were to do this for every game they developed. This is where gaming engines like Unreal Engine come into play.

Some of the most popular games that we know of today were made using Unreal Engine. The list includes names like Fortnite, Batman: Arkham CityStar Wars Jedi: Fallen Order, Mass Effect, Sifu and a ton of other games that have been applauded for their visuals

So what made VFX artists and animation experts turn to a tool that was essentially made for video games, to create some of the most stunning visual effects we have seen in recent films?

We spoke with Yashoda Parthasarthy, a director, animator, motion designer and co-founder of Plexus Motion Pictures, who played a key role in developing the graphics and motion design for PS:1, and with Tilak Raj Shetty, CEO of Graphiti, an animation studio that has put India on the global map when we speak of VFX. Following are the edited excerpts from the conversation. 

Films like ‘RRR’ and ‘PS_1’ are only possible because of the video game industry’s Unreal Engine (1)

Some portions of Ponniyin Selvan: Part I was very tricky to shoot because of the manner they were conceptualised. Unreal Engine helped the visual effects artists to blend in a number of key elements to make the film look as realistic and aesthetic that it did.

India has been at the forefront of animation and VFX in major Hollywood films. Why is it that only a few Indian movies get the VFX and the animation correct?
Yashoda Parthasarthy: There are a number of reasons for that. Most visual effects projects that get outsourced to Indian Studios have supervisors and creative directors based abroad, who are trained in art and have some experience with designing, and directors who have passed from the best art and design schools. These are the people who take most of the key technical and creative calls. Even the Indian movies that have great VFX, Brahmastra, for example, have art directors and supervisors who have had that sort of training and are based abroad. 

Now, we have seen some Indian studios that have their own internal creative directors and concept artists, who are getting into that zone. However, most Hollywood projects have creative directors who like to have control of the entire project. 

The second factor is time. VFX might be technical, but it is art at the end of the day, and it needs time. Working with reasonable timelines is still something that filmmakers are trying to figure out. And not just in Indian films, you will see films from Marvel as well which fans have criticised for having bad VFX because they were rushed.

Because digital filmmaking and the VFX boom that we have seen in India has been very recent, a lot of people including filmmakers are still unaware of what are the stages of digital production, how much time we need to do certain things, why VFX needs time to get to a certain quality. The more such conversations happen, the more we are able to explain to directors and producers our process, and the quality will become better over time.

Tilak Raj Shetty: I have been in the industry for over 30 years now and one thing that I can say for sure that will not change is the nature of how we work. For most projects, we have time for multiple rounds of recorrections and post-processing, but only sometimes will filmmakers invest any time in prep work. That’s why you’ll see good studios will not get involved in projects that don’t allocate time for prep. 

Films like ‘RRR’ and ‘PS_1’ are only possible because of the video game industry’s Unreal Engine (2)

How does Unreal Engine fit into prep?
TRS: Life starts at prep and ends at prep. For us, if the prep isn’t good, we can’t move forward with production. New directors are actually very aware that we need to spend time preparing because they also want to understand how things work. Otherwise, I have seen some of the most prominent directors and technicians are at a loss when they need to shoot with green screens. Newer directors, DoPs, and other technicians now want to learn what is actually going on. Some of the basic things, like scale, perspective, and various other parameters that one needs to keep in mind. We also need to keep in mind what lenses should we use to shoot a sequence, and how to get the proper camera movements. Unreal Engine helps with all of this. It allows us to move seamlessly from one medium to another. 

What benefits do filmmakers and VFX artists have from using Unreal Engine in their projects?
YP: We have a habit of going to the location of the shoot, recceing it, and then figuring out what lens to use, how to set up the shot, how to stage and then film it. All of this happens on production when we’re on a shoot. Unreal Engine allows us to mirror the same pipeline, only quicker. There are a number of assets that are available on the marketplace that will let you quickly know what your set looks like. Moreover, the camera and the lighting in Unreal Engine is out of this world. Any DoP can just walk in, and figure out how he wants to stage his characters and all other elements, without the need to actually go out and do it in a live-action set. 

There is also an ease of working. For example, in the opening sequence of PS:1 it was very easy for us to stage things, to stitch one shot to another seamlessly, and then lens it and light the scene as necessary, things that would be extremely difficult in a live-action set. Furthermore, we can illuminate a particular portion of the scene in a way that it doesn’t light up or cast a shadow on other elements, which only makes our life much easier. 

TRS: I believe filmmaking should never be done without prep work. The first thing we do is decide how should a project be seen, what should be the dimension of our frame, what aspect ratio should we work on – 16:9, 4:3 etc. Then we have the storyboard artists and layout artists come in and do their magic. Without prep work like this, animation films can’t be made. I feel live-action films also need to follow this principle. Unfortunately, our industry hasn’t been trained in that manner. 

So what Yashoda is talking about, is ideally how projects should be started – through thorough prep work, and Unreal Engine forces creators to take prep work seriously. We are actually in a phase of transition, where newer filmmakers are aware that this is the way to go and they will take to it as the fish is taken to the water. 

We had the same watershed moment when we moved from stock(films) to digital. Some of the biggest filmmakers fell off the wagon because they did not want to adapt, whereas the newer guys who worked with them as assistants, but understood what digital did and what sort of things it allowed, went on to become the best filmmakers of their generation. Because of Unreal Engine, we are able to deliver what is required of us, when filmmakers have some “unrealistic” demands.

YP: I agree with Tilak, it makes life easier for everyone involved. And moreover, it will allow filmmakers to come up with sequences that they normally would normally shy away from because of how expensive they think the VFX might be to create a particular sequence.

Films like ‘RRR’ and ‘PS_1’ are only possible because of the video game industry’s Unreal Engine (2) (1)

How cost-effective is using Unreal Engine for filmmakers?
YP: I will answer this with an example. We had to shoot a promo for Netflix’s Aryanak with Raveen Tandon. Normally, 10 promos are shot in one day. We only had four hours to shoot with her, and we had planned this very elaborate sequence, where we were using the Bolt camera bot which enabled us to move the camera in a really tricky and dynamic manner. We also wanted to blend in some live-action elements and some aspects of VFX seamlessly, in a short span of time. The best way to approach this was to go into Unreal Engine, use the camera bot, and then make the sequence in the Engine using a dummy. Because of this, we knew exactly what shots to take, and what duration each take would be when we were on the set. As a result, a shoot that was supposed to take 4 hours, was completed in 45 minutes.

If we function like this as an industry, it will save filmmakers a lot of time, and of course, money, in production. They will save a lot of money in post-production as well, as you will know exactly what you want and how to shoot it.  

Will ever see digital avatars of actors in a full-length Indian feature film, like we see actors appearing in games? Or are mainstream actors too insecure to participate in this?
YP: Because of certain NDAs, all I can say is that it happening, and you will see digital avatars of actors on screen. However, there are nuances of the human body, facial expressions, motion, and the actor’s personality that filmmakers will need to recreate and for that, you will need a team of animators who are really good at what they do. 

That won’t be easy, but we will head there. We are already working with facial tracking, face and body replacement as we know. Instagram and Snapchat filters are doing it and doing it well. As animation and VFX artists, we will only do better. In five years or so, we may see entire CG-generated digital avatars done so well, that people might not be able to tell, if technology keeps developing at the pace it has been so far. 

TRS: We will go that far, but as an audience, I don’t if we are ready to accept that yet. As a technicality, a film might draw audiences based on the fact that it has CG-generated digital avatars. But after 10 or 15 minutes it is the story and the performance of the character that will drive the film.

We connect with animated characters. Be it Mulan, or Toy Story we connected with those characters, but I am not sure if we will connect with CG-generated digital avatars to that level.

As Yashoda said, simulating the micro-movements, the nuances of an actor, will take a lot of time and effort. There’s also the audience’s observation and the emotions at play here. There is a reason why Amitabh Bachchan, is Amitabh Bachchan, or Manoj Bajpayee is Manoj Bajpayee. They get into the skin of the character and blow you away. CG-generated digital avatars to get there, but it will take time. All the times Hollywood has tried to do something like this, they have failed miserably so far, so we can confidently say, this will take time. 

YP: I completely agree with Tilak. We emote with animated characters very well. So anybody who emotes very well, but does not necessarily look like a human or a real, live object, may work very well. For example, the latest Lion King, which was photorealistic, did not evoke the same response as the animated Lion King, because it lacked those animated and exaggerated eyes, even though the latter film was technically brilliant. We are very attuned to what is supposed to be real and what isn’t. The moment you show a very realistic lion talking, it is jarring, despite the willing suspension of disbelief. 

TRS: Moreover our brain is wired and trained to see everything in a certain way. If we see something that does not abide by that, our brain will simply reject that. With animation films, our brains were already trained to see them in a certain way. So you are able to accept it as is without running into issues of suspended belief. 





Source link

Leave a Reply