How A Shot Is Made At Animost Studio
Linear vs. Real-Time Pipeline
Traditional linear animation pipelines take an assembly line approach to production, where tasks are performed sequentially. Newer and more holistic pipelines opt for nonlinear and parallel methods to process and distribute data.
(image courtesy of Epic Games)
While a linear pipeline can produce results, it has certain drawbacks:
- Limitations to changes. In any production, animation needs to be broken down into distinct requirements, each of which is critical to the nuances of performance. An improvement for a specific requirement might require a subtle change to an asset’s motion, a set’s lighting, etc. In a linear workflow, if such a change is required while later steps are underway, the changes can be difficult or time-consuming to propagate throughout the project. Such a limitation can lead to a reluctance to make changes and improvements, which ultimately affects the artistic quality of the production. Since the entire point of the production is to create the best animation possible, such a limitation can defeat the purpose of the project.
- Post processing required. Traditional linear animation Fortnite Trailer: Developing a Next Gen pipeline for a faster workflow studios commonly produce output as layers and mattes to be composited at the end of the project. While this approach does allow for a high degree of control over the result, it adds a great deal to the overall budget and schedule. Post processing is often done outside the creating studio, which further divorces this step from the rest of the production.
To minimize these limitations, we use Epic Games approach to develop a real-time pipeline that eliminated the problems of a linear pipeline, smoothing the production process while maximizing artistic quality:
- Interactive creative process. The entire production pipeline is represented within Unreal Engine from asset ingestion to animation, effects, lighting, editing, and post production. Reviews can be performed on each part individually or on the project as a whole to determine whether changes are needed. Artists can make iterative changes easily within Sequencer, the cinematic tool within Unreal Engine. The time between making a change and seeing it reflected on-screen is instantaneous, facilitating an interactive and creative process.
- Ease of editing. Sequencer can create, modify, reorder, or even delete shots in real time. Sequencer acts as a production hub, combining aspects of the editorial department with layout and animation in a non-linear timeline editor.
- Faster output. The entire process of post effects can be dealt with inside Unreal Engine, thus reducing the need for external compositing requirements.
Real-Time Pipeline: How Animost Studio’s Pipeline Works
In our real-time animation process, there are five main sub-processes. Three of these sub-processes— layout compositing and shooting in a virtual environment, real-time editing and feedback, and real-time feedback for on-set production— were made possible through Unreal Engine. The real-time rendering technology of the engine allows on-set production thanks to high-quality visualization and an excellent sense of immersion. To achieve this, asset creation, development, real-time operation, and updates are essential. Animost Studio is currently in the process of developing an asset-creation pipeline, I/O pipeline, real-time editing data sync system and sub-pipelines, multi-rendering from the user’s perspective for feedback, other sub-custom tools, and external drive control functions.
(image courtesy of Epic Games)
The major post-production pipeline at Animost Studio applied a real-time pipeline similar to what was used for the creation of the Fortnite trailer in 2018. The key difference compared to the existing production method is that all shots are created with real-time ray tracing to bridge the gap between the existing digital content creation tools and Unreal Engine’s final look.
The effects of Unreal Engine on real-time animation production
Enhancing productivity through real-time rendering
Previously, reviewing the final results required using the Playblast feature and Maya for final renders, but Unreal Engine’s real-time rendering enabled the directors to almost immediately communicate changes that shorten production times from the pre-production stage to post-production stage.
It is now possible to add camera blocking and animation in Maya, view the general atmosphere through the master sequence and fine-tune the details by moving around Sequencer in real-time. The “what-you-see-is-what-you-get” factor is a huge benefit that streamlined the existing pipeline by several stages.
There were also great improvements to rendering time. Using the previous tool required at least 30 minutes per 2K resolution frame, but now Unreal Engine renders a 4K frame in just five seconds with ray tracing enabled.
Effective communication
Unreal Engine’s comprehensive real-time rendering leads to significant innovation by providing real-time feedback to not only directors and actors but also all participants. Participants can receive immediate artistic and technical feedback in their areas of expertise, allowing us to streamline the long communication times between pre-production and post-production. Even the post-production stage has become efficient as the director can view the screen and immediately provide directions and make decisions on the spot.
Expanding the artist’s boundaries
By reducing the side effects of the waterfall method, production was made possible with half the manpower. In the previous pipeline, the content source was passed on from the earlier stages and compiled with a single program, just like a relay race. In contrast, Unreal Engine enables the team to view the final result and go beyond their boundaries without being limited to their respective positions.
Cutting asset costs: Houdini-HDA & Megascans
Certain circumstances, such as limited timelines and budgets, did not allow for artists to work on asset creation. With no other artists available who had experience in optimizing assets with level of details in the game industry, Houdini was used to procedurally model the simple yet high-maintenance props to shorten the time.
When comparing the method of importing FBX formats to importing through Houdini Digital Asset while maintaining the same file and the number of meshes needed, the latter was a much lighter procedure, which reduced the data size. That’s in part because certain objects that were added to large-scale environments were created with many variations and then instanced.
In the case of creating environments, a small number of artists had to work on a large batch of tasks, which made it challenging to collect assets to create beautiful backdrops that match the desired quality. Megascans was available for free, which enabled much faster placement of dense nature assets. Different versions of the background were created also using foliage, decal, exponential height fog, atmospheric fog, and post-process volume.
We use Unreal Engine to view the look and set up the lighting in real-time, which enabled more creative work. In comparison to the existing software, this made a huge difference and we will keep refining our pipeline while looking forward to the possibilities the future brings.