3D Creation Workflow - The Bigger Picture, Intel® Technologies in Action

Marc Potocnik

Marc Potocnik

Düsseldorf, Nordrhein-Westfalen

5 0
  • 0 Collaborators

The animation project "Apollinarisstr." was created for Intel and Maxon to test the power of Intels latest CPU workstations. The project was created on an 28core Intel Xeon W3175X CPU and later re-edited on an even faster machine: a Dual 28core Intel Xeon Platinum 8280M. ...learn more

Project status: Published/In Market

oneAPI, Graphics and Media, Digital Art

Groups
SIGGRAPH 2020, Creators Artists

Intel Technologies
oneAPI, Intel® Xeon® Processors

Docs/PDFs [1]Links [1]

Overview / Usage

3D Creation is not only about 3D Rendering. Explore the steps of a typical 3D Creation Workflow by looking at the animation project Intel® "Apollinarisstr" by renderbaron and see the bigger picture. Contributed for Intel @ Siggraph 2020

About the project: An everyday kitchen in spring. Worn out, used and flooded with light. No high gloss design interior, no fancy lifestyle location. Done with Cinema 4D, rendered with Physical Render. Created for Intel® to benchmark their latest Xeon® CPUs: W3175X and Dual Platinum 8280M.

This scene also served as an extensive study of highly detailed procedural shading and photorealistic lighting - with Direct Illumination only. Modeling and shading of fruits and vegetables is 100% procedural, created with the native material system of Cinema 4D. All models and textures were created from scratch.

Methodology / Approach

Shading

The shading of the entire scene, especially vegetables and fruit, is 100% procedural, created with the native shader toolset of the standard material system in Cinema 4D. For the creation of natural details with Cinema 4D Shaders it is of course important to have a profound knowledge of the shaders available in Cinema 4D, especially the Noise Shaders. But then it's amazing how much detail you can get into with procedural shading - completely rule-based, without touching a single pixel.

Lighting

In my animation work I prefer to use only Direct Illumination and not to use Global Illumination. Why? Because I love it... and because I can. :-) Jokes aside: lighting with Direct Illumination heavily enhances your own lighting skills. It also allows you to directly control every imaginable aspect of light. In addition, it can offer serious advantages in speed and predictability when rendering animations. Nveretheless the light setup is quite elegant: there are only 15 light sources in this scene making up photorealistic lighting. See the video "3D Creation Workflow - The Bigger Picture" for more details.

Animation

The animation should be that of a shaky handheld camera. The ideal tool for this is the MotionCamera tool from Cinema 4D. With it, you can simulate random like hand camera movements and, if desired, even add dynamic behavior. Details such as camera height, step frequency, rotation of the upper body and camera can be individually adjusted like many other parameters.

Technologies Used

Rendering

For rendering I used my own In House render farm. It consists of Intel Xeon (Dual or Single Socket) and Intel i9 workstations, which are involved in distributed rendering processes via Cinema 4D Team Render. CPU-Rendering offers speed and easy scalability – you just plug in another client machine, the amount of RAM you want and you´re done. And on top it offers long-term reliability and a bigger return of investment.

Cinema 4D's internal CPU render engines Physical Renderer and Standard Renderer both benefits from an Intel technology called Intel Embree. These are raytracing instructions that are optimized for Intel® CPUs, accelerating the rendering of large amounts of geometry in Cinema 4D. Here's an example - the kitchen scene only with geometry and light: without Embree and with Embree switched on. With Embree switched on the rendering time is reduced by almost 50%. Remarkable.

The final animation was rendered using Cinema 4D's internal Physical Renderer. The Physical Renderer is based on stochastic sampling and therefore benefits from the Intel Open Image Denoiser. This AI Deep Learning based Denoiser improves the image quality of noisy images and therefore saves rendering time, especially with numerous render iterations. And the Denoiser comes into the place when it really matters: for ray-tracing-based features such as Area Shadows, Ambient Occlusion, Depth of Field, reflections, etc. But also a possibly too grainy Quasi Monte Carlo solution can be massively accelerated by using the Intel Denoiser – no matter if you use Standard or Physical Renderer.

After Rendering the results were composited in Adobe After Effects. One purpose of compositing aside from gradation correction is to transform the mathematically perfect rendering into credible, analog-looking imagery - just as if it had been filmed with a slightly imperfect optical system. For this purpose I use effects such as Chromatic Abberations (Red Giant Magic Bullet), Glow, 2D Motion Blur, Vignette and Film Grain.

Documents and Presentations

Comments (0)