Ramp up with me… on HPC: What is Rendering?

This post has been republished via RSS; it originally appeared at: New blog articles in Microsoft Community Hub.

Like many others, I love animated films.  Age no longer seems to be a factor in interest, rather people of all ages are enjoying these, and likewise studios now have a range of movies targeted at both kids, adults, and both. But how are these made? Studios commonly use rendering, a high-performance computing (HPC) workload, to create these stunning works of art.  Ultimately creating these animated films requires the ability to take hundreds to thousands of still images and putting them together to create the illusion of motion and video.  This is pretty incredible – they are literally mashing together a ton of images to make you think there’s movement.  Imagine creating something as basic as a person walking a few steps. You need several images for every second of that person’s movements.  

 

What is rendering and how does it relate to HPC?

Rendering is used mostly in the media and entertainment industries, though it’s also used in other industries as well.  Rendering in HPC refers to using powerful computers to create images or videos from 3D models or scenes. This process involves a lot of complex calculations that simulate how light interacts with objects in the scene, which requires a lot of processing power. HPC systems provide a lot of computing resources that can be used to perform these calculations quickly, which is useful for applications such as animation, visual effects, and engineering.

 

When I first started in this space, I had a difficult time grasping this concept and my husband gave me a great example.  Take Sullivan from Monsters, Inc, my favorite character in the movie.  Every single piece of hair (and let’s face it, Sullivan is hairy) has to interact differently with the environment. Imagine Sullivan standing outside with the wind blowing – a fairly straightforward and common scene in movies. The speed of the wind, the way he is facing (towards or against the wind), and many other factors all impact how every single piece of his hair shifts.  Or consider how the placement of the sun, whether it’s fully up or setting, affects the way the light hits each hair and shadows are created. This all has to be reflected in every single frame, and thousands of frames are used in a movie. If these aren’t done in a way that feels natural and realistic, the movie has lost you.

 

Now, I guess the question is why would you use high performance computing in rendering? This is where it gets fun.  Every second of an animated show or movie uses several frames per second, and each of these have to be rendered. Think of the old animation flipbooks that used to be popular, and how you flipped through each page to see a tiny difference in the drawing, but when flipped at a fast speed, the animation came alive.  The amount of frames needed per second are going to vary from say 24 frames a second to over a hundred.  For our purposes to put this into perspective, let’s use 30 frames a second.  This means that for every minute, you have to render 1,800 frames!  Now for a 90-minute movie? 162,000 frames must be rendered if they are all perfect the first time.  Keep in mind, often the artists creating these films can’t tell if something is wrong or needs to be changed in a frame until it’s fully rendered.  Want to remove something from the frame that doesn’t look right? You are stuck rendering it all over again – potentially several frames.

 

Rendering in the cloud

I used the Cinebench application to try to put this into perspective and see a still image being rendered using only the power of my laptop. This application can show you a still image being rendered in real time using only your device. To render the single still image in figure 1, my computer took a total of 7 minutes and 20 seconds. 

 

RachelPruitt_0-1683246012072.png

Figure 1: Rendering an image using the Cinebench application

 

But let’s be fair – I’m using an approximately 5-year-old laptop to do this. Imagine I was using a brand-new desktop computer and how much faster it would work.  We’ve all experienced the performance increase when we get a new computer or even switch from a tablet or laptop to a desktop computer. This performance increase comes from the hardware you can put in the computer versus the small tablet/laptop (though let’s face it, what they are fitting in those tiny machines is pretty incredible).  With rendering, many of these companies have traditional used what’s called “render farms”.  These are large numbers of computers that are managed by a Render Manager, which is a 3rd party system that manages each of the jobs as they are sent over. Essentially an artist will create the image and send it over to the 3rd party system to be rendered. That 3rd party system then manages all the scheduling and sends it over to a certain machine to be rendered once available. Similar to a line at a bank, where you stand in line until a banker is available to help you. As you can imagine, if the line gets too long, it causes major delays in the amount of work being done.  This often happens with these render farms, because companies are trying to balance having the resources, they need with fluctuating demand so they are not wasting money.

 

So why do this on the cloud instead?

What if you could use the best technology as it comes out and then use hundreds or thousands of computers at once to tackle these frames in parallel? What if you could have all the hardware you need, no matter how much work you have to throw it’s way so nobody is standing in line? The cloud provides all these things – access to better hardware that can get the jobs done even faster, and the ability to “scale up” or throw as many resources as they need as the jobs flow in.  If you notice from the image, there are several pixels on the image that are being rendered at once – 8 precisely. My example below is significantly slower, because while it’s being done in the cloud, for my first try I wanted to keep it more simple. 

 

Rendering my first image

To try out rendering my first image, I actually wanted to learn to create a 3D image.  I don’t consider myself an artistic person, but I found this to be incredibly approachable and fun to learn.  Since I enjoy Minecraft, I follow this excellent video tutorial that taught me how to make a Minecraft based structure and create the animation.  From there, I connected to a virtual machine and rendered the image into Azure. I’ve included the video below for anyone interested in seeing what I did and anyone wanting to follow along.

I’ll reiterate that for my purposes, I used a single virtual machine because my job was very small, and I’m very new to this process.  Using the Azure Game Development Virtual Machine provided an easy way to do this and better understand the workload, with the added benefit of being able to watch as each individual frame was rendered. However, the typical user in this field generally has significantly larger jobs that require you to scale up significantly, meaning they have to use many virtual machines in parallel.  Remember back to our conversation on what HPC was – the power is in doing many tasks in parallel to achieve faster results.  So, for rendering, it’s about completing each pixel of every single frame – and as mentioned before, those frames do not change very much from frame to frame.  Using more virtual machines to fill in each frame will make it a lot faster – and something like Azure Virtual Machines Scale Sets allows you to set up those virtual machines really fast without having to manually create each one, which would take a long time for someone having to use hundreds to thousands of virtual machines.

 

Animation.png

Figure 2: 3D animation image (click to watch the full demo)

 

 

Let’s wrap this up

As we have discussed, high performance computing, or HPC, is a way of doing something by stringing together a lot of virtual machines to process many tasks in parallel to produce faster results.  Rendering is an incredibly interesting workload of HPC, often used by the media and entertainment industry for some of our most beloved animated movies.  Next time I’m sitting on my couch watching a movie or in a movie theater, I know I will have a much bigger appreciation for the amount of effort that goes into creating those works of art.

 

Learn More

Interested in learning more about high performance computing?

 

Links used for rendering

 

Ramp up with me on… HPC series

 

More helpful links

Leave a Reply

Your email address will not be published. Required fields are marked *

*

This site uses Akismet to reduce spam. Learn how your comment data is processed.