“Artists show their work, the director wants more of this or that, and all this is repeated back and forth many times. This is a cycle that moves our work and makes things look like they look in the end. In this way, every little piece gets better and better, then it becomes perfect, and then we move on to the next frame, ”
describes his work on Avatar by Joe Wilkie, one of the managers of the special effects studio
Weta Digital from Wellington, New Zealand. It was here that all the magic of the film “Avatar” was created, which captivated us all.

12 MB per frame, 288 MB per second, 17.28 GB per minute. All 162 minutes of the film that we saw on the screens - a huge amount of information - this is only a small fraction of what the new data center of Weta studio passed through. Assembled, as you might have guessed, on HP servers: 34 racks with 32
HP Proliant BL2x220c blades each - 40,000 processor cores and 104 TB RAM,
seven lines in the Top-500 supercomputer rating. We'll see a little on the other side of the film that was left behind the scenes.
')

Weta Digital, which has repeatedly received awards for the best special effects, has a very impressive
portfolio . But when the studio received an order for post-production from James Cameron, even the proven blade cluster at its disposal from one of the major vendors was not enough. Therefore, in the summer of 2008, Weta completely updated its data center with an area of 950 m².
To feed the entire fleet of dual HP BL2x220c tasks, Weta had to install a 3 Pb fiber optic network. And the racks themselves with servers, connected by several 10-gigabit links, had to be placed closer to each other in order to achieve the required throughput.

What is rarely heard from the owners of such a data center today, Weta chose to abandon the parallelization of tasks using virtualization. Paul Gunn, chief studio system administrator, says virtualization would only slow down the creative process.
The fact is that the process of working on the film in Weta was built like this:
- the director discusses with the artists his vision of a scene,
- artists embody it, each their own part, in Maya or RenderMan packages and send their work for rendering;
- tasks from artists are collected in a hierarchical queue using the product of Pixar studio under the name Alfred - a system that in turn gives them to free processors of the cluster, loading them at 100% with one task;
- the scene is calculated completely exactly by the time when the director is ready to watch it with the artists and give his comments.
The job of arranging each scene in the queue fell on the shoulders of Joe Wilkie, who in the studio is called the “rendering driver”. Last month before the premiere of the movie, the Weta Digital data center processed an average of 1.3 million rendering tasks per day - 7-8 Gb of data per second, 24 hours a day!

“This is a complex system, and when you have a deadline for a project like Avatar, you really want everything to work as it should,” says Paul Gunn. Therefore, the cooling system has become another necessary part of the new data center.
Completely water, with safety passive radiators at the top of each rack, the system allowed a 40% reduction in the cost of cooling the data center compared to its similar density in density, which are cooled in a standard air way. “I don’t want to give exact figures, but when we talk about cooling down by 1 degree, we are talking about tens of thousands of dollars in savings,” says Paul. Even if he meant the New Zealand dollars, the difference is impressive.
Today, Weta Digital is the owner of the most powerful cluster for 3D rendering in the world. And the existing groundwork for many years will allow the studio to create masterpieces that have no equal. That was Avatar, and I can’t wait to see what the Hobbit, directed by Guillermo Del Toro, expected to be released next year ...