M9 1a8 8 0 1 0 0 16A8 8 0 0 0 9 1zm. For a last push on filmco transformice bitcoin final weekend of work, ILM’s entire render farm was used for Transformers 3. ILM calculates that that added up to more than 200,000 rendering hours per day – or the equivalent of 22. 8 years of render time in a 24-hour period.
I don’t understand what type of render time it is? By a simple calculation this means that at that time the ILM render farm consisted of at least 8300 CPUs. In comparison: the Pixar render farm had 12,500 CPUs in 2011. 25 million hours sounds reasonable for a movie such as Bee movie, actually. Feature animated films like this are VERY time consuming to render, and if they’re always rendered on large collections of computers such as the Media Grid since individual computers and even small collections of computers aren’t enough. The Media Grid has thousands of powerful computers all connected together to render, for example.
This is known as a “render farm”. So basically they spent 200,000 hours of rendering to process the raw images into what you see as the final product. Think of this like man hours. In a typical day in the US, a person might work 8 hours. If you have 10 people working 8 hours, you now have 80 man hours. Mind you it would take 25,000 people to equate this to the amount of rendering hours their rendering farm put out.
Paulster2 correct but in this case the “man” hour if you will is generated by the rendering farm to process the raw data and the label is a bit of misnomer. For example, many mistakenly think light year is a measure of time, but in fact its a measure of distance. Render hours can be a very ambiguous measurement. Sometimes it refers to a single thread so 1 computer could complete 4 hours if it has 4 cores.
It also depends on the speed of a single CPU. Sometimes they use old standards like a 2Ghz CPU as the baseline. Was just trying to make it a little more understandable through analogy. You must log in to answer this question.