Industry Research

For this project I wanted to look further into the world of VFX. It is a field of animation that has always interested me, ever since Double negative came to my college and gave a talk about their company back in 2012.  I will start out by giving a quick run down of the VFX industry and then focus more on the sectors that interest me personally and how the industry within these sectors is changing.

It is a visual effects artists job to recreate real world phenomena in a digital environment. These effects are used on multiple platforms from games to movies to virtual reailty. My main focus is going to be on VFX in films, that is the area I want to go onto in the future so it seems to be reasonable to focus my research on that. There are many different job roles within the VFX industry, compositor, concept artists, FX artists and matchmove artists, to name just a few. The area that I want to research more in depth is effects animation. The process of creating things like fire, water and explosions, any type of animation that requires a simulation.

Pre Production

There are three key phases in a films VFX pipeline. Pre Production is the first step, this step is made up of all the work that needs to be done before shooting begins. The next step is Production. Production is the actual shooting of the film where much of the VFX is done on set or is prepared for the next stage. The final stage is Post-Production. This is the stage where all the juicy stuff happens and the majority of the work is produced. The pre-production stage is broken down into different groups. The first and one of the most important is R&D. This is the process of creating and increasing efficiency of any tools that will be needed to simulate desired effects for the project. This step allows for post houses to remain contemporary, producing effects that get better and better, year on year. Next, low poly models are created for the pre vis team, for the process of taking the story board or scene descriptions and turning them into a low resolution 3D sequence. For scenes that will contain large amounts of FX, pre vis can be a great way to experiment with camera moves and set-up without having to sack a lot of cash onto an set. The final sequence is then put together and shown to the clients to get the go ahead.

Production

Now, we move onto the production stage. This stage is basically setting everything up in preparation for the post production stage. During production a member of the studio normal FX TD or supervisor would be on set giving direction on scenes where VFX are present and also taking reference images for the artists to use later on. Using a LIDAR camera, 3D digital scans will be taken of the environment and props for the modellers to reference from and also to ensure that the 3D assets line up perfectly with the rest of the scene. Finally HDR photos are taken for the lighting to set up the correct image based lighting system in post.

Post-Production

The final step is post- production. There are a lot more steps to this section so I will be condensing it down and focusing closer on the sections I want to research closer. The first step will be to track the camera, this will be done with 3D tracking software like Boujou. Once the camera tracking is done, the artists can start match moving and/or body tracking, any assets that need to be applied to the scene, so when the final image is composited everything lines up nicely. Lastly any animation of models or rigs needs to completed before moving on. After this the effects animation begins, this is the field I find most interesting so i will be focusing on this a little later.

It is the effects artists job to create any visual elements of the scene that require simulations such as, smoke, fire, clouds, water, steam and explosions. These can be broken down into three groups; particles, anything from dust to snow and rain; dynamics, this is for the hair, fur and cloth simulations; and fluids, for all the liquid, fire and smoke simulations. Particles are a huge part of being an effects artist, basically their bread and butter. These can come in the form of either particles or voxels. A particle system is a technique that uses a large number of small particles or sprites to simulate the effect of real world phenomena like dust, stars, magic and rain. The system starts off with an emitter, simply the generator of the particles. These emitters can be configured to emit these particles in pretty much any way imaginable. Much like the emitters, the particles themselves have a set of parameters that can be altered to change the way the particles behave depending on the purpose. Whereas particles only represent one point in a 3D space, a voxel represents a volume within a 3D space and an ‘imaginary geometry’ is used to fill that volume.. The word voxel is a combination of the words volume and pixel. The up side to using a voxel based system is that they better interact with each other, as if they were real parts of geometry. This makes using a voxel based system for water can be highly effect as each voxel can blend into each other creating the illusion that there is just one body water, instead of a million voxels.

To create these effects most post houses will use a combination of primarily Maya and Houdini. Maya is more often used to create the models and rigging in the earlier stages and although Maya’s Fluids and particles can be quite powerful, many artists prefer using Houdini. Houdini is a much more procedural software, meaning most of the user input is done through the node based system. Now Maya claims to be node based, however if you have ever opened up the hypergraph at the end of a project, you’ll know it looks a bit like a massive tangle of string. Now this is Houdini’s node editor, is much cleaner and easier to read, not too dissimilar from the layout of Nuke. Most nodes only have one input and one out put, unlike Maya where each node could have half a dozen connections.  This layout makes it much easier for an artist to come in and really fine tune each aspect of a simulation, but still be able to come back later and change some data without disrupting the rest of the sim. Now that’s not to say Maya doesn’t have it’s own advantages, with plugins like Real Flow and FumeFX, both of which produce outstanding results. However Houdini has it all under one hood and doesn’t rely on third party plug-ins.

The next step is setting up the lighting rigs along with, texture maps and shaders to determine the the final look of the shot, with the assistance of a look-dev until a perfect match has been achieved. This stage is where the HDR images are used as image based lighting so the lighting in the scene reflects the lighting of the real environment. In some scenes it is necessary for a roto artist to extract a character or object from a scene so that some effects can be placed in front and behind them. This can also be achieved with deep compositing but I will get onto that later. The final stage for the studio is to composite all the various assets from models and animations to special effects, graphics and backgrounds. This process is normally done through software such as Nuke or After effects. Nuke is more used in film than After effects due to its node based system as opposed to After Effects’ layer system. The node based system makes it much easier to create effects at certain points in the pipeline, it also makes it much easier to alter these effects on each separate node, whereas in after effects you need to drag back through multiple pre comps to find and edit effects, whilst trying to to alter anything else in the scene.

Deep Compositing

A relatively new innovation in the world of compositing I have found quite interesting is the art of deep compositing. Deep comping creates a rendered image, that doesn’t just have a single Z depth for a certain point on the image, but instead it has an array of values that define the distance of each pixel relevant to a certain point in space. Basically giving the pixels of an image a position within a 3D space, thus allowing for compositors to place other assets inside the image without having to create multiple nodes or layers with separate masks. Here is an example of deep comping being used in the ‘Orrery’ sequence in ‘Prometheus’. We can clearly see how each particle in blue has its own depth in the image as each particle passes in front and behind each other encasing the actress in the middle. This allows for a lot more realistic effect making the scene more believable for the audience.

deep commping
Here is an example of deep comping being used in the ‘Orrery’ sequence in ‘Prometheus’

Clarisse Engine

The Clarisse engine has been a major development when it comes to rendering. It is not just another render engine, as some people may believe. And is not just some compositor. Born of two French men who were tired of waiting, tired of waiting for a scene to load, tired of waiting for modifications to happen and tired of waiting for renders to start and end, thus Clarisse was born. Its main purpose is for speed and flexibility. Instead of the standard 3D workflow of starting in a 3D engine (Maya, 3DSMax, Houdini) and using an external rendering engine. Clarisse combines all elements from 3D software to renderer and even compositing all in one package, with the final image being the focal point of the software. Clarisse is built for super fast rendering allowing artists so quickly edit shaders, textures and lighting set ups on the fly with near instant response time. Clarisse was debuted at SiGraph 2012 and has been picking up momentum at a rapid pace ever since its launch, it has also been recently picked up by Double Negative as there renderer of choice. Clarisse is changing the way we look at rendering and is something we should all keep our eyes open for.

Reflection

Looking back over my research, it has become apparent that having a good understanding of Houdini is a vital skill to have if I want to pursue a career in the VFX industry, it is clearly a much more powerful piece of kit for producing procedural effects compared to Maya. I will also be continuing to to practice Maya as it does have some plug ins such as FumeFX which is an industry standard for creating fluids such as fire and explosions. With recent updates for a Houdini engine for Clarisse I believe it will be beneficial to acquire a copy of Clarisse, as I to get frustrated with the waiting times from simulation to render. Although previously not so interested in compositing, my research for this project and our environments projects has opened my eyes to the power of compositing and how fun it can be. The workflow of Nuke I find to be very appealing and I would like to be able to explore deep compositing further as it a great break through for both compositing and effects animation. I know in my research I have not touched on any intellectual property or economic sectors.  However whilst doing my research, economic statistics and IP rights, directly relating to effects animation was quite hard to come by and I also didn’t want to include statics saying there are 5299 workers in the VFX industry, in the UK and that 9% of them are freelancers. However for this project I wanted to research what was really at the heart of VFX and whole the system works. This topic has led me to understand various techniques and applications of VFX in the industry and also has sparked my interest in a number of artists and software that I am itching to learn more about.

 

Links:

Allan mckay, influention artists in vfx http://www.allanmckay.com/

Clarisse: http://www.isotropix.com/clarisse

Clarisse: http://www.dneg.com/dneg_vfx/dneg-purchase-global-site-license-for-isotropixs-clarisse/

Clarisse: http://www.awn.com/news/isotropix-showcase-clarisse-ifx-20-siggraph-2015

Clarisse: http://blog.isotropix.net/?p=51

Jordie bares on why the revenent vfx is so good: https://www.creativereview.co.uk/cr-blog/2016/february/why-the-revenant-is-a-landmark-in-vfx/

 

Clarisse http://blog.isotropix.net/?p=224

Jordi Bares: https://www.creativereview.co.uk/author/jordi-bares/

Jordi Bares http://odforce.net/blog/?p=348

Differences between maya and houdini: http://www.tokeru.com/cgwiki/index.php?title=MayaToHoudini#Houdini.27s_Core_concept:_Points_with_data.2C_manipulated_via_clean_networks

Maya and Houdini: https://www.quora.com/Which-is-better-SideFX-Houdini-or-Maya

Maya and houdini: http://forums.cgsociety.org/archive/index.php?t-975103.html

Voxels: https://en.wikipedia.org/wiki/Voxel

Particles: https://en.wikipedia.org/wiki/Particle_system

Small fvx studio london: http://www.realtimeuk.com/

Particle and voxels book: https://books.google.co.uk/books?id=9XrjBAAAQBAJ&pg=PA397&lpg=PA397&dq=difference+between+particle+and+voxel+based+systems&source=bl&ots=lX-R_4CHpU&sig=n_0of8U4zw1oj7Az8BfJpVWX7xI&hl=en&sa=X&ved=0ahUKEwik06L5hofNAhWnCMAKHYKbBkEQ6AEILzAB#v=onepage&q=difference%20between%20particle%20and%20voxel%20based%20systems&f=false

Advertisements

Presentation planning

Intro

 

What does an FX artist do?

It is a visual effects artists job to recreate real world phenomena in a digital environment. These effects are used on multiple platforms from games to movies to virtual reailty. My main focus is going to be on VFX in films, that is the area I want to go onto in the future so it seems to be reasonable to focus my research on that. There are many different job roles within the VFX industry, compositor, concept artists, FX artists and matchmove artists, to name just a few.

What is the work flow to produce these effects?

How has innovation in hardware changed the use of effects?

What other forms of innovation are there in the industry? Glassworks – jordi bares, dneg – Clarrisa engine, photo gammatry.

Innovation – Jordi Bares creating FX that work in harmony with the environment instead of being over the top, a good combination of live action and FX where it is needed create a much more believable scene.

The Art of Deep Compositing

A relatively new innovation in the world of compositing I have found quite interesting is the art of deep compositing. Rather than layering multiple 2D images ontop of each other and using mattes and masks to manipulate them, deep comping creates a rendered image, that doesn’t just have a single Z depth for a certain point on the image, but instead it has an array of values that define the distance of each pixel relevant to a certain point in space. Basically giving the pixels of an image a position within a 3D space, thus allowing for compositors to place other assets inside the image without having to create multiple nodes or layers with separate masks.

deep commping

Here is an example of deep comping being used in the ‘Orrery’ sequence in ‘Prometheus’. We can clearly see how each particle in blue has its own depth in the image as each particle passes in front and behind each other encasing the actress in the middle. This allows for a lot more realistic effect making the scene more believable for the audience.

PROMETHEUS: Paul Butterworth – VFX Supervisor – Fuel VFX

Stuart Penn

https://www.creativereview.co.uk/author/jordi-bares/

How will the industry change in the future? Obviously VR but explain why I’m not choosing VR.

Conclusion – reflect on how I could implement this in the future and how I will protect my work

 

 

 

VFX Artists

London is a massive hot-spot for VFX across the globe. There are over 140 VFX dedicated studios, varying from feature films, to commercials and games or even a bit of everything.

Double Negative: VFX TD

Job Description: FX Technical Directors gather a variety of 3D assets produced by our Build, Layout and Animation departments and use off the shelf software – primarily Houdini but also Maya – and custom pipeline tools to create particle, rigid body, fluid, cloth, fur and hair simulations.

Responsibilities:
-Use an array of commercial and proprietary software tools to produce photorealistic simulations of real-world phenomena while developing innovative solutions to complex problems.
-Create bespoke effects setups to fulfil client briefs.
Produce test and often final renders of FX elements for a shot.
-Create working test composites for review by the supervision team.
Work in partnership with lighters & compositors to ensure shots are delivered to the very highest standard.

Required Skills and Experience:
-Proven experience in producing effects such as smoke, fire, clouds, water, steam and explosions, plus experience with particle and voxel based rendering.
-An eye for details, good sense of timing, and thorough understanding of techniques and technologies relating to physical simulation in computer graphics.
-Experience in live action FX work preferred.
-Extensive knowledge of Houdini and/or Maya (Python, Hscript or Mel knowledge a plus).        _____________________________________________________________

Framestore: FX TD

Job Description: FX Technical Directors create CG renditions of naturalistic physical or magical phenomena such as fire, water, clouds, smoke, physical destruction and particulate.

Responsibilities:
-Creating particle, rigid-body, fluid, cloth, fur and/or hair simulations or animations
-Applying lighting and shaders to produce the final rendered image or passing the simulation on to a Lighting Technical Director
-Using a variety of commercial and proprietary FX tools including Maya, Houdini and Naiad
-Working in partnership with other departments to ensure that shots are delivered to the highest possible standard
-Working within the team to determine the various design solutions needed to create the effects
-Designing and creating images, elements, effects, pipelines, tools and techniques
-Helping to design solutions involving a more procedural approach in order to create the effects required
-Producing reviewable composites of all FX elements for a shot. Sometimes FX TDs will be expected to produce final renders too

Required Skills
-An excellent knowledge of Houdini and/or Maya
-A thorough understanding of the techniques and technologies relating to FX simulation and procedural animation
-Previous experience producing effects such as smoke, fire, clouds, water, steam and explosions in addition to simulation techniques for rigid-bodies, cloth and hair
-A good understanding of animation and an eye for motion
-A working knowledge of Renderman or Arnold
-A good understanding of the entire visual effects process
-A familiarity with pipeline issues, especially working between multiple packages
-A working knowledge of at least one compositing application (e.g. Nuke) and a solid understanding of the compositing process
-Strong Vex, hscript and/or Mel scripting skills

Desirable Skills
-Knowledge in other simulation packages such as Naiad
-Programming skills with C/C++ along with knowledge of the Houdini HDK and/or Maya API
-Python and Unix shell scripting
-Experience with other render packages

As you can see, the requirements to be and FX TD, at both Dneg and Framestore, are very similar. This is to no surprise as they require a lot of people with varying specialized skills to produce such high quality content. There seems to be a big emphasis on  Maya and Houdini being the main propitiatory software used within the industry. Having already got a basic understanding of Maya, I feel it is a good idea to start learning Houdini as well. I planned to use it for our current 3D environment project but I am unable to find the time, so it will be a project for over the summer.

I want to compare the big companies and the smaller ones, based in London, to see the difference between the way they operate. Sadly, after look at a few companies like BASEBLACK and Jellyfish, they don’t display what is required to apply for a job, so, I have contacted them asking if the can supply me with any information.

 

 

Fluid Dynamics

Fluid Dynamics and particle effects are the bread and butter to any VFX artist’s toolbelt. These dynamic effects enable the artists to create anything from fire and water, to hurricanes and sand storms. Plugins and software such as Maya, Houdini, RealFlow and FumeFX are all used by FX artists to create these effects.

Here are the showreels of two prominent figures in the VFX industry, Mark Theriault and Allan Mckay. These videos are a good example of what is possible with dynamic and fluid effects.

In our past projects I used one of Maya’s plugins, Bifrost, to simulate a body of water for a river scene and more recently I have used a combination of particle and fluid dynamics to create a swirling portal for a time machine in my 3D environment.

What industry roles interest me?

After a bit of searching, a few roles have started to stand out to me, these are; character animator, effects animator, animator, VFX artist and concept artist. I plan to look into all of these different positions to gain a better understanding and see what interests me to look deeper into.

Before starting on the course, being a visual effects artist was a field that interested me greatly and still does today. My interested started when at college, Double Negative came in to our college and gave a presentation on working within the VFX industry. They spoke about all the different pathways within the company, from concept artists to compositors and the process of becoming a member of the team at DNeg. Most people start out as a runner, delivering post, grabbing coffees, printing just being an all round helper. Currently on the DNeg website they are offering positions as a 2D or 3D runner, and from being a runner you are promoted to your field of interest usually after 12 months.