Show Reel


Industry Research

For this project I wanted to look further into the world of VFX. It is a field of animation that has always interested me, ever since Double negative came to my college and gave a talk about their company back in 2012.  I will start out by giving a quick run down of the VFX industry and then focus more on the sectors that interest me personally and how the industry within these sectors is changing.

It is a visual effects artists job to recreate real world phenomena in a digital environment. These effects are used on multiple platforms from games to movies to virtual reailty. My main focus is going to be on VFX in films, that is the area I want to go onto in the future so it seems to be reasonable to focus my research on that. There are many different job roles within the VFX industry, compositor, concept artists, FX artists and matchmove artists, to name just a few. The area that I want to research more in depth is effects animation. The process of creating things like fire, water and explosions, any type of animation that requires a simulation.

Pre Production

There are three key phases in a films VFX pipeline. Pre Production is the first step, this step is made up of all the work that needs to be done before shooting begins. The next step is Production. Production is the actual shooting of the film where much of the VFX is done on set or is prepared for the next stage. The final stage is Post-Production. This is the stage where all the juicy stuff happens and the majority of the work is produced. The pre-production stage is broken down into different groups. The first and one of the most important is R&D. This is the process of creating and increasing efficiency of any tools that will be needed to simulate desired effects for the project. This step allows for post houses to remain contemporary, producing effects that get better and better, year on year. Next, low poly models are created for the pre vis team, for the process of taking the story board or scene descriptions and turning them into a low resolution 3D sequence. For scenes that will contain large amounts of FX, pre vis can be a great way to experiment with camera moves and set-up without having to sack a lot of cash onto an set. The final sequence is then put together and shown to the clients to get the go ahead.


Now, we move onto the production stage. This stage is basically setting everything up in preparation for the post production stage. During production a member of the studio normal FX TD or supervisor would be on set giving direction on scenes where VFX are present and also taking reference images for the artists to use later on. Using a LIDAR camera, 3D digital scans will be taken of the environment and props for the modellers to reference from and also to ensure that the 3D assets line up perfectly with the rest of the scene. Finally HDR photos are taken for the lighting to set up the correct image based lighting system in post.


The final step is post- production. There are a lot more steps to this section so I will be condensing it down and focusing closer on the sections I want to research closer. The first step will be to track the camera, this will be done with 3D tracking software like Boujou. Once the camera tracking is done, the artists can start match moving and/or body tracking, any assets that need to be applied to the scene, so when the final image is composited everything lines up nicely. Lastly any animation of models or rigs needs to completed before moving on. After this the effects animation begins, this is the field I find most interesting so i will be focusing on this a little later.

It is the effects artists job to create any visual elements of the scene that require simulations such as, smoke, fire, clouds, water, steam and explosions. These can be broken down into three groups; particles, anything from dust to snow and rain; dynamics, this is for the hair, fur and cloth simulations; and fluids, for all the liquid, fire and smoke simulations. Particles are a huge part of being an effects artist, basically their bread and butter. These can come in the form of either particles or voxels. A particle system is a technique that uses a large number of small particles or sprites to simulate the effect of real world phenomena like dust, stars, magic and rain. The system starts off with an emitter, simply the generator of the particles. These emitters can be configured to emit these particles in pretty much any way imaginable. Much like the emitters, the particles themselves have a set of parameters that can be altered to change the way the particles behave depending on the purpose. Whereas particles only represent one point in a 3D space, a voxel represents a volume within a 3D space and an ‘imaginary geometry’ is used to fill that volume.. The word voxel is a combination of the words volume and pixel. The up side to using a voxel based system is that they better interact with each other, as if they were real parts of geometry. This makes using a voxel based system for water can be highly effect as each voxel can blend into each other creating the illusion that there is just one body water, instead of a million voxels.

To create these effects most post houses will use a combination of primarily Maya and Houdini. Maya is more often used to create the models and rigging in the earlier stages and although Maya’s Fluids and particles can be quite powerful, many artists prefer using Houdini. Houdini is a much more procedural software, meaning most of the user input is done through the node based system. Now Maya claims to be node based, however if you have ever opened up the hypergraph at the end of a project, you’ll know it looks a bit like a massive tangle of string. Now this is Houdini’s node editor, is much cleaner and easier to read, not too dissimilar from the layout of Nuke. Most nodes only have one input and one out put, unlike Maya where each node could have half a dozen connections.  This layout makes it much easier for an artist to come in and really fine tune each aspect of a simulation, but still be able to come back later and change some data without disrupting the rest of the sim. Now that’s not to say Maya doesn’t have it’s own advantages, with plugins like Real Flow and FumeFX, both of which produce outstanding results. However Houdini has it all under one hood and doesn’t rely on third party plug-ins.

The next step is setting up the lighting rigs along with, texture maps and shaders to determine the the final look of the shot, with the assistance of a look-dev until a perfect match has been achieved. This stage is where the HDR images are used as image based lighting so the lighting in the scene reflects the lighting of the real environment. In some scenes it is necessary for a roto artist to extract a character or object from a scene so that some effects can be placed in front and behind them. This can also be achieved with deep compositing but I will get onto that later. The final stage for the studio is to composite all the various assets from models and animations to special effects, graphics and backgrounds. This process is normally done through software such as Nuke or After effects. Nuke is more used in film than After effects due to its node based system as opposed to After Effects’ layer system. The node based system makes it much easier to create effects at certain points in the pipeline, it also makes it much easier to alter these effects on each separate node, whereas in after effects you need to drag back through multiple pre comps to find and edit effects, whilst trying to to alter anything else in the scene.

Deep Compositing

A relatively new innovation in the world of compositing I have found quite interesting is the art of deep compositing. Deep comping creates a rendered image, that doesn’t just have a single Z depth for a certain point on the image, but instead it has an array of values that define the distance of each pixel relevant to a certain point in space. Basically giving the pixels of an image a position within a 3D space, thus allowing for compositors to place other assets inside the image without having to create multiple nodes or layers with separate masks. Here is an example of deep comping being used in the ‘Orrery’ sequence in ‘Prometheus’. We can clearly see how each particle in blue has its own depth in the image as each particle passes in front and behind each other encasing the actress in the middle. This allows for a lot more realistic effect making the scene more believable for the audience.

deep commping
Here is an example of deep comping being used in the ‘Orrery’ sequence in ‘Prometheus’

Clarisse Engine

The Clarisse engine has been a major development when it comes to rendering. It is not just another render engine, as some people may believe. And is not just some compositor. Born of two French men who were tired of waiting, tired of waiting for a scene to load, tired of waiting for modifications to happen and tired of waiting for renders to start and end, thus Clarisse was born. Its main purpose is for speed and flexibility. Instead of the standard 3D workflow of starting in a 3D engine (Maya, 3DSMax, Houdini) and using an external rendering engine. Clarisse combines all elements from 3D software to renderer and even compositing all in one package, with the final image being the focal point of the software. Clarisse is built for super fast rendering allowing artists so quickly edit shaders, textures and lighting set ups on the fly with near instant response time. Clarisse was debuted at SiGraph 2012 and has been picking up momentum at a rapid pace ever since its launch, it has also been recently picked up by Double Negative as there renderer of choice. Clarisse is changing the way we look at rendering and is something we should all keep our eyes open for.


Looking back over my research, it has become apparent that having a good understanding of Houdini is a vital skill to have if I want to pursue a career in the VFX industry, it is clearly a much more powerful piece of kit for producing procedural effects compared to Maya. I will also be continuing to to practice Maya as it does have some plug ins such as FumeFX which is an industry standard for creating fluids such as fire and explosions. With recent updates for a Houdini engine for Clarisse I believe it will be beneficial to acquire a copy of Clarisse, as I to get frustrated with the waiting times from simulation to render. Although previously not so interested in compositing, my research for this project and our environments projects has opened my eyes to the power of compositing and how fun it can be. The workflow of Nuke I find to be very appealing and I would like to be able to explore deep compositing further as it a great break through for both compositing and effects animation. I know in my research I have not touched on any intellectual property or economic sectors.  However whilst doing my research, economic statistics and IP rights, directly relating to effects animation was quite hard to come by and I also didn’t want to include statics saying there are 5299 workers in the VFX industry, in the UK and that 9% of them are freelancers. However for this project I wanted to research what was really at the heart of VFX and whole the system works. This topic has led me to understand various techniques and applications of VFX in the industry and also has sparked my interest in a number of artists and software that I am itching to learn more about.



Allan mckay, influention artists in vfx





Jordie bares on why the revenent vfx is so good:



Jordi Bares:

Jordi Bares

Differences between maya and houdini:

Maya and Houdini:

Maya and houdini:



Small fvx studio london:

Particle and voxels book: