Nuke Process

Before starting this project I had never really used nuke before. I had downloaded it over summer  and started to play around with it but nothing more than that. Because I didn’t know much about Nuke I really valued the lessons we received from Clement, from Escape Studios, as he gave very well structured and detailed lectures, right down from the basics up to more advance techniques of 3D camera tracking and plate cleaning. Alongside these lessons I also attended a course of the Ravensbourne shorts on Nuke and compositing. There were 3 sessions with increasing dificulty levels, taught by Alex, a past Ravensbourne tutor. These sessions really helped me to concrete down what I had learnt from clement, boosting my confidence within Nuke, allowing me more creative freedom.

The first thing I needed to do in nuke was to track the camera for shot 5. This was our hand held shot that we included so that we could have a chance to practice camera tracking. We didn’t note down the meta data from the camera whilst we were shooting as I knew that the Canon 5D recorded that data like f/stop and focal length into the image file. I was the able to extract this date using a piece of software called efixtool(-k). Its a simple script that just pulls all the meta data from a shot and displays it in the cmd prompt. I needed this data like the sensor size and focal length so that the camera track would be as accurate as possible.capture

As well as imputing the correct meta data, I also needed to correct the lens distortion. This could be easily done using the lens distort node. I knew how much to undistort by looking up the distortion factor of the specific lens we using a canon 24-105 has a distortion of 0.015 so we did the opposite of -0.015. After this the footage was then ready to be tracked.

camera-track

Once the camera was tracked I then exported it to Maya using the writeGeo node and saving it as either a .abc(alembic) or as a .FBX. These files can then be imported into Maya and lined up to the geometry in our scene.

By our own mistake we didn’t get any measurements of the distance between our camera and the position of the crane, when we were filming. So, by a small stroke of genius, we were able to use Google Earth to quite accurately measure the distance, using its built in ruler tool.google earth.PNG

After everything was render out of Maya (which overall took more than a week of pretty much solid rendering!) we split our shots between us. I was to do shot 4 and 5 and Henry was doing shot 7. Because shot 4 contained no live footage and the shot consisted of mainly reflections, it didn’t require much compositing, only colour grading and a few alterations.

First off I apply a simple grade to boost the blacks and whites and then I am using a chroma key to extract the greens from the leaves and create and alpha to use as a mask to drive the next grade nodes where I adjust the, black white and gamma, colour levels to give it the yellow/brown hue. I am then using the depth pass, that I render out in my .exr file, to drive my zDefocus node. This node allows me to use a depth pass to get a depth of field effect like you would on a normal camera lens.

Next was shot 5, this was the shot where I really wanted to showcase a various range of skills. First off we were having some problems with using a ‘hold out’ to create the alpha for where the crane touches the water. It was rendering out the alpha correctly so we had to fix it in Nuke. To create the alpha I used a Keyer node to key out the darker portion at the bottom, caused by the refractions. I then merge this keyed alpha with the top portion of the original alpha to create this composite alpha.alpha.png

Then I added in the water marks along the bottom of the crane where it meets the water. This was to get a more realistic paper effect. I used a tracker node to track the motion of the crane, I then applied this tracking data to the transform node of my rotoscope, which was masking out the grade node, giving the effect of water marks. I also had to animate the mask on the neck so that it would follow it as it dipped into the water.water marks.png

To give the crane the nice rim light effect on its neck, I shuffled out the specular pass and increased its brightness. I then animated the roto node to follow it as it moved. This was then merged over the crane. As well as extracting the specular pass, I shuffled out the ambient occlusion and multiply merged it over the orgional crane, this then made my shadows much more pronounced, giving the model more depth.

ao specular.png

The next step was to grade this crane to fit the colour of the back plate. After this, I again am using the depth pass on the zDefocus to adjust the focus so that it is slightly blurred to follow suit with the back plate. I did this for all of the elements of this scene, the water and the lily pads.

zdpeth.png

We had some people walking around at the back of our scene and we wanted to remove them in case they drew attention away from the rest of the scene. I first used a tracker node to track the motion of the areas I wanted removed. Then, using the roto paint tool, I painted out the people with other areas of the scene. Tracking data is then copied into the transform tab to cover the people with the motion of the camera.

I was rather pleased with the way the colour grade went on this shot. Shooting with the 5D produced some really crisp footage with was a pleasure to work with. Here I am layering multiple grade and colour nodes to give myself full control over the look of the scene. I start off by just boost some of the levels. Then I am extracting the greens, using a greenscreen keyer, so that I can mask out the leaves and trees and apply the grades that give it the yellow/brown hue.

Working with particles has always been something I have enjoyed, whether it be in Maya, Houdini or After Effects. For this reason I wanted to experiment on working with particles in Nuke. It has a solid 3D application built in which meant I could create particles with the depth of the scene no problem. Here I am using the particle emit node to simulate the particles. To this is then connected; a sphere node to give the particles physical geometry; a cube node to emit and contain the particles; then a series of effect nodes, like wind and turbulence, to give the particles some natural movement.

particles.png

For the rest of the shots I applied a similar grade to shot 5, to shots 1-3. Just extracting the greens and making the orange. On shot 3 there was not much green so instead I extracted the blue and increased the saturation a bit, which really made the blue pop, the effect Henry and I both really like.

 

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s