Category: OUDF 203 Negotiated studies

Ok so i’m going to try and explain a simple track so i can show i know how to do it. First things first load the image sequence, set the frame rate it was recorded at and the last frame you want to track to.

Once loaded you can run an auto track and see what it picks up. This gave me 548 tracks, but not all last for very long. So if you go to clean assist you can decide to keep so many tracks per shot and only keep ones that last for so many frames.

You dont need to run an auto track, you could just track manually or do both. But you need at least seven tracks.

To manyally track, right click and select new track. Make the bounding box large enough to cover the tracking point and give it some area to work with. You can then, depend where you are in the time line, tack forward or backwards.

The area at the bottom of the screen allows you to see the quality of the tracks. Green is a Confident solve, yellow is not so confident and red is not confident at all. You can go back over these red points and re position the tracking box in place and repeat until you have just green or yellow tracks. Once you have all the tracks you need you do a camera solve.

Once you have solved the camera it will tell you the quality of it at the top of the list of tracks. Again green is good, yellow not so good, red bad. Really you don’t want any red on your camera solve, it means that the computer isn’t confident about its camera calculations. So revisit the red frames it isn’t sure about and try re positioning the tracking box and try get the tracks green. I found that after doing this and re- solving the solve would be more confident and turn yellow or green.

Once you have a good solve you need to create a new co-ordinate system.

Here you enter the distance between tracks and dictate an origin point. It’s at this point i would get confused, with what number goes where and in X, Y or Z. But Matt showed me an easier way to achieve the same result. By switching it to 3d mode you can drag the X and Z point to where you want it, this is so much easier to get my ead around. This is so it can determined which way is up and how the surface sits.

You also need to input some details about the camera. you need the focal length and size of the film gate.

Once this is all in place, if you go into 3D mode and press C you can look through the camera and see how the grid sits in the image. There is also an option to place a ghost cube in the scene. Below is a screen shot example from anther track. By scrubbing the time line or playing the footage you can see if there are any problems or sliding occurring.

Once your happy with the track you can export it as a MA maya file and open it in maya as a renderable camera.

Below is a play blast of my independent track, i can now confidently say i can produce a simple track.

Time for a large one now….


The are a few ways to prepare the footage for a better track.

  • If the footage is interlaced, de-interlace it
  • Export the footage as and image sequence
  • If the footage needs stabilizing for the shot stablize it (it would help keep the track)
  • Sharpen the image
  • color correct it using RGB channels
  • boost the contrast.

I have been sharpening the footage since Andy suggested it to me and read in ‘The art and technique of match moving’  about how a professional preps their footage. Thinking about it it makes a lot of sense. Because although the footage maybe de saturated or colour corrected but the action and movement of the camera doesn’t change so once the track is finished you just composite it with the original footage.

By preparing the footage you can create higher contrasting tracking areas, giving the program a better and more confident track.

For a really good track you need Data!

For the tracks I have been dealing with I have been recording the size of the Film Back, focal length and frame rate of the camera. Also the measured distance of the tracking points.

The cameras dont always have these specifics on the camera so it is a case of looking throught the manuall or finding it online. So it will alway be an idea to take down the name and make of the camera.

In the book ‘ The art and technique of matchmoving’ by Erica Hornung she recomends you collect and recored every little bit of data you can.

Details on;

  • the location
  • the buildings
  • furniture
  • trees
  • landscape features
  • light poles
  • chairs
  • cars
  • carpets

Anything that can be measured and stays still long enough and photograph everything for reference later. For the camera she say to record;

  • the lens
  • the focal length
  • the focus distance
  • distance to subject
  • hight
  • tilt
  • roll
and anything else you can record.
I have only been doing small tests but if i were to do a track to production quality standared i will definatly take as much details and measurements as possible.
These measure ments and information is then inputted into a match move program to achieve the best track possible.


So the track used for chris’s model and my spinning top had some issues. You can see them at 0.11 and 0.15 in the video below.

The model appears to get smaller for one frame and lager in a few others.

So i took a look at what was happening to the camera in Maya.

Taking note of what second the problem occurred it did a little maths to work out around what frame it was happening. So the 11 th second x 25 frames a second suggested the problem was around  frame 275 observing frame by frame i spotted it at 337. I also did the same for the problem around the 15th second and this is what i found.

Now there are a few reasons this could be happening. When you import the exported match move camera, the camera is still editable, so i could have accidentally scrolled the camera forward thinking i was in perspective view and it has auto keyed. This can be easily prevented by locking all the attributes of the camera in Maya. But It could also be a calibration problem in Matchmover.

The red areas in the tracks mean that Matchmover isn’t confident in the quality of the track. It could be these that confuse the program and calculate the camera in the wrong place. You can eliminate these red spots by placing the tracking box back in the right place manually and tracking forward from there. If it happens again, just pick up the track again from the red point. Having little or no red on your tracks will give you a good camera solve ( creates a camera in 3D space)

I tried to put the camera problem right in maya by zooming the camrea in and out and it seemed to work. There is just one more thing that was an issue.

At approximately 0.14 in this playblast the camera actually goes through the face of the model. So this is something you need to be aware of when filming, how big and what space would the model take up if it was actually in that live environment to avoid this happening. Or the model could have been moved slightly back in the Maya scene to prevent this from happening.


Using the same footage and Matchmover exported camera for the test i did with chris’s model, i wanted to see how an object would look moving around a tracked space. I didn’t want to spend a lot of time modeling or animating something so i thought i would quickly model a spinning top and putting it on a motion path.

Took a quick refreasher from Matt to remember how to attatch it to the path using CV curves and locators. Something i didn’t know was you could get the animation the play infinitely using pre and post infinity in the graph editor so that was handy.

Doing it this way it only took about half an hour to model, apply a simple UV map and animate. Using the multicoloured UV map meant that you could see it spinning where if it was just one colour you wouldn’t see the spinning animation.

Low rez playblast

It seemed to move nicely around the space with no problems but didn’t look quite so place in the scene as my last test with chirs’s alien. It just needed some color correction, blur and a shadow adding.


After my crash course with Annabeth she sowed me a little test she had done with match moving.

This is right up my ally so I wanted to give it ago. I didn’t want to spend time on modelling so I asked permission from Chris Luk to use his little alien he had made for our previous module. It would just mean importing the model or camera into the Maya scene file. I thought it would be nice to have him sitting on my desk at college.

I drew 7 dots on a sheet of paper and recored a few seconds of footage. I used the video recording on my camera but the footage was pretty bad.

The frame rate of my camera and the strip lights in the studio made the footage flash in different colors. We thought this would interfere with the track but we tracked using grey scale and not colour and it tracked fine, not fantastic but enough to try out a 3D track using Match Mover.

The auto track worked well, it tracked most of the dots and also tracked some areas on the keyboard. I did some manual tracks on the dots that didn’t track so well, solved the camera, inputed the focal length of the camera, film gate and created a co-ordinate system.

When tracking you need to take note of a few specifics about the camera and environment to help the software calculate the position of the camera in 3D space, but i want to go more into that in another blog post.

I expoted the the scene in Matchmover as a MA maya file and opened it in Maya. The file created the camera in maya with all the key frames needed to simulate the live camera.

To see the footage you can import the image sequence onto the image plane of the camera and test how it sits and where in the scene it is placed.

Low rez Maya playblast.

It’s just a case then of rendering out the scene from the imported Matchmover camera and layering them together in a program like AfterEffects.

The first reder out i forgot to tick alpha, so the background was black and i couldn’t tidily remover it. Luckily it didn’t take long to render out with alpha applied.

First rendered out composition.

I was so chuffed with this when i saw it, but between Andy and myself we spotted some problems with it. At 0.11 the model appears to vanish for a frame, but looking at the frame the model is actually very small.

Also at 0.15 and a few other points it flashes larger for a couple of frames. We later discovered the possible reasons for this but would like to go into that in another post.

So for the time being I cheated slightly and edited over it so I could put it in my resentation.

I had quite a bit of help from Annabeth so now i need to get to a stage where i can do a track indipendantly. I also want to see how something looks when it is moving around a tracked space so will quickly animate something in maya using the same camera as this test.

I thought i would have a quick go at removing the piece of paper from the scene using the perspective corner pin track in AfterEffects. But it would work, i think it is because of the quality of the footage. I tried several times increasing and decreasing the size’s of the bounding box’s but it wasn’t able to keep a good track all the way through.

I would like to use dots or markers on the actual surface next time to see if i can remove them from the footage.

After Annabeth walked myself and Andy through MatchMove we wanted to give it ago our selves but we weren’t really very successful. There was a few things different we wanted to do;

  • Use a better HD camera
  • Use dots on the surface to track from instead of paper so they could easily be removed from the footage later
  • and see if we could do it with out help


  • the had held HD camera wasn’t that much better than the one on my camera I used before. They dont work very well with artificial light.
  • The dots on the table couldn’t been seen from a reasonable distance (not high enough contrast from a distance)
  • We used green dots but the table had green flecks in it and the program would have trouble distinguishing the dots.
  • The footage was far too shaky, in some frames the footage had too much blur on it that the colour of the dots would not be distinguishable from other colours surrounding and loose the track completely.

I’m not going to say that it was a total loss because we learnt from it. I could now probably analyse the footage before hand to see if its possible to track. I understand that the movement of the footage cant be to sharp or jarry because tracks work frame by frame around an area and if that tracking point was to jump out of frame or blur too much it looses the track.

So we did a second test hoping to get better trackable footage. We did it outside for better light, made sure to avoid any fast movements and did a test wit some poke chips Andy had. The blue chips barely showed up on the camera but the yellow was really vibrant. But we didn’t think at the time that we probably wouldn’t need to add tracking points, the cross’s on the pavement may have been enough to track the footage. The chips were quite reflective and the light would change the color of the chips in some frames.

Again the the footage wasn’t great, I think the reflections would loose the track and the footage was still a little too shaky. I like the handy cam look sometimes it adds an air  of realisum, but thinking about it now that could be done in post.

I was quiet disappointed about spending the whole day on match moving with out achieving a good enough track to be able to export a camera. But looking back on the day I did learn a lot about what is trackable and why or why not.

Millennium square

Popped in to town today to take some photos and test footage of Leeds Millennium square for my large track shot. BUT It was fenced off ready for Live at Leeds! So I couldn’t take any test shots but it’s only on from the 4th-7th of May, so hopefully I can grab some footage and test shots Tuesday and film this week or the next. I guess this is where a location manager would  come in and make sure the location is available and gain permission from the council and owners. It’s a good job I went to check the location before hand and not just turned up expecting to film! It would have been a waist of time, effort and if it were a real shoot, money.

I wanted to write up in my own words what Matchmover does, but it is a mouth full and very technical.If i was to write it into my own words i don’t think it would make much sense. I have an understanding in my head of how it does it visually, but words are not always my strong point so i have found and excellent description of what the software does by Autodesk them selves.

Matchmoving is the computation of the global 3D geometry of a scene including camera path, internal parameters, and moving object. By exporting the real 3D camera path and parameters to animation software, the position and motion of virtual cameras can be accurately established. With the motion of the virtual cameras, new, matched image sequences can be created whose virtual objects are seamlessly composited into live action footage.

Matchmoving lets you accurately place 3D objects into a film, video or image sequence. For virtual objects to appear as part of the scene, the objects have to be rendered by a virtual camera whose motion exactly matches the motion of the actual camera that shot the film. Using Autodesk MatchMover you can generate the exact camera parameters that match the motion of the actual camera used in the sequence.

So matchmover creates a virtual camera that mimics the movement of the real live camera of some footage, that allows you to import it into animations programs like Maya, to then render from , so that the rendered images will match the movement when composited back into the live footage…. think thats makes sense.

Ok I found this interesting. I type motion tracking in google and it doesn’t seem to be referred to as motion tracking but Match Moving. This is probably why I have been finding it hard to find books on motion tracking.

On Wiki it says refer to;

  • Match moving
  • Optical motion tracking
  • Video tracking

This will defiantly help when searching for articles, tutorials and books.