Showing posts with label production. Show all posts
Showing posts with label production. Show all posts

Friday, April 02, 2010

Visual effects service - The Big Picture

(Note: I’m on the board of the VES but all posting to this blog are mine own and do not represent the VES) (As always, I could be wrong about anything)

If we’re re-examining the current VFX situation we need to take a step back and look at the whole process.

I think most of us think of visual effects as a service. But is it?

History
Years ago the studios had their own visual effects department with people on staff and basic optical and animation equipment. When the studios closed those departments, labs and small optical companies took their place. I can remember looking over lab and vfx company price lists in the mid-70’s. A dissolve was so much per foot. A matte was so much per foot with minimum cost of X dollars. Some of the places had small insert stages they would rent for so much per day. If you needed something special they could give you a quote but the majority of the work was on a time and materials basis.

For larger vfx projects the productions themselves would set up a full department somewhere. This was the case with 2001, Logan’s Run, Close Encounters, Star Wars and other films. The production would lease a building and set it up from scratch with the people needed to run and operate it. For Close Encounters we were in an industrial building in Marina del Rey. They had to custom make the matte painting stands and other equipment as need be. They purchased or leased optical printers and an animation camera. Everyone working there was paid by the production. This is in fact how productions usually run. They become their own company that has a group of people that round up or build whatever is required for the film. In live action this would be the construction of the sets, special rigs, etc. Once those shows were done the one off facility they setup would usually be closed down and the crew laid off. In the case of ILM, since George Lucas had such success with Star Wars and thought he might like to do more things for himself and his friends he re-created it in Northern California. Apogee was formed by John Dykstra and others using the original ILM building and some of the same gear(?) used on Star Wars. Many of the Close Encounters people went to Universal Hartland where Universal setup up a facility to handle Buck Rogers and Battlestar Galactica.

When we formed Dream Quest most of the initial projects we did on a time and materials basis, especially if it was large. If someone wanted us to shoot motion control we charged for the stages and the crew for so much per week. This saved us in a number of cases where the director or the vfx supervisor they had hired made changes or threw out work because of a change.

Today if you’re working on a commercial the cost of a telecine is so much per hour and so much for tape, etc. The work on the Flame system or equivalent is on a per hour basis. They may provide a rough estimate but it’s always up to the client how much time is used. If the agency wants to tweak something all day that’s fine. They get billed for it and the video house doesn’t have to worry about making a budget.

Fixed Bids
Somewhere along the way the studios wanted a fixed bid on vfx work for feature films. Estimates were no longer good enough and the vfx companies would now have to stick to the budget. This changes a number of dynamics. In the eyes of some studios and directors the vfx people were no longer working directly for the production, they were working for the vfx company. The vfx crew became another step further from the film crew. Outsiders. Now there’s always the ping-pong of trying to please the director but not going over budget. Filing change orders and having discussions with the studios regarding the costs now became standard process. At times the vfx company can be pushed into a corner. The vfx supervisor was now that guy from the vfx company. The name of the vfx company became the main selling point. The vfx supervisor, not so much. If the client doesn’t like the first proposed company supe then another one at the company will be swapped out on a whim. The crew having to work overtime was now the vfx company’s problem. We still had crazy hours at times in the ‘old days’. On Star Trek: The Motion Picture I worked several weeks straight, 12 hrs a day, 7 days a week. (and a few 24hr days) But the studio knew it and actually visited the facility. Today, they’re now removed from those details.

So are there any other areas of film production that are completely farmed out to a 3rd party company besides VFX? All the other main leads tend to be hired directly. DP, Production Designer, Wardrobe, etc. Even though most Special Effects people have companies I believe most are hired as a team of people or at least paid on a time basis. Sound mixing or the DI? I assume these are on a time and material basis as well. The previs team is frequently brought in to work in the same office down the hall from the director. They typically bill by the man days or hours. Most of the set construction I see is done by a team of people working for production with special projects (cars, etc) farmed out.

When I think of a service I think of a dentist, a car shop where they work on your car or a plumber that comes to your house. In these cases they do work but don’t tend to produce anything. The costs are based on time and materials.

Custom manufacturing?
Should vfx be considered as custom manufacturing? We actually create something when we finish our work, whether it’s from scratch or a montage of material provided. That’s what the studios want, not the actual service part.

Here is where things get crazier. Each shot is unique like a snowflake. It’s own little world of issues, handwork and tweaks. You try like anything to make shots as consistent as possible and to be able to run them through the exact same process but it’s never full automated. For all the talk about computers in our business it’s still a very labor-intensive process. The number of people and the time required to do a shot from start to finish would astound most outsiders.

If you go to most manufactures and request custom work you will be required to make specific requirements in writing. (I.e. you want cabinet style 32 but in this specific color of blue. You want a custom cake that says Happy Birthday. It will be yellow cake with vanilla ice cream and chocolate frosting.) And that is what you will get. They seldom show you the work in progress or have your input at every single stage. The other thing is a custom manufacture will tell you when it will be done. They dictate the schedule. In the film business it’s the opposite of all of this. The studio specifies when the delivery will be. It’s almost always less than the time that would have been arrived at by a normal scheduling process for the facility.

On a VFX project you start with the script, which provides a wide-open interpretation of what the final visuals will look like. In pre-production the director hopefully approves concept art, does storyboards and ideally previs. While most previs lays a good foundation the number of nuances and changes required for the final shots can be enormous. The director usually wants something never seen before that will require a lot of R&D. Not just custom but a totally unknown look or process that needs to be invented. Just how much time and money will that take? The vfx companies have to provide a bid for all of this before the film is even shot. During shooting things will change. During post-production things will continue to change.

This is a creative process so there will be changes but think of it this way: The vfx company is making a 1000 custom oil paintings that technically have to be delivered on a hard date for a fixed price (at least initially). This process could costs in the 10’s of millions of dollars, make up half of the film budget and fill up half of the screen time. There are some rough thumbnails but not enough information to simply deliver the finished paintings. The director is involved at every step of the process for every single painting. In some cases, for every brush stroke. Some directors only want to see the final pieces. In these cases you can end up with ‘no, now that I see it I don’t’ want apples in the painting, I want pears’. So much for the time and effort to create the initial painting. If a director changes one painting that may change two dozen that are almost finished. Remember, the due date will not move, regardless of the changes. And of course shots are not paintings but moving images so time and motion presents another infinite number of possibilities.

How many other areas does the director really work in this much minutia? Normally when they’re working with Directors of Photography, Production Designers, etc they discuss and try to get in sync regarding the general look and style they want. The director may be asked about the color of the pillows on a set but at some point they pass on the taking care of the details to their key creatives. The director is unlikely to ask to change the 3rd brick from the right on the set or ask the DP to reduce a specific light by ½ stop. And yet at times it can be that way when working with visual effects.

With visual effects the director has unlimited control. Every pixel of every frame can be changed. If production has an on set stunt or action the director shoots what takes they feel are appropriate and will select one. The fact that the stuntman’s hand is raised a little doesn’t cause problems. The best take will be selected and production moves on. With the advent of digital visual effects that’s not the end of the story. What would have been fine previously in any movie is now something to be scrutinized and analyzed by the director, editor and studio. Now it may be an added shot for the vfx crew to fix that hand position. And while they’re working on the shot can they change that thing back there and that other thing over there? A shot with a jet may get a request to roll the jet another 3 degrees. Will the audience notice 3 degrees? Will it make it a better shot? Obviously if production paid and shot a real jet they would be unlikely to schedule another shoot day simply to get the jet to roll 3 degrees more.

On the set the director knows it will take a certain amount of time to make a change so they always have to balance that because time is their gold standard. They have so many days to shoot the show and have only 2 days scheduled for this set and need to shoot 20 setups a day. With vfx that time balance is thrown out the window. Most of the work is done after filming. The amount of time and effort to make the change is all hidden. It’s happening elsewhere by unseen people. It’s no longer the director or producers responsibility to complete this phase of production on time; it’s up to the vfx company. To add to this difficulty is the fact that the live action shoot can and does go over schedule. Problems during shooting may now require additional, unplanned work to be done by the vfx company. But the vfx company cannot go over schedule. They are the end of the road so every delay during shooting, every added fix, shot or change needs to happen by the deadline. That’s the finals date that was set before the vfx company even started bidding on the show. Not only does the vfx company have to do all the work they initially agreed to do in that time, they have to absorb most production issues that have accumulated and rippled down the pipeline since the pre-production began. Add into that mix the requirement by the studio to make last minute changes, possibly based on test screenings, possibly based on an idea of an executive.

Are there other non-film businesses setup like vfx companies in terms of the requirements and client involvement? That would be useful to look at and learn from. Unfortunately I can’t really think of anything on the scale or dealing with the same types of issues. Many construction projects are of course custom and involve a lot of money and people. However they have blueprints that have been signed off on. They have colors that were selected to paint the walls and the client has approved the carpet and the tiles. Sure there will be some changes but the majority of the work is usually very well specified. Any major changes will involve a change of completion date or will require client to pay a large fees to have it accelerated.

Summary
Visual effects is a very labor-intensive business. The labor is made up of dedicated and highly skilled and trained people. There’s the requirement to complete hundreds of works of unique, never before seen, art (shots), based on rudimentary starting points, that are constantly being scrutinized and changed. And this all has to be done for as much adherence to a fixed bid as possible and above all has to be finished on the deadline, - no ifs, ands or buts.

I do want to go on record that I support all the directors I work for and that I’m all for anything that can make a film better. All vfx artists want the best possible film. What I hope this posting will illustrate is just how complex this issue is. We have art, technology and commerce all colliding. The vfx companies are put in a tough situation and the vfx artists are put in a tough situation to try to balance this all out. The end result is any process or structure that will help balance this issue to create the best creative and to make it reasonable for the vfx artists will be a welcome relief.

Related posts:
Pass me a nail
Risk and subsidies
Oh, the mess we're in!



Thursday, September 10, 2009

Visual Effects Producer Book

This was written by Charles Finance and Susan Zwerman, both with the VIsual Effects Society.
Susan is also one of the editors of the VES Handbook that is in progress.

If you're interested in visual effects producing this is the book. Covers primarily the budgeting, scheduling and workflow issues but also covers basics of the technical issues along with pros and cons. 377 pages



Update:
Another book to check out is The VES Handbook of Visual Effects which was released end of July 2010.  More info.

Sunday, September 16, 2007

Naming conventions and workflow

Naming conventions and workflow.

Since this question has come up a few times I’ll try to address it here. Note that if a company is already up and running chances are they have a workflow in place along with naming conventions. I’ll describe some of the more typical approaches but there are no standards in the industry.

It’s important to try to standardize naming conventions and also directory configurations since you’ll have dozens or even hundreds of files associated with each shot (All the live action pieces such as BlueScreen, background, dust elements, explosions, each model will have multiple files and associated texture maps, each shot will have multiple animation files, shaders, composting scripts, etc) Multiply those by the number of takes or versions and then multiply that by the number of shots in film (hundreds or thousands). That’s a lot of data to manage.

When the initial movie script arrives it usually hasn’t been broken down by production yet so that means when bidding you have to assign shots your own names. Even if the script is broken down it’s only broken down to the scene level. Each scene may have a large number of shots. From a live action standpoint these shot numbers are assigned during shooting and usually relate to the shooting order (i.e. 17A over shoulder, 17B extreme closeup, etc) . This allow the live action side of the project to be organized (script supervisor notes, asst camera slates and camera reports, editorial, etc) But by the time the shooting starts the VFX team has already created storyboards and possibly previs with numbers along with a budget breakdown of each and every shot. These days you may also have a number of pure virtual shots (no live action) such as all CG or pure matte painting shots.

Typical method in VFX is to assign a 2 or 3 letters ID to each sequence that at least might help people to know the sequence. If you had a rocket landing on the moon sequence you might label it RLM (Rocket Lands Moon) (Try to label by the gist of the scene rather than the location since the location may change and you’ll forever have to explain to people what the ID used to mean and how it relates now) From there as you create the storyboards you increment the shot numbers by 10 to allow adding new shots in-between as the director and artist modify the sequences. These are the numbers that are used in the bids and schedules and will hopefully remain through the production.

When shooting the VFX people work with the script supervisor. Somewhat standard is to place a V in front of the slated shot number to note that it’s a visual effects shot. If space on the slate permits the asst cameraman may note the VFX number on the slate as well. The script supervisor and the VFX coordinator keep a running cross reference of the VFX shots names and the live action shot IDs. Note that in editing a shot may be re-used or used for an entirely different shot.

Once a sequence is edited and locked the editorial department provides information on the shots. They used 83B, take 5 for this shot and it starts and ends at specific frames. (Based on keycode on the film or based on the Avid timecode info) If it’s a large show the VFX companies may have their own editors who get a copy of the avid editing bins which they breakdown. The negative (assuming it’s shot on film) will be shipped from the lab to whoever is scanning the film. The VFX editor works with the scanning company regarding the label process. In some cases the VFX company may scan it themselves but now a lot of this work is done by scanning companies.

The digital files are delivered to VFX company by high-speed connections or hard drives or other techniques. Each VFX may re-name them for their internal naming conventions when the files are brought online. Because of the massive amount of data not all shots and elements are ‘online’ at the VFX house. They may be on high-density tape storage or other storage system and brought on by support as needed. There’s usually a whole team of people doing this.

A shot may have the name of rlm0030.cc.dr
The cc might stand for color corrected. Dr might be for dirt removal.
bs might be for bluescreen or they may choose to label the shots a,b,c, etc for each live action element. In addition to standard suffixes for the file themselves (.exr, .cin, .jpg)
each frame will have a number. Numbers may be for a fixed number (4 or 5) of digits (i.e. myshot.0001.jpg, myshot.0002.jpg) or may float (i.e. myshot.1.jpg,myshot.2.jog) Even the idea of leading zero or non-leading zero will make a difference depending on your software.

Most VFX work is done on a frame level rather than passing QuickTimes around. This allows work on frames such as compositing even while other frames are being rendered. It also allows spreading the rendering/compositing over multiple machines and more flexibility in file formats.

As the shot goes through the different stages the complexity grows. Matchmoving (or layout) may need a couple of passes to get their basic animation information correct. That’s take 2 of that file. The animator might be working on take 5 but the Technical Director is rendering take 4 (from the day before) The model may have different version numbers since production may start before the CG models are finalized (or someone (the director, may change it)). It’s possible on take 8 of the final render the director decides he actually liked the animation of take 3 better but with the new lighting and render of take 8. It’s also likely you’ll have multiple people messing with files at the same time (animator, compositor, TD)

This soon turns into a nightmare if you’re not on top of it. The typical approach is to standardize on a directory structure. Details tend to be unique to each company but you may have a shot folder which will hold an animation folder, model alias or reference folder, composite folder, etc. Each of these will have sub-folders of different types of files or work. This directory is then configured on any machines where you may want to spread the work. You may want to render and composite frames 1-100 on computer 1 and 101 -200 on machine 2. Or it may be on a special render farm system. In many cases you’ll be using aliases or pointers to the actual image or data files so they can be in one place. That minimizes having to initially copy over all large files but it does mean they may have to be moved during the render process/composite.

So to deal with all of this type of data each VFX company has written multiple databases and scripts. These may be UNIX script, Python, or a number of scripting languages. Databases can be FileMaker on up full-blown custom code. There are now some off the shelf type of products (or modified off the shelf products). Luckily I personally don’t tend to have to get into the nitty gritty. Some of these problems are similar to programming large projects where you have version control to allow people to access files yet still allow it to lock out people or to merge the differences.

During the course of the day each person works on his specific area and then they submit a request. The TD may run a test frame and calculate the number of process hours required to do the shot. All of these requests are submitted and the CG supervisor may review this list with the producer and the supervisor. Since there may not be enough processors and time (likely the case) they will have to decide what the priorities are supposed to be. Some shots may be on a slow render to be finished in a few days and other shots may need to be done the next day for the director to review. Some shots may opt to be done as plastic renders or without fur just for checking.

The script or software will then distribute the shots across multiple machines and be sure to grab the latest version of the animation and model, etc.

In some cases editorial has scripts to automatically assemble all the new renders into the cut. Additional software may be used to keep track of daily notes from the supervisor and the director.

Individuals or new companies starting out should review their software packages and see what they can import and export in regard to frame numbering. Also review your software to see if and how it handles multiple computer networking. (Whether it’s After Effects or a full rendering package). Keep an eye on the number of characters you can have in a filename. Create scripts and database programs that can deal with this naming and directory structure. Test it out on small test projects.

[Update 12/14/2012
Steve Molin provided a directory recommendation he uses.  Steve was a key CG person at ILM, LAIKA, and Image Movers]

from Steve:


Over the years that computer have been used in creating graphics for film and video, the question has come up time and again: how do we lay out the directories on the hard drive so that we can all work together efficiently. What follows is the way I like to see it done, based on my years in the industry - Steve Molin
$SHOW
assets
$TYPE
$NAME
modeling
rigging
surfacing
mattePainting
shaderDevelopment
sequences
tools
shaders
common (a sequence with data common to all sequences)
$SEQ
$SHOT
common (a shot with data common to shots in this seq)
capture
animation
characterFinaling
lighting
compositing
mattePainting
dynamics (other than charFin, eg dirt, splashes, flames)
rotoscoping
  1. If a shot ends up with eg two lighters working on it, the second and subsequent would get appended the username, eg light_smolin.
  2. it is preferable to create directories only when needed to reduce number of empty dirs
  3. sequences to be named with mnemonic codes, eg rr for raptor rotunda
  4. shots to be named with sequence and number, eg rr1, rr2 etc. “Count by ten” allowed, eg  rr10, rr20 to facilitate insertions if desired.
  5. development is to occur in developer-specific shotdirs, eg sequences/rr/dev_smolin
  1. Each of the lowest level would have the same structure, as below:
maya (under here, maya has complete control)
images
source (ie input from another discipline)
reference
renders (ie output from this discipline)
scripts (eg Nuke scripts, Katana scripts … Maya scenes?)
tools (eg Python scripts, compiled binaries)
shaders
curves (eg animation curves)
points (eg point clouds, brick maps)
assets (copies from assets branch, to make shotdirs self-contained)
models
rigs
surfaces
digitmattes


Related links:
RaysInBlue Blog has a great set of links on VFX and Animation pipelines
Art of CG Supervision goes into extensive details

Sunday, November 27, 2005

Moving Camera

Using a moving camera when filming live action for Visual Effects.
Locked off camera, Post Moves, Motion Control, 2D Motion Tracking, 3D Match Moving and Face Markers are all covered in this podcast relative to live action photography.
Approx 25 minutes.

Transcript
Today I'll be discussing the moving camera.

Locked Off
The simplest type of camera move is no move or the locked off shot. The camera is placed on a tripod or dolly and isn't moved. This makes it easier to add visual effects later or to do multiple elements of the same setup. Examples of this could include adding a matte painted house to the top of a hill. If you wanted to create a shot where part of the actor is removed later – such as a leg as in Forest Gump or most of the body as in the invisible man you would shoot the shot two times. Once with the actor and another time with no actors. This shot with out the actors is called a clean plate and is pretty common for many visual effects. Because the camera doesn't move you have identical images – one with the actor and one without. In post production if we remove part of the image of the actor the clean version gives the image without the actor. Special rigs are removed the same way. This same process is easily used for creating twins from one actor. Shoot the scene once with the actor on the left side and then shoot the same thing with the actor on the right side. Because these shots don't change position you can do a simple split down the middle of the scene.

Normally the camera operator is making slight pan and tilt adjustments while sitting on the dolly and the asst cameraman is making slight focus and exposure adjustments. To obtain the best quality of clean plate get everyone away from the camera, off the dolly and avoid changing the settings, even to do slates. The size and position in frame will be different if anything changes, including the exposure. You can spend time fixing this in post but it's better if you can avoid the problem.

Even though a locked off camera makes it easier to accomplish visual effects it may not fit with the look of the rest of the film or the requirements of the shot.

Post Move
Doing a move on an image in post is sometimes called Pan and Scan. This also can refer to transferring a widescreen film to full frame video.

The scene is photographed normally and then in the composite stage the image is enlarged and a synthetic move is added by moving the image digitally. It's also possible to do this type of move with some scanners for better quality.

The problem here is the loss of resolution from enlarging the image. If your end production is video then it's possible to scan the film at a higher resolution.

In the past another way around the resolution problem was to shoot on VistaVision or 65mm cameras. VistaVision a format which shoots with special cameras that run 35mm motion picture film sideways, much like a still film camera. This larger film size allowed for blowing up without as much quality loss in the days of optical printing. Large formats were also common when doing any effects heavy productions in the days of optical compositing. 2001 A Space Odyssey, Close Encounters and Blade Runner are some of the films that used 65mm. Star Wars and most ILM films used VistaVision for effects until the last few years

If the live action element only makes up a portion of the frame then a post move doesn't cause a quality loss. An example of this is starting on a scene which is all live action and then pulling back to reveal a large matte painting extension to the scene. In the old days a motion controlled camera might be used to film a physical matte painting on glass. A rear-projector would be one method of adding the live action with the same move. With the advent of digital matte painting the matte painter creates the painting at a higher resolution. If the original live action scan is 2000 pixels across, the matte painting might be done at 4000 pixels across if the final scene was going to show twice as wide.

I'll discuss 2D match-moving later in this podcast, which is a form of post move on an element.

One of the problems with post moves is the very limited 2D or two-dimensional appearance of the moves. There's no perspective change or feeling of depth. To obtain true 3D camera moves of the original live action there are a couple of processes.

Motion Control
One way to deal with this is using motion control. Motion control is used frequently with shooting miniatures and models but here I'll focus on live action use. Normally the camera is mounted on a pan and tilt mechanism known as a camera head. The operator has a hand wheel for pan and one for tilt. For motion control you have a special head and usually a special dolly where all of these motions are controlled by motors. In some cases the operator moves this just like a regular head and a computer records the positions by using position encoders. In other systems the operator uses a joystick or a remote camera head that is designed to just record his hand moves. In this case the remote head looks like a camera head but it's just a box with the hand wheels.

Note that non-recording remote camera heads are also with live action frequently when the operator can't be at the camera itself. The camera is on a boom arm or other remote system where the operator watches a video monitor and remotely controls the camera.

With true motion control once a camera move has been programmed the move may be repeated over and over again exactly in sync with the camera. The repeatability means you can shoot the same actor walking through the scene multiple times with a complex camera move. This would be done to create twins using one actor during a camera move. Motion control also allows you to film moving clean plates so you have many of the same benefits as a locked off camera such as rig or actor removal. Since the move is recorded you can also take this data and use it later in a motion control system for models or convert the data and use it in the computer for adding CG elements.

Motion control can also be used to shoot secondary elements such as bluescreens people or objects that exactly match a previous live action plate. In this case the data for the move comes from the original motion control shoot or from a process called match moving, which I'll be covering shortly.

The downside of motion control is the requirement for a special system that has to be setup and be programmed. Most directors hate it because of the extra time and process. I recommend it only when you really have to get repeatable motion on the set or location.

I'll cover more details of motion control in a future podcast.

MatchMoving
The other option when a moving camera is required is to do matchmoving in postproduction. This is now one of the most common techniques that came about with the advent of the computer. The simplest technique is shot with just a pan and tilt that requires another images to be added later in the composite. As an example if we pan and tilt an outdoor scene and wish to put the image of a flying saucer hovering we could matchmove a point near the horizon or infinity like a distant mountain or building.

This is done after the film images have been scanned into the computer. These days' most professional compositing programs allow you to specify an area in the moving footage that you want to track. Frame by frame the computer compares the image in this area and tries to find the best fit around where it was last. We call this 2D tracking since it's only calculated in 2 dimensions x and y.

Once this is done the image of the flying saucer is composited over the background and the motion of the background is applied to it. Now you have a scene with a pan and tilt of a cityscape and a flying saucer that hovers over a building no mater what move you make. This creates the illusion it was there the time of photography. If you have 2 distant points that you can track then it's possible for the computer to calculate the rotation and size so you can not only pan and tilt but roll the camera and even do some basic dolly or zoom moves. I suggest distant points to track so you can obtain a more accurate motion track without having to worry about parallax or perspective. You typically also want to track a point in the scene close to where you want to place the additional image. Make sure you always have at least one or 2 points in the scene to track during the entire scene. If you're doing a big pan move make sure there's another suitable point that you can switch the motion tracking to during the shot. Ideally these would be close to the same distance away from camera.

Even though this is a 2D track it's possible to obtain 3D pan and tilt information. By knowing the lens used to photograph the scene the computer can calculate and output a 3D camera file that can then be used in 3D program for the 3D camera to pan and tilt. This renders an object that matches the move exactly.

Another variation on the 2D tracker is a 4-corner track. If you have a billboard in a moving shot and want to replace it with a different image you could track each corner of billboard on the computer and tell it to apply the move to each corresponding corner of the new image. This will distort the new image to fit it in the billboard even if the original scene is moving.

3D matchmoving
The next level of match moving is 3D match moving.
When shooting a scene measurements are taken of any landmarks in the scene (windows, doors, tables, that sort of thing) and in some cases special markers are placed in the scene. Later after the film has been scanned into the computer a person called a matchmover will build a rough replica of the set or location in the computer. The virtual camera, which is a simulated camera in the computer graphics program, is now animated to match the move of the film camera that filmed the live action scene. Originally this was done as a manual process of lining up the CG set with the image from the film. Now much of it is done using special 3D motion tracking software.

Note that some measurements such as the focal length of the lens may be a bit different than marked. If it's labeled as a 50mm lens it could in fact be a 48mm or a 55mm lens. As such you may need to make some adjustments manually to get the final matchmove perfect. Also be sure to check any automated 3D matchmoves since it's possible to fool the computer.

Now that we have a CG camera that matches we can now place a CG object such as a box on the match move floor and it will look like it was on the floor in the original photography. It still has to be lit and composited into the scene but it stays locked with the correct position and perspective in the scene even if the camera operator is moving a hand held camera around the imaginary object.

If a CG creature will be walking on rough terrain hopefully the matchmover has been able to recreate that so the animator can always keep the creature on the ground. This terrain shape is likewise used to cast CG shadows onto. The matchmover has also included any large objects such as tables and trees so the animator can avoid those when moving his creature. If the actor in the scene is interacting with the creature then the matchmover moves a CG pawn that represents the actor. In some cases the matchmover has to do a very tight match of the arms and legs of the actor so the animator can time and match the creature to the actor. Even though I'm using the term creature this could be any 3D object or effect such as CG sparks.

So now that we have the basic of the matchmove process down lets take a look at how that works when we shoot. If you're going to be using 2D tracking then survey the scene and make sure you have some definable points. Any place of sharp contrast or a spot. The corner of a building against the sky provides a clearly defined area that is fixed. The edge of a tree blowing in the wind is not a good point since it's changing. The edge of a building isn't a good point because it needs to be one specific and unique point.

These same requirements apply to four-corner tracking. If you're replacing the image of an old TV you may find that the TV screen is rounded and almost blends into the frame of the TV. In this case you may want to put fluorescent dots from the office supply store on the corners of the area you want to track. You'll need to paint or composite out the dots in the final scene.

If tracking points don't exist in the scene and you need them then you have to create them. If you're shooting an actor in front of a blue screen to create a distant vista in post production then place markers on or in front of the bluescreen. These may be 2 to 6 inch plus marks made out of tape or plastic. If it's a cloth screen material you can put Velcro on the back of the plastic markers. If you can't touch the screen use c-stands to hold them but note that there will be more removal work to do in post. It's important for any markers to be solidly locked down so don't suspend with them with thin wires.

You want the markers to be visible on the final film scans but not too large or you'll be doing a lot of extra paint out. Same with the number of markers. You want to always have 2 or 3 markers visible even when you pan and tilt the camera. If you plan to pan the camera 180 degrees hen you'll need a number of markers around the camera. The actors or props may obscure some of the markers. If you're doing a full day of shooting in front of the bluescreen then it's best to just setup a grid pattern of the plus marks. Note that if the depth of field makes the markers way out of focus or if there are no markers visible in the shot it becomes a matter of experimenting in post to create a move on the background that seems to work with the foreground. This can be very time consuming and frustrating especially if your actor is jumping or moving.

If you're doing 3D motion tracking you always want to have at least 3 trackers visible. As always the matchmover will be recording the lens and tilt information from the camera, possibly with the help of the coordinator and certainly with the help of the camera assistant. If you're shooting on a set the matchmover should be getting measurements of anything with straight lines that are clearly defined such as the tabletop or windows. With this information they can build a rough CG version of the set. Set drawings are seldom used to build the CG matchmove version from since there may have been adjustments and changes made to the real set that aren't reflected in the blueprints. You want to make sure to build the CG world to what's actually captured on film, not on what was planned. To help with this work the matchmover usually shoot stills and especially Polaroid's where they can mark the actual dimensions on the image. These and the notes will be used months from now when the visual effects are being started. In some cases stereo images are taken or multiple cameras are used to help document the relationship of the objects in the set.

For organic sets such as caves or outdoors in natural landscape small ball markers are typically used. It's important for these markers show up so they are usually ping-pong balls or tennis balls painted a fluorescent color. Bright LEDs can also be placed in ping-pong balls. This is especially useful in darker sets or night shoots. At times a box frame of known size is filmed in the scene as well to provide a defined object and perspective check.

If the scene is supposed to be a close-up of a creature but there is no fixed object in frame then vertical rods may be used that hold a marker in place. As always there should be at least 3 markers.

If it's a large exterior scene on uneven ground a grid of markers may be set down or at least measured. Now the animator will be able to match the creature feet directly to the real ground. On DragonHeart and most of the shows afterward when filming in a large natural setting we used a surveyors transit system connected to a powerbook or a handheld computer that records the true position in 3D space of xy and z.

The matchmover back at the studio creates a 3D ball in the computer matching those coordinates. In the past the computer camera would be moved manually to match the position. Now there are special 3D tracking programs that can do this automatically.

Face Markers
Sometimes we're required to create CG prosthetics. This is done for shots that can't be done as normal makeup work since it's going to change during the shot or it requires removal of sections of the real actor. This is the case with the jaw extensions in Van Helsing and the facial electronics in something like the latest Terminator movie. Small colored dots from an office supply store can be applied to a face or arm. It's also possible for the makeup person to apply makeup marks. In these cases it's important to know where to place the dots to get the best match. Because of muscle changes you typically want some on the cheekbones or area that won't change as much in addition to the edge of the imaginary prosthetic. The animator, model maker and/or matchmover will then have to change to shape and position of the CG prosthetic to match frame by frame.

Note that matchmoving doesn't solve the problem of repeating live action camera moves such as required for twin shots and doesn't provide us with a clean plate. It is possible to take the match move data and convert it into a move for a motion control systems later when you're shooting live action or miniature elements. If you need a clean plate (for paint restoring) then the operator tries to repeat his same movement as accurately as he can without the actors and then the compositor will have to massage it in post to try to make it work.

Low Budget
Most beginning filmmakers want to make as complex scene as possible but it's important to keep it simple and to learn the basics of the craft before overwhelming yourself.

As mentioned before the simplest process is the locked off camera. Attach you camera to a solid tripod and film your different elements, including your clean plate as needed, without touching the camera or tripod. This allows you to focus on the art of compositing without getting tied up in the difficulties of match moving.

If you're creating CG elements measure the set or area and try to create a replica of this. Use real units of measure in your computer graphics program. Measure your camera as well. This would include the height from the ground and the distance from camera. Technically you want the nodal point of the lens but middle of lens or film plane is usually accurate enough to start with.

You also want to record the tilt of the camera. You can use an inclinometer to measure this tilt. You can find these on the internet or at larger hardware stores. You don't need the fancy, all digital versions. A simple type based on a bubble level is fine. These are 3 or 6 inch vertical disks with a flat bottom. A needle always points up to provide a readout the number of degrees of tilt, Place it on something level to the film plane, such as the tripod head plate.

If you can find one of these or they're too expensive then get a simple 6 inch protractor. Tie a knot in the end of a one-foot piece of string. Thread this through the hole in the protractor from the back. Tie a weight such as a large nut or bolt to the other end. Place the protractor upside down with the straight edge of the protractor resting against the straight edge of the tripod head. Read the degrees where the string passes the protractor. You may find in your CG program that you have to add or subtract 90 degrees from this number or you may have to negate the number. There's no absolute position for pan so we don't bother to measure it.

Place your CG camera at this point and use the set you built be your guide to where to place the CG creature or object. You also want to be able to cast shadows from your creature onto this ground plane and any large objects such as tables.

Once you've done these types of shots then you can experiment with 2D motion tracking and 3D matchmoving. Start the 3D matchmoving by using just pan and tilt when shooting.

Monday, November 21, 2005

Filming

Filming live action for visual effects on feature films. I discuss the process of shooting on locations and sets, using references, creating interactions and things to watch for. In addition I give some suggestions of to apply this to low budget filmmaking.
I'll probably discuss moving cameras and matchmoving in the next podcast. Filming of bluescreen, minatures and other elements will be covered in future podcasts.

Transcript
In today's podcast I'll be covering the filming process relative to visual effects. First I'll focus on how it's done on a feature film and at the end I'll provide additional suggestions for filmmakers. I've actually split up this podcast so moving cameras will be discussed in the next podcast.

There are infinite possibilities when shooting a film, which is one of the reasons why visual effects are interesting. I'll be covering the basics of shooting with actors or what we term as live action. Later podcasts will discuss shooting bluescreen and miniatures.

First a review of a few terms I'll be using.
CG stands for computer graphics.
An element is an image that will be part of a composite.
A plate is a live action element.
A clean plate is a version of the shot without actors that can be used to remove any unwanted items from the real shot.

The shooting of a feature film can take two to six months. Much of the shooting depends on the project. Some projects such as Dragonheart are shot almost all outdoors, regardless of weather. This makes it like a camping trip with 2 or 300 hundred other people. Other projects may be primarily on sound stages in front of bluescreens. Most projects are a balance of exterior and stage shooting.

Prepping for the shoot
Before shooting begins it's important that the cameras are checked and prepped. This is handled by the camera assistant but for visual effects we request a steady test. When film goes through a movie camera the camera movement may cause the film to shift a bit from frame to frame. This isn't visible in a typical shot projected on the movie screen but if you composite multiple images you may well see them moving against each other. In some older movies when you watch the titles you may see them shaking against the background. This was because the original camera wasn't completely steady.
To test the steadiness a grid of white lines is applied to a black backing. This can be tape lines on a 4 by 8 foot black. This is filmed with the camera fixed on a tripod or pedestal. Depending on the test the film can be rewound and re-exposed to the same grid offset halfway by a grid space. When this is processed and projected you can see if the camera is steady and repeatable to itself. The preferred steady test is to scan the grid from the camera with your film input scanner and confirm that it's not moving relative to your control system, the input scanner.

Film live action for visual effects

The visual effects crew directly involved with the shooting is fairly small. Normally the team consists of the visual effects supervisor or a plate supervisor, 1 to 4 matchmovers, a coordinator and possibly the effects producer as well. If the show is heavy with animated creatures the animation supervisor may also be part of the team. The remainder of the visual effects crew is back at a facility working on creating CG and real models along with preparing for the full render and composite processes. The full effects crew won't start until a sequence has been shot and edited since they need to work with the footage. Because of the deadlines most shows are editing simultaneously with the shooting so finished edited sequence will be done even before the entire movie is finished being shot.

The visual effects supervisor or plate supervisor is in charge of making sure the required footage is shot correctly to do the effects later. The plate supervisor is called that since live action pieces or elements are frequently referred to as plates. A background plate for a bluescreen would be called BG plate. The matchmover position was created primarily in the digital age. It's important to be able reproduce the camera and objects exactly and that means recording all the camera information such as lens, tilt and also to measure specific items in the scene. The coordinator helps to organize all of this and to facilitate passing of information. The effects producer is normally busy at the effects house overseeing that process but the production itself may have their own effects producer on the location to help the different departments and to make sure things are moving smoothly.

Before the photography begins there's usually some pre-production at the location. Part of this is a key meeting with all the department heads, including visual effects, so everyone is clear on the requirements of each sequences and who's doing what. It's also a chance to flag any problems. Typically the storyboards and animatics are shown to the department heads and the actors so they're aware of what the final shots will look like.

Each day there is a call sheet passed out to all crew members that lists the shots/sequences for the next day as well as when each crew member is required on set. A shooting day is usually 7am to 7pm and night shooting is 6 or 7 pm to 7 am. Shooting is 5 or 6 days a week.

The setup
First thing in the morning is brief huddle of key personnel with the first asst director and director. The director has his shot list, which is the list of all shots he plans to shoot that day. For shots that require visual effects the first task is to figure out the camera position and blocking. A director may run through with the actors first to get a handle on how he wants the scene covered. For camera placement it's important to consider anything that will be added later in postproduction. I typically have a set of storyboards that have been reduced to half size so it fits in my side bag or jacket pocket. I've gone through and made notes about every element or piece of film we have to shoot for each shot and have a technique in mind for each shot. We do a quick review of the storyboard. Since the storyboard is only a guide the director or circumstances may require a change. As a visual effects supervisor I have to be able to quickly determine any implications these changes might mean. Will other elements be required? Does this change the technique? Have we already shot something else that this needs to work with? Will there be any major cost differences?

For Dragonheart I wrote some software for a powerbook that would display the dragon in the correct size on a location still, but we found it faster to take the poseable dragon model in front of the camera. With the actor standing in the scene and the physical model of the dragon we can easily check the composition and framing. I'd have a matchmover measure the distance to the actor and would calculate the scale distance for the model. If the actor was 50 feet away it might mean the model should be 23 and ½ inches away. Now the director, DP and myself would review the framing. Once the initial framing is setup we review the specific storyboard and animatic with the camera operator and actors along with key people. The practical effects and stunt crews start any rigging necessary while the director of photography lights the scene.

To help the actors and the camera operator interact with a creature we try to provide a stand in. For Dragonheart we had a monster stick that was an expandable pole with 2 disks on each side representing the eyes. For something like Mr. Hyde on Van Helsing a 2d foam core cutout was attached to a helmet worn by an actor for reference. Roger Rabbit used 2d cutouts and full-size 3d practical model. If the creature is moving someone moves the stand-in around. The actors can use the eyes as a guide where to look and react and the camera operator can make sure they provide enough room in the film frame for the creature. At least one reference is shot with this stand-in moving around. This can be used as a temp in the edit stage since the action and framing is clearer with the reference. We try to do the majority of takes without the reference to avoid having to do paint out removal of it in post. In some cases though the complexity of the action may require the stand-in be in all the takes.

Interaction
Any time you're combining multiple images you want them to blend together to make them appear to be a real scene or location. To help with this illusion we try to provide interaction of the different elements such as a creature that lifts up the actor and then swings it's tale to knock over a bookcase. This is where the special effects crew gets involved. As mentioned before technically special effects in movie credits are practical or mechanical type of effects. Normally this will have already been discussed with the special effects supervisor who will have planned and constructed some type of rig to do this. If it involves moving the actor or a stunt person then the stunt coordinator is involved. New ideas from the director at the time of the shoot require everyone to scramble to make it happen. If it was preplanned and there is an animatic we use that as a starting point for the timing of the action. If there is no animatic the animation supervisor or the visual effects supervisor is involved in working out the timing with the director. We need enough time for the actor and the imaginary creature or object to react. The asst director will usually provide a verbal cue so the special effects crew can do their part at the right time. If the creature is supposed to speak then someone on the set will be a stand-in for the creature by speaking off camera or this may come from a pre-taped voice. An actual person on the set delivering the dialog is preferred since the timing and dialog can change easily.
Another form of interaction is the lighting from the director of photography. If colored objects are supposed to emerge from a glass then the DP has to create an interactive light on the faces of the actors so when the objects are added later by the visual effects crew they look like they're creating the light effect on the faces.

Rolling film
Each shot and take is noted on the camera report and slated. For visual effects shots there are usually specific shot names or ID codes that relate to the sequence. These names may have been defined months before in pre-production and relates to specific storyboards.
The first take is usually done with any interactive reference material as monster sticks or a stand-in. After that the shots may be done without these depending on the specific requirements.
During each take the visual effects supervisor is making sure everything is working as required. Is the actor's eye line correct? Is he looking for where the creature will be in that specific moment? Has he missed a mark and put his arm through where the creature is supposed to be? Are there additional items that may need to be removed? Will the special effects rigging work and will everything look correct once the shot is finished in post? Are any cables causing a bunching up on the costumes that give away the fact the actor is on wires? Is the camera move correct and timed right? There are a hundred items to keep an eye on.

The actor's eye line becomes even more difficult when there are multiple actors. If you have a lot of extras it becomes very difficult. If you have a small object flying and hovering the illusion won't work if everyone is looking in a different direction. So to handle this you do the run through with the reference and may need to shout out timings so to choreograph this eye motion. You may need to work with the special effects people to create a small target that can be moved around that people watch. For Dragonheart we used an ultralight plane for some shots where the dragon is flying. This provides everyone with a clear idea where to look and the plane itself is removed in post.

Multiple video cameras may be mounted on tripods to provide additional reference for the animators and match movers.
I try to list on my storyboard book any additional items that may be critical to keep an eye on in a specific shot. I've also listed the additional elements that may be required.
Most of this checking involves looking at a video monitor coming from the video camera mounted in the film camera. This is known as a tap camera and unfortunately the quality of the tap cameras is about the same as surveillance cameras. The video monitor itself is at what's known as the video village. There's an operator who runs this and records video of the different takes for reference. It's around this small monitor or two that the director, script supervisor and other key people may be grouped around during a take.

If there are changes to be made the effects supervisor discusses this with the director. Normally you let the director be the one giving directions to the actors to avoid confusion.
Once the director is satisfied with the takes for the main action any additional elements are filmed. This may be the clean plate as previously discussed. This could be some practical effects such as dirt hit or an additional actor. For things like dust hits a black flat may be placed behind the dust and this element would be screened or lumikey in the final composite. If it an additional actor this may be shot against a small portable bluescreen on the location. One of the advantages of shooting these elements on the location or set is the lighting will match and camera position will match exactly. Trying to re-create sunlight months later on a stage and have it match is very difficult. The downside is this will take up more time when shooting the live action.

Once any additional elements are shot references are shot. For 3D work normally a ½ of a gray sphere is shot. This will provide a controlled reference for the technical director who lights the CG scene later. It shows where the light is coming from and the basic balance of the lights. A ½ of a chrome sphere, which may be the other side of the gray sphere, is filmed. This will provide an image for the reflection map that is wrapped around the image. This provides some of the ambient illumination as well as image for the reflection. Sometimes stills are taken with a fish eye lens to provide this same map.
A color card or grayscale may be filmed to provide a color reference. I usually try to have a reference of the object that will be added later. If we're adding a CG version of a clock such as in The Mask I'll move that through the action so the technical director, also known as TDs, can use as a guide for how the material and lighting interact. If there's a creature the model shop may provide a reference material such as section of fur if this is a creature with fur. By trying to have as many real world references as possible the final results will be based on what it would really look like in that environment. Even with fantastical creatures the aim is to make them photo real. It's actually more important to make them photo real since the eye and mind knows it's not real it tries to find any discrepancies.

So finally we have one shot done. There may be another 20 more to do that day, each with their own concerns and rigging. On a large show they may be running 2 cameras all the time to provide more angles and coverage. For complex action scenes they may be running 4 or more cameras at the same time. Any of these shots that will require visual effects will need to be watched and measured. A large film usually has second unit crew shooting additional scenes or pickups and inserts at a different location. Each of these shots has to be dealt with the same way if there are any visual effects.

On a large effects film there comes a time when the crew along with the director and assistant director start assuming that you can fix everything and they may become a bit sloppy about removing things from the scene or making final adjustments or providing you what you need. There is always pressure to keep moving on a film shoot and in some cases it will be cheaper for you to do something in post rather than taking an hour at a cost of tens of thousands of dollars to do it on location. The visual effects supervisor has to weigh these two issues and choose his battles wisely. If the quality will suffer because of production shortcuts then it's critical to flag the director and discuss the issue.

Low Budget
So how can we apply all of this to a low budget filmmaker? As always planning and preproduction are vital for keeping production moving smoothly and keeping the costs down. Make sure the effects are there to tell the story.

Try to do a rough storyboard of all your effects shots. Don't get wrapped up in created elaborate animatics or 3D storyboards. Usually the added value of these, especially for simple projects, is minimal but the time required can be enormous. It's better to put your additional time and energy into the real shot.

Analyze the different elements required to make the final shots. Remember to keep it simple. If you can do it using 2 elements instead of 5 do it with 2. List those elements with the storyboards.

Keep the storyboards with you while shooting and check off the elements as you shoot them.

Do tests ahead of time to check your technique and to determine if there are other requirements when shooting.

If this is a really low budget film you may be directing and running the camera in addition to doing the effects. This makes it even more critical to have a checklist and make sure everything is shot correctly.

Communicate with your crew so they understand what you need done. Make sure the actors understand what's required of them. The storyboards will help here.

Don't rush the shoot. If you don't get what you need or get poor quality elements then you're likely to spend a lot of time in postproduction just to create something marginal.

Be prepared. Have your tools with you. Storyboards, notepad, pens, tape measure, etc. A fanny pack or other bag is useful to hold these items.

Slate and label everything. Try being organized. When you're doing effects work you may end up with several elements for each shot. It's very easy to get overwhelmed but al l the different variations. On a feature film we have people who's focus is to keep track of all these bits.

Double-check your camera settings. If you review the footage on your video camera be sure to cue it up to after the last take. This is to avoid recording over a shot and to make sure the timecode is correct if your camera supports timecode.

Just because you can do something in post doesn't mean you should. As an example if someone left a c-stand in the shot you could paint that out later but why bother when you can just take 5 minutes and move it out before shooting. Don't try to fix everything in post. Balance the time on the location with the amount of work required and the final quality.

Keep the shots simple, especially if this is one of your first projects. Even when on set make sure you're not making things too complicated. When we were filming the The Mask we had planned a wide shot where large props are being pulled out of Jim Carrey's pants. Since this included items such as a tuba and bazooka and we saw his entire body we were going to make CG pants and stretch them to show the objects being pulled out. On the night of the shoot the director changed it to a cowboy framing, which means the bottom of the frame, is between the knees and waist. Jim's costume was a very baggy zoot suite so the stretching pants gag became un-necessary with this framing and costume. I suggested we cut the pants to make them into shorts and cut holes in the pockets. We then had two people shoving real props up into the pant pockets.
We eliminated an effects shot and had something that was better for that shot.

If you're shooting on film, especially 16mm, do steady tests before shooting or your images may jump against each other. If you're shooting film but will be finishing on video make sure the telecine is done with a solid movement. In the past you had to request a special pin registered transfer but film movements have gotten better. Just check with the video house and make sure to steady test it.

If you're shooting video try to shoot progressive if you can with a camera like the Panasonic DVX100. You can shoot visual effects with a standard video camera but the interlacing makes the process a little more difficult. Likewise if you're planning to shoot video and doing a lot of greenscreen work consider shooting on a higher end system than miniDV. You can do greenscreen with miniDV but a higher end system will provide better quality easier.

Shoot your shots as locked down cameras if possible. This will simplify most effects a lot.

Try to shoot a clean plate when possible. To make sure there's a good match you can just keep the cameras rolling and have the actors leave the scene. This avoids the chance of the camera changing or the lighting changing as the sun moves.

If you're going to purchase a composting software try to get one with motion tracking. If you're in school check on educational pricing for all software. And certainly take advantage of cameras that may be available through your school.

High quality visual effects takes a lot of time and work. Even with all the tools available nothing is as simple and easy as you would think it would be. Accept that and keep moving forward.