Sometimes you are just given a video that you need to track and the level of difficulty to do this directly depends on how the video was shot. If it wasn’t shot with some basic considerations in mind, you’ll probably have a hard time tracking it: it may not contain enough features to track or it may be very blurry or fast moving. Keep in mind that in a movie, for example, there can be a lot of these tricky shots to track, but also keep in mind that there are people with significant experience, using expensive software, and putting in a lot of effort (sometimes even performing manual “magic”) to make the 3D camera fit perfectly with the camera that took the real footage. If you want to create your own videos and prevent your shots from being very difficult to track, you just need to know how the tracker works.
Camera tracking uses what is called perspective shift or parallax to detect the perspective in your footage. Imagine you are inside a train and you look out through a window: the objects closer to you appear to move really fast, and the objects further away, such as clouds, are almost stationary. This is what perspective shift is all about: an object that is closer to the camera will shift its perspective faster than one that is very far away.
Knowing this, you can understand that a video in which you move the camera to show that perspective shift will help Blender determine where the markers are. If that doesn’t fit the idea you have for the shot, don’t worry! You can shoot some video reference frames that capture perspective shift before shooting the actual video; then, you can use the frames of the reference footage to show Blender the correct perspective, and the rest of the video will then incorporate that perspective. This will give you better results when you’re working with a video that has minimal perspective shift. When you finally edit the video, you can cut out those reference frames at the beginning, of course.
Here are some other useful tips on how to shoot a video for tracking:
Because of perspective shift, it helps to have tracking features in both the foreground and the background; this will give Blender a better reference to analyze.
Make sure that there are enough recognizable features that you will be able to track throughout the video (high-contrast elements with 90-degree corners offer the best results when tracking). The more often a feature appears during the video, the more stable the camera motion’s solution will be. If you feel there are not a lot of features, place something in the scene that can help you: a little stone here, a piece of paper there. Just add things that will help you later and won’t distract the viewer. Another alternative is to add physical markers (usually small designs that contain contrasting shapes or corners that you can print and place on your scene during filming); however, this is trickier because you’ll have to remove them later in postproduction, so it’s better if you place only small, unobtrusive objects in strategic positions.
Avoid zooming if possible because changes in the lens while shooting the footage can compromise the tracking and make it much trickier.
Try to prevent very fast moves that might blur the image. If you have blurry footage, chances are you’ll have to track it almost manually and it probably won’t be very precise because you won’t be able to see the tracking features clearly.
Shooting a good-quality video makes tracking a lot easier. If the video has compression artifacts or low resolution, small (and even big) features will change a lot from one frame to the next, making it very difficult for Blender’s automatic tracker and you’ll have to do a lot of manual work.
To track the camera movement, you can only use features that are static. Don’t use things that move as features to track because they can break Blender’s perspective analysis. Keep this in mind when you are planning how to shoot the scene to make sure you have enough static features to track. Remember that the more features you track, the more stable and close to the real footage the 3D camera movement will be.
If possible, take note of the focal length and other camera parameters you used while shooting the video footage. That information will help Blender solve the 3D camera’s motion.
18.225.156.158