Simultaneous Camera and Object Tracking in SynthEyes [B… — Transcript

Learn how to perform simultaneous camera and object tracking in SynthEyes using MaskML and auto trackers for a clean 3D solve.

Key Takeaways

  • MaskML enables effective separation of scene elements for targeted tracking.
  • Auto trackers can be generated and refined for both camera and object tracking.
  • Manual adjustments and constraints improve object placement accuracy in 3D space.
  • Using mesh constraints reduces drift and maintains alignment throughout the shot.
  • SynthEyes workflow supports semi-automated tracking with minimal manual supervision.

Summary

  • Introduction to semi-automated object tracking in SynthEyes without supervised trackers.
  • Use of MaskML in the Roto Masking tab to create exclusion maps and object-specific layers.
  • Propagation of multiple mask layers to separate elements like the man, ocean waves, and surfboard.
  • Adding a moving object track and adjusting layer roles from garbage matte to object.
  • Tweaking tracker generation settings for a 4K shot and generating 2D auto trackers.
  • Evaluating camera and object tracker distribution and solving the scene for 3D tracking.
  • Manual adjustment of object distance from the camera using visual ground reference points.
  • Importing and aligning 3D mesh geometry with trackers using move, rotate, scale, and pin tools.
  • Projecting trackers onto the mesh to create constraints and refining the solve to reduce drift.
  • Final step involves exporting the tracked and refined 3D scene.

Full Transcript — Download SRT & Markdown

00:00
Speaker A
Today we're going to walk through the basics of simultaneous camera and object tracking in SynthEyes. I want to perform a semi-automated object track. What I mean by that is that I won't create any supervised trackers, I'll only use auto features. Let's jump straight into the Roto Masking tab
00:24
Speaker A
and use MaskML to distinguish different elements. When opening the MaskML dialog, it automatically creates a layer. I'll use this first layer to make an exclusion map for the man in the center,
00:43
Speaker A
so I'll name it accordingly and then click on the viewer to make a selection.
00:59
Speaker A
For this shot, I'm going to create multiple layers, then propagate them all at once. So let's create another one by clicking the Create button. I'll use this second layer
01:25
Speaker A
to avoid tracking the motion of the ocean waves, since I definitely don't want SynthEyes placing auto trackers on the water. We might even want to have multiple reference
01:39
Speaker A
frames for the ocean layer to make sure the selection stays consistent, so I'll scrub through the timeline and create another selection later in time.
01:50
Speaker A
Finally, I click Create one more time. This third layer won't be for exclusion. Instead, we will use it to tell SynthEyes which specific object we want to track. In this case, the surfboard.
02:11
Speaker A
I make a selection and name the layer "surf." Now let's propagate the masks for all layers by clicking this button.
02:28
Speaker A
This shouldn't take too long. This shouldn't take more than a minute. Great! The last step is to tell SynthEyes that one of these layers should actually be used for an object track. I go to Shot, Add Moving Object, and the green diamond appears in the viewer indicating
02:37
Speaker A
that now we have an object in our scene. Now in the left panel, I change the surf layer from being a garbage matte to an object. The rest of the layers stay as they are. Now
02:45
Speaker A
we are ready to move forward and generate some auto trackers. I'll
02:57
Speaker A
jump over to the features room. However, I want to tweak some default settings for tracker generation, so I'll go to the Advanced.
03:09
Speaker A
My shot is 250 frames long, so I'll slightly bump up the tracker count to 200.
03:22
Speaker A
Since I'm working with a 4K plate and the areas where I want my trackers are slightly out of focus, I'll also increase the small and big blip sizes to 12 and 24 respectively.
03:28
Speaker A
Now to generate the trackers, let's go ahead and click through Blip All Frames, Blip All, Clear All Blips.
03:37
Speaker A
Now we've got a bunch of 2D trackers spread across the shot. Now we can change the area. Since we've separated the areas with MaskML, here we can change what set of trackers is now highlighted.
03:54
Speaker A
For the camera, we have a solid number of trackers on the ground and the back clef.
04:09
Speaker A
As for the object, the surfboard had some sand and dirt on it, meaning there were enough fine details to generate a good amount of object trackers.
04:21
Speaker A
Even though some of these trackers might be a bit slippery and not pixel perfect, the fact that we have so many means they reinforce each other and the errors get averaged out. So
04:34
Speaker A
this should work fine for our solve. Now I'll turn down the alpha mask capacity,
04:46
Speaker A
jump over to the solver room and see if I can get a clean solve. I click the big green go button and in just a moment we get our first solve. The HPX error for both the camera and
05:04
Speaker A
object looks decent enough. Looking at the top 3D view, we can see the moving camera's path, along with the cloud of trackers moving throughout the space.
05:21
Speaker A
However, notice that the object's path doesn't look quite right in 3D. It appears as if the surfboard is moving towards the camera, where in reality it's not.
05:26
Speaker A
It's also placed much further away than the set of ground features where it roughly should be. The reason for this is that SynthEyes doesn't know how far the object is from the camera.
05:35
Speaker A
Luckily, there is an easy fix. I don't have distinct measurements of the scene, but I can estimate a close enough position by eye based on the features on the ground. Here's how I do that. First, I'd find a 3D feature on the ground that appears to be closest to where the
06:00
Speaker A
surfboard should start. This spot right under the man's feet is a perfect candidate. I'll mark it with a different color here, changing it from green to something else.
06:25
Speaker A
The reason I do this is that I can now use it as a visual reference.
06:41
Speaker A
Since I know that the surfboard should be just slightly behind the spot, it gives me a solid idea of where the object should be placed in its starting position.
07:00
Speaker A
Now it should be easy to move the object. Make sure you have the object tracker selected here. Now on the left panel, enable the whole button and choose scale. By adjusting this parameter, you can move the object closer or further away from the camera. Now I can switch back to the camera trackers
07:24
Speaker A
to make our reference tracker visible. Notice how the entire object's path is affected and changes its shape as I move it closer to the camera. It shifted from moving towards the camera
07:35
Speaker A
to moving more to the left, which now looks like the proper motion. Now let's bring in the actual geometry and see if everything fits. First, make sure you have the object
08:02
Speaker A
tracker selected here. Then go to File, Import, Mesh.
08:12
Speaker A
The next step is to align it with the trackers and place it correctly in 3D space. Use these buttons on the left to choose the correct location. Use these buttons on the left to choose whether to move, rotate, or scale the mesh.
08:23
Speaker A
Click and drag in any of the 3D views to perform the chosen action. Once you have a rough position, it's better to use pins for precise alignment. Switch to the perspective view and enable pinning toolbar.
08:33
Speaker A
You can hide the trackers by hitting J on the keyboard. Choose "Create added pins," then click on a mesh to create one.
08:41
Speaker A
With one pin, you can adjust only the X and Y positions. With two pins, you can adjust rotation and scale. And with three or more, you can move the mesh in all dimensions. To delete a pin, hold CTRL and left-click on it. You can delete all pins at once by clicking the
08:47
Speaker A
related button in the toolbar. At this stage, I'd use the pins to align the mesh.
09:02
Speaker A
Unfortunately, the model I have doesn't perfectly match the photography. But it's close enough to get the job done and demonstrate the workflow.
09:28
Speaker A
So, I've aligned the mesh the way I'm happy with it, but if I scroll through the shot, you'll notice it slides a bit. Now, I'm going to use the mesh to align the mesh.
09:45
Speaker A
That's because we positioned the model based on the perspective view in just one frame, but the mesh isn't quite in the right spot in the 3D.
09:55
Speaker A
To fix this, we can project the trackers onto the mesh and use it as a constraint. First, select a few object trackers while holding SHIFT, then go to Track, and Track. Then, drop onto mesh.
10:10
Speaker A
SynthEyes will create a set of constraints based on the mesh shape. Now, to make sure SynthEyes uses those constraints, turn on the corresponding checkbox in the Solver Room. Change the solver algorithm from Automatic to Refine for both camera and the object. Now, let's refine the scene and click Go.
10:19
Speaker A
Okay, notice that after setting the constraints and refining the scene, the object error went up a bit. That's because we essentially limited some of SynthEyes' freedom. But that's fine,
10:34
Speaker A
because if I play through the shot, the mesh stays aligned with the imagery much better throughout the shot and it no longer drifts. The final step, of course, is exporting the scene.
Topics:SynthEyescamera trackingobject trackingMaskMLauto trackers3D trackingmesh alignmenttracking constraintsvisual effectsBoris FX

Frequently Asked Questions

What is the purpose of using MaskML in SynthEyes?

MaskML is used to create multiple mask layers that distinguish different elements in the scene, such as exclusion areas and specific objects to track, enabling more accurate automatic tracker placement.

How does SynthEyes handle object distance from the camera in tracking?

SynthEyes does not inherently know the object's distance from the camera, so manual adjustment using visual reference points in 3D space is necessary to correctly position the object and improve tracking accuracy.

What benefits do mesh constraints provide in the tracking workflow?

Mesh constraints help maintain the alignment of the 3D object with the tracked imagery by limiting SynthEyes' solver freedom, reducing drift, and ensuring the object stays properly positioned throughout the shot.

Get More with the Söz AI App

Transcribe recordings, audio files, and YouTube videos — with AI summaries, speaker detection, and unlimited transcriptions.

Or transcribe another YouTube video here →