Video / Tips and Solutions

Timecode versus Sync: How They Differ and Why it Matters


Though many of us are familiar with cameras having two ports―a timecode in/out port and a genlock port―we may not consider why both might be needed. Perhaps we dismiss genlock as just a legacy holdover from the times before switchers had built-in frame synchronizers. The terms “sync” and “timecode” are often used interchangeably, and the fact that timecode can be used to sync devices only compounds this confusion. Sync (genlock) is like a beat that calls out when a field (as well as a line with tri-level) occurs, while timecode indexes each frame (or the equivalent period of time for an audio recorder) so that it can be identified in uniquely post.  

"... timecode alone, while sufficient to provide syncing points for post production, may not be a reliable way to keep devices in sync with each other."

If we need to synchronize multiple cameras, or a camera and audio recorder, we tend to just use timecode. The problem is, timecode alone, while sufficient to provide syncing points for post production, may not be a reliable way to keep devices in sync with each other.

With timecode, unless the cameras or recorders are re-jam-synced frequently, their timecode will start to drift apart. The problem is that the quartz crystal (or other piezo-electric material) clocks used in many cameras and quite a few audio recorders are too imprecise for critical applications. This means that two different clocks may have two different opinions on how long a second is, causing recordings to drift apart over time. Depending on the camera or recorder, the timecode may drift as much as a field in a non-trivially short space of time (30 minutes in some cases). Even a field offset is enough that sound and picture will become visibly out of sync when combining the two elements later, in post.

In light of this limitation, several responses come to mind:

  • Genlock the devices, forcing them to "fire" at the same instant
  • Rather than jam-syncing, constantly feed in timecode from a central source
  • Fix it in post, perhaps using a sync software

What is genlock?

To start, it is worth revisiting genlock. Genlock (generator locking) appeared in the broadcast world as a way to synchronize multiple video sources, including cameras, VTRs, and external feeds, so that their field rates are all in phase with each other. This was required to prevent so-called "jumping," an artifact that occurs, when switching in live productions, if the all of the video sources are not in step with each other. For a long time it was necessary to genlock each device―camera or deck―by sending a black burst or composite signal into it, originating from a common house clock. These days, many switchers handle frame sync on their own by using a frame buffer that can hold the field (or frame if the signal is progressive) momentarily until it falls into alignment with the program feed.

In the HD world genlock is still with us, but using a composite signal as a reference has largely been supplanted by tri-level, which emits pulses that clock both the frame rate and the line rate.

Because it synchronizes frames, genlock can be used to keep multiple devices from drifting apart. But it is tricky because, unlike timecode, the camera or recorder must always be hardwired to the source. On top of that, where different length cables are used, each camera must be calibrated to take into account the length of the cable run. In a fixed studio setting this isn't a big problem, but it can be in the field. And it is especially problematic in cinema production where the setup might change with every shot.

Incidentally, another application for which genlock is used is to ensure both sensors on two-camera 3D rigs fire at the same time. A common misconception is that that you can simply link the two cameras together, feeding the composite output of one camera into the genlock input on the second. While this works in theory, the risk of miscalibrating the receiving camera means it is a much safer option to use a separate sync box that is wired to both cameras with cables of identical length.

What about wireless timecode?

A popular, if dubious, way to sync multiple devices in the field is to continuously feed timecode into them using a wireless audio transmitter and receiver. Much like black burst being a type of composite signal, timecode such as SMPTE 12M LTC can be passed as an analog audio signal, and produces a well-know "telemetry" noise if played through speakers. Since we live in Age of the App, there are many app-driven solutions that promise timecode sync over Wi-Fi. This can work, but the inherent unreliability of wireless technologies comes in to play; inexpensive wireless systems may even digitally process the signals, introducing delay. Also, when audio or video is being sent wirelessly, a loss of signal will be noticed right away―people on set are watching that. But if the timecode signal drops, the camera just reverts to its internal clock and the problem may not get noticed until it is too late. Plus, using audio hardware, not to mention a Wi-Fi network, rather than dedicated timecode hardware, adds variables that may compound troubleshooting when things do―as they inevitably will―go wrong.

What about syncing in software?

If the footage is being recoded, it's going to need to be edited anyway, so why not just use an application like Red Giant PluralEyes or Final Cut Pro X's built-in facility to sync audio and video? If the material is sufficiently broken up or if you are able to cut up the clips into less than 30-minute segments, this will probably work, at least within a half-frame margin of error.* But for an all-day live event, during which each camera is being recorded separately, this solution would create a lot of extra work. It would be nice, if timecode is available anyway, to be able to use that exclusively when syncing cameras and audio in post.

The ideal solution, when all cameras can't be hard-wired to a reference source as in a TV studio, is to connect each individually to a reliable sync device such as an Ambient Recording Lockit Box or a Sound Devices recorder with sync output. These devices use high-precision, temperature-compensated crystal oscillator (TCVCXO) technology that boasts less than one frame of drift per day. For situations in which timecode has to be relayed wirelessly, there is also wireless hardware based on the same TCVCXO technology which, while not as reliable as a hard-wired solution, is the next best thing since it is optimized to send timecode data rather than natural sound in the frequency range of the human voice.

To sum it up, true synchronization without drift requires two things: It requires a reliable timecode source to avoid drift, and it requires genlock to make sure the fields or frames are hitting the same beat. In some cases, sync problems can be fixed in post. But this can be time consuming, and the relative cost of getting it right on-set may win at the end of the day. The bad news is with the reduced need for genlock in broadcast, fewer and fewer cameras have sync ports, especially at the prosumer level. Eventually, we may be stuck with Wi-Fi but, hopefully, by then cameras will feature better internal clocks so that an hour in camera A will match an hour in camera B―even if their timecodes aren't perfectly aligned.

*A video field in NTSC areas lasts just shy of 1/60 of a second. Meanwhile, an audio sample is typically either 48 kHz or 96 kHz. That's 48 or 96 thousand samples per second versus 60 video samples per second for video. Unfortunately most NLEs only let you adjust audio with precision of one frame, or at best, one field. This means you will never get audio that wasn't already in sync precisely without resorting to dedicated tools.


We are using a Sony EX3 camera, a Sony PMW 150 camera and a sound devices T702 audio recorder,

We use the T702 to jam the timecode into the cameras. We edit with Adobe Premiere Pro CS6 and after syncing the video and audio  there is always 1 frame difference between video and audio 

Recently we purchased 2 nanolockit devices to sync everyting. That should solve the problem but the problem is still the same.

Is there anyone who knows how to solve this problem!? 

Is it possible to genlock using a sdi splitter like this one?:

If im doing a 3 cam interview with 1 x c300 and 2 x fs7(with XDCA-FS7 Extension Unit ). The c300 has a SYNC OUT. Could this go to the SDI IN, in the splitter and then connect the outs on the splitter to the gen lock on the fs7's?

And can you do also use another sdi splitter and split a timecode signal from one camera to the others?

Hi Lars - 

Sounds like a plan on both proposals, Lars.  But we wouldn't mind reviewing your work flow for you:

Please contact us via e-mail if you have additional questions:

What was the outcome of Lars question? I'm wondering about the same thing. Thanks!

I need to run genlock cables in a large conference room. The video will be projected so sync between the projection and the speaker is critical. To eliminate latency I've been asked to genlock all equipment. I have 4 cameras to lock, and there is no room in the budget for purchasing sdi/bnc cables. Our local rental house shut down and liquidated all assetss. Can genlock be ran over SURVEILLANCE camera bnc? 150' of surveillance camera bnc is $15.00/ compared to sdi/bnc at $100.00+.

Thanks in Advance

Hi Neo - 

With a high quality RG59 cable and quality BNC connectors and careful, accurate crimping you will max out at about 300'.  RG-6 cable is only a bit heavier and thicker but offers much better signal loss characteristics.

Quick question: Could you suggest a solution for shutter-sync for witness cameras - used in VFX to help with matchmoving. There's no need for sound in this application but it is very useful to have the shutters fire at the same time on multiple cameras (even if the primary recording unit isn't on the same lock). Are the any cheaper cameras available that you think this would be possible for? I've seen some info about atomos systems with the Sony dSLRs but it would be great to hear your thoughts on options available. Wireless would be handy, despite it's limitations: there are short takes and reset frequently.

Hi Wil - 

Please send us an e-mail on this topic and your specific needs and cameras:

I need to genlock signals running between several buildings.  For the most distant one (a couple blocks away), I'm considering using a sync generator locked to GPS (the master sync generator is also locked to GPS).  Can I expect that to deliver a correct lock between them?

For this type of question, your best bet would be really to consult with a studio engineer.  Though, if you send us an email, a couple of our agents in the Video Department would be happy to see what they can come up with.

Sorry, newbie question.

So, if I use genlock with multiple video cameras, will they fire their shutters at EXACTLY the same moment?

I want to create a multi-camera rig for "bullet time" effects to capture high-speed motion with no jitter when switching cameras. Any advice on low-cost sync generator and cameras?

Hi Jimmy -

Genlocking would keep multiple cameras firing at the same time however timecode is used to sync audio and video.Genlock is used is to ensure both sensors on two-camera 3D rigs fire at the same time. During long cable runs, combining a audio mixer, and cameras, there is a frame or two delay.  Timecode allows you to then sync the audio with the video so the cable delays are no longer an issue. The combination of both provides a more reliable and flexible synchronization. 

The Lynx Technik AG yellobrik SPG 1707 HD / SD Sync Pulse Generator with Genlock is a resilient and versatile sync generator that delivers simultaneous sources of HDTV tri-level and SDTV Bi-level sync. The module provides three SD sync outputs and three HD sync outputs and a separate audio sync output that can be switched between 48 kHz World Clock or Digital Audio Reference (DARS). 

The Genlock capability allows cross locking to any sync signal regardless of the output sync format selected. The unit is suitable when you need a source of HDTV tri-level sync locked to a SDTV Black reference, or if you need to lock across standards. For example, when you need to connect a PAL sync and get frequency locked NTSC sync and HDTV Tri-level sync outputs. 

The HDTV tri-level sync outputs can be set to any of the available HDTV standards, and the SDTV bi-level outputs can be set for NTSC, PAL, or PAL M/N. The SDTV, HDTV, and audio sync signals are all frequency locked together, or locked to the reference input, if connected. The SDTV sync outputs can be color bars, black burst, or sync only, with selectable 7.5 IRE pedestal for NTSC standards with adjustable burst phase in 8 increments. 

Please contact us via e-mail if you have additional questions:

I have a question from a post point of view.

When I edit a multicamera shoot, my goal is to make sure that there is a single common sync point for all my tracks. This can be done manually or through audio waveform recognition. Once I have all the tracks synced, my sequence plays perfectly and all tracks remain in sync. 
After reading your article I would expect that all my cameras would drift out of sync without using hardware. Why does my workflow not fall out of sync?

The reason I am asking is I am putting together a tutorial on using Premiere Pro's new Multicam sync that uses labels to place "start and stop" cameras on a single track. This new feature (CC 2015.2) does not work with audio sync and must use Timecode. So I arrived on B & H to find a Jamsync type device to recommend to my viewers to give them the ability to use this new feature with the kind of cameras they use which is mostly DSLRs and Blackmagic cameras. 


Hi Colin Smith-

To be clear, the amount of drift is very slight. I think around one frame per 30 minutes is typical for the timing devices prosumer cameras use these days. So if each segment you place in the timeline is less than 30 minutes, drift won't be a problem. Additionally, if you are using waveform sync software (and not relying on timecode) you can always break longer clips up into less than 30 minutes segments so everything aligns properly. The takeaway for the average user is, drift is probably never going to be something to worry about. Having said that, it is important to understand that simply jam-syncing timecode at the start will not cause cameras to remain in sync hours later, which may be an issue for certain types of productions. Hope this helps clarify.

Hello Mr. McDougal,

Thank you for such an informative article!  I am a founder of a high-tech startup in Boston area.  Some of the technologies we've developed has to do with wireless high-precision time and frequecy synchronization.  The accuracy is in the order of nano seconds for time and a few ppb (parts per billion) for frequency.  Our technology also enables much more improved throughput for wireless link, which is very relevant to video, camera, drone industries.  I am very much intereted in speaking with you privately as I find you to be a highly knowledgeable expert in these fields.  If you don't mind please email me back via your personal email so I could tell you more about our company and technology.  Looking forward to the follow up.  Thank you.

good read, just want to understand, if i genlock the cameras, so they are all firing at the exact same time every time, why do i need timecode for? they are already runing together...

It seems like genlock is the better way to go isn;t it? apart from the fact they have to stay connected all the time, but that is also preferable with timecode so what's to gain here?


Genlocking would keep multiple cameras firing at the same time however timecode is used to sync audio and video. During long cable runs, combining a audio mixer, and cameras, there is a frame or two delay.  Timecode allows you to then sync the audio with the video so the cable delays are no longer an issue. The combination of both provides a more reliable and flexible synchronization.