Sponsored by Fujifilm

Understanding Various Video “Rates”

For photographers making the transition into moving pictures, there’s a lot to learn—not the least of which are several video-specific technical terms. Some of them—such as bitrate, frame rate and refresh rate—sound awfully similar, but they refer to very different things. Here’s an explanation of the differences between these important “rates” when it comes to capturing and displaying video.


Measured in megabits per second (expressed as mb/s) the bitrate of a file determines how much data is included in every second of video. You can think of it similarly to color bit depth in that more bits means better quality. In video, the bitrate is a way to adjust image quality and file size. Set your camera to capture 60 mb/s versus 100 mb/s, for instance, and you’ll see that the larger bitrate equates to much larger files that fill your media much faster. It should also equate to sharper images with more detail, but as to whether that difference is visible and worth it, you’ll have to test it and determine for yourself.

Bitrate also pertains to the very media you use to record your video files. How quickly an SD or CF card can read and write data—not much of an issue for most still photography situations—becomes a big deal when it comes to capturing video. This is especially true when shooting at a higher bitrate, as a too-slow card will create a bottleneck and simply won’t function correctly.

It’s imperative, then, to invest in media that can handle the data your camera delivers. This is why SD and CF cards denote throughput by way of read and write speeds in their primary specifications. For shooting 4k video at high bitrates, be sure to use Class 3 SD cards, which meet the throughput needs for high bitrate recordings.

Frame Rate

The frame rate camera setting determines how many individual images comprise a single second of video. Popular frame rates include 24, 30 and 60 frames per second (fps) and there’s no single “right” answer for every situation. 24 and 30 fps are the two most useful, in my opinion, and can be oversimplified as 24 fps for a more cinematic look and 30 fps for more of a traditional TV look. The faster 30 fps frame rate is smoother, and while this may sound like a benefit, it isn’t always the case.

Because we’re conditioned to see the 24 fps and 30 fps frame rates differently, a movie shot at 30 fps simply doesn’t look quite cinematic enough. At 24 fps, there’s a subtle stroboscopic effect that emulates the frame rate of a 35mm motion picture film camera. When shooting fast-moving subjects, sports for instance, or wanting to emulate the video look of a traditional sitcom, 30 fps is an ideal frame rate.

Faster frame rates are even better for sharp and detailed motion, which is where 60 fps becomes useful, and even faster frame rates come into play in order to create slow-motion video. Shot at 120 fps and played at 30 fps, a fast-moving object will be slowed down—four times slower, in fact, given that for every four frames shot at 30 fps, four times as many frames are played from a 120 fps capture. Even faster frame rates used during capture will make for even slower slow-mo playback—a very interesting special effect that has never been easier to achieve.

The other thing to remember about frame rates is the idea of matching shutter speed to frame rate. This is due again to what we’re used to seeing from generations of film and TV. In order to create a natural amount of motion blur in each frame of video, the rule of thumb is to use a shutter speed closest to double the frame rate. Therefore, when shooting at 30 fps, the ideal shutter speed is 1/60th. At 24 fps, the ideal shutter speed is 1/50th. Much faster and the individual frames themselves will become visible, leading to a stuttering, choppy effect that’s especially visible with moving elements in a scene.

Refresh Rate

When it comes to refresh rate in video, it’s a question of how the footage is displayed. Monitors and TVs have a refresh rate—the frequency with which the image is changed—which is measured in hertz (Hz), indicating the number of cycles per second. Many displays have long used a 50 Hz or 60 Hz refresh rate, though in recent years, manufacturers are promoting higher 120 Hz and sometimes even 240 Hz refresh rates.

The lower the refresh rate, the more motion blur will be evident in a moving image. And that lack of motion blur often translates to our eyes and brains as “sharpness.” Again, much like with frame rates, a little bit of blur can often appear pleasing to the eye. For some viewers—particularly sports fans who often want the crispest, clearest, sharpest images available—setting displays to utilize higher available refresh rates may be preferable. But if a viewer finds their 120 Hz TV simply makes some things look too sharp and “video-like,” they may prefer lowering the refresh rate to the 60 Hz more typical of traditional televisions.

When shooting stills or, in particular, video of a computer monitor or television, the refresh rate will also need to be taken into account. Have you ever photographed a CRT screen that, even though exposed correctly, has a dark bar horizontally through a portion of the screen? That’s visual evidence of the refresh rate, and it’s particularly problematic when shooting video.

With CRT displays in particular, it’s imperative to match the camera’s frame rate to the screen’s refresh rate. Start by assuming a CRT uses a 60 Hz refresh rate, so a 30 fps or 60 fps frame rate should safely avoid the issue. Even with LCD displays, frame rates should be adjusted deliberately to avoid the appearance of flickering screens. Matching the frame rate to the refresh rate should alleviate the problem, and it’s visible on playback to the naked eye.

Leave a Comment