Grow Your Market: Learn to Create Netflix-Quality Subtitles

by Max Troyer


This is a guest post from one of's partners, Middlebury Institute of International Studies.


Content creators are increasingly turning to video to expand their audience. This proliferation in video content has created a new market in video subtitling, but do you know how to take advantage of this opportunity? 

As a freelancer, you may not be sure how to start offering subtitling as a service. Or, you may already be doing so, but you’d like to improve the quality of your work. You can find plenty of online examples of subtitling, but creating world-class subtitles is part art, part science, and it’s also the result of quite a bit of research into what the human body is capable of doing. (Really.)

Understanding Subtitles

Subtitles are onscreen text that typically represent spoken dialogue, but can also encompass pertinent information such as music playing in closed captioning for the hearing impaired. In brief: Subtitles provide access to video content that is otherwise inaccessible. 

Subtitles and Closed Captioning

The purpose of subtitles is to convey meaning, and often this means subtitles must simplify the dialog a bit. Oftentimes, a subtitle doesn’t match the spoken dialog. This is inevitable because of the limitations of how fast humans can read. 


Most of the time when we discuss subtitles, we’re talking about only subtitling spoken dialog. Subtitlers don’t include sound effects and music because they assume the viewer can hear them. What if someone is completely or partially hard of hearing? These viewers need to have the sound effects and music captioned so they can understand what’s going on. In the U.S., any TV larger than 13 inches must have a closed captioning decoder. The U.S. also mandates closed captions be included with DVD, videotape, broadcast TV, cable, etc. 


I don’t think the subtitle in Figure 1 is meant to be funny, but subtitles are definitely amusing sometimes.

Zombies banging

Figure 1: Zombies banging.

Where Do Subtitles Go?

Divide a screen into thirds. Normally, subtitles will appear in the lower third part of the screen (see Figure 2). The lower third is used for subtitles because there is often not much action in that part of the screen. Imagine a video with a “talking head”—usually the person's head is at the top or middle of the screen and the torso is at the bottom. Subtitles should never cover a speaker's face.


Subtitle positioning

Figure 2: Subtitles generally appear in the lower third portion of the screen.


Types of Subtitles

There are basically two types of subtitles: open and closed. 


“Open” subtitles cannot be disabled and are “burned” into the video. 


“Closed” subtitles can be enabled or disabled. They’re also sometimes referred to as “soft” subtitles. 


Whether our clients request open or closed subtitles, as providers we need to be able to deliver what they want. 

Managing Subtitling Workflow

Transcription: The first step in subtitling is transcription, either using a script, transcribing manually, or using an automated tool such as Premiere Pro (I cover the Premiere Pro workflow in my on-demand course Subtitling for Streaming.)


Spotting: When you have a transcript, you can use Premiere Pro to create an initial round of subtitles, or you can manually take the transcript and perform spotting in a subtitling tool such as VisualSubSync Enhanced or Subtitle Edit. Spotting is basically chunking up the transcript and fitting it to a timeline. While you can use Premiere Pro for automated captions, my course covers how to use VisualSubSync Enhanced. This tool has built-in quality assurance to perform post-processing on Premiere auto-generated subtitles to shape them into compliance.


Template Creation: Subtitles should appear when someone starts talking, and disappear when they stop. There’s a lot of nuance in the rules for the minimum and maximum duration of a subtitle, how many characters can be in a line, and even how many lines you can have (typically two). You also need to subtitle any plot-pertinent onscreen text, unless it’s mentioned in the dialog. Including more than dialog, such as adding plot points or cultural references to increase translation accuracy, is referred to by Netflix as “template” creation. 


Netflix-Compliance: To create Netflix-compliant subtitles, you need to study Netflix’s Timed Text General Requirements, Timing Guidelines, and the English Guide. The Netflix guides combine nonnegotiable rules (the “science”), but they also allow some leeway in choices a subtitler makes (the “art”).


YouTube: YouTube “subtitles” are in fact automatic machine transcriptions, often rife with mistakes. YouTube technology isn’t smart enough to do any simplification of text and YouTube definitely isn’t trying to target Netflix subtitling guidelines! Even though you can technically download YouTube captions with the timing included, you would need to do so much work to post-process them, that it’s probably easier to start from scratch (although you may be able to salvage the transcription). This doesn’t mean YouTube captions aren’t useful. I rely on them to understand something in another language. You can actually upload proper subtitles if you want, which is what I do for my clients’ YouTube and Vimeo videos. This means disabling any automatic captions and replacing them with high-quality broadcast-style subtitles.

Translating Subtitles

Once you’ve created source language subtitles for your client, you may be asked to translate them. To translate subtitles, I highly recommend using a translation platform that supports visual quality assurance of your subtitle translation. In my course, Audio-Visual and Games Localization, I include a module on subtitle translation where I show students how to use the memoQ Video Preview Tool as well as the Trados Studio plugin Studio Subtitling. Both are great options and let you QA (quality assurance) your translated subtitles while translating, which lets you avoid a separate round of QA. 


Before you translate a single subtitle, check the Netflix Partner Help Center’s extensive collection of Timed Text guidelines for all of the languages Netflix supports. You’ll need to pay particular attention to the characters per line, reading speed, and subtitling rules specific to the target language.


I personally love the subtitling challenges created by a fast-talking panel—think a CNN or FOX rapid-fire back and forth argument between multiple people. If you enable broadcast closed captioning on one of these interview shows, the captions are so delayed it’s almost impossible to understand what’s going on. When I watch virtually any Netflix Original, I’m so impressed at how the subtitles simply fade into the background and don’t distract from the visuals. It’s easy to make mistakes, but when we take time to study the Netflix Timed Text guidelines, use a visual subtitling tool for our source-language subtitling, and a translation platform with visual QA, we increase the chances of creating subtitling nobody notices, which is the highest compliment.


This article is an extract from the course Subtitling for Streaming.


About the author: Max Troyer is passionate about preparing future professionals to join the localization industry. He’s an Associate Professor of Professional Practice and the Grover Hermann Endowed Program Chair for the Translation and Localization Management (TLM) program at the Middlebury Institute of International Studies at Monterey (MIIS).


Topics: subtitling

Subscribe to Email Updates

    Lists by Topic

    see all

    Posts by Topic

    see all

    Listen to the Podcast


    Recent Posts