I recently watched Chris Nolan’s film Tenet. I’m not sure how anyone could have understood any of the dialogue — let alone the plot — without subtitles. I was reminded why I’ve come to appreciate them.
Back in my high school days, I hated subtitles and the way they distracted me. But since meeting my wife, whose first language is Chinese, we’ve used subtitles or closed-captions on films and TV shows that we watch together. As a result, I have found that I prefer them, even for English movies and TV shows, as I can understand more clearly what’s going on. In the classroom, I now always enable them for my students who have almost always been English language learners.
This post will look at what subtitles are, examine how they can be used in the classroom, and consider what impact advancements in AI are having on auto-translate.
Subtitles vs. closed captions
Technically, subtitles are used when translating the dialogue of a foreign language and closed captions are the words that display the dialogue in the same language. There are also open captions, which are embedded in the film and cannot be switched off. Think of a 1980’s Hong Kong gangster movie with both English and Chinese text. For the purposes of this blog I’ll just refer to them all as subtitles.
Subtitles in the classroom
Whenever subtitles are available, they should be turned on. This benefits the language learners in the classroom, but it also assists the English speakers through reinforcing the way words are spelled and pronounced. Subtitles can help deal with the issues of a noisy classroom or poor audio quality. It’s a myth that ELLs language development is harmed if they’re watching English media and using English subtitles.
Subtitles may be counterproductive if students are always using their own language’s subtitles to watch English media. With that being said, it’s often okay to give students the opportunity to use their first language subtitles (perhaps at home, where they will use them anyway) when the goal of showing the media is related to course content or concepts. A great suggestion for older, more self-directed students, is to suggest they rewatch films they’ve seen in their first language so that they can focus on the language more closely and not be confused by trying to figure out the plot.
Which educational media use subtitles well
TED.com and TED Ed videos almost always have English subtitles available. Popular videos often have dozens of translated subtitles to choose from as well. Some YouTube channels will provide properly transcribed subtitles. Examples include the Upandatom, CrashCourse series, Veritasium, and Numberphile.
Many schools subscribe to BrainPOP, which offers English subtitles for all their media. Netflix always has English subtitles for English media and generally English for films in other languages as well.
The future of subtitles
You may not have played around with YouTube’s auto-generated and auto-translated subtitles. Google’s machine transcription services are improving all the time and can offer good to very good closed captions. I suspect that many people have used them thinking they were manually transcribed.
The production quality of YouTube channels has improved greatly and so it’s become easier for both AI and humans to pick out the dialogue from the background noise. Where it tends to breakdown is when transcribing less common names or when the voice is more heavily accented, though even in this case I have seen great improvements over the last few years.
I have played around with live translated subtitles from another language. And this is what the future will be. YouTube (Google, really) will translate and transcribe on the fly to any language within Google Translate’s repertoire. If a video clip has Spanish dialogue, for example, auto-translated English subtitles can be switched on.
Here’s a TED talk with embedded Chinese and English subtitles (open captions). Toggle through the auto-translate options and you’ll see that they are very, very accurate. The auto-transcribed English captions lack punctuation, but the auto-generated subtitles are also as accurate as a Google Translation. The auto-translate works well in this clip because there is a single original language and a clear recording, even though Ron Gutman has a slight French accent.
Here’s a clip of an Argentinian, Chinese-speaking vlog host profiling a Russian family living in Taiwan that also speaks Chinese. The video contains embedded Chinese and English translations. Watch from 2:46-4:56 as they move from Chinese to English, then back to Chinese again. The auto-subtitles lag a few seconds as it seems to figure out what language is being spoken. Unfortunately I can’t embed this video on this page, but it’s worth a watch.
Auto-translation and auto-captioning will only improve in the future, but the tools available to us now are already powerful enough to be used in the classroom. So, to sum up, make every effort to enable English subtitles on any media you show in the classroom. Everyone will benefit, especially the English language learners. And if the content creator did not manually transcribe subtitles, then YouTube’s auto-generated can work in a pinch.
Auto-captioning has a long way to go before it can handle the fast, spontaneous conversations of video gamers. I transcribe these conversations everyday and find the subtitles next to useless. I’m not complaining. The task is really, really difficult.
Is that your job to transcribe? That really is time-consuming.
I’ve done a bit of captioning myself, and you’re right that it’s very difficult. But whereas a few years ago I might have decided I liked a video clip enough to caption it for my students, now, more often than not, the auto-captioned one can work.
Would you admit there’s been a large improvement over the last few years? I feel that it might get to the point where auto-translate can catch up to fast conversations will happen sooner, than later. You don’t think so?