A proposed workflow for the creation of integrated titles based on eye tracking data
Title of edited book
Seeing into screens. Eye tracking and the moving image
Year of publication
Abusive, creative, hybrid and integrated are just some of the many terms that have emerged in past years to describe new kinds of subtitles–titles that appear all over the screen, that imitate or contrast with the film’s images, and that follow modern concepts of design and perception in audiovisual translation. Subtitles are placed not at the bottom but in close relation to what is currently happening on the screen. There are various examples of films incorporating integrated titles in order to translate an additional language in an English-language film, as in John Wick (Stahelski and Leitch 2014). There are also first examples of integrated titles being used to translate an entire film into another target language, both for hearing audiences (see Night Watch/Nochnoy Dozor [Bekmambetov 2004]), and hearing-impaired audiences (see Notes on Blindness [Spinney and Middleton 2016]). However, there seem to be no clear rules or guidelines being followed in the creation of these integrated titles. Based on an eye-tracking study that illustrates how integrated titles can enhance image exploration, detail perception and overall entertainment value, a first workflow is proposed and tested in this chapter.