Sandwich analogy: The Bread

If a scene in a film is a sandwich and we are consuming it as a whole, what would the bread be? The image? The frame? The screen?

In the book Sound on Screen, Michel Chion discussed:

“Why in the cinema do we speak of “the image” in the singular, when a film has thousands of them? The reason is that even if there were millions, there would still be only one container for them the frame. What “the image” designates in the cinema is not content but container: the frame.

The frame can start out black and empty for a few seconds, or even for several minutes. But it nevertheless remains perceivable and present for the spectator as the visible, rectangular, delimited place of the projection. The frame thus affirms itself as a preexisting container, which was there before the images came on and which can remain after the images disappear.

What is specific to film is that it has just one place for images – as opposed to video installations, slide shows, sound and light shows, and other multimedia genres, which can have several.”

Like the frame, the bread is the “container” for the components within.

References:

Chion, M. (1994). Audio-vision: sound on screen. Columbia University Press.

Juxtaposition of music, sound and image; and audiovisual dissonance

Michel Chion developed the idea that there are two ways for music in film to create a specific emotion in relation to the situation depicted on the screen (Chion, 1985). On one hand, music can directly express its participation in the feeling of the scene, by taking on the scene’s rhythm, tone and phrasing; obviously such music participates in cultural codes for things like sadness, happiness and movement. In this case we can speak of empathetic music, from the word empathy, the ability to feel the feelings of others (Chion, 1994). On the other hand, music can also exhibit conspicuous in difference to the situation, by progressing in a steady, undaunted, and ineluctable manner: the scene takes place against this very backdrop of “indifference”. This juxtaposition of scene with indifferent music has the effect not of freezing emotion but rather intensifying it, by inscribing it on a cosmic background. Chion calls this second kind of music anempathetic. The anempathetic impulse in the cinema produces those countless musical bits from player pianos, celestas, music boxes, and dance bands, whose studied frivolity and naivete reinfonrce the individual emotion of the character and of the spectator, even as the music pretends not to notice them. There also exist cases of music that is neither empathetic of anempathetic, which has either an abstract meaning, or simple function of presence, a value as signpost: at any rate, no precise emotional resonance. The anempathetic effect can also occur with noise – when, for example, in a very violent scene after the death of a character some sonic process continues, like the noise of a machine, the hum of a fan, a shower running, as if nothing had happened. Examples of these can be found in Hitchcock’s Psycho and Antonioni’s The Passenger.

Sound can also influence the perception of movement and perception of speed, and perception of time in the image.

Audiovisual dissonance

Audiovisual dissonance is when image and sound follow two totally different tracks. It is not enough if the sound and image differ in nature (their respective content, spatial characteristics). Audiovisual counterpoint will be noticed only if it sets up an opposition between sound and image on a precise point of meaning. This kind of counterpoint influences our reading, in postulating a certain linear interpretation of the meaning of the sounds. Take for example, the moment in Godard’s First Name Carmen when we see the Paris metro and hear the cries of seagulls. Critics identified this as counterpoint, because the seagulls were considered as signifiers of “seashore setting” and the metro image as a signifier of “urban setting”. This reduces the audio and visual elements to abstractions at the expense of their multiple concrete particularities, which are much richer and full of ambiguity. Thus this counterpoint reduces our reading to a stereotyped meaning of the sounds, drawing on their codedness (seagulls = seashore) rather than their own sonic substance, their specific characteristics in the passage in question.

So the problem of counterpoint-as-contradiction, or rather of audiovisual dissonance, is that counterpoint or dissonance implies a prereading of the relation between sound of image. It forces us to attribute simple, one-way meanings, since it is based on an opposition of a rhetorical nature (“I should hear X, but I hear Y”).

There exists hundreds of possible ways to add sound to any given image. Of this vast array of choices, some are wholly conventional. Others, without formally contradiction or “negating” the image, carry the perception of the image to another level. And audiovisual dissonance is merely the inverse of convention, and thus pays homage to it, imprisoning us in a binary logic that has only remotely to do with how cinema works.

References:

Chion, M. (1985). Le son au cinéma. Vol. 5. Cahiers du cinéma.

Chion, M. (1994). Audio-vision: sound on screen. Columbia University Press.

Montage (cont.) – the Kuleshov experiment and Eisensetin’s theory of montage

In the 1920s, a filmmaker named Lev Kuleshov took three identical shots of the well-known prerevolutionary actor Moszhukin and intercut them with shots of a plate of soup, a woman in a coffin, and a little girl. According to V. I. Podovkin (a filmmaker, who is Kuleshov’s student), who later described the results of the experiment, audiences exclaimed at Moszhukin’s subtle and affective ability to convey such varied emotions: hunger, sadness and affection. In his two major works, Pudovkin developed from the basic root of the his experiments with Kuleshov a varied theory of cinema centered on what he called “relational editing”. For Pudovkin, montage was “the method which controls the ‘psychological guidance’ of the spectator”. In this respect, his theory was simply Expressionist – that is, mainly concerned with how the filmmaker can affect the observer. But he identified five separate and distinct types of montage: contrast, parallelism, symbolism, simultaneity, and leitmotif. He saw montage as the complex, pumping heart of film, but he also felt that its purpose was to support narrative rather than to alter it.

Eisenstein set up his own theory of montage – as collision rather than linkage – in apposition to Pudovkin’s theory. For Eisenstein, montage has as its aim the creation of ideas, of a new reality, rather than the support of narrative, the old reality of experience. As a student, he had been fascinated by Oriental ideograms that combined elements of widely different meaning in order to create entirely new meanings, and he regarded the ideogram as a model of cinematic montage. Taking an idea from the literary Formalists, he conceived of the elements of a film being “decomposed” or “neutralised” so that they could serve as fresh material for dialectic montage.

Eisenstein extended this concept of dialectics even to the shot itself. As shots related to each other dialectically, so the basic elements of a single shot – which he called its “attractions” – could interrelate to produce new meanings. Attractions as he defined them included “every aggressive moment … every element … that brings to light in the spectator those senses or that psychology that influence his experience – every element that can be verified and mathematically calculated to produce certain emotional shots in a proper order within the totality …” [Film Sense, p. 231].

Because attractions existed within the framework of that totality, a further extension of montage was suggested: a montage of attractions. “Instead of a static ‘reflection’ of event with all possibilities for activity within the limits of the event’s logical action, we advance to a new plane – free montage of arbitrarily selected, independent … attractions …” [p. 232].

Later, Eisenstein developed a more elaborate view of the system of attractions in which one was always dominant while others were subsidiary. The problem here was that the idea of the dominant seemed to conflict with the concept of neutralisation, which supposedly prepared all the elements to be used with equal ease by the filmmaker.

Possibly the most important ramification of Eisenstein’s system of attractions, dominants and dialectic collision montage lies in its implication for the observer of film. Whereas Pudovkin had seen the techniques of montage as an aid to narrative, Eisenstein reconstructed montage in opposition to straight narrative. If shot A and B were to form an entirely new idea C, the the audience had to become directly involved. It was necessary that they work to understand the inherent meaning of the montage. Eisenstein, in suggesting an extreme Formalism in which photographed reality ceased to be itself and became instead simply a stock of raw material – attractions, or “shocks” – for the filmmaker to rearrange as he saw fit.

References:

Monaco, J. (2013), How to Read a Film: Movies, Media, and Beyond. Oxford University Press, 2013, pp. 448-56.

Eisenstein, S (1943). The film sense. Ed. Jay Leyda. London: Faber & Faber.

Montage and Juxtaposition

Montage is the European term for putting together the shots of a film, whereas the American term is “cutting” or “editing”. Montage suggests that a film is constructed rather than edited (Monaco, 2013).

Montage is used in a number different ways. While maintaining its basic meaning, it also has the more specific usages of:

  • A dialectical process that creates a third meaning out of the original two meanings of the adjacent shots; and
  • A process in which a number of short shots are woven together to communicate a great deal of information in short time

Montage literally translated from French is assembly, the process by which an editor takes two pieces of film of tape and combines them to emphasise their meaning (Azia, 2015). Visualise, for example, shot A which is a pumpkin and shot B which is a hammer going down. Mix both shots together and you get a meaning, C. By placing the two shots together, the pumpkin is assumed to be destroyed by the hammer.

Sergei Eisenstein is an important individual within the world of editing because he developed “The Film Sense” with fast editing and juxtaposition. The school of thought at the time was that shots complemented each other; if you showed a person walking, then the next shot should help continue the action. Eisenstein developed the idea of juxtaposition. Juxtaposition is the process of showing two things which are unrelated and through combining, they create a new meaning.

References:

Monaco, J. (2013), How to Read a Film: Movies, Media, and Beyond. Oxford University Press, 2013, pp. 239-49.

Azia, R. (2015), Montage theory, [online] available from < http://www.main-vision.com/richard/montage.shtml > [Last accessed 13/4/2015]

5 channels of information in film

In my previous posts, I talked about how movie scene is split into different elements. This has been discussed and defined by Christian Metz, who identified five channels of information in film (Monaco, 2013):

  1. The visual image
  2. Print and other graphics
  3. Speech
  4. Music
  5. Noise (sound effects)

Interestingly, the majority of these channels are auditory rather than visual. Examining these channels with regard to the manner in which they communicate, we discover that only two of them are continuous – the first and the fifth. The rest are intermittent – they are switched on and off – and it is easy to conceive of a film without either print, speech or music. The two continuous channels themselves communicate in distinctly separate ways. We “read” images by directing our attention; we do not read sound, at least not in the same conscious way. Sound is not only omnipresent but also omnidirectional. Because it is so pervasive, we tend to discount it. Images can be manipulated in many different ways, and the manipulation is relatively obvious; we sound, even the limited manipulation that does occur is vague and tends to be ignored.

It is the pervasiveness of sound that is its most attractive quality. It acts to realise both space and time. It is essential to the creation of a locale; the “room tone”, based on the reverberation time, harmonics, and so forth of a particular location, is its signature. A still image comes alive when a soundtrack is added that can create a sense of the passage of time. In a utilitarian sense, sound shows its value by creating a ground base of continuity to support the images, which usually receive more conscious attention. Speech and music naturally receive attention because they have specific meaning. But the “noise” of the soundtrack – “sound effects” – is paramount. This is where the real construction of the sound environment takes place.

References:

Monaco, J. (2013), How to Read a Film: Movies, Media, and Beyond. Oxford University Press, 2013, pp. 235-6.

Research on film editing – updated and gathered for learning agreement

Process of film editing

The film editor’s job is to select the best shots from the raw footage of film and to assemble those shots into a final cut (Film Foundation, 2014).

The entire process of putting a film together into its final form which includes the selection and shaping of shots; the arrangement of shots, scenes, and sequences: the mixing of all sound tracks; and the integration of all sound tracks with the images (Konigsberg, 1985).

Crittenden (1996) wrote a chapter in his book on the procedure of editing a film. However, the book is 19 years old, therefore there are notable differences due to the change in technology and ease in editing.

1. Checking material for faults

2. Synchronising the rushes (picture and sound)

3. Coding the rushes

Procedures 2 and 3 have become less relevant in recent years since sound would often be recorded directly from the video camera itself.

4. Logging

5. Transcribing (documentary) or marking up the script (fiction)

6. Viewing the sync rushes (previewing the footage with synced sound)

7. Breaking down the film (putting all the slates of one scene into the same can, or in technological terms, putting all the files of one scene into the same folder)

8. Cutting

9. Video editing (adding soundtracks and transition effects)

10. Random access (more applicable term used in the past)

How does the editor work with the director?

One indication of the importance of the relationship between the editor and the director is that the same editor will frequently work with the director on a number of films; this is due to the fact that directors alter their schedule to fit in with the availability of their favourite editor (Crittenden, 1996). The relationship will vary enormously in different cases. Some directors merely require a good and reliable technician, others expect and depend upon a more creative contribution. There is considerable variation in working methods. Most editors do not like to have directors breathing down their necks all the time; the most effective way is to meet regularly and discuss what has been cut since the last meeting and to review the next material. Of course, the nature of the material will affect how often and for how long meetings need to take place, where both are completely relaxed and in tune with each other to prevent tension (Crittenden, 1996).

The editor’s primary role takes place in the post-production phase. One production has been completed, sound and music are added during this phase, as are special effects. Aside from shortening the film, the editor must find a rhythm for the film; working closely with the director and sometimes the producer, the editor presents options, points out areas of confusion, and identifies redundant scenes. The winnowing process is an intuitive search for clarity and dynamism (Dancyger, 2006).

Has editing changed in recent years?

In the last few decades, there has been a revolution in the development of technology for the process of film editing. The combination of digital storage and random access to material has given the editor fingertip control in an instant. It is now possible to carry out editing decisions faster than the brain can envisage them. Never has it been more important for the prospective editor to master the craft to avoid being mastered by the machinery. It is only in the 1990s that we are faced with a technological revolution. New generations may never know the difference: cutting on film has already started to become a thing of the past.

In 1996, Crittenden predicted that sophisticated editing systems would be available in schools and colleges, and every home in “the near future”. He also said that “In the profession it is clear that there is nothing to stop directors cutting their own films at home, making the director redundant. What worries the traditionalist is that because the technology can offer infinite alternatives at the flick of a switch, the need to understand the material before you make a cutting decision no longer exists.” This certainly applies to countless numbers of amateur video makers on websites like YouTube and Vimeo, in which they would direct, film and edit their own videos. Programs like Premiere Pro do offer infinite ways a video maker can manipulate their footage, and film editing can be done virtually anywhere, as platforms like laptops and tablets have access to Premiere and similar programs; even on mobile phones, footage can be instantly edited – technology has exceeded way beyond the prediction of Roger Crittenden in 1996.

10 years later, in Dancyger’s book written in 2006, he described the “digital revolution” in filmmaking:

“The digital revolution has heavily influenced editing in many ways, in both sound and image. Technology has transformed the technology of editing, the speed of editing and conceptually the aesthetics of editing. A film is made in three phases, pre-production, production and post-production, and their goal is storytelling. But has the digital changes influenced storytelling? In the digital age when an image can be amended, altered to look real, what is real and what is unreal?

The list of technological changes is long and, with the high technology of television and video, it is growing rapidly. Today, motion pictures are often recorded on film but edited on video. This gives the editor more sophisticated choices.

Stanley Kubrick proved that technology and creativity were not mutually exclusive. Technology in and of itself need not be used creatively, but, in the right hands, it can be. Technology plays a critical role in shaping film, but it is only a tool in the human hands of the artists who ply their ideas in this medium.”

References:

Crittenden, R. (1996), Film and Video Editing, Routledge, 1996, 2nd ed.

Dancyger, K. (2006), The Technique of Film and Video Editing, Focal Press, 4th ed.

Film Foundation (2014), The Filmmaking Process [online], available from < http://www.film-foundation.org/common/11041/pdfs/tg_chapter2.pdf >, accessed 15/11/2014.

Konigsberg, I. (1985), The Complete Film Dictionary. New York: Meridian.

Bibliography:

Kolstrup, S. (1998), The Notion of Editing, P.O.V, Issue 06.

Kawin, B. F. (1992) How Movies Work. Berkeley: University of California Press.

Carroll, N. (1996) Theorizing the Moving Image.Cambridge University Press.

Bordwell, D. and Thompson, K. (1995), Film Art. New York: McGraw-Hill.

Le Fanu, M. (1998), On Editing, P.O.V, Issue 06.

Wiedemann, V. (1998), Film Editing – A Hidden Art?, P.O.V, Issue 06.