Translate

Saturday, July 4, 2015

Cinema, Editing and Movie Music...Week 2 notes (partial part 2)



II. Cinematography
All of the camera’s functions
From Wikipedia:
Cinematography (from Greek: κίνημα, kinema "movements" and γράφειν, graphein "to record") is the art or science of motion picture photography.[1] It is the technique of film photography, including both the shooting and development of the film.[2] The cinematographer could also be referred to as the film director's main visual collaborator.[3]
Cinematography is an art form in the field of filmmaking. Although the exposing of images on light-sensitive elements dates to the early 19th century,[4] motion pictures demanded a new form of photography and a new aesthetic.
On June 19, 1873, Eadweard Muybridge successfully photographed a horse named "Sallie Gardner" in fast motion using a series of 24 stereoscopic cameras. The cameras were arranged along a track parallel to the horse's, and each camera shutter was controlled by a trip wire triggered by the horse's hooves. They were 21 inches apart to cover the 20 feet taken by the horse stride, taking pictures at one thousandth of a second.[5]
Nine years later, in 1882, French scientist Étienne-Jules Marey invented a chronophotographic gun, which was capable of taking 12 consecutive frames a second, recording all the frames of the same picture. The second experimental film, Roundhay Garden Scene, filmed by Louis Le Prince on October 14, 1888 in Roundhay, Leeds, England, is the earliest surviving motion picture. The first though to design a successful apparatus was W. K. L. Dickson, working under the direction of Thomas Alva Edison, called the Kinetograph, and patented in 1891. This camera took a series of instantaneous photographs on standard Eastman Kodak photographic emulsion coated onto a transparent celluloid strip 35 mm wide. The results of this work were first shown in public in 1893, using the viewing apparatus also designed by Dickson, and called the Kinetoscope. Contained within a large box, only one person at a time looking into it through a peephole could view the movie. It was not a commercial success but in the following year, Charles Francis Jenkins and his projector, the Phantoscope, made a successful audience viewing while Louis and Auguste Lumière perfected the Cinématographe, an apparatus that took, printed, and projected film in Paris in December 1895.

Click below to continue reading about cinema, cinematography, film editing and music in film.


Film technique

Georges Méliès (left) painting a backdrop in his studio
The first film cameras were fastened directly to the head of a tripod or other support, with only the crudest kind of levelling devices provided, in the manner of the still-camera tripod heads of the period. The earliest film cameras were thus effectively fixed during the shot, and hence the first camera movements were the result of mounting a camera on a moving vehicle. The first known of these was a film shot by a Lumière cameraman from the back platform of a train leaving Jerusalem in 1896, and by 1898 there were a number of films shot from moving trains. Although listed under the general heading of "panoramas" in the sales catalogues of the time, those films shot straight forward from in front of a railway engine were usually specifically referred to as "phantom rides".
In 1897, Robert W. Paul had the first real rotating camera head made to put on a tripod, so that he could follow the passing processions of Queen Victoria's Diamond Jubilee in one uninterrupted shot. This device had the camera mounted on a vertical axis that could be rotated by a worm gear driven by turning a crank handle, and Paul put it on general sale the next year. Shots taken using such a "panning" head were also referred to as "panoramas" in the film catalogues of the first decade of the cinema.
The standard pattern for early film studios was provided by the studio which Georges Méliès had built in 1897. This had a glass roof and three glass walls constructed after the model of large studios for still photography, and it was fitted with thin cotton cloths that could be stretched below the roof to diffuse the direct ray of the sun on sunny days. The soft overall light without real shadows that this arrangement produced, and which also exists naturally on lightly overcast days, was to become the basis for film lighting in film studios for the next decade.

Effects

Unique among all the one minute long films made by the Edison company, which recorded parts of the acts of variety performers for their Kinetoscope viewing machines, was The Execution of Mary Queen of Scots. This showed a person dressed as the queen placing her head on the execution block in front of a small group of bystanders in Elizabethan dress. The executioner brings his axe down, and the queen's severed head drops onto the ground. This trick was worked by stopping the camera and replacing the actor with a dummy, then restarting the camera before the axe falls. The two pieces of film were then trimmed and cemented together so that the action appeared continuous when the film was shown.
This film was among those exported to Europe with the first Kinetoscope machines in 1895, and was seen by Georges Méliès, who was putting on magic shows in his Theatre Robert-Houdin in Paris at the time. He took up filmmaking in 1896, and after making imitations of other films from Edison, Lumière, and Robert Paul, he made Escamotage d'un dame chez Robert-Houdin (The Vanishing Lady). This film shows a woman being made to vanish by using the same stop motion technique as the earlier Edison film. After this, Georges Méliès made many single shot films using this trick over the next couple of years.

Double exposure

A scene inset inside a circular vignette showing a "dream vision" in Santa Claus (1898)
The other basic technique for trick cinematography involves double exposure of the film in the camera, which was first done by George Albert Smith in July 1898 in the UK. Smith's The Corsican Brothers (1898) was described in the catalogue of the Warwick Trading Company, which took up the distribution of Smith's films in 1900, thus:
"One of the twin brothers returns home from shooting in the Corsican mountains, and is visited by the ghost of the other twin. By extremely careful photography the ghost appears *quite transparent*. After indicating that he has been killed by a sword-thrust, and appealing for vengeance, he disappears. A 'vision' then appears showing the fatal duel in the snow. To the Corsican's amazement, the duel and death of his brother are vividly depicted in the vision, and overcome by his feelings, he falls to the floor just as his mother enters the room."
The ghost effect was done by draping the set in black velvet after the main action had been shot, and then re-exposing the negative with the actor playing the ghost going through the actions at the appropriate point. Likewise, the vision, which appeared within a circular vignette or matte, was similarly superimposed over a black area in the backdrop to the scene, rather than over a part of the set with detail in it, so that nothing appeared through the image, which seemed quite solid. Smith used this technique again in Santa Claus (1898).
Georges Méliès first used superimposition on a dark background in La Caverne maudite (The Cave of the Demons) made a couple of months later in 1898, and elaborated it with multiple superimpositions in the one shot in Un Homme de têtes (The Four Troublesome Heads). He created further variations in subsequent films.

Other special techniques

G.A. Smith initiated the technique of reverse motion and also improved the quality of self-motivating images. This he did by repeating the action a second time, while filming it with an inverted camera, and then joining the tail of the second negative to that of the first. The first films using this were Tipsy, Topsy, Turvy and The Awkward Sign Painter, the latter which showed a sign painter lettering a sign, and then the painting on the sign vanishing under the painter's brush. The earliest surviving example of this technique is Smith's The House That Jack Built, made before September 1901. Here, a small boy is shown knocking down a castle just constructed by a little girl out of children's building blocks. A title then appears, saying "Reversed", and the action is repeated in reverse, so that the castle re-erects itself under his blows.
Cecil Hepworth took this improved upon this technique by printing the negative of the forwards motion backwards frame by frame, so that in the production of the print the original action was exactly reversed. To do this, Hepworth built a special printer in which the negative running through a projector were projected into the gate of a camera through a special lens giving the same-size image. This arrangement came to be called a "projection printer", and eventually an "optical printer". With it Hepworth made The Bathers in 1900, in which bathers who have undressed and jumped into the water appear to spring backwards out of it, and have their clothes magically fly back onto their bodies.
The use of different camera speeds also appeared around 1900. Robert Paul's On a Runaway Motor Car through Piccadilly Circus (1899), had the camera turn so slowly that when the film was projected at the usual 16 frames per second, the scenery appeared to be passing at great speed. Cecil Hepworth used the opposite effect in The Indian Chief and the Seidlitz Powder (1901), in which a naïve Red Indian eats a lot of the fizzy stomach medicine, causing his stomach to expand and then he then leaps around balloon-like. This was done by cranking the camera faster than the normal 16 frames per second giving the first "slow motion" effect.

The cinematographer

In the infancy of motion pictures, the cinematographer was usually also the director and the person physically handling the camera. As the art form and technology evolved, a separation between director and camera operator emerged. With the advent of artificial lighting and faster (more light sensitive) film stocks, in addition to technological advancements in optics, the technical aspects of cinematography necessitated a specialist in that area.
Cinematography was key during the silent movie era—with no sound apart from background music and no dialogue, the films depended on lighting, acting, and set.
In 1919, in Hollywood, the new motion picture capital of the world, one of the first (and still existing) trade societies was formed: the American Society of Cinematographers (ASC), which stood to recognize the cinematographer's contribution to the art and science of motion picture making. Similar trade associations have been established in other countries, too.
The ASC defines cinematography as:
a creative and interpretive process that culminates in the authorship of an original work of art rather than the simple recording of a physical event. Cinematography is not a subcategory of photography. Rather, photography is but one craft that the cinematographer uses in addition to other physical, organizational, managerial, interpretive and image-manipulating techniques to effect one coherent process.[6]

Aspects

Numerous aspects contribute to the art of cinematography, including:

Image sensor and film stock

Cinematography can begin with rolls of film or a digital image sensor. Advancements in film emulsion and grain structure provided a wide range of available film stocks. The selection of a film stock is one of the first decisions made in preparing a typical film production.
Aside from the film gauge selection — 8 mm (amateur), 16 mm (semi-professional), 35 mm (professional) and 65 mm (epic photography, rarely used except in special event venues) — the cinematographer has a selection of stocks in reversal (which, when developed, create a positive image) and negative formats along with a wide range of film speeds (varying sensitivity to light) from ISO 50 (slow, least sensitive to light) to 800 (very fast, extremely sensitive to light) and differing response to color (low saturation, high saturation) and contrast (varying levels between pure black (no exposure) and pure white (complete overexposure).
Advancements and adjustments to nearly all gauges of film created the "super" formats wherein the area of the film used to capture a single frame of an image is expanded, although the physical gauge of the film remains the same. Super 8 mm, Super 16 mm and Super 35 mm all utilize more of the overall film area for the image than their "regular" non-super counterparts.
The larger the film gauge, the higher the overall image resolution clarity and technical quality.
The techniques used by the film laboratory to process the film stock can also offer a considerable variance in the image produced. By controlling the temperature and varying the duration in which the film is soaked in the development chemicals and by skipping certain chemical processes (or partially skipping all of them), cinematographers can achieve very different looks from a single film stock in the laboratory. Some techniques that can be used are push processing, bleach bypass and cross processing.
Although the majority of cinema still uses film, much of modern cinema uses digital cinematography and has no film stocks[citation needed], but the cameras themselves can be adjusted in ways that go far beyond the abilities of one particular film stock. They can provide varying degrees of color sensitivity, image contrast, light sensitivity and so on. One camera can achieve all the various looks of different emulsions, although it is heavily argued as to which method of capturing an image is the "best" method. Digital image adjustments (ISO, contrast etc.) are executed by estimating the same adjustments that would take place if actual film were in use, and are thus vulnerable to the cameras sensor designers perceptions of various film stocks and image adjustment parameters.

Filters

Filters, such as diffusion filters or color-effect filters, are also widely used to enhance mood or dramatic effects. Most photographic filters are made up of two pieces of optical glass glued together with some form of image or light manipulation material between the glass. In the case of color filters, there is often a translucent color medium pressed between two planes of optical glass. Color filters work by blocking out certain color wavelengths of light from reaching the film. With color film, this works very intuitively wherein a blue filter will cut down on the passage of red, orange and yellow light and create a blue tint on the film. In black-and-white photography, color filters are used somewhat counter intuitively; for instance a yellow filter, which cuts down on blue wavelengths of light, can be used to darken a daylight sky (by eliminating blue light from hitting the film, thus greatly underexposing the mostly blue sky), while not biasing most human flesh tone. Certain cinematographers, such as Christopher Doyle, are well known for their innovative use of filters. Filters can be used in front of the lens or, in some cases, behind the lens for different effects.

Lens

Lenses can be attached to the camera to give a certain look, feel, or effect by focus, color, etc.
As does the human eye, the camera creates perspective and spatial relations with the rest of the world. However, unlike one's eye, a cinematographer can select different lenses for different purposes. Variation in focal length is one of the chief benefits. The focal length of the lens determines the angle of view and, therefore, the field of view. Cinematographers can choose from a range of wide-angle lenses, "normal" lenses and long focus lenses, as well as macro lenses and other special effect lens systems such as borescope lenses. Wide-angle lenses have short focal lengths and make spatial distances more obvious. A person in the distance is shown as much smaller while someone in the front will loom large. On the other hand, long focus lenses reduce such exaggerations, depicting far-off objects as seemingly close together and flattening perspective. The differences between the perspective rendering is actually not due to the focal length by itself, but by the distance between the subjects and the camera. Therefore, the use of different focal lengths in combination with different camera to subject distances creates these different rendering. Changing the focal length only while keeping the same camera position doesn't affect perspective but the camera angle of view only. A Zoom lens allows a camera operator to change their focal length within a shot or quickly between setups for shots. As prime lenses offer greater optical quality and are "faster" (larger aperture openings, usable in less light) than zoom lenses, they are often employed in professional cinematography over zoom lenses. Certain scenes or even types of filmmaking, however, may require the use of zooms for speed or ease of use, as well as shots involving a zoom move. As in other photography, the control of the exposed image is done in the lens with the control of the diaphragm aperture. For proper selection, the cinematographer needs that all lenses be engraved with T-Stop, not f-stop, so that the eventual light loss due to the glass doesn't affect the exposure control when setting it using the usual meters. The choice of the aperture also affects image quality (aberrations) and depth of field (see below).

Depth of field and focus

 stern looking man and a woman sit on the right side of a table with documents on the table. A top hat is on the table. An unkempt man stands to the left of the picture. In the background a boy can be seen through a window playing in the snow.
A deep focus shot from Citizen Kane (1941): everything, including the hat in the foreground and the boy (young Charles Foster Kane) in the distance, is in sharp focus.
Focal length and diaphragm aperture affect the depth of field of a scene — that is, how much the background, mid-ground and foreground will be rendered in "acceptable focus" (only one exact plane of the image is in precise focus) on the film or video target. Depth of field (not to be confused with depth of focus) is determined by the aperture size and the focal distance. A large or deep depth of field is generated with a very small iris aperture and focusing on a point in the distance, whereas a shallow depth of field will be achieved with a large (open) iris aperture and focusing closer to the lens. Depth of field is also governed by the format size. If one considers the field of view and angle of view, the smaller the image is, the shorter the focal length should be, as to keep the same field of view. Then, the smaller the image is, the more depth of field is obtained, for the same field of view. Therefore, 70mm has less depth of field than 35mm for a given field of view, 16mm more than 35mm, and video cameras even more depth of field than 16mm. As videographers try to emulate the look of 35 mm film with digital cameras, this is one issue of frustration - excessive depth of field with digital cameras and using additional optical devices to reduce that depth of field.
In Citizen Kane (1941), cinematographer Gregg Toland and director Orson Welles used tighter apertures to create ring every detail of the foreground and background of the sets in sharp focus. This practice is known as deep focus. Deep focus became a popular cinematographic device from the 1940s onwards in Hollywood. Today, the trend is for more shallow focus.
To change the plane of focus from one object or character to another within a shot is commonly known as a rack focus.

Aspect ratio and framing

The aspect ratio of an image is the ratio of its width to its height. This can be expressed either as a ratio of 2 integers, such as 4:3, or in a decimal format, such as 1.33:1 or simply 1.33.
Different ratios provide different aesthetic effects. Standards for aspect ratio have varied significantly over time.
During the silent era, aspect ratios varied widely, from square 1:1, all the way up to the extreme widescreen 4:1 Polyvision. However, from the 1910s, silent motion pictures generally settled on the ratio of 4:3 (1.33). The introduction of sound-on-film briefly narrowed the aspect ratio, to allow room for a sound stripe. In 1932 a new standard was introduced, the Academy ratio of 1.37, by means of thickening the frame line.
For years, mainstream cinematographers were limited to using the Academy ratio, but in the 1950s, thanks to the popularity of Cinerama, widescreen ratios were introduced in an effort to pull audiences back into the theater and away from their home television sets. These new widescreen formats provided cinematographers a wider frame within which to compose their images.
Many different proprietary photographic systems were invented and utilized in the 1950s to create widescreen movies, but one dominates film today: the anamorphic process, which optically squeezes the image to photograph twice the horizontal area to the same size vertical as standard "spherical" lenses.
The first commonly used anamorphic format was CinemaScope, which used a 2.35 aspect ratio, although it was originally 2.55. CinemaScope was used from 1953 to 1967, but due to technical flaws in the design and its ownership by Fox, several third-party companies, led by Panavision's technical improvements in the 1950s, now dominate the anamorphic cine lens market.
Changes to SMPTE projection standards altered the projected ratio from 2.35 to 2.39 in 1970, although this did not change anything regarding the photographic anamorphic standards; all changes in respect to the aspect ratio of anamorphic 35 mm photography are specific to camera or projector gate sizes, not the optical system.
After the "widescreen wars" of the 1950s, the motion-picture industry settled into 1.85 as a standard for theatrical projection in the United States and the United Kingdom. This is a cropped version of 1.37. Europe and Asia opted for 1.66 at first, although 1.85 has largely permeated these markets in recent decades. Certain "epic" or adventure movies utilized the anamorphic 2.39.
In the 1990s, with the advent of high-definition video, television engineers created the 1.78 (16:9) ratio as a mathematical compromise between the theatrical standard of 1.85 and television's 1.33, as it was not practical to produce a traditional CRT television tube with a width of 1.85. Until that point, nothing had ever been originated in 1.78. Today, this is a standard for high-definition video and for widescreen television. Some cinema films are now shot using HDTV cameras.

Lighting

Light is necessary to create an image exposure on a frame of film or on a digital target (CCD, etc.). The art of lighting for cinematography goes far beyond basic exposure, however, into the essence of visual storytelling. Lighting contributes considerably to the emotional response an audience has watching a motion picture.

Camera movement

Main article: Cinematic techniques
Camera on a small motor vehicle representing a large one
Cinematography can not only depict a moving subject but can use a camera, which represents the audience's viewpoint or perspective, that moves during the course of filming. This movement plays a considerable role in the emotional language of film images and the audience's emotional reaction to the action. Techniques range from the most basic movements of panning (horizontal shift in viewpoint from a fixed position; like turning your head side-to-side) and tilting (vertical shift in viewpoint from a fixed position; like tipping your head back to look at the sky or down to look at the ground) to dollying (placing the camera on a moving platform to move it closer or farther from the subject), tracking (placing the camera on a moving platform to move it to the left or right), craning (moving the camera in a vertical position; being able to lift it off the ground as well as swing it side-to-side from a fixed base position), and combinations of the above.
Cameras have been mounted to nearly every imaginable form of transportation.
Most cameras can also be handheld, that is held in the hands of the camera operator who moves from one position to another while filming the action. Personal stabilizing platforms came into being in the late 1970s through the invention of Garrett Brown, which became known as the Steadicam. The Steadicam is a body harness and stabilization arm that connects to the camera, supporting the camera while isolating it from the operator's body movements. After the Steadicam patent expired in the early 1990s, many other companies began manufacturing their concept of the personal camera stabilizer.

Special effects

Main article: Special effect
The first special effects in the cinema were created while the film was being shot. These came to be known as "in-camera" effects. Later, optical and digital effects were developed so that editors and visual effects artists could more tightly control the process by manipulating the film in post-production.
For examples of many in-camera special effects, see the work of early filmmaker Georges Méliès.

Frame rate selection

Main article: Frame rate
Motion picture images are presented to an audience at a constant speed. In the theater it is 24 frames per second, in NTSC (US) Television it is 30 frames per second (29.97 to be exact), in PAL (Europe) television it is 25 frames per second. This speed of presentation does not vary.
However, by varying the speed at which the image is captured, various effects can be created knowing that the faster or slower recorded image will be played at a constant speed.
For instance, time-lapse photography is created by exposing an image at an extremely slow rate. If a cinematographer sets a camera to expose one frame every minute for four hours, and then that footage is projected at 24 frames per second, a four hour event will take 10 seconds to present, and one can present the events of a whole day (24 hours) in just one minute.
The inverse of this, if an image is captured at speeds above that at which they will be presented, the effect is to greatly slow down (slow motion) the image. If a cinematographer shoots a person diving into a pool at 96 frames per second, and that image is played back at 24 frames per second, the presentation will take 4 times as long as the actual event. Extreme slow motion, capturing many thousands of frames per second can present things normally invisible to the human eye, such as bullets in flight and shockwaves travelling through media, a potentially powerful cinematographical technique.
In motion pictures the manipulation of time and space is a considerable contributing factor to the narrative storytelling tools. Film editing plays a much stronger role in this manipulation, but frame rate selection in the photography of the original action is also a contributing factor to altering time. For example, Charlie Chaplin's Modern Times was shot at "silent speed" (18 fps) but projected at "sound speed" (24 fps), which makes the slapstick action appear even more frenetic.
Speed ramping, or simply "ramping", is a process whereby the capture frame rate of the camera changes over time. For example, if in the course of 10 seconds of capture, the capture frame rate is adjusted from 60 frames per second to 24 frames per second, when played back at the standard film rate of 24 frames per second, a unique time-manipulation effect is achieved. For example, someone pushing a door open and walking out into the street would appear to start off in slow-motion, but in a few seconds later within the same shot the person would appear to walk in "realtime" (normal speed). The opposite speed-ramping is done in The Matrix when Neo re-enters the Matrix for the first time to see the Oracle. As he comes out of the warehouse "load-point", the camera zooms in to Neo at normal speed but as it gets closer to Neo's face, time seems to slow down, foreshadowing the manipulation of time itself within the Matrix later in the movie.

Personnel

A camera crew from the First Motion Picture Unit
In descending order of seniority, the following staff are involved:
In the film industry, the cinematographer is responsible for the technical aspects of the images (lighting, lens choices, composition, exposure, filtration, film selection), but works closely with the director to ensure that the artistic aesthetics are supporting the director's vision of the story being told. The cinematographers are the heads of the camera, grip and lighting crew on a set, and for this reason they are often called directors of photography or DPs. In British tradition, if the DOP actually operates the camera him/herself they are called the cinematographer. On smaller productions it is common for one person to perform all these functions alone. The career progression usually involves climbing up the ladder from seconding, firsting, eventually to operating the camera.
Directors of photography make many creative and interpretive decisions during the course of their work, from pre-production to post-production, all of which affect the overall feel and look of the motion picture. Many of these decisions are similar to what a photographer needs to note when taking a picture: the cinematographer controls the film choice itself (from a range of available stocks with varying sensitivities to light and color), the selection of lens focal lengths, aperture exposure and focus. Cinematography, however, has a temporal aspect (see persistence of vision), unlike still photography, which is purely a single still image. It is also bulkier and more strenuous to deal with movie cameras, and it involves a more complex array of choices. As such a cinematographer often needs to work co-operatively with more people than does a photographer, who could frequently function as a single person. As a result, the cinematographer's job also includes personnel management and logistical organization. Given the in-depth knowledge a cinematographer requires not only of his or her own craft but also that of other personnel, formal tuition in analogue or digital filmmaking can be advantageous.[7]

Evolution of technology: new definitions

Traditionally the term "cinematography" referred to working with motion-picture film emulsion, but it is now largely synonymous with videography and digital video due to the popularity of digital cinematography.
Modern digital image processing has also made it possible to radically modify pictures from how they were originally captured. This has allowed new disciplines to encroach on some of the choices that were once the cinematographer's exclusive domain.

See also

Wikimedia Commons has media related to Cinema.
Today with digital goes well behind manual focus, and lense changes. Now settings on light, image, saturation and more done with the camera as well as in post.
Shots:
Wide Shot: a wide shot of all the action. Used for establishing scene, larger events, showing relationship in space and more.
Midshot: is a shot from the waste up.
Try to have each shot convey more information than the last…
Closeup: head and shoulders or shot of the face. Can mean close shot of action.
Storyboard: a rough illustration of your planned shots. Helps to break down script and make visual decisions.
Framing: put major lines or object one third up, one third down or one third across. Horizon line is uncomfortable if in the center .
Looking Room: give them room to look through. And be aware of eye line. Our eyes do not like to see things in the middle of a frame. It is distracting and uncomfortable.
Eye Line: Needs to be match shots. Can be jarring if they do not match. Not always logical in how you shoot…usually the actor has to “cheat”.
Camera Angel: create moods, impressions and character traits.
Match Shots: have costume, actions, props all the same so they edit as a single shot, even if they are shot days or weeks apart.
Films are shot out of order: often by location or shot…a location or shot needed at another time or place in the film may be easier to shoot if you are already there or it is near where you were just shooting.
Wide Lense: used for wide shots.
Zoom Lense: Out is a WIDE LENSE, in is a LONG LENSE. But for quality most filmmakers use difference lenses for each shot rather than a single zoom lense. Never se electronic artificial zoom if you want full quality on the final product (can be used for a reason).
Camera Movement: Smooth, focused. Dolly, crane, train, steadicam.
Zoom is a two dementional movement on the subject
Tracking shot has camera actually moved to give a different perspective on subject.
Handheld without steadycam works best is subject is moving. Camera is usually too light, so add weight.
Autofocus is not used in filmmaking (except documentary).
Pulling focus is adjusting the focus during the shot.
Each successive still picure is a “frame”. Some shoot 25 frames a second, others 30,some 35 and 70.
Lighting, set design, sound, continuity (keeping things the same so shot match), costume, make-up, craft services (the munchies on the set), catering, transportation and many other key elements to into a production.



The Camera
During filming, one of three types of lenses is used: wide-angle, normal, or telephoto. Often all three are used at different times within the same film. Each type of lens has different properties and creates different images.

Choice of lens, aperture (or opening), and film stock largely determine the depth of field, or distance in front of the camera in which all objects are in focus.

Diffusers may be placed in front of a light source or in front of a camera lens to soften lines in the subject, to glamorize, or to lend a more spiritual or ethereal look.

Camera distance helps determine how large the subject will appear within the frame, what details will be noticeable, and what will be excluded from the frame.

By changing the camera lens and the camera distance between shots or during a shot, filmmakers can change perspective: the relative size and apparent depth of subjects and setting in the photographic image.

The angle from which the subject is filmed influences the expressiveness of the images. There are four basic camera angles—bird’s-eye view, high angle, eye-level angle, and low angle—and countless other angles in between.

In point-of-view (p.o.v.) shots, the camera films a subject from the approximate position of someone, or occasionally something, in the film. Such camera placements may contribute to the viewer’s identification with one of the subjects and sense of participation in the action.

A motion-picture camera may remain in one place during filming. While filming with a camera fixed in one place, the camera may be pivoted up or down (tilting) or rotated sideways (panning).

Panning too quickly causes blurred footage. Such a result is called a swish pan.

Ways to move the camera around during filming include dollying, tracking, using a crane, and employing a Steadicam. Like other aspects of cinematography, camera movement can be used in countless expressive ways.

Digital Cinematography
Film and video images can be scanned or transferred into a computer, changed there, and transferred back to film. Computers can be used to modify colors and contrast (digital intermediate), correct errors, and change the images in ways impossible or more troublesome and costly to do with film alone.

Mainly for reasons of economy and convenience, more and more movies are being filmed in high-definition video and transferred to film for theatrical showings, though the results do not yet match the detail and nuance of the best film stocks.


III. Music in A Film
Score:  The music that is the underlying theme and background for a film. Score is often used to bring out emotion, related to a character or an event. It may be used for continuity and enhancement of imagery and story telling. Some directors prefer little scoring, some use it as a background exclusively, others use music as a major part of their film. Among the most famous contemporary composers of film music is John Williams, without whose scores more Spielberg, Lucas and other filmmaker work of the 1970’s to current day would simply not the the same (see Jaws video clip).

Songs. Films have long used contemporary music and music hit artist as a way to lure people to the box office, or to enhance and underline portions of imagry and plot. Woody Allen uses jazz, James Bond film use hit performers singing title songs for  each release, John Ford preferred folk music that appeared to be contemporary to the films themselves.
From Wikipedia: A film score (also sometimes called film music, background music, or incidental music) is original music written specifically to accompany a film. The score forms part of the film's soundtrack, which also usually includes dialogue and sound effects, and comprises a number of orchestral, instrumental or choral pieces called cues which are timed to begin and end at specific points during the film in order to enhance the dramatic narrative and the emotional impact of the scene in question.[1] Scores are written by one or more composers, under the guidance of, or in collaboration with, the film's director and/or producer, and are then usually performed by an ensemble of musicians – most often comprising an orchestra or band, instrumental soloists, and choir or vocalists – and recorded by a sound engineer.
Film scores encompass an enormous variety of styles of music, depending on the nature of the films they accompany. The majority of scores are orchestral works rooted in Western classical music, but a great number of scores also draw influence from jazz, rock, pop, blues, new-age and ambient music, and a wide range of ethnic and world music styles. Since the 1950s, a growing number of scores have also included electronic elements as part of the score, and many scores written today feature a hybrid of orchestral and electronic instruments.[2]
Since the invention of digital technology and audio sampling, many low-budget films have been able to rely on digital samples to imitate the sound of live instruments, and many scores are created and performed wholly by the composers themselves, by using sophisticated music composition software.
Songs are usually not considered part of the film's score,[3] although songs do also form part of the film's soundtrack. Although some songs, especially in musicals, are based on thematic ideas from the score (or vice-versa), scores usually do not have lyrics, except for when sung by choirs or soloists as part of a cue. Similarly, pop songs which are "needle dropped" into a specific scene in film for added emphasis are not considered part of the score, although occasionally the score's composer will write an original pop song based on his themes, such as James Horner's "My Heart Will Go On" from Titanic, written for Celine Dion.

Process of creation

Spotting

The composer usually enters the creative process towards the end of filming, at around the same time as the film is being edited, although on some occasions the composer is on hand during the entire film shoot, especially when actors are required to perform with or be aware of original diegetic music. The composer is shown an unpolished "rough cut" of the film, before the editing is completed, and talks to the director or producer about what sort of music is required for the film in terms of style and tone. The director and composer will watch the entire film, taking note of which scenes require original music. During this process the composer will take precise timing notes so that he or she knows how long each cue needs to last, where it begins, where it ends, and of particular moments during a scene with which the music may need to coincide in a specific way. This process is known as "spotting".[4]
Occasionally, a film maker will actually edit his film to fit the flow of music, rather than the other way around, which is the norm. Director Godfrey Reggio edited his films Koyaanisqatsi and Powaqqatsi based on composer Philip Glass's music.[5] Similarly, the relationship between director Sergio Leone and composer Ennio Morricone was such that the finale of The Good, the Bad and the Ugly and the films Once Upon a Time in the West and Once Upon a Time in America were edited to Morricone's score as the composer had prepared it months before the film's production ended.[6]
In another notable example, the finale of Steven Spielberg's E.T. the Extra-Terrestrial was edited to match the music of his long-time collaborator John Williams: as recounted in a companion documentary on the DVD, Spielberg gave Williams complete freedom with the music and asked him to record the cue without picture; Spielberg then re-edited the scene later to match the music.
In some circumstances, a composer will be asked to write music based on his or her impressions of the script or storyboards, without seeing the film itself, and is given more freedom to create music without the need to adhere to specific cue lengths or mirror the emotional arc of a particular scene. This approach is usually taken by a director who does not wish to have the music comment specifically on a particular scene or nuance of a film, and which can instead be inserted into the film at any point the director wishes during the post-production process. Composer Hans Zimmer was asked to write music in this way in 2010 for director Christopher Nolan's film Inception;[7] composer Gustavo Santaolalla did the same thing when he wrote his Oscar-winning score for Brokeback Mountain.[8]

Syncing

When writing music for film, one goal is to sync dramatic events happening on screen with musical events in the score. There are many different methods for syncing music to picture. These include using sequencing software to calculate timings, using mathematic formulas and free timing with reference timings. Composers work using SMPTE timecode for syncing purposes.[9]
When syncing music to picture, generally a leeway of 3-4 frames late or early allows the composer to be extremely accurate. Using a technique called Free Timing, a conductor will use either (a) a stop watch or studio size stopclock, or (b) watch the film on a screen or video monitor while conducting the musicians to predetermined timings. These are represented visually by vertical lines (streamers) and bursts of light called punches. These are put on the film by the Music Editor at points specified by the composer. In both instances the timings on the clock or lines scribed on the film have corresponding timings which are also at specific points (beats) in the composer/conductor score.

Written click track

A written click track is a method of writing bars of music in consistent time values (i.e. 4 beats in :02 seconds) to establish a constant tempo in lieu of a metronome value (e.g. 88 Bpm). A composer would use a written click if they planned to conduct live performers. When using other methods such as a metronome, the conductor has a perfectly spaced click playing in his ear which he conducts to. This can yield stiff and lifeless performances in slower more expressive cues. One can convert a standard BPM value to a written click where X represents the number of beats per bar, and W represents time in seconds, by using the following equation:
frac{60}{bpm}(x)=W
Written clicks are expressed using 1/3 second increments, so the next step is to round the decimal to either 0, 1/3, or 2/3 of a second. The following is an example for 88 BPM:
frac{60}{88}(4)=2.72
2.72 rounds to 2.66, so the written click is 4 beats in :02 seconds.
Once the composer has identified the location in the film they wish to sync with musically, they must determine the musical beat this event occurs on. To find this, they use the following equation, where bpm is beats per minute, sp is the sync point in real-time (i.e. 33.7 seconds), and B is the beat number in 1/3 increments (i.e. 49).
frac{bpm(sp)}{60}+1=B

Writing

Once the spotting session has been completed and the precise timings of each cue determined, the composer will then work on writing the score. The methods of writing the score vary from composer to composer; some composers prefer to work with a traditional pencil and paper, writing notes by hand on a staff and performing works-in-progress for the director on a piano, while other composers write on computers using sophisticated music composition software such as Digital Performer, Logic Pro, Finale, Cubase, or Protools.[10] Working with software allows composers to create MIDI-based demos of cues, called MIDI mockups, for review by the filmmaker prior to the final orchestral recording.
The length of time a composer has to write the score varies from project to project; depending on the post-production schedule, a composer may have as little as two weeks, or as much as three months to write the score. In normal circumstances, the actual writing process usually lasts around six weeks from beginning to end.
The actual musical content of a film score is wholly dependent on the type of film being scored, and the emotions the director wishes the music to convey. A film score can encompass literally thousands of different combinations of instruments, ranging from full symphony orchestral ensembles to single solo instruments to rock bands to jazz combos, along with a multitude of ethnic and world music influences, soloists, vocalists, choirs and electronic textures. The style of the music being written also varies massively from project to project, and can be influenced by the time period in which the film is set, the geographic location of the film's action, and even the musical tastes of the characters. As part of their preparations for writing the score the composer will often research different musical techniques and genres as appropriate for that specific project; as such, it is not uncommon for established film composers to be proficient at writing music in dozens of different styles.

Orchestration

Once the music has been written, it must then be arranged or orchestrated in order for the ensemble to be able to perform it. The nature and level of orchestration varies from project to project and composer to composer, but in its basic form the orchestrator's job is to take the single-line music written by the composer and "flesh it out" into instrument-specific sheet music for each member of the orchestra to perform.
Some composers, notably Ennio Morricone, orchestrate their own scores themselves, without using an additional orchestrator. Some composers provide intricate details in how they want this to be accomplished, and will provide the orchestrator with copious notes outlining which instruments are being asked to perform which notes, giving the orchestrator no personal creative input whatsoever beyond re-notating the music on different sheets of paper as appropriate. Other composers are less detailed, and will often ask orchestrators to "fill in the blanks", providing their own creative input into the makeup of the ensemble, ensuring that each instrument is capable of performing the music as written, and even allowing them to introduce performance techniques and flourishes to enhance the score. In many cases, time constraints determined by the film's post-production schedule dictate whether composers orchestrate their own scores, as it is often impossible for the composer to complete all the required tasks within the timeframe allowed.
Over the years several orchestrators have become linked to the work of one particular composer, often to the point where one will not work without the other. Examples of enduring composer-orchestrator relationships include Jerry Goldsmith with Arthur Morton, Alexander Courage and Herbert W. Spencer; Miklos Rozsa with Eugene Zador; Alfred Newman with Edward Powell, Ken Darby and Hugo Friedhofer; Danny Elfman with Steve Bartek; David Arnold with Nicholas Dodd; Basil Poledouris with Greig McRitchie; and Elliot Goldenthal with Robert Elhai. Others have become orchestrators-for-hire, and work with many different composers over the course of their careers; examples of prominent film music orchestrators include Pete Anthony, Jeff Atmajian, Brad Dechter, Bruce Fowler, John Neufeld, Thomas Pasatieri, Conrad Pope, Nic Raine and J.A.C. Redford.
Once the orchestration process has been completed, the sheet music is physically printed onto paper by one or more music copyists, and is ready for performance.

Recording

When the music has been composed and orchestrated, the orchestra or ensemble then performs it, often with the composer conducting. Musicians for these ensembles are often uncredited in the film or on the album and are contracted individually (and if so, the orchestra contractor is credited in the film or the soundtrack album). However, some films have recently begun crediting the contracted musicians on the albums under the name Hollywood Studio Symphony after an agreement with the American Federation of Musicians. Other performing ensembles that are often employed include the London Symphony Orchestra (performing film music since 1935)[11] the City of Prague Philharmonic Orchestra (an orchestra dedicated mostly to recording), the BBC Philharmonic, and the Northwest Sinfonia.[citation needed]
The orchestra performs in front of a large screen depicting the movie, and sometimes to a series of clicks called a "click-track" that changes with meter and tempo, assisting the conductor to synchronize the music with the film.[12]
More rarely, the director will talk to the composer before shooting has started, so as to give more time to the composer or because the director needs to shoot scenes (namely song or dance scenes) according to the final score. Sometimes the director will have edited the film using "temp (temporary) music": already published pieces with a character that the director believes to fit specific scenes.

Elements of a film score

Temp tracks

In some instances, film composers have been asked by the director to imitate a specific composer or style present in the temp track.[13] On other occasions, directors have become so attached to the temp score that they decide to use it and reject the original score written by the film composer. One of the most famous cases is Stanley Kubrick's 2001: A Space Odyssey, where Kubrick opted for existing recordings of classical works, including pieces by composer György Ligeti rather than the score by Alex North,[14] although Kubrick had also hired Frank Cordell to do a score. Other examples include Torn Curtain (Bernard Herrmann),[15] Troy (Gabriel Yared),[16] Pirates of the Caribbean: The Curse of the Black Pearl (Alan Silvestri),[17] Peter Jackson's King Kong (Howard Shore),[18] and The Bourne Identity (Carter Burwell).[19]

Structure

Films often have different themes for important characters, events, ideas or objects, an idea often associated with Wagner's use of leitmotif.[20] These may be played in different variations depending on the situation they represent, scattered amongst incidental music. An example of this technique is John Williams' score for the Star Wars saga, and the numerous themes associated with characters like Darth Vader, Luke Skywalker, and Princess Leia Organa (see Star Wars music for more details).[21] The Lord of the Rings trilogy uses a similar technique, with recurring themes for many main characters and places. Others are less known by casual moviegoers, but well known among score enthusiasts, such as Jerry Goldsmith's underlying theme for the Borg in Star Trek: First Contact, or his Klingon theme from Star Trek: The Motion Picture which other composers carry over into their Klingon motifs, and he has brought back on numerous occasions as the theme for Worf, Star Trek: The Next Generation's most prominent Klingon.[citation needed] Michael Giacchino employed character themes in the soundtrack for the 2009 animated film Up, for which he received the Academy Award for Best Score. His orchestral soundtrack for the television series Lost also depended heavily on character and situation-specific themes.
In 1983, a non-profit organization, the Society for the Preservation of Film Music, was formed to preserve the "byproducts" of creating a film score:[22] the music manuscripts (written music) and other documents and studio recordings generated in the process of composing and recording scores which, in some instances, have been discarded by the movie studios. The written music must be kept to perform the music on concert programs and to make new recordings of it. Sometimes only after decades has an archival recording of a film score been released on CD.

Source music

Most films have between 40 and 120 minutes of music. However, some films have very little or no music; others may feature a score that plays almost continuously throughout. Dogme 95 is a genre that has music only from sources within a film, such as from a radio or television. This is called "source music" (or a "source cue") because it comes from an on screen source that can actually be seen or that can be inferred (in academic film theory such music is called "diegetic" music, as it emanates from the "diegesis" or "story world").[23] An example of "source music" is the use of the Frankie Valli song "Can't Take My Eyes Off You" in Michael Cimino's The Deer Hunter. Alfred Hitchcock's 1963 thriller The Birds is an example of a Hollywood film with no non-diegetic music whatsoever.

Relation with directors

Sometimes, a composer may unite with a director by composing the score for many films of a same director. For example, Danny Elfman did the score for all the movies directed by Tim Burton, with the exception of Ed Wood (score by Howard Shore) and Sweeney Todd: The Demon Barber of Fleet Street (score by Stephen Sondheim). Other examples are John Williams and Steven Spielberg, Jerry Goldsmith with Joe Dante and Franklin Schaffner, Ennio Morricone with Sergio Leone, Mauro Bolognini and Giuseppe Tornatore, Alan Silvestri and Robert Zemeckis, Angelo Badalamenti and David Lynch, James Newton Howard and M. Night Shyamalan, Éric Serra and Luc Besson, Patrick Doyle and Kenneth Branagh, Howard Shore and David Cronenberg, Carter Burwell and Joel & Ethan Coen, Hans Zimmer and Christopher Nolan, Harry Gregson-Williams and Tony Scott, and Clint Mansell and Darren Aronofsky.

Production music

Main article: Production music
Many companies such as Jingle Punks, Associated Production Music, VideoHelper and Extreme Music provide music to various film, TV and commercial projects for a fee. Sometimes called library music, the music is owned by production music libraries and licensed to customers for use in film, television, radio and other media. Unlike popular and classical music publishers, who typically own less than 50 percent of the copyright in a composition, music production libraries own all of the copyrights of their music, meaning that it can be licensed without seeking the composer's permission, as is necessary in licensing music from normal publishers. This is because virtually all music created for music libraries is done on a work for hire basis.[citation needed] Production music is therefore a very convenient medium for media producers – they can be assured that they will be able to license any piece of music in the library at a reasonable rate.
Production music libraries will typically offer a broad range of musical styles and genres, enabling producers and editors to find much of what they need in the same library. Music libraries vary in size from a few hundred tracks up to many thousands. The first production music library was set up by De Wolfe in 1927 with the advent of sound in film, the company originally scored music for use in silent film.[28] Another music library was set up by Ralph Hawkes of Boosey & Hawkes Music Publishers in the 1930s.[29] APM, the largest US library, has over 250,000 tracks.[30]

See also

ortal icon

Film music organizations

Film music review sites

Independent specialist original soundtrack recording labels

Journals


IV. Film Editing
See documentary on Film Editing included with course materials.

The invention of film editing (1903) is the equivalent to the invention of flight...both changed society forever.

Every individual frame is important, and it is the “cutter” or film editor who decides which ones disappear and are never seen again (with video some do show up as “blooper” reels or in “expanded” versions of a project).

Editing is the process of bringing movement and life to the film, taking shots and takes and assembling them in a way that tells the story, invokes emotion, informs the audience and takes advantage of all the other elements of film.

The editor assembles close-ups, flashbacks, parallel action, match shots, tight cut, inference, image manipulation, all make a film work, and if done well, unnoticed by the audience.

From Wikipedia, the free encyclopedia
A film editor at work in 1928
Film editing is part of the creative post-production process of filmmaking. The term film editing is derived from the traditional process of working with film, but now it increasingly involves the use of digital technology.
The film editor works with the raw footage, selecting shots and combining them into sequences to create a finished motion picture. Film editing is described as an art or skill, the only art that is unique to cinema, separating filmmaking from other art forms that preceded it, although there are close parallels to the editing process in other art forms like poetry or novel writing. Film editing is often referred to as the "invisible art"[1] because when it is well-practiced, the viewer can become so engaged that he or she is not even aware of the editor's work. On its most fundamental level, film editing is the art, technique, and practice of assembling shots into a coherent sequence. The job of an editor isn’t simply to mechanically put pieces of a film together, cut off film slates, or edit dialogue scenes. A film editor must creatively work with the layers of images, story, dialogue, music, pacing, as well as the actors' performances to effectively "re-imagine" and even rewrite the film to craft a cohesive whole. Editors usually play a dynamic role in the making of a film. Sometimes, auteur film directors edit their own films. Notable examples are Akira Kurosawa and the Coen brothers.
With the advent of digital editing, film editors and their assistants have become responsible for many areas of filmmaking that used to be the responsibility of others. For instance, in past years, picture editors dealt only with just that—picture. Sound, music, and (more recently) visual effects editors dealt with the practicalities of other aspects of the editing process, usually under the direction of the picture editor and director. However, digital systems have increasingly put these responsibilities on the picture editor. It is common, especially on lower budget films, for the assistant editors or even the editor to cut in music, mock up visual effects, and add sound effects or other sound replacements. These temporary elements are usually replaced with more refined final elements by the sound, music, and visual effects teams hired to complete the picture.
Film editing is an art that can be used in diverse ways. It can create sensually provocative montages; become a laboratory for experimental cinema; bring out the emotional truth in an actor's performance; create a point of view on otherwise obtuse events; guide the telling and pace of a story; create an illusion of danger where there is none; give emphasis to things that would not have otherwise been noted; and even create a vital subconscious emotional connection to the viewer, among many other possibilities.

Contents

History

Early films were short films that were one long, static, and locked-down shot. Motion in the shot was all that was necessary to amuse an audience, so the first films simply showed activity such as traffic moving on a city street. There was no story and no editing. Each film ran as long as there was film in the camera.
Screenshot from Scrooge, or, Marley's Ghost, the first film to feature multiple exposures.
The use of film editing to establish continuity, involving action moving from one sequence into another, is attributed to British film pioneer Robert W. Paul's Come Along, Do!, made in 1898 and one of the first films to feature more than one shot.[2] In the first shot, an elderly couple is outside an art exhibition having lunch and then follow other people inside through the door. The second shot shows what they do inside. Paul's 'Cinematograph Camera No. 1' of 1896 was the first camera to feature reverse-cranking, which allowed the same film footage to be exposed several times and thereby to create super-positions and multiple exposures. This technique was first used in his 1901 film Scrooge, or, Marley's Ghost.
The further development of action continuity in multi-shot films continued in 1899-1900 at the Brighton School in England, where it was definitively established by George Albert Smith and James Williamson. In that year Smith made Seen Through the Telescope, in which the main shot shows street scene with a young man tying the shoelace and then caressing the foot of his girlfriend, while an old man observes this through a telescope. There is then a cut to close shot of the hands on the girl's foot shown inside a black circular mask, and then a cut back to the continuation of the original scene.
Excerpt from the movie Fire! directed by James Williamson
Even more remarkable was James Williamson's Attack on a China Mission Station, made around the same time in 1900. The first shot shows the gate to the mission station from the outside being attacked and broken open by Chinese Boxer rebels, then there is a cut to the garden of the mission station where a pitched battle ensues. An armed party of British sailors arrive and defeat the Boxers and rescue the missionary's family. The film used the first "reverse angle" cut in film history. The scene continues with the sailors
James Williamson concentrated on making films taking action from one place shown in one shot to the next shown in another shot in films like Stop Thief! and Fire!, made in 1901, and many others. He also experimented with the close-up, and made perhaps the most extreme one of all in The Big Swallow, when his character approaches the camera and appears to swallow it. These two film makers of the Brighton School also pioneered the editing of the film; they tinted their work with color and used trick photography to enhance the narrative. By 1900, their films were extended scenes of up to 5 minutes long.[3]
Scene from The Great Train Robbery (1903), directed by Edwin Stanton Porter
Other filmmakers then took up all these ideas including the American Edwin S. Porter, who started making films for the Edison Company in 1901. Porter worked on a number of minor films before making Life of an American Fireman in 1903. The film was the first American film with a plot, featuring action, and even a closeup of a hand pulling a fire alarm. The film comprised a continuous narrative over seven scenes, rendered in a total of nine shots.[4] He put a dissolve between every shot, just as Georges Méliès was already doing, and he frequently had the same action repeated across the dissolves. His film, The Great Train Robbery (1903), had a running time of twelve minutes, with twenty separate shots and ten different indoor and outdoor locations. He used cross-cutting editing method to show simultaneous action in different places.
These early film directors discovered important aspects of motion picture language: that the screen image does not need to show a complete person from head to toe and that splicing together two shots creates in the viewer's mind a contextual relationship. These were the key discoveries that made all non-live or non live-on-videotape narrative motion pictures and television possible—that shots (in this case whole scenes since each shot is a complete scene) can be photographed at widely different locations over a period of time (hours, days or even months) and combined into a narrative whole.[5] That is, The Great Train Robbery contains scenes shot on sets of a telegraph station, a railroad car interior, and a dance hall, with outdoor scenes at a railroad water tower, on the train itself, at a point along the track, and in the woods. But when the robbers leave the telegraph station interior (set) and emerge at the water tower, the audience believes they went immediately from one to the other. Or that when they climb on the train in one shot and enter the baggage car (a set) in the next, the audience believes they are on the same train.
Sometime around 1918, Russian director Lev Kuleshov did an experiment that proves this point. (See Kuleshov Experiment) He took an old film clip of a head shot of a noted Russian actor and intercut the shot with a shot of a bowl of soup, then with a child playing with a teddy bear, then with a shot an elderly woman in a casket. When he showed the film to people they praised the actor's acting—the hunger in his face when he saw the soup, the delight in the child, and the grief when looking at the dead woman.[6] Of course, the shot of the actor was years before the other shots and he never "saw" any of the items. The simple act of juxtaposing the shots in a sequence made the relationship.
The original editing machine: an upright Moviola.

Film editing technology

Before the widespread use of non-linear editing systems, the initial editing of all films was done with a positive copy of the film negative called a film workprint (cutting copy in UK) by physically cutting and pasting together pieces of film, using a splicer and threading the film on a machine with a viewer such as a Moviola, or "flatbed" machine such as a K.-E.-M. or Steenbeck. Today, most films are edited digitally (on systems such as Avid or Final Cut Pro) and bypass the film positive workprint altogether. In the past, the use of a film positive (not the original negative) allowed the editor to do as much experimenting as he or she wished, without the risk of damaging the original.
When the film workprint had been cut to a satisfactory state, it was then used to make an edit decision list (EDL). The negative cutter referred to this list while processing the negative, splitting the shots into rolls, which were then contact printed to produce the final film print or answer print. Today, production companies have the option of bypassing negative cutting altogether. With the advent of digital intermediate ("DI"), the physical negative does not necessarily need to be physically cut and hot spliced together; rather the negative is optically scanned into computer(s) and a cut list is conformed by a DI editor.

Post-production

Main article: Post-production

Editor's cut

Main article: Editor's cut
There are several editing stages and the editor's cut is the first. An editor's cut (sometimes referred to as the "Assembly edit" or "Rough cut") is normally the first pass of what the final film will be when it reaches picture lock. The film editor usually starts working while principal photography starts. Likely, prior to cutting, the editor and director will have seen and/or discussed "dailies" (raw footage shot each day) as shooting progresses. Screening dailies gives the editor a ballpark idea of the director's intentions. Because it is the first pass, the editor's cut might be longer than the final film. The editor continues to refine the cut while shooting continues, and often the entire editing process goes on for many months and sometimes more than a year, depending on the film.

Director's cut

Main article: Director's cut
When shooting is finished, the director can then turn his or her full attention to collaborating with the editor and further refining the cut of the film. This is the time that is set aside where the film editor's first cut is molded to fit the director's vision. In the United States, under DGA rules, directors receive a minimum of ten weeks after completion of principal photography to prepare their first cut.
While collaborating on what is referred to as the "director's cut", the director and the editor go over the entire movie in great detail; scenes and shots are re-ordered, removed, shortened and otherwise tweaked. Often it is discovered that there are plot holes, missing shots or even missing segments which might require that new scenes be filmed. Because of this time working closely and collaborating – a period that is normally far longer, and far more intimately involved, than the entire production and filming – most directors and editors form a unique artistic bond.

Final cut

Main article: Final cut privilege
Often after the director has had his chance to oversee a cut, the subsequent cuts are supervised by one or more producers, who represent the production company and/or movie studio. There have been several conflicts in the past between the director and the studio, sometimes leading to the use of the "Alan Smithee" credit signifying when a director no longer wants to be associated with the final release.

Continuity

Continuity is a film term that suggests that a series of shots should be physically continuous, as if the camera simply changed angles in the course of a single event. For instance, if in one shot a beer glass is empty, it should not be full in the next shot. Live coverage of a sporting event would be an example of footage that is very continuous. Since the live operators are cutting from one live feed to another, the physical action of the shots matches very closely. Many people regard inconsistencies in continuity as mistakes, and often the editor is blamed. In film, however, continuity is very nearly last on a film editor's list of important things to maintain.
Technically, continuity is the responsibility of the script supervisor and film director, who are together responsible for preserving continuity and preventing errors from take to take and shot to shot. The script supervisor, who sits next to the director during shooting, keeps the physical continuity of the edit in mind as shots are set up. He is the editor's watchman. If shots are taken out of sequence, as is often the case, he will be alert to make sure that that beer glass is in the appropriate state. The editor utilizes the script supervisor's notes during post-production to log and keep track of the vast amounts of footage and takes that a director might shoot.

Methods of montage

In motion picture terminology, a montage (from the French for "putting together" or "assembly") is a film editing technique.
There are at least three senses of the term:
  1. In French film practice, "montage" has its literal French meaning (assembly, installation) and simply identifies editing.
  2. In Soviet filmmaking of the 1920s, "montage" was a method of juxtaposing shots to derive new meaning that did not exist in either shot alone.
  3. In classical Hollywood cinema, a "montage sequence" is a short segment in a film in which narrative information is presented in a condensed fashion.

Soviet montage

Main article: Soviet montage theory
Lev Kuleshov was among the very first to theorize about the relatively young medium of the cinema in the 1920s. For him, the unique essence of the cinema — that which could be duplicated in no other medium — is editing. He argues that editing a film is like constructing a building. Brick-by-brick (shot-by-shot) the building (film) is erected. His often-cited Kuleshov Experiment established that montage can lead the viewer to reach certain conclusions about the action in a film. Montage works because viewers infer meaning based on context.
Although, strictly speaking, U.S. film director D.W. Griffith was not part of the montage school, he was one of the early proponents of the power of editing — mastering cross-cutting to show parallel action in different locations, and codifying film grammar in other ways as well. Griffith's work in the teens was highly regarded by Kuleshov and other Soviet filmmakers and greatly influenced their understanding of editing.
Sergei Eisenstein was briefly a student of Kuleshov's, but the two parted ways because they had different ideas of montage. Eisenstein regarded montage as a dialectical means of creating meaning. By contrasting unrelated shots he tried to provoke associations in the viewer, which were induced by shocks.

Montage sequence

Main article: Montage sequence
A montage sequence consists of a series of short shots that are edited into a sequence to condense narrative. It is usually used to advance the story as a whole (often to suggest the passage of time), rather than to create symbolic meaning. In many cases, a song plays in the background to enhance the mood or reinforce the message being conveyed. One famous example of montage was seen in the 1968 film 2001: A Space Odyssey, depicting the start of man's first development from apes to humans. Another example that is employed in many films is the sports montage. The sports montage shows the star athlete training over a period of time, each shot having more improvement then the last. Classic examples include Rocky and the Karate Kid.

Continuity editing

Main article: continuity editing
What became known as the popular 'classical Hollywood' style of editing was developed by early European and American directors, in particular D.W. Griffith in his films such as The Birth of a Nation and Intolerance. The classical style ensures temporal and spatial continuity as a way of advancing narrative, using such techniques as the 180 degree rule, Establishing shot, and Shot reverse shot.

Alternatives to continuity editing (non-traditional or experimental)

Early Russian filmmakers such as Lev Kuleshov further explored and theorized about editing and its ideological nature. Sergei Eisenstein developed a system of editing that was unconcerned with the rules of the continuity system of classical Hollywood that he called Intellectual montage.
Alternatives to traditional editing were also the folly of early surrealist and dada filmmakers such as Luis Buñuel (director of the 1929 Un Chien Andalou) and René Clair (director of 1924's Entr'acte which starred famous dada artists Marcel Duchamp and Man Ray). Both filmmakers, Clair and Buñuel, experimented with editing techniques long before what is referred to as "MTV style" editing.
The French New Wave filmmakers such as Jean-Luc Godard and François Truffaut and their American counterparts such as Andy Warhol and John Cassavetes also pushed the limits of editing technique during the late 1950s and throughout the 1960s. French New Wave films and the non-narrative films of the 1960s used a carefree editing style and did not conform to the traditional editing etiquette of Hollywood films. Like its dada and surrealist predecessors, French New Wave editing often drew attention to itself by its lack of continuity, its demystifying self-reflexive nature (reminding the audience that they were watching a film), and by the overt use of jump cuts or the insertion of material not often related to any narrative.

Editing techniques

Vsevolod Pudovkin noted that the editing process is the one phase of production that is truly unique to motion pictures. Every other aspect of film making originated in a different medium than film (photography, art direction, writing, sound recording), but editing is the one process that is unique to film.[citation needed] Kubrick was quoted as saying: "I love editing. I think I like it more than any other phase of film making. If I wanted to be frivolous, I might say that everything that precedes editing is merely a way of producing film to edit."[7]
Edward Dmytryk lays out seven "rules of cutting" that a good editor should follow:[8]
  • "Rule 1: NEVER make a cut without a positive reason."
  • "Rule 2: When undecided about the exact frame to cut on, cut long rather than short."[9]
  • "Rule 3: Whenever possible cut 'in movement'."[10]
  • "Rule 4: The 'fresh' is preferable to the 'stale'."[11]
  • "Rule 5: All scenes should begin and end with continuing action."[12]
  • "Rule 6: Cut for proper values rather than proper 'matches'."[13]
  • "Rule 7: Substance first—then form."[14]
According to Walter Murch, when it comes to film editing, there are six main criteria for evaluating a cut or deciding where to cut. They are (in order of importance, most important first, with notional percentage values.):
  • Emotion (51%) — Does the cut reflect what the editor believes the audience should be feeling at that moment?
  • Story (23%) — Does the cut advance the story?
  • Rhythm (10%) — Does the cut occur "at a moment that is rhythmically interesting and 'right'" (Murch, 18)?
  • Eye-trace (7%) — Does the cut pay respect to "the location and movement of the audience's focus of interest within the frame" (Murch, 18)?
  • Two-dimensional plane of the screen (5%) — Does the cut respect the 180 degree rule?
  • Three-dimensional space of action (4%) — Is the cut true to the physical/spatial relationships within the diegesis?
Murch assigned the notional percentage values to each of the criteria. "Emotion, at the top of the list, is the thing that you should try to preserve at all costs. If you find you have to sacrifice certain of those six things to make a cut, sacrifice your way up, item by item, from the bottom."-Murch
According to writer-director Preston Sturges:
[T]here is a law of natural cutting and that this replicates what an audience in a legitimate theater does for itself. The more nearly the film cutter approaches this law of natural interest, the more invisible will be his cutting. If the camera moves from one person to the next at the exact moment that one in the legitimate theatre would have turned his head, one will not be conscious of a cut. If the camera misses by a quarter of a second, one will get a jolt. There is one other requirement: the two shots must be approximately of the same tone value. At any given time, the camera must point at the exact spot the audience wishes to look at. To find that spot is absurdly easy: one has only to remember where one was looking at the time the scene was made.[15]

See also

Two Editing Tables

V. Mise en scène
Film Tools and Techniques Introduction

In this and other publications, the term mise en scène signifies the major aspects filmmaking shares with staging a play. It refers to the selection of setting, subjects, and composition of each shot. Normally in complex film productions, the director makes final decisions about mise en scène.

Settings

A setting is the place where filmed action occurs. It is either a set, which has been built for use in the film, or a location, which is any place other than a film studio that is used for filming.

Depending on the needs of the scene, settings may be limbo (indistinct), realistic, or nonrealistic.

A setting can be the main subject of a shot or scene but usually is not. Settings often reveal the time and place of a scene, create or intensify moods, and help reveal what people (in a documentary film) or characters (in a fictional film) are like. Throughout a film, changes in settings can also mirror changes in situations and moods.

Subjects

In films, fictional characters or real people are the usual subjects, and their actions and appearances help reveal their nature.

Performers may be stars, Method actors, character actors, or nonprofessional actors. There is some overlap among these categories: a star, for example, may also be a Method actor. Depending on the desired results, actors may be cast by type or against type.

Usually film actors must perform their scenes out of order, in brief segments, and often after long waits.

Effective performances may depend on the script, casting, direction, editing, and music. There is no one type of effective performance: what is judged effective depends in part on the viewers’ culture and the film’s style or its manner of representing its subject.

Composition: The Uses of Space

Filmmakers, especially cinematographers and directors, decide the shape of the overall image. They also decide how to use the space within an image. They decide when and how to use empty space and what will be conveyed by the arrangement of significant subjects on the sides of the frame, in the foreground, or in the background. Filmmakers also decide if compositions are to be symmetrical or asymmetrical.

Composition influences what viewers see positioned in relationship to the subject and how the subject is situated within the frame; what information is revealed to viewers that the characters do not know; and what viewers learn about the characters’ personalities or situations.

Many films are seen in an aspect ratio (or shape of the image) other than the one the filmmakers intended, and the compositions, meanings, and moods conveyed are thus altered.

Mise en Scène and the World outside the Frame

Mise en scène can be used to promote a political viewpoint or commercial product (the latter practice is called product placement).

Mise en scène can be used to parody human behavior or a text (such as a film). It can also be used to pay homage or tribute to an earlier text or part of one.

From Phillips, William H (2013 4th ed.) "Film and Introduction" Unversity of Wisconsin-Wau Claire, Bedford St. Martins. Boston / New York



p. 16
Figure 1.8
The Cabinet of Dr. Caligari
See the scene beginning approximately 1 minute and 40 seconds into the following clip:
http://www.youtube.com/watch?v=xrg73BUxJLI
pp. 27, 29
Method acting
This site has many specifics about Method acting from a theatrical group based in St. Louis. Includes sections on relaxation, sense memory, concentration, affective memory, objects, and other subjects, along with interviews and an annotated bibliography.
http://www.theatrgroup.com/Method
p. 53
For examples of product placement in specific movies, see:
http://www.brandchannel.com/brandcameo_films.asp
Text Box:

No comments: