Radio Production Unit 4

Download as pptx, pdf, or txt
Download as pptx, pdf, or txt
You are on page 1of 29

BASICS OF RADIO

PROGRAMMING AND
PRODUCTION
UNIT 4
POST PRODUCTION AND EVALUATION

 POST PRODUCTION DEFINE:


 Audio post production is the general term for all stages of production happening
between the actual recording in a studio and the completion of a master recording.
 It involves sound design, sound editing, audio mixing, and the addition of effects.
 EDITING AND MIXING:
 The basic purpose of editing is to put an audio or video programme together with
clarity, continuity and impact, and is an interesting manner.

 BASIC STEPS TO BE REMEMBERED FOR EDITING:


 Preview your prerecorded audio or video materials carefully and patiently once,
twice and even more if you have time.
 Make a proper log sheet an note down all important points and precise details that
come to your mind.
 Take some time to ponder over recorded materials and re-clarify your ideas about the
overall shape of the programme - its central theme, its objectives, style, music, pace,
its organization, its beginning and end etc.
 Take a decision about what is important and relevant to the purpose of your
programme and what is not.
 Discard all such portion or footage, however beautiful, as does not contribute to the
theme of your programme.
 Select only most effective and good quality sequences and shots for your final version.
 Look for any missing gaps and re-record or re-shoot some more essential material, if it
can fill the gaps and can add to the quality and purpose of your programme.
 Now, you have a clear idea about the final shape or overall story of your programme
and develop the final edit-script.
 That is: the precise order and continuity of audio bits, video shots, of sound and music,
use of transitions, cut-aways and reaction shots that can achieve a smooth flow and
desired effect.
 You are now ready to edit actually.
 Estimate how much time you need for editing.
 Try to finish it in one go.
 While editing, stick to your final editing-script as far as possible.

 STAGES OF EDITING:
 The editing process takes place in several steps or phases ball for radio and television.
 These are:
 Recording or shooting phases.
 Review (Listening and Viewing) Phase.
 Decision - Making Phase.
 Final or Operational Stage.

 RECORDING OR SHOOTING STAGE:


 In a way, the bulk of audio or video editing is largely predetermined by the way the
material is recorded or shot.
 For example, to allow for convenient edits at the post-production stage, it is advisable
to let an audio or video shot to continue silent for just a few seconds.
 This will facilitate to bring in a designed transition and proper audio/video
continuity while joining it to the next shot or sequence.
 REVIEW PHASE:
 This phase is essentially concerned with the listening and viewing of the prerecorded
audio/video materials for their quality and suitability.
 In this phase the producer is required to listen, view and time the audio or video
programme from beginning to end and prepare a detailed ‘LOG SHEET’, giving a
brief description of end shot or portion and marking ‘Good’ or ‘NG’ (No Good).
 The review of intervals automatically leads you to the next phase i.e. the decision
making phase.

 DECISION-MAKING PHASE:
 At this stage, the whole programme storylies bare before you
of course in disconnected sequences.
 Now you have a little more time to think and contemplate on the course of your
editing in a rather patient way.
 Studying, listening and viewing the raw materials-individual shots and sequence-you
begin to decide on the final shot sequence.
 It is at this stage that you re-clarify your ideas about the programme.
 FINAL OPERATIONAL STAGE:
 The operational phase refers to the process in which the planned edits are actually
performed using the edit script as a reference.
 Editing audio or video – can be best learnt during the actual process with hands on
the materials and the machines.
 Today, a variety of models and types of editing equipment, including computerized
and digital control units are available.
 In actual editing phase, it is always important to estimate your editing time in
advance.
 Book for all facilities and machines you need and all tapes, log sheets and edit
scripts
must kept ready by your side.
 Ideally, the editing task for a programme must be so planned that it can be
accomplished in one go, without interruption.
TYPES OF SOUND EDITING

 MECHANICAL/LINEAR SOUND EDITING:


 Before computers came into wide use for sound editing in the 1990s, everything was
done with magnetic tape.
 To make edits using magnetic tape, you literally had to cut the tape, remove the
piece of audio that you didn’t want and splice the tape back together again.
 The machine of choice for mechanical audio editing was the reel-to-reel tape
recorder.
 With this piece of equipment, you could record and playback audio from circular
reels of magnetic audiotape.
 You also needed several pieces of specialized editing equipment: a razor blade, an
editing block and editing tape.
 DIGITAL/NON- LINEAR SOUND EDITING:
 Digital audio workstations are multi-track systems that greatly simplify and enhance
the sound editing process for all types of professional audio production.
 Digital audio workstations vary greatly in size, price and complexity.
 The most basic systems are simply software applications that can be loaded onto a
standard personal computer.
 More professional systems, like Digi Design’s Pro Tools, require a special sound card
and are typically used in conjunction with large digital mixing boards and are
compatible with hundreds of effects and virtual instrument plug-ins.
 In digital editing digital file formats and increased computer processing speed, the
total amount of tracks is limitless.
 Besides multiple dialogue tracks, an editor can add dozens of background effects and
layers and layers of Foley and music.
 Multiple tracks can be cut, copied, pasted, trimmed and faded at once.
 Each track comes with dozens of controls for volume, stereo panning and effects,
which greatly simplifies the mixing process.
SOUND EDITOR FUNCTIONS

 COMPRESSION:
 It shows you how to perform the most important editing process—reducing your file
sizes for the Web.
 W hen working with sound files, there are two completely
different types of compression.
 One type decreases the size of the sound file and the other reduces the dynamic
range of a signal.

 CUTTING:
 It describes how to delete unwanted sections of your sound file.
 One of the easiest ways to reduce the size of your sound file and improve the general
quality of sound is to simply delete unwanted sections or random noise within your
sound clip.
 EQUALIZING:
 It shows you how to obtain a proper ratio of treble to bass in your sound files.
 While some people enjoy cranking up the bass on their home or car stereos, it isn't a
good idea to do the same with an audio file for the Web.
 NORMALIZE:
 It describes how to boost or tone down the levels of your audio file.
 Normalizing increases the level of the entire sound file so that the loudest part of the
sound is at the maximum playback level before distortion; it then increases the rest of
the sound proportionality.
 CHANGING PLAYBACK RATE:
 It describes how to alter how fast or slow your sound is.
 To create Alvin and the Chipmunk-types of effects or turn your sound into a s-l-o-w
motion sound, try speeding up or slowing down your sound file with your audio
editor.
DIFFERENT AUDIO FORMATS

 WAV:
 It standard audio file format used mainly in Windows PCs.
 Commonly used for storing uncompressed (PCM), CD-quality sound files, which
means that they can be large in size - around 10MB per minute of music.
 MP3:
 The M P E G Layer-3 format is the most popular format for downloading and storing
music.
 AIFF:
 The standard audio file format used by Apple.
 It is like a wav file for the Mac.
 MIDI - MUSICAL INSTRUMENT DIGITAL INTERFACE (.MID):
 Short for musical instrument digital interface, M I D I is a standard adopted by the
electronic music industry for controlling devices, such as synthesizers and sound
cards, that emit music.
ADDING SOUND EFFECTS AND
MUSIC

 SOUND EFFECTS:
 Sound effects are artificially created or enhanced sounds, or sound processes used to
emphasize artistic or other content of films, television shows, live performance,
animation, video games, music, or other media.
 In motion picture and television production, a sound effect is a sound recorded and
presented to make a specific storytelling or creative point without the use of dialogue
or music.
 The term often refers to a process applied to a recording, without necessarily referring
to the recording itself.
 In professional motion picture and television production, dialogue, music, and sound
effects recordings are treated as separate elements.
 The most realistic sound effects originate from original sources; the closest sound to
machine-gun fire that we can replay is an original recording of actual machine guns.
 In music and radio production, typical effects used in recording and amplified
performances are:

 ECHO:
 It is to simulate the effect of reverberation in a large hall or cavern, one or several
delayed signals are added to the original signal.
 To be perceived as echo, the delay has to be of order 50 milliseconds or above.
 FLANGER:
 It is to create an unusual sound, a delayed signal is added to the original signal with
a
continuously-variable delay (usually smaller than 10 ms).
 This effect is now done electronically using DSP, but originally the effect was created
by playing the same recording on two synchronized tape players, and then mixing
the signals together.
 CHORUS:
 A delayed signal is added to the original signal with a constant delay.
 The delay has to be short in order not to be perceived as echo, but above 5 ms to be
audible.
 If the delay is too short, it will destructively interfere with the un-delayed signal and
create a flanging effect.
 COMPRESSION:
 The reduction of the dynamic range of a sound to avoid unintentional fluctuation in
the dynamics.
 Level compression is not to be confused with audio data compression, where the
amount of data is reduced without affecting the amplitude of the sound it represents.
 3D AUDIO EFFECTS:
 It place sounds outside the stereo basis.
 REVERSE ECHO:
 A swelling effect created by reversing an audio signal and recording echo and/or
delay whilst the signal runs in reverse.

 MODULATION:
 It is to change the frequency or amplitude of a carrier signal in relation to a
predefined signal.

 RESONATORS:
 It emphasize harmonic frequency content on specified frequencies.
 ROBOTIC VOICE EFFECTS:
 It is used to make an actor's voice sound like a synthesized human voice.
 FILTERING:
 In the general sense, frequency ranges can be emphasized or attenuated using low-
pass, high-pass, band-pass or band-stop filters.
 Band-pass filtering of voice can simulate the effect of a telephone because telephones
use band-pass filters.

 OVERDRIVE:
 It effects such as the use of a fuzz box can be used to produce distorted sounds, such
as for imitating robotic voices or to simulate distorted radio telephone.
 PITCH SHIFT:
 It similar to pitch correction, this effect shifts a signal up or down in pitch.
 For example, a signal may be shifted an octave up or down.
 This is usually applied to the entire signal and not to each note separately.
 TIME STRETCHING:
 It is the opposite of pitch shift, that is, the process of changing the speed of an audio
signal without affecting its pitch.
MUSIC
 DEFINE:
 Music is the soul of radio.
 Music is an art form whose medium is sound.
 Music is also used as signature tunes or theme music of various radio programmes.
 Music adds colour and life to any spoken word programme.
 Music can break monotony.
 Music can suggest scenes and locations.
 ELEMENTS OF MUSIC:
 PITCH:
 Pitch represents the perceived fundamental frequency of a sound.
 It is one of the four major auditory attributes of sounds along with loudness, timbre
and sound source location.
 RHYTHM:
 Rhythm a "movement marked by the regulated succession of strong and weak
elements, or of opposite or different conditions."
 While rhythm most commonly applies to sound, such as music and spoken language,
it may also refer to visual presentation, as "timed movement through space.“
 TIMBRE:
 In music, timbre is the quality of a musical note or sound or tone that distinguishes
different types of sound production, such as voices or musical instruments.
 The physical characteristics of sound that mediate the perception of timbre include
spectrum and envelope.
 TEXTURE:
 In music, texture is the way the melodic, rhythmic, and harmonic materials are
combined in a composition determining the overall quality of sound of a piece.
 Texture is often described in regards to the density, or thickness, and range, or width
between lowest and highest pitches, in relative terms as well as more specifically
distinguished according to the number of voices, or parts, and the relationship
between these voices.
 KINDS OF MUSIC:
 RAP:
 Rap is a fast singing rhyming kind of music.
 It is the latest kind of music.
 COUNTRY:
 Not a lot of kids listen to country music.
 It’s a typical old kind of music.
 ROCK:
 Rock is a kind of music that you will usually use drums, keyboards, and electric
guitars and Rock singers sing very loud.
 DISCO:
 A lot of kids liked this music years ago.
 People take disco and mix it with rap.
 POP:
 Pop is like a regular kind of music.
 Kids listen to it.
 Sometimes when you listen to pop, you can hear two of every kind of instrument
from each family of instruments.
 FAMILIES OF MUSICAL INSTRUMENTS:
 STRING:
 Instruments are instruments that have strings.
 All you have to do is pluck the strings.
 They are made of different materials.
 Examples of string instruments: Harp, Guitar, Cello, Viola, Violin, Mandolin, Eass.
 WOODWINDS:
 Instruments are instruments that you blow in and they make music.
 Each instrument has a lot of different holes on top to hold so you can make music.
 BRASS:
 Instruments are instruments that are made from brass.
 Most of them are long.
 They make different tones.
 They have buttons of slides to make noises.
 You have to blow in them.
 PERCUSSION:
 Instruments are instruments that you have to hit to make different music.
 Percussion instruments are like a drum and piano.
AUDIO FILTERS: TYPES, NEED AND
IMPORTANCE

 DEFINE:
 An audio filter is a frequency dependent amplifier circuit, working in the audio
frequency range, 0 Hz to beyond 20 kHz.
 Audio filters can amplify ("boost"), pass or attenuate ("cut") some frequency ranges.
 Many types of filters exist for different audio applications including hi-fi stereo
systems, musical synthesizers, sound effects, sound reinforcement systems,
instrument amplifiers and virtual reality systems.

 TYPES OF AUDIO FILTERS:


 HIGH-PASS FILTER:
 It is a filter that passes high frequencies well but attenuates (i.e., reduces the
amplitude of) frequencies lower than the filter's cutoff frequency.
 The actual amount of attenuation for each frequency is a design parameter of the
filter.
 It is sometimes called a low-cut filter or bass-cut filter.
 LOW-PASS FILTER:
 Low-Pass Filter is a filter that passes low-frequency signals but attenuates (reduces
the amplitude of) signals with frequencies higher than the cutoff frequency.
 The actual amount of attenuation for each frequency varies from filter to filter.
 It is sometimes called a high-cut filter, or treble cut filter when used in audio
applications.
 A low-pass filter is the opposite of a high-pass filter, and a band-pass filter is a
combination of a low-pass and a high-pass.
 Low-pass filters exist in many different forms, including electronic circuits digital
filters for smoothing sets of data, acoustic barriers, blurring of images, and so on.
 The moving average operation used in fields such as finance is a particular kind of low-
pass filter, and can be analyzed with the same signal processing techniques as are
used for other low-pass filters.
 Low-pass filters provide a smoother form of a signal, removing the short-term
fluctuations, and leaving the longer-term trend.
 BAND-PASS FILTER:
 Band-Pass Filter is a device that passes frequencies within a certain range and rejects
(attenuates) frequencies outside that range.
 An example of an analogue electronic band pass filter is an R L C circuit (a resistor–
inductor–capacitor circuit).
 These filters can also be created by combining a low-pass filter with a high-pass
filter.

 LINEAR FILTER :
 Linear Filter applies a linear operator to a time-varying input signal.
 Linear filters are very common in electronics and digital signal processing, but they
can also be found in mechanical engineering and other technologies.
 They are often used to eliminate unwanted frequencies from an input signal or to
select a desired frequency among many others.
 There are a wide range of types of filters and filter technologies, of which this article
will present an overview.
 Regardless of whether they are electronic, electrical, or mechanical, or what frequency
ranges or timescales they work on, the mathematical theory of linear filters is
universal.
 EQUALIZATION FILTER:

 Equalization Filter is a filter, usually adjustable, designed to compensate for the


unequal frequency response of some other signal processing circuit or system.

 In audio engineering, the E Q filter is more often used creatively to alter the
frequency response characteristics of a musical source or a sound mix.
 An E Q filter typically allows the user to adjust one or more parameters that
determine the overall shape of the filter's transfer function.

 It is generally used to improve the fidelity of sound, to emphasize certain instruments,


to remove undesired noises, or to create completely new and different timbres.

 Equalizers may be designed with peaking filters, shelving filters, band pass filters, or
high-pass and low-pass filters.

 Dynamic range circuitry can be linked with an E Q filter to make timbre changes only
after a signal passes an amplitude threshold, or to dynamically increase or reduce
amplitude based on the level of a frequency band.

 Such circuitry is involved in de-essing and in pop-filtering.


 NOTE:
 DISTORT:
 Unwanted distortion is caused by a signal which is "too strong".
 If an audio signal level is too high for a particular component to cope with, then
parts
of the signal will be lost.
 This results in the rasping distorted sound.
 Some times distort is used along with echo.
 IMPORTANT NOTE:
 Distortion can occur at almost any point in the audio pathway, from the microphone
to the speaker.
 The first priority is to find out exactly where the problem is.
 Ideally, you would want to measure the signal level at as many points as possible,
using a V U (Volume Unit) mètre or similar device.
 DISTORTION’S FACTORS:
 Is the distortion coming from a microphone?
 Are any volume or gain controls in your system turned up suspiciously high?
 Could the distortion be caused by faulty equipment?
 Are your speakers being driven too hard?
 If the distortion is coming from occasional peaking, consider adding a
compressor.

 EVALUATION: PROCESS AND MEASUREMENT TECHNIQUES:

 Programmes can be assessed from several viewpoints.


 POST AND QUALITY EVALUATION.
 AUDIENCE EVALUATION.
 COST EVALUATION.

You might also like