Listening - in the ear of the beholder
Listening - consciousness studies
We process sound events in the ear-to-brain mechanism. There is a central processing of sound in the auditory pathways . The brain has learned patterns for thousands of different sound events, such as rain, speech, trains and music. You even hear things that your ears did not. In 'Music Matters', David Elliott writes, 'we don't hear music as it is, we hear it as we are'. http://www.davidelliottmusic.com/musicmat/index.html There are 30,000 fibres in the auditory nerve. They carry information about every sound event we hear. After one-twentieth of a second, the messages reach the cortex of the hearing part of the brain (in the temporal lobe). There, the conscious perception of organized sound takes place. We do not hear a sound until the message reaches consciousness. While this coded signal is on its way, it undergoes a great deal of processing. This is like a computer, but much more complicated.
LOUDNESS IS SUBJECTIVE: There is no simple one-to-one correspondence between physical amplitude (increased amplification leading to a greater movement in the radio or or TV loudspeakers) and the loudness we experience.
There is a distinction between the amplitude of a sound pressure wave (measurable objective property of the sound) and its loudness (subjective property of the sound - mental activity in the brain).
See Perceptual filling-in - necessary compensation by the listener for what is blurred or indistinct or omitted
Ashley Scott's system (below on this page) - EMITTER, EMMISSION, LISTENING SUBJECT Difference between how we listen in the Lifeworld and listening to radio plays (below) See Perceptual filling-in - necessary compensation by the listener for what is blurred or indistinct or omitted See Cocktail Party Effect (on another page) Film Sound Terminology at http://www.filmsound.org/terminology.htm Some links about the science of hearing (below on this page) Randy Thom articles about film sound at http://www.filmsound.org/randythom/ Sound - sound effects - sound events - physics of sound (on another page) Two Gary Ferrington articles (below) Speed of listening (below)
What are the differences between seeing and hearing? (below) Mistakes of hearing & ambiguity of hearing (below) McGurk effect (illusion) - hearing AND seeing talk (on another page) Some points from Murray Schafer's The Soundscape (below)
Listening as a secondary activity - see audiences (reception theory) - on another page
Plack, Christopher, 2005, The Sense of Hearing, London: Lawrence Erlbaum Associates, Publishers - (below) Academic debate about the senses and how they connect - see Internalist Model of the senses (on another page) Sound paradise for the listener - created by the radio drama director (on another page) LOUDNESS - A Web Tutorial by Mark Huckvale at http://www.phon.ucl.ac.uk/cgi-bin/wtutor PITCH - A Web Tutorial by Mark Huckvale at http://www.phon.ucl.ac.uk/cgi-bin/wtutor
"Sound travels extremely quickly, at about 340 meters per second, and so takes, at most, about 700 millionths of a second to cross the head." The auditory system is a complex sensory system with seven (or more) distinct processing levels.
(See about hearing and binaural hearing at http://www.ihr.gla.ac.uk/research/binaural_notes.php)
'The eye is superficial, the ear is profound and inventive.'
Bresson, Robert, Notes on Cinematography, trans. Griffin, Jonathan, 1977, New York: Urizen Books, p.39.
Scott, Ashley, 'Extract from 'Modes for Listening', a Work in Progress' http://audiolabo.free.fr/revue1999/content/scott.htm
There is a chain from the sound event (emission from a sound source) to sound waves (the acoustic stage) to the ear.
Scott suggests the following:
a) an EMITTER (ER): a material object, machine or agent which, due to some form of excitation or work, transmits energy as audible vibrations. The ER is the sound source, or 'sounding object'. It belongs in the order of action or motion.
b) an EMISSION (EN): what is given off, produced by the emitter due to some action. The EN is often characterised as a material (especially in a recorded, spatial state), but let us characterise it as an articulation of sound waves, to distinguish it from the 'stronger' force involved with the ER.
c) a LISTENING SUBJECT (SU): the subject is here dealt with as the site of perception, interpretations and judgement. These are the vital interior functions which will enter the account. For the SU, one can say that there exists an order of aspects which describes the subject's interrogation of the EN.
This constitutes what will be taken as a fundamental case, even though it is but one of a number of action/signal/perceptual chains which might be invented to study the transmission of sound sequences. The above chain plots the itinerary of 'a sound event' from source to its 'final destination': someone's ear.
The divisions of this itinerary are demonstrated in phonetics, where the vocal apparatus constitutes an articulatory stage (ER); the sound waves, an acoustic stage (EN); and the hearing apparatus, an auditory stage (SU).
Gary Ferrington, 'Keep Your Ear-Lids Open' Journal of Visual Literacy (1994) at http://interact.uoregon.edu/Medialit/wfae/library/articles/ferrington_earlids.pdf
(in the World Forum for Acoustic Ecology (WFAE) Library)
Gary Ferrington, 'Audio Design: Creating Multi-Sensory Images For The Mind' in Journal of Visual Literary (1993) at http://interact.uoregon.edu/Medialit/wfae/library/articles/ferrington_design.pdf
(in the World Forum for Acoustic Ecology (WFAE) Library)
Gary Ferrington - other articles in the World Forum for Acoustic Ecology (WFAE) Library at http://interact.uoregon.edu/MediaLit/WFAE/library/articles/index.html
Note Ferrington's distinction between hearing and listening.
Here is more:
From - Smith, Bruce R., 1999, The Acoustic World of Early Modern England, London: University of Chicago Press, page 6
You can hear sound waves within a certain range of frequencies and a certain range of amplitudes, but you choose to listen with varying degrees of attention. About hearing you have no choice: you can shut off vision by closing your eyes, but from birth to death, in waking and in sleep, the coils of flesh, the tiny bones, the hair cells, the nerve fibers are always at the ready. To listen, however, is a choice.
What's more, you can choose how to listen (Truax 1984: 19-24). At your most alert, you can listen in search of something you want to hear.
A perfect example is speech. What we hear when someone speaks is a stream of constantly changing sounds in which consonant sounds merge with vowel sounds merge with consonant sounds merge with vowel sounds, according to a principle speech physiologists call "coarticulation." What we listen for when someone speaks is a series of discrete, recognizable sounds. Hearing is a physiological constant, listening is a psychological variable.
(Truax, Barry 1984. Acoustic Communication. Norwood, NJ: Ablex. )
We are often disappointed with reality .....
Beck, Alan, 2002, The Death of Radio? An essay in radio-philosophy for the digital age, electronic book published by Sound Journal, http://www.savoyhill.co.uk/deathofradio/
As in all audio works, there is a difference between natural sound (that of an actual source) and the sound that meets everyone's expectations, characteristic sound - manipulated sound achieving the effect of what it should be, according to someone's perception of the sound. Carlsson, defining these terms in his 'Film Sound Theory' site, gives the example of the natural sound for a .38 pistol - 'sounding like an "anemic" cap pistol', and what the audience hear, the film manipulated FX - a '"dramatic ricochet" with gradually decreasing pitch'. Our perception of natural sounds is influenced by our 'ideology about sounds' or what he also calls the 'contract of convention'. '[W]e are often disappointed with reality' and the choice for the director is between 'authenticity and dramatic impact', with a preference for the latter.
See Sven Carlsson, SOUNDS OF ILLUSIONS, http://www.filmsound.org/articles/moving_images.htm
See http://www.filmsound.org/ for more
How does sound relate to space?
From - Smith, Bruce R., 1999, The Acoustic World of Early Modern England, London: University of Chicago Press, pages 7-8
Sound is a periodic displacement of molecules in the air. Now closer together, now farther apart, the molecules set up a sound wave that takes a certain number of microseconds to complete a cycle. The number of times per second a vibration pattern repeats itself constitutes its frequency. That much is a matter of time But the wave also takes place in space: the vibration exerts pressure on the air molecules, and 8/ the greater that pressure happens to be, the greater the displacement of air molecules. The degree of the molecules' displacement constitutes that sound wave's amplitude. Through all the variables, cultural as well as individual, one thing is certain: sound is inescapable. It is as pervasive as the air that constitutes its primary medium.
Susanne K. Langer, Mind: An Essay on Human Feeling, vol.2 (Baltimore: Johns Hopkins University Press, 1972), p.134:
Sound, to our ears, is diffuse, like smell, or relatively massive, impinging without any precise spatial articulation. Such detail as it may convey is temporal. The crash of dishes sliding off an unbalanced tray, or the rustling of a mouse fleeing through dry leaves, has certainly more audible structure than an explosion, but it is the progress of an event through time that it conveys, not spatial form.
In the detailed analysis of an early radio play excerpt below, the focus is on how listeners interpret radio space in the aural compression of space-time-motion. It is a defining aspect of sound that it exists in space (Ferrington, 1994, 'Aural Information'). It is characterised by the space it occupies, its resonance and acoustic, and by the surfaces off of which it reflects and into which it is partially absorbed. Dialogue itself gives constant depth cues.
In the skills of perception, the ear analyzes faster than the eye. Take a rapid visual movement - a hand wave - and compare it to an sudden sound trajectory of the same time length. The fast visual movement will not enter the memory in a precise picture. The sound trajectory will succeed in outlining a clear and definite form, individuated, recognizable, distinguishable from others... the eye is more spatially adept and the ear more adept in time.
Adapted from Chion, Michel (trans. Gorbman, Claudia) (1994) Audio-Vision: Sound on Screen. New York: Columbia.
What are the differences between seeing and hearing?
There are things you can hear that you cannot see. Look around you, right now, and pay especial notice to the objects you can see with yourself at the centre. (Your point-of-listening in the Lifeworld.) What can you see? - SEEING - we rely on hue, saturation, brightness, along with colours which characterize surfaces or volumes. These give depth clues to locations, shapes, volumes. MASKING: Objects mask other objects in a certain way in the field of vision. How do sounds mask other sounds? 'Physical' stimuli and 'perceived' stimuli are not the same phenomena. How do we process sound? How do we process visual stimuli? Don Ihde notes that vision fixes objects 'out there'. We connect to objects we see in a different way, situated in geometric space. (Ihde, Don, Technology and the Lifeworld: From Garden to Earth. Bloomington: Indiana Univ. Press, 1990) But sounds reverberate inside us. Sound enters the body through ENTRAINMENT Stephen Handel: 'Listening is centripetal; it pulls you into the world. Looking is centrifugal; it separates you from the world' (Listening. An Introduction to the Perception of Auditory Events, 1989, xi). When we think we are getting enough information with our eyes we tend not to use our ears fully. See McGurk effect (illusion) - hearing AND seeing talk Visual dominance when conflicting information comes in from different senses - the McGurk effect (illusion) - see http://ccms.ntu.edu.tw/~karchung/Phonetics%20II%20page%20seventeen.htm - "How much do the visual cues (what we see) that we get from the face of a speaker influence what we hear? Quite a bit, in some cases." SIGHT AND SOUND ARE NOT SEPARATED USUALLY: We use sight to aid hearing and we use hearing to aid sight. Sounds assist our imaging, as calling up a visual image from the memory - the sound of a waterfall, the rustle of grass, the bark of a dog. We hear and visualize things and events that important to us, not the actual sound waves or light waves. POINT-OF-LISTENING (POL) - Where we are situated is important. We are hardly detached, objective observers of the Lifeworld. Through the theoretical model of phenomenology, we are subjects who are immersed in our experiences and processing. Phenomenology (philosophy) is an important part of the theoretical approach on this site.
'By bringing a durative, motional world of time and space simultaneously to front and back, top and bottom, and left and right, an alignment suffuses the entire fixed or moving body. This is why hearing and voicing link the felt sensations of sound and balance to those of physical and emotional presence'.
Steven Feld and Donald Brenneis, 'Doing Anthropology in Sound', Special web supplement to: American Ethnologist 31:4 - November 2004
FROM - Acoustics FAQ at http://www.faqs.org/faqs/physics-faq/acoustics/
*** 2.6 How does the ear work ?
The eardrum is connected by three small jointed bones in the air-filled middle ear to the oval window of the inner ear or cochlea, a fluid-filled spiral coil about one and a half inches in length. Over 10,000 hair cells on the basilar membrane along the cochlea convert minuscule movements to nerve impulses, which are transmitted by the auditory nerve to the hearing center of the brain.
The basilar membrane is wider at its apex than at its base, near the oval window, whereas the cochlea tapers towards its apex. Different groups of the delicate hair sensors on the membrane, which varies in stiffness along its length, respond to different frequencies transmitted down the coil. The hair sensors are one of the few cell types in the body which do not regenerate. They may therefore become irreparably damaged by large noise doses.
Yost, WA. (2000) Fundamentals of Hearing: An Introduction (Academic Press; fourth edition)
Moore, BCJ. (1997) An Introduction to the Psychology of Hearing (Academic Press; fourth edition)
McAdams, Stephen, and Bigand, Emmanuel, Thinking in Sound. The Cognitive Psychology of Human Audition, 1993, Oxford: Oxford University Press.
Just as the illusions of seeing are much studied, so also there are ambiguities in hearing. Information available at the level of the sense organs may be too ambiguous for correct processing. The result is the perception of an unreal sound object.
TWO WATCHES TICKING
We pay most attention to regularities in the noise or the timbral qualities of the materials being set into vibration. Ernst Weber pointed this out in 1846. He used the example of two watches whose ticking noises could not be discriminated from one another. With the watches placed on either side of the head, one can hear two distinct instances of qualitatively the same ticking noise. See Boring 1942, 383.
Boring, Edwin G. (1942). Sensation and Perception in the History of Experimental Psychology. New York: Appleton Century.
This is a key collection of essays on hearing and cognitive science: McAdams, Stephen, and Bigand, Emmanuel, Thinking in Sound. The Cognitive Psychology of Human Audition, 1993, Oxford: Oxford University Press.
Here is: Bregman, Albert, 1993, 'Auditory scene analysis: hearing in complex environments', chapter 2 in McAdams, Stephen, and Bigand, Emmanuel, Thinking in Sound. The Cognitive Psychology of Human Audition, Oxford: Oxford University Press, pp 10-36.
NOTE: Bregman here challenges the circumstances of experimental psychologists and their experiments out of the Lifeworld and in their laboratories. The 'phenomenon of auditory streaming' mentioned below (discussed in the book on page 22) concerns the ear-to-mind ability to make patterns of incoming signals, and here, sometimes believing they come from one single source and sometimes from two sources.
A correct solution is probable as long as the [hearing] system is operating in the rich natural environment in which it evolved. When it is placed in a soundproof room and presented with sounds in which many of the natural properties of the sound are missing, it is likely to produce strange illusions, such as the phenomena of auditory streaming and homophonic continuity that were discussed earlier, or the musical illusions described by Deutsch (e.g. 1975). These percepts represent the best the system can do when its strategies are evoked by unnatural data. Fortunately, these illusions display the nature of the rules to the experimental psychologist. (page 34)
FURTHER NOTE: Bergman's conclusion is that there are a 'large number of strategies for grouping and interpreting the sensory data', because 'each one is subject to error'. These strategies range from grouping sounds according to their origins in space, but this is not so effective in 'reverberant environments'. Another groups sounds by 'partials' (in relation to their harmonies).'In the worst cases, listeners may simply not be able to penetrate the mixture of sounds. It is a testament to the power of the scene-analysis process that such total failures are quite rare.' (pages 33-4)
Schafer, R. Murray, The Soundscape. Our Sonic Environment and the Tuning of the World, 1977, Rochester, Vermont: Destiny Books.
ECOLOGICAL NICHE AS A THEORY
NOTE: Researchers on bird calls have divided them into: pleasure calls, distress calls, territorial-defense calls, alarm calls, flight calls, flock calls, nest calls, feeding calls.
Equivalents for many of these can be found in human soundmaking. To take some obvious examples: the territorial calls of birds are reproduced in automobile horn blowing, their alarm calls are reproduced in police sirens and their pleasure calls in the beach-side radio. In the territorial calls of birds we encounter the genesis of the idea of acoustic space, with which we will be much concerned later. The definition of space by acoustic means is much more ancient than the establishment of property lines and fences; and as private property becomes increasingly threatened in the modern world, it may be that principles regulating the complex network of overlapping and interpenetrating acoustic spaces as observed by birds and animals will again have greater significance for the human community also.
Alan Beck comments on this book: This is an accessible and thorough student textbook. Plack begins: 'Hearing is the sense that obtains information about the world around us using the pressure fluctuations in the air (sounds) that are produced by vibrating objects. In most situations in which we find ourselves, the air is full of sound, and it is therefore full of information. The ear evolved to make use of that information, to make us better at coping with the "struggle for existence," as Charles Darwin put it (Darwin, 1859).'
Plack asks 'Why Study Hearing?' and replies that it is 'central to the interaction of human beings' and that music allows the expression of powerful emotions. Then there is the distinction between auditory physiology (structures in the ear and brain which can be simulated by microphones etc.) and psychoacoustics (the psychological or behavioural study of hearing - how we process the data). Chapters 2 and 3 cover physical acoustics, the spectrum, resonance and signal processing, and Chapter 4 gets into the anatomy of the ear, along with Chapter 5, on frequency selectivity.
RESEARCH IN THE HEARING SCIENCES - University of California, Berkeley http://ear.berkeley.edu/~ear/ Acoustics FAQ http://ear.berkeley.edu/~ear/acoustics.html Learn about human hearing at (Wikipedia) http://www.answers.com/main/ntquery?method=4&dsid=2222&dekey=Hearing+%28sense%29&gwp=8&curtab=2222_1&linktext=hearing frequencies audible to humans (about 20--20,000 Hz)
Learn about auditory illusion at (Wikipedia)
Learn about audio compression (Wikipedia)
Learn about dynamic range at (Wikipedia)
Learn about audio at (Wikipedia)
Learn about sound reproduction at (Wikipedia)
Learn about audio editing at (Wikipedia)
Learn about digitizing sound at (Wikipedia)
Learn about transmission at (Wikipedia)
Learn about radio at (Wikipedia)
Learn about Open Source software at (Wikipedia)
Learn about the microphone at (Wikipedia)
Learn about the preamplifier at (Wikipedia)
Learn about the loudspeaker at (Wikipedia)
Learn about audio formats at (Wikipedia)
Learn about audio system measurements at (Wikipedia)
This site is 'Radio Drama - directing, acting, technical, learning & teaching, researching, styles, genres'. See INDEX to navigate also. Complete curriculum of scripts, techniques (acting & directing & post-production & genre styles), advice, sound files - effects and atmoses (with no copyright and so free to use), detailed script commentaries, etc.
Academic material on this site is Alan Beck is licensed under a Creative Commons Attribution-Non-Commercial-Share Alike 2.0 UK: England & Wales License.
Learn about radio drama on this site along with my book - Beck, Alan, Radio Acting, London: A & C Black ISBN 0-7136-4631-4 Available on Amazon. CLICK HERE.
Any opinions expressed in this site are the personal opinions of the owner of the site. IF YOU HAVE COMMENTS, PLEASE EMAIL TO : firstname.lastname@example.org