21.08.2019

Michael Drr Photography: Michael Drr For M.a.c.

Level cues in isolation generally provide more accurate information than DRR only in isolation (Zahorik, et al., 2005), although level and DRR cues can provide equally accurate information for discriminating distance in highly reverberant environments (Kolarik, Cirstea, & Pardhan, 2013a).

  1. Michael Drr Photography Michael Drr For M.a.c.e

. Abstract Intrathecal baclofen ( ITB) therapy is a good therapeutic, evidence-based option and effective treatment for spasticity in children with cerebral palsy. It has also been used in children with dystonic cerebral palsy. ITB results in decreased spasticity, with the lower extremities more relieved than the upper extremities, fewer spasms, improvement in ankle clonus and increased range of motion and improved ambulation in children with spastic diplegia.

As well, ITB therapy can help in reducing contractures, improving functional abilities and facilitating patient care by parents and caregivers. ITB therapy involves the surgical implantation of a programmable pump in the abdomen with a catheter tracking into the spinal subarachnoid space and the baclofen venting into the cerebrospinal fluid. There can be surgical complications, as well as complications related to the patient’s medical condition, to the hardware implanted and to the medication. Despite this, the overall perceptions of ITB therapy have been found to be positive with most parents or caregivers satisfied and would go through the process again if needed for their child.

Auditory distance perception plays a major role in spatial awareness, enabling location of objects and avoidance of obstacles in the environment. However, it remains under-researched relative to studies of the directional aspect of sound localization. This review focuses on the following four aspects of auditory distance perception: cue processing, development, consequences of visual and auditory loss, and neurological bases. The several auditory distance cues vary in their effective ranges in peripersonal and extrapersonal space. The primary cues are sound level, reverberation, and frequency. Nonperceptual factors, including the importance of the auditory event to the listener, also can affect perceived distance. Basic internal representations of auditory distance emerge at approximately 6 months of age in humans.

Although visual information plays an important role in calibrating auditory space, sensorimotor contingencies can be used for calibration when vision is unavailable. Blind individuals often manifest supranormal abilities to judge relative distance but show a deficit in absolute distance judgments.

Following hearing loss, the use of auditory level as a distance cue remains robust, while the reverberation cue becomes less effective. Previous studies have not found evidence that hearing-aid processing affects perceived auditory distance. Studies investigating the brain areas involved in processing different acoustic distance cues are described. Finally, suggestions are given for further research on auditory distance perception, including broader investigation of how background noise and multiple sound sources affect perceived auditory distance for those with sensory loss.

The ability to judge the distance of sounds is important for building up a representation of the environment and for the interpretation of those sounds. Audition is the main means of evaluating distance when vision is degraded, due to environmental or physiological factors, or when the sound-producing object is outside of the visual field. In contrast to light, sound is generally able to travel around occluding objects. Thus, audition provides us with important cues when evaluating the distance of objects that are not visible.

Whereas touch can only provide spatial information for objects within reaching and grasping distance, the auditory modality can be used to detect and judge objects that are farther away from the listener. Furthermore, audition plays a key role in guiding locomotion by the central nervous system (CNS) when vision is not available, for which an accurate internal representation of the distance between the organism and the target is essential. However, auditory estimates of distance are generally poorer than those for azimuth (left-front-right judgments; Middlebrooks & Green, ). A distinction can be made between sounds in peripersonal space, i.e., sounds that are within reaching and grasping distance (approximately 1 m from the listener) and farther sounds in extrapersonal space. This distinction is useful because the range over which distance cues are operable varies, and some cues are only useful within peripersonal space, a region where internal representations of distance are based on both auditory and tactile information (Serino, Canzoneri, & Avenanti, ).

Peripersonal space is a region especially relevant to behavior. Many important everyday events, such as personal conversations, occur with sound sources that are close to the listener, and an appropriate selection of a target voice from a mixture of voices may require accurate spatial information (Shinn-Cunningham, Kopčo, & Martin, ). Nearby auditory events may require immediate motor responses, especially if the signal is threatening or particularly interesting (Serino, et al., ), and accurate auditory distance information is needed to coordinate this. The issue of how auditory space is generated, calibrated, and maintained when vision or hearing are impaired is of considerable interest in neuroscience and psychology (Collignon, Voss, Lassonde, & Lepore,; Gori, Sandini, Martinoli, & Burr,; Lewald,; Voss et al., ). A current question is how the external world is internally represented in blind people, who cannot use visual information to calibrate auditory space.

A large body of evidence shows that severe visual loss leads to an enhancement of directional localization abilities, especially for signals located in peripheral space (Doucet et al.,; Gougoux, Zatorre, Lassonde, Voss, & Lepore,; Lessard, Pare, Lepore, & Lassonde,; Simon, Divenyi, & Lotze, ). These enhanced abilities often are coupled with cortical reorganization, such that visually deafferented brain regions within the occipital cortex are recruited to process auditory input (for reviews, see Voss, Collignon, Lassonde, & Lepore,; Voss & Zatorre, ). The effect of visual loss on auditory distance perception is considerably less clear, due in part to the sparse number of behavioral studies on this topic and the scarcity of neural data. It is still largely unknown whether visual loss leads to cortical reorganization that affects auditory distance perception, although recent work involving distance-to-sound learning with sensory substitution devices (SSDs) suggests that occipital areas are recruited for auditory distance processing following visual loss (Chan et al.,; Tao et al., ).

The literature on the effects of sensory loss on auditory distance perception has not previously been reviewed and is discussed below. We discuss evidence suggesting that visual loss systematically affects auditory distance perception, thereby leading to decreased abilities to judge absolute auditory distance but enhanced abilities to judge relative distance. We argue that severe visual loss distorts internal spatial representations of the environment while enhancing abilities to discriminate between sound sources.

In this review, we examined the psychophysical and neuronal bases of human auditory distance perception, and the effects of sensory loss. We first describe the various acoustic cues that are used to perceive distance and the non-acoustic factors that influence this. A summary of research investigating the development of auditory distance perception is presented. The means by which auditory distance is calibrated in peripersonal and extrapersonal space and its effectiveness for guiding locomotion are reviewed. Findings of studies that have investigated the effects of visual and auditory loss on auditory distance perception are summarized. Research that has explored the neural processes associated with auditory distance is described.

Finally, we highlight potential avenues for future research relevant to auditory distance perception and the impact of sensory loss. Perceiving distance using sound Knowledge about the processing of auditory distance cues has been advanced by the development of binaural technology that allows simulation of different acoustical environments via headphone presentation (Zahorik, ). Such technology allows realistic simulation of sounds presented from different distances for various listener positions. It also allows auditory distance cues to be manipulated independently in a controlled way.

This technology was used in many of the studies described below. On average, perceived distance to sound sources in peripersonal space tends to be overestimated, while distance to sounds in extrapersonal space is generally underestimated for normally sighted and hearing humans (Fontana & Rocchesso,; Kearney, Gorzel, Rice, & Boland,; Parseihian, Jouffrais, & Katz,; Zahorik,; Zahorik, Brungart, & Bronkhorst,; Zahorik & Wightman, ). This is illustrated in Fig., which shows distance judgments for noise bursts presented at virtual distances (via a headphone simulation) between 0.3 and 14 m. More veridical judgments are made when close sound sources are presented laterally relative to the listener (Kopčo & Shinn-Cunningham, ).

This is contrary to azimuthal localization, which is generally more accurate for sources near the midline (Middlebrooks & Green, ). Auditory distance judgments are generally most accurate for sound sources approximately 1 m from the listener. Zahorik et al. demonstrated that systematic biases in distance estimates occur across a wide range of stimulus conditions, acoustic environments, and psychophysical procedures. Based on previous findings, they showed that compressive power functions of the form r′ = kr a gave good fits to distance judgments, where r′ is the judged distance, r is the actual distance, and k and a are adjustable parameters (with a.

Average apparent distance estimates (10 estimates/distance/participant, n = 5) plotted as a function of sound source distance. A power function was fitted to the data, and the exponent, a, which on double-logarithmic coordinates equals the slope of the linear fit, is reported in the bottom right.

The dashed diagonal line indicates where veridical judgments would lie. Adapted from “Loudness constancy with varying sound source distance,” by Zahorik and Wightman, Nature Neuroscience, 4, p. Copyright 2001 by Nature Publishing Group. Reprinted with permission In addition to being biased, auditory distance estimates appear to be considerably less precise than visual distance estimates.

This reduction in precision, or distance “blur,” is evident in the considerable variability often observed in (un-averaged) auditory distance estimates. For example, Anderson and Zahorik reported that the average standard deviation of sound source distance estimates was approximately 1.6 times the distance of the target. This corresponds to nearly twice the variability observed for comparable estimates of distance to visual targets (Anderson & Zahorik, ).

There are multiple acoustic cues available for perceiving the distance between a listener and a sound source. The number of cues available and their reliability can vary substantially depending upon the stimulus, the properties of the environment, and the direction of the sound source.

Two types of auditory distance cues can be distinguished. Absolute cues allow distance to be judged based on single presentations of sounds to independent groups of listeners (Mershon, Ballenger, Little, McMurtry, & Buchanan, ). Relative cues allow sounds at different distances to be discriminated.

In addition, there is now a considerable body of work showing that visual information and nonperceptual factors can influence estimates of perceived distance. Zahorik, et al. and Coleman previously reviewed the auditory distance cues used by humans, and Naguib and Wiley summarized the use of auditory distance cues by humans and animals.

In the following sections, we summarize work that has investigated the cues used for auditory distance perception by normally sighted and hearing humans. Binaural cues For close sound sources, auditory distance judgments tend to be more accurate when the sound is presented laterally relative to the listener (Kopčo & Shinn-Cunningham, ).

This is due to the added benefit of binaural cues which are noticeable even in the presence of prominent level and DRR cues. When sounds are heard laterally or when the listener turns their head, the signal at the ear farther from the source is attenuated and delayed. This produces interaural level differences (ILD) and interaural time differences (ITD) between the ears. Although ITD changes are approximately independent of distance, ILD changes substantially as a function of distance in the acoustic near-field (Brungart, Durlach, & Rabinowitz,; Duda & Martens, ). ILD provides a distance cue for distances up to approximately 1 m, beyond which it becomes roughly independent of source distance (Brungart, et al.,; Greene, ). In particular, ILDs for low-frequency sounds can be large for nearby sources but are very small for distant sources.

Theoretical work by Hartley and Fry suggested that the distance of a pure tone could be estimated at near distances using ILD information. However, an experiment of Wightman and Firestone showed that listeners were unable to judge the distances of pure-tone stimuli. Hirsch theorized that listeners could combine information from ILDs and ITDs to determine source distance. Molino modified Hirsch’s theory so that it would apply to cases where the source direction was known, but again found that listeners could not make distance judgments for pure-tone stimuli.

Michael Drr Photography Michael Drr For M.a.c.e

Duda and Martens suggested that the use of pure tones may have resulted in the stimuli not being heard as external to the listener’s head, potentially explaining why distance judgments were difficult for pure-tone stimuli. Only small benefits of head movements that introduced binaural cues were reported by Gardner for judging distances to speech in an anechoic room.

Holt and Thurlow reported that listeners were not able to judge the distance of thermal noise (similar to white noise) presented frontally at distances beyond 1.8 m, but performance improved when the sound sources were oriented laterally. Cochran et al. found that the orientation of the head had no effect on distance judgments for speech presented at distances greater than 1 m, and Simpson and Stanton found that head movements did not aid judgments of distance for pulse trains at distances between 0.3 and 2.66 m.

The binaural cues available at low frequencies, as measured using the head-related transfer function (HRTF, the transfer function from a sound source to the eardrum of the listener) have been approximated by modelling the human head as an ideal rigid sphere (Duda & Martens,; Hartley & Fry,; Shinn-Cunningham, Santarelli, & Kopco,; Stewart, ). HRTFs also have been measured in the acoustic near field using a Knowles Electronic Manikin for Acoustic Research (KEMAR) (Brungart & Rabinowitz,; Calamia & Hixson,; Kopčo & Shinn-Cunningham, ). Distance judgments for lateral sounds were more accurate than for sounds in the median plane, consistent with HRTF measurements indicating that ILD varied with distance (Brungart et al., ). Low-frequency ILD cues are relatively robust to room reverberation, and perceived distance judgments for close sound sources may be more veridical in a reverberant room where DRR cues also are available in addition to ILD cues (Shinn-Cunningham et al., ). Results from a study by Kopčo and Shinn-Cunningham in a simulated reverberant room suggested that distance judgments could be explained on the basis of listeners using a fixed frequency-dependent mapping of DRR to distance, despite the presence of potential ILD cues.

The authors suggested that further experiments were needed to establish whether ILD cues contribute to auditory distance judgments in reverberant space as well as in anechoic environments. HRTF parallax also may be used to determine the distance to sound sources that are relatively close to the listener. Acoustic parallax occurs when a sound is relatively close to the head, introducing a difference between the angle of the source relative to the left ear, and the angle of the source relative to the right ear. Assuming that the direction from each ear can be determined, presumably using pinna cues that can be quantified by measuring HRTFs, the parallax angle can be calculated using the difference between the directions from each ear to the sound source. This varies as a function of source distance.

For frontal sources, the parallax angle is larger for closer sources than for farther sources. Kim, Suzuki, Takane, and Sone obtained distance judgments using acoustic parallax with pink noise presented at virtual distances between 0.1 and 2 m; the stimuli were synthesized to remove level and DRR cues. Distance judgments increased with increasing source distance up to approximately 1 m, consistent with observations that HRTFs are almost independent of sound source distance beyond 1 m (Otani, Hirahara, & Ise, ). HRTF parallax may account for instances where participants were able to report auditory distance for frontally presented sounds at near distances even when level and ILD cues were unavailable (Ashmead et al., ).

Combining auditory distance cues An internal representation of distance to a sound source is built up by combining information from the various cues that are available. For example, it is not possible to make accurate distance judgments using a fixed mapping of DRR to distance, because mean DRR values corresponding to specific distances depend on source spectral content, on the characteristics of the room, and on whether the DRR is measured at the near- or far-ear relative to the sound (Kopčo & Shinn-Cunningham, ).

Different cues vary in terms of reliability and are dependent on sound-source properties and the environment. Thus, the relative weighting of each cue in determining the percept of distance needs to be flexible. Zahorik showed that the perceptual system did indeed weight level and DRR cues flexibly to produce a single distance percept depending on the stimulus and the angular position of the sound source relative to the listener.

However, the processes underlying cue combination and weighting associated with other cues, such as spectrum and binaural information have yet to be explored. Kopčo and Shinn-Cunningham suggested that the auditory system may optimally combine DRR and ILD information in reverberant rooms to improve the precision of distance estimates to lateral sound sources. However, this requires further testing. Development of auditory distance processing Infants’ perception of auditory distance has generally been assessed by measuring how their actions match spatial information conveyed by proximal sensory stimulation, such as reaching to grasp sound-producing objects (Ashmead, Clifton, & Perris,; Clifton, Perris, & Bullinger,; Litovsky & Clifton, ) or moving to avoid approaching objects (Freiberg, Tually, & Crassini, ). Such actions suggest that the sound-producing object is perceived in spatial terms relative to the location of the infant (van der Meer, Ramstad, & Van der Weel, ).

The literature on developmental aspects of auditory distance perception has not been reviewed previously and is discussed in this section. Clifton et al. showed that infants were able to distinguish between objects in near and far space on the basis of sound by 6 months of age. Sounds were presented in the dark either within reach at 15 cm, or out of reach at 60 cm. Infants reached more frequently towards the location of the sound when positioned within reach than when out of reach. This was replicated in a follow-up study by Litovsky and Clifton , who further demonstrated that infants correctly discriminated between near and far sounds regardless of whether or not sound level was roved to prevent it from providing a useful cue.

This suggests that infants use other cues than sound level when judging distance, in contrast to adults tested in the same study, who relied primarily on level cues and whose performance worsened when level was roved. Using a conditioned head turn technique, Morrongiello, Hewitt, and Gotowiec showed that by 6 months of age infants were better at discriminating approaching than receding stimuli. They also demonstrated that responses on trials where changes in distance occurred were greater than responses on trials using non-moving sounds that increased or decreased in sound level, suggesting that distance cues other than changes in level were utilized.

Freiberg et al. used a more direct auditory looming paradigm to assess relative distance perception using sound level cues. They hypothesized that if infants used changing sound level to perceive changes in distance to the sound source, then they would engage in more defensive avoidance behavior for auditory stimuli that increased rather than decreased in level.

Consistent with this hypothesis, avoidance behavior, as measured by amount of backward body pressure exerted by the infant, was associated with level increases but not decreases. Two studies have investigated whether infants are able to coordinate auditory and visual distance information (Morrongiello & Fenwick,; Walker-Andrews & Lennon, ). Walker-Andrews and Lennon showed 5-month-old infants two videos side by side of automobiles approaching or receding. The videos were paired with a soundtrack of a lawn mower either increasing or decreasing in level. Infants looked preferentially at the video that matched the soundtrack for approaching stimuli only. A second study, also using a preferential-looking procedure, showed that 9-month-old infants were able to coordinate visual and auditory depth information for both approaching and receding stimuli (Morrongiello & Fenwick, ).

Visual information of a drum-beating toy was presented on two screens with auditory information that matched one of the screens. The toy was shown moving horizontally in depth or stationary. Five-month-old infants only reliably looked preferentially at the stationary toy paired with the stationary sound stimulus, suggesting that they did not recognize that changes in sound level indicated that the distance of an object was changing. Nine-month-old infants preferentially looked at the screen that matched the auditory stimulus for which the depth changed.

The authors suggested that the extended time period between perceiving auditory distance at approximately 6 months (Morrongiello et al., ) and coordinating it with visual depth was due to younger infants having difficulty recognizing that increases and decreases in sound level accompany an object moving in depth, possibly because sounds can vary in level independent of source distance. The discrepancy between their findings and those of Walker-Andrews and Lennon was attributed to the increased salience of visual depth cues in the prior study, recruiting the attention of 5-month-old infants and aiding coordination of audiovisual depth. Overall, these studies suggest that by 9 months of age, infants are able to coordinate visual depth information with auditory distance cues and hence could use visual information in order to calibrate auditory space.

Infants younger than 11 months are able to discriminate increments in sound level of 3 dB and decrements of 6 dB (Bull, Eilers, & Oller,; Sinnott & Aslin,; Tarquinio, Zelazo, & Weiss, ). This suggests that changes in level are potentially usable as a distance cue.

By 3 years of age, children increase their vocal intensity as the distance from a listener increases (Johnson et al., ), indicating that they have at least limited knowledge of the intensity losses due to sound propagation. Further work is needed to determine more closely how auditory distance perception develops, using conditions where individual distance cues are controlled and tested independently of other cues. One possible avenue of further research is to investigate how auditory distance is calibrated for normally sighted, early- and late-onset blind individuals, by investigating the accuracy of absolute auditory distance judgments longitudinally from infancy to adulthood, to establish how internal representations of auditory space are generated and maintained when visual calibration cues are unavailable. The internal representation of auditory distance and its calibration are discussed in more detail in the following section. Effect of visual loss on auditory distance perception For totally blind individuals or those with light perception only, audition provides the primary source of information regarding distance to a target in extrapersonal space. Auditory distance information is also of paramount importance to those with partial visual losses, such as age-related macular degeneration (AMD), retinitis pigmentosa (RP), or glaucoma, which can severely reduce central or peripheral visual spatial information.

showed that both early- and late-onset blind participants could discriminate the distance of broadband white noise bursts presented 3-4 m away in a reverberant room, whereas normally sighted participants could not (Fig. ). Using virtualization techniques to simulate anechoic and reverberant rooms, Kolarik, Cirstea, and Pardhan showed that blind participants used level and DRR cues more effectively than normally sighted or partially sighted participants to discriminate the distance of broadband white noises presented between 1 and 8 m from the participant. These studies suggest that significant auditory compensation occurs following full visual loss and that it provides measurable benefits across a range of acoustic environments for relative auditory distance perception. Effects of hearing loss and hearing aid processing on auditory distance perception In contrast to investigations of the effects of hearing loss on localization in azimuth (see Keating & King, for a review), there are relatively few psychophysical studies and no neuronal studies of how auditory distance perception is affected by hearing loss.

Effects of hearing aid processing, which potentially may distort available auditory distance cues, also have received relatively little attention. Adverse effects of hearing impairment or hearing-aid processing on auditory distance perception may be compensated to some extent by visual depth information for normally sighted listeners. However, considerable difficulties may occur in situations where vision is degraded, and although hearing loss is an important consideration for blind listeners, this area of enquiry is currently under-researched.

The effect of sensory loss and hearing aid processing on auditory distance perception is especially important for older people, because visual and auditory losses are more prevalent in this group. Akeroyd et al. compared the effectiveness of level and DRR distance cues combined with DRR alone for normal-hearing and hearing-impaired participants.

They measured distance discrimination for sentence pairs at virtual distances between 1 and 8 m. Hearing-impaired participants generally performed as well as normally hearing participants when both cues were available, although hearing-impaired participants performed more poorly for simulated distances greater than 5 m.

Hearing-impaired participants performed more poorly when the level cue was made unavailable by fixing the overall level of the sounds, suggesting deficits in the ability to discriminate distance using DRR. The scores obtained with DRR cues alone were correlated with self-reported auditory distance perception abilities. Most modern hearing aids include amplitude compression that applies high gain for low-level sounds and low gain for high-level sounds. This increases the audibility of low-level sounds without making intense sounds uncomfortably loud (Moore, ). However, alterations to sound level due to hearing aid processing may alter the cues utilized to perceive distance accurately (Simon & Levitt, ). Amplitude compression alters level cues and can affect DRR cues by reducing gain for high-level direct sound while providing high gain for low-level reverberant sound.

However, for continuous speech, the reverberant tail only occurs in isolation, during pauses in speech. Thus, adverse effects of hearing aid processing might be expected to be small or negligible. This was found in a study by Akeroyd , who investigated distance discrimination for continuous speech using the design of Akeroyd et al. described above, with level and DRR cues both available.

Akeroyd did not find any adverse effects of hearing-aid compression produced by the participants’ own hearing aids. As the participants were experienced hearing aid users, it is possible that they acclimatized to the effects of their own hearing aids on sound level. It also is possible that no adverse effects were found, because the amount of amplitude compression was small or because the gain changed too slowly to affect the DRR. Effects of hearing-aid compression might be observed for absolute distance judgments rather than the relative distance task utilized in the study (Akeroyd, ). Hearing aid compression may affect the use of ILD distance cues (Simon & Levitt, ) even for continuous speech, and although this has not yet been directly assessed, there is evidence that compression distorts ILD cues for localization in azimuth. Musa-Shufani et al. tested normally hearing and hearing-impaired participants with narrow-band one-third octave wide noise signals centered at 500 and 4000 Hz in a localization task.

Hearing aids were simulated with linear processing or fixed amounts of fast or slow compression. Fast compression was found to increase JNDs in ILD compared with linear processing. In summary, hearing loss adversely affects the use of DRR cues, although the use of level cues remains relatively unaffected (Akeroyd et al., ). Hearing aid compression does not affect distance discrimination when level and DRR cues are available (Akeroyd, ). Further work is needed to expand upon these findings, in particular to investigate how hearing loss affects absolute distance judgments, the use of spectral and binaural distance cues, and the effects of bilateral versus unilateral fitting. Although the effect of hearing loss on the use of distance cues other than level and DRR has yet to be evaluated, it is likely that the use of high-frequency spectral content for estimating distance to far sounds will be affected, because hearing loss often is greater at higher than lower frequencies (Moore, ).

Use of binaural cues for near-distance judgments may be impaired due to reduced frequency selectivity caused by the broadening of auditory filters with hearing loss, which reduces the ability to obtain ITD and ILD information from within narrow frequency bands (Moore, ). Due to the importance of accurate spatial awareness for blind individuals, if detrimental effects on distance perception as a consequence of hearing-aid amplitude compression were to be identified, it is possible that blind individuals would derive greater benefits from hearing aids with linear processing. However, this has yet to be investigated. Neuronal bases of auditory distance perception It often is assumed that auditory processing occurs along functionally separate pathways within the auditory cortex (AC), organized along similar lines to the “what” and “where” pathways in the visual cortex, such that spatial information is processed in a posterior stream of the AC (Ahveninen et al.,; Rauschecker & Scott,; Rauschecker & Tian,; Recanzone & Cohen, ). Horizontal sound direction changes have been shown to activate posterior nonprimary AC regions, including the planum temporale (PT) and superior temporal gyrus (STG) (Ahveninen et al., ). Auditory distance may be processed in areas, including the PT and STG (Kopčo et al., ), within a dedicated network that includes the ventral premotor cortex (Graziano et al., ) and anterior cingulate cortex (ACC) (Wisniewski et al., ).

There is currently little neural data regarding how visual loss affects neural processing of auditory distance. However, work involving learning of the distance to sounds suggest the recruitment of occipital areas (Chan et al.,; Tao et al., ). These results parallel findings for localization in azimuth, which have shown that visually deafferented areas are functionally recruited for spatial auditory processing in the event of visual loss (for reviews, see Collignon et al.,; Voss & Zatorre, ). Mathiak et al. utilized magnetoencephalography (MEG) to investigate neural correlates of auditory distance processing. A series of white noise bursts with deviants in level and (as a control) duration were presented in various conditions, simulating sound sources at distances that included 0 m (i.e., within the head) and 2 m.

The results suggested that the right AC plays a dominant role in the processing of level and DRR distance cues. Deviants in level evoked a larger response over the right supratemporal plane than the left, but this activation decreased when reverberation was present. In macaque monkeys, multimodal neurons in ventral premotor cortex have been found to represent distances in peripersonal space (Graziano et al., ), and it is possible that the purpose of spatial representation in this area is to guide head and arm movements in relation to targets that are within reaching and grasping distance (Graziano & Gross,; Moore & King, ). Graziano et al. reported that many of the neurons they tested changed their response when the level was changed, but 59% of the neurons tested coded distance independent of level. The authors suggested that cues other than level are used to process distance in the near field, including reverberation and spectrum, and these might determine the responses of the neurons.

Further experiments are needed to analyze the relative influence of these potential cues. In another study focusing on peripersonal space (Kopčo et al., ), sensitivity to acoustic distance changes independent of level was observed in neural populations in the PT and posterior STG (Fig. ).

Michael Drr Photography: Michael Drr For M.a.c.

Wisniewski et al. utilized electroencephalography (EEG) to investigate the neural mechanisms underlying distance perception for familiar or unfamiliar speech sounds under conditions where level cues were minimized. They reported that cortical regions responded in different ways depending on sound familiarity across a network that included the ACC, suggesting that working memory and attentional processing were implicated when familiarity was a factor in distance judgments. Posterior nonprimary auditory cortex activations for hypothesis-based region-of-interest (ROI) analyses during auditory distance processing. Increases in left posterior auditory cortex ROI activity occurred when the distance of the sound source was varied versus when the sound level (intensity) was varied. This suggests the presence of neurons with distance representations that are independent of level in the posterior nonprimary auditory cortices. Increased activity was observed in both hemispheres during varying intensity versus constant sounds.

Error bars represent +1 standard error of the mean. From “Neuronal representations of distance in human auditory cortex,” by N.

Belliveau, T. Tengshe, and J. Ahveninen, Proceedings of the National Academy of Sciences of the United States of America, 109, p. Free fonts for mac. Copyright 2012 by National Academy of Sciences.

Reprinted with permission As described earlier, using monaurally presented noise at virtual distances between 10 and 160 cm, Kim et al. showed that inferior colliculus neurons in the rabbit either increased or decreased firing rates monotonically as AM depth increased, but only when the virtual environment was reverberant and AM was present. These findings suggest that neural sensitivity to AM depth, combined with the distance-dependent reduction of AM depth in reverberant environments, may provide a mechanism for the neural coding of monaural distance.

Altmann et al. demonstrated that loudness constancy occurred in conditions where reverberation was comparatively strong, but not in weak reverberation conditions.

In contrast, the accuracy of auditory distance judgments was similar in strong and weak reverberation conditions, suggesting dissociation between loudness constancy and distance perception. MEG recordings suggested that the right middle temporal and inferior anterior temporal cortex play a role in the representation of loudness, reflecting perceptual loudness constancy, while superior temporal areas are implicated in sound distance perception processing (Kopčo et al., ). It has been speculated that the greater biological salience of approaching than retreating sounds may have a neural basis (Ghazanfar et al., ).

A neural network involving the superior temporal sulcus, middle temporal gyrus, right premotor cortex, and right temporoparietal junction has been implicated in distance processing for dynamic sound sources (Seifritz et al., ), and it is likely that the right PT plays a role in auditory distance processing for dynamic sounds (Hall & Moore, ). Several studies that used sound-to-distance learning paradigms involving SSDs have provided insight regarding the neural networks involved in auditory distance perception for the blind. Renier et al. trained normally sighted blindfolded participants to use a visual to auditory SSD for perceiving distance to objects in peripersonal space. Using positron emission topography (PET), they showed that occipitoparietal and occipitotemporal areas were activated while using the device, suggesting that areas of the visual cortex are somewhat multimodal and can be recruited for perceiving distance by audition. tested early-blind and normally sighted controls using an echo-based SSD to judge distance for objects placed 1-5 m from the SSD.

Functional magnetic resonance imaging (fMRI) showed that learning was mediated by a parieto-frontal network that involved the hippocampus and the cuneus in the striate cortex. The neural network for normally sighted individuals involved reduced activity in the occipital lobe and hippocampus, and increased activity in the frontal and temporal lobes. measured fMRI responses from early- and late-onset blind groups when using auditory spatial information from an SSD signal to locate objects at 1.5-, 2.5-, or 3.5-m distance at various azimuths. They also evaluated participants’ visuospatial working memory abilities. Both groups showed activation in middle occipital gyrus, superior frontal gyrus, precuneus, and precentral gyrus when localizing sounds. However, the groups differed in activation of the right middle occipital gyrus and left superior frontal gyrus. In the early-blind group, sound localization performance was correlated with BOLD responses in the right middle occipital gyrus only.

In the late-onset group, BOLD responses in the left superior frontal gyrus were correlated with sound localization performance and visuospatial working memory abilities. The results suggest that early-onset visual loss results in cross-modal plasticity that recruits occipital areas for processing auditory spatial information, including distance, whereas spatial processing occurs in prefrontal areas involving visuospatial working memory for those with late-onset visual loss.