Studies have shown that bats are capable of using visual information for a variety of purposes, including navigation and foraging, but the relative contributions of visual and auditory modalities in obstacle avoidance has yet to be fully investigated, particularly in laryngeal echolocating bats. A first step requires the characterization of behavioral responses to different combinations of sensory cues. Here, we quantified the behavioral responses of the insectivorous big brown bat, Eptesicus fuscus, in an obstacle avoidance task offering different combinations of auditory and visual cues. To do so, we utilized a new method that eliminates the confounds typically associated with testing bat vision and precludes auditory cues. We found that the presence of visual and auditory cues together enhances bats' avoidance response to obstacles compared with cues requiring either vision or audition alone. Analyses of flight and echolocation behaviors, such as speed and call rate, did not vary significantly under different obstacle conditions, and thus are not informative indicators of a bat's response to obstacle stimulus type. These findings advance the understanding of the relative importance of visual and auditory sensory modalities in guiding obstacle avoidance behaviors.

Animals navigating and foraging in their natural environments must not only detect biologically relevant signals, but they must also determine how to use that sensory information for a given task. An animal's surroundings are generally filled with noise and ambiguous signals, and the information from multisensory input can contain more information than signals from any single sensory modality, or the signals carried through one sensory modality can contain more reliable information than that in others. Therefore, being able to combine stimulus information across multiple sensory modalities, and to subsequently weight these different sensory inputs, is crucial to disambiguating information about the world, forming unified perceptions of objects and guiding probabilistic decision making. For example, both male and female big-clawed snapping shrimp (Alpheus heterochaelis) use the same visual display in aggressive and mating interactions, requiring individuals to assess sex via chemical cues in order to respond to the display appropriately (Hughes, 1996). Gray squirrels (Sciurus carolinensis) emit alarm signals that contain both a visual and auditory component among conspecifics, and in populations inhabiting urban environments with more auditory noise, individuals rely more heavily on the visual component than their more rural counterparts (Partan et al., 2010).

Bats serve as an excellent model for studying multi-modal sensing, integration and decision processes. Over 1000 species of bats produce echolocation calls with the larynx and are well known to use auditory information (via passive listening or by active biosonar) for prey capture and foraging (Anderson and Racey, 1991; Bell, 1982; Faure and Barclay, 1994; Gomes et al., 2016; Marimuthu and Neuweiler, 1987; Russo et al., 2007; Ryan, 1987); a prevailing misconception is that echolocating bats are blind or have no use for visual information (Thiagavel et al., 2018). This can be attributed to their mostly nocturnal lifestyle and relatively small eyes. Although many bat species rely extensively on biosonar for many facets of life, echolocation is only functional over relatively short distances, owing to its reliance on high frequencies that attenuate rapidly in air (Jakobsen et al., 2013; Lawrence and Simmons, 1982). Thus, bats also use vision for tasks such as navigation (Davis, 1966; Griffin, 1970; Höller and Schmidt, 1996; Layne, 1967; Williams et al., 1966), escape behaviors (Chase, 1983; Mistry, 1990) and predator surveillance (Eklöf, 2003). Our goal is to quantify bat responses when multisensory cues in the auditory and visual domains are present.

Vision and hearing are closely coordinated senses, and, in many organisms, a major function of sound localization is to direct the eyes to the source of a sound (Heffner and Heffner, 1992; Heffner et al., 1999). However, when visual cues and cues of other sensory modalities conflict, visual input often dominates (Bekoff, 1972; Posner et al., 1976; Uetake and Kudo, 1994; Ward and Mehner, 2010; Wilcoxon et al., 1971; Witten and Knudsen, 2005). Accordingly, we can find various instances in which animals have demonstrated visual dominance over other senses (Bekoff, 1972; Posner et al., 1976; Uetake and Kudo, 1994). Which, if any, sense dominates perception depends on the type of task being performed (Parker and Robinson, 2017). Generally speaking, spatial navigation tasks, including those that require obstacle avoidance, tend to depend largely on vision (Welch and Warren, 1980). This is an intuitive strategy for an organism for which the primary sense is vision; however, for an organism for which the primary sense is audition, such as echolocating bats, vision may not dominate.

Prior studies of bats have suggested that task performance using echolocation may be enhanced by the presence of visual cues, especially in dim-light conditions like those found at dusk or dawn. Northern bats (Eptesicus nilsonii) in southern Sweden may use visual cues during prey search to locate bright white moths that are active just above and within tall grass (Eklöf et al., 2002; Jensen et al., 2001). Laboratory experiments report that lingual-echolocating Egyptian fruit bats exhibit visual dominance in some tasks, even when both auditory and visual cues are present and discriminable (Danilovich and Yovel, 2019). This species has large eyes and well-developed vision, and its sonar click rate depends on light level (Danilovich et al., 2015). Notably, laryngeal echolocating bats presented with hypothetical escapes via transparent, rigid windows through which light entered a darkened room or maze tended to collide with these structures despite the assumption that their echolocation should have alerted them to the obstacle (Chase, 1981, 1983; Davis and Barbour, 1965; Mistry, 1990). However, it has also been shown that bats presented with smooth vertical surfaces, such as glass windows, do not receive returning echoes until they are in very close proximity to the structure, owing to the angles at which sound is reflected (Greif et al., 2017), raising the possibility that bats did not receive echoes from the surfaces presented in the above-named studies prior to being able to abort their escape attempts.

Other studies suggest that vision has a deleterious effect on task performance when combined with echolocation. Free-flying little brown bats (Myotis lucifugus) made a greater number of collisions with a trailer when the exterior lights were on (McGuire and Fenton, 2010). In situations in which bats appeared to be guided by sight, they often improved their performance when their eyes were covered (Griffin and Galambos, 1941). This may be because these experiments were conducted in light conditions that were too bright and impaired visual function. It has been demonstrated that normal levels of room illumination (∼377 lx; similar to lighting in an interior classroom) appear to impair obstacle avoidance in M. lucifugus, and performance was best in very dim conditions (∼1 lx; similar to a night with a full moon) (Bradbury and Nottebohm, 1969).

The extent to which bats utilize visual cues when echolocation cues are available and distinct is an ongoing topic of inquiry. As a whole, the field of audiovisual integration in echolocating bats is incomplete and would benefit from the addition of more studies examining this phenomenon under a variety of conditions across many species. From the literature, it is clear that bats possess the capacity to see and that vision plays a role in their natural behaviors (Chase and Suthers, 1969; Curtis, 1952; Suthers et al., 1969), but which modality, echolocation or vision, is prioritized appears to depend on the environment, stimulus strength, species and task.

To understand the relative importance of visual and auditory sensory modalities in obstacle avoidance tasks, we conducted a set of behavioral experiments in which we quantified orientation of the laryngeal echolocating big brown bat (Eptesicus fuscus) in response to visual and sonar obstacles that resulted in different combinations of audiovisual cues. One of the challenges that studies have faced in the past is being able to present stimuli in the visual domain without providing information in the auditory domain, which is essential to determine which cues the bat is using. The present study exploited a novel method to display visual obstacles that yielded no echo returns. The goal of this study was to determine whether bats would rely solely on vision for performing obstacle avoidance and whether the presence of visual and auditory cues would differentially affect the behavioral responses observed during task performance. We hypothesized that the combination of visual and echolocation cues together would augment avoidance behavior compared with a single modality alone. Thus, we predicted that bats would evade an obstacle significantly more in the presence of visual and echolocation cues together than in the presence of visual or echolocation cues alone.

We would like to bring to the reader's attention the novelty of the methods employed in experiments reported here. To our knowledge, the use of lasers as visual obstacles that carry no acoustic information has never before been implemented and presents an exciting opportunity to explore in future experiments the question whether bats perceive laser beams as solid objects and, if so, whether this perception depends on the diameter of the beam.

Animals and setup

We designed an obstacle avoidance task in which three wild-caught adult female big brown bats, Eptesicus fuscus (Beauvois 1796), were trained to fly into a box suspended from the ceiling (Fig. 1A) for a food reward. Experiments took place in a large room (6×6×2.5 m) under infrared illumination. Bats were captured in North Carolina under collecting permit 17-SC01070 issued by the North Carolina Wildlife Resources Commission, and were housed and trained at Johns Hopkins University according to all procedures set forth by the Institutional Animal Care and Use Committee (protocol number BA17A107). A food reward of mealworms (Tenebrio molitor larvae) was given when the animals entered the box (60×70×65 cm), requiring them to navigate past the obstacle, if present, and land on any one of the three enclosing walls. Bats that did not enter the box, or landed on the outside of the box, were not given rewards. A fan-operated mist-producing apparatus was used to create a column of water vapor in front of the box opening during training and testing. The opening of the box remained unobstructed for control trials and was partially obstructed with an obstacle during test trials.

Obstacles

We created obstacle conditions (Fig. 1B) in which the three bats (G20: N=33 trials; G40: N=60 trials; O90: N=45 trials) were presented with acoustic-only cues (A+V–, N=43 trials), vision-only cues (A–V+, N=38 trials), acoustic and vision cues (A+V+, N=35 trials) or no obstacles (A–V–, N=56 trials). A–V+ cues are challenging to create because this condition requires an acoustically transparent object that still serves as a visual obstacle in the flight path. To create this condition, a laser was used, in conjunction with the column of mist. The mist supplied additional airborne particles to increase the scattering of light, resulting in increased visibility of the entire laser projection. The result was a thin beam of solid green light (520 nm, 3 mm diameter), which, according to electroretinograms, should be near the peak sensitivity of E. fuscus (Hope and Bhatnagar, 1979) and easily detectable. The ability of E. fuscus to detect the 520 nm light was also verified in a separate behavioral experiment (see below). A+V– cues were constructed by placing thin (5 mm diameter), flexible pieces of rubber wrapped in a thin string of unlit light-emitting diodes (LEDs) in front of the opening. A+V+ cues were these same LED-wrapped rubber pieces with the LEDs illuminated. To prevent bats from relying on spatial memory to avoid obstacles, each was randomly positioned in either a horizontal or vertical configuration and placed in center or off-center locations at the box opening across trials. The entire box was covered in non-reflective black felt in order to minimize strong visual cues, even when partially illuminated by the light of the obstacles. The felt also served to attenuate echoes (Warnecke et al., 2018), but bats could still use echolocation to find the walls of the box. To enable infrared video recordings, experiments were conducted under long-wavelength ambient light, outside the visible range of E. fuscus (Hope and Bhatnagar, 1979).

Audio-video recordings

Each trial was recorded with two high-speed infrared Miro cameras (Phantom, Wayne, NJ, USA) sampling at 100 frames s−1, which permitted 3D reconstruction of the bat's flight trajectories using DLTdv5 digitizing software (Hedrick, 2008). Echolocation calls emitted by bats during the trials were recorded with a 24-channel wide-band ultrasound microphone array (Pettersson Elektronik, Uppsala, Sweden). The camera and microphone systems were synchronously recorded, triggered via a transistor–transistor logic pulse generated with custom hardware. The resulting reconstructed flight trajectories, extracted from the digitized center of mass of the bat, and audio recordings were further processed and analyzed using custom MATLAB (Natick, MA, USA) scripts to extract acoustic parameters of the fundamental harmonic of the bats' echolocation calls and kinematic parameters of the bats' flight, presented in Table 1.

Statistical analyses

All statistical analyses were conducted in R v. 3.6.3 (https://www.r-project.org) using the lme4 package (https://cran.r-project.org/web/packages/lme4/index.html) to generate linear mixed effects models (LMMs) or generalized linear mixed effects models (GLMMs), with individual bat identities used as random effects. For analyses of temporal and spectral echolocation call parameters, the individual trial was also used as a random effect. Each analysis is reported with the model used and the statistical results. Planned contrasts and post hoc analyses were carried out using the multcomp package (https://cran.r-project.org/web/packages/multcomp/index.html), adjusting P-values using the Bonferroni-based false discovery rate method (α=0.05).

Determination of 520 nm light-detection behavior

To confirm that the light generated by the laser was detected by the bats, two additional E. fuscus were trained in a separate set of behavioral experiments. In this paradigm, bats were trained to crawl to the arm of a Y-platform, in front of which was a piece of black felt onto which the green laser stimulus was projected (Fig. 2A). The laser 520 nm beam was manually oriented to either the left or right in alignment with the arms of the platform. Bats were rewarded with a mealworm for crawling towards the side on which the laser was projected. One individual was tested in 75 trials and the other in 100 trials; a permutation test was used to estimate the expected percentage of correct responses to determine whether the animals performed significantly above 50% in the visual detection task. This generates a cutoff percentage, or performance threshold, for ensuring that bats reliably detect the presence of the signal (i.e. the laser) at a rate that is statistically better than chance.

Behavioral detection of 520 nm light

In the two-choice laser detection task, the two bats went to the correct arm of the platform in 87% (Bat 1) and 95% (Bat 2) of trials, which is above their respective chance performance of 58% and 60% (Fig. 2B). This confirms that the laser stimulus is indeed detectable by E. fuscus.

Obstacle avoidance performance

In the flight experiment, bats were required to use echo acoustic and/or visual cues to steer around an obstacle placed at the opening of a box to receive a food reward. Performance was compared across four sensory conditions: A–V– (control), A+V–, A–V+ and A+V+. Obstacles were thin enough (≤5 mm) and the overall width of the box was large enough (70 cm wide) to accommodate the full wingspans of the individual bats (<35 cm) on at least one side of the obstacle. A chi-square test of proportions revealed that there was no significant effect of obstacle orientation (χ2=5.42e-31, d.f.=1, N=118, P=1) or position (χ2=5.06, d.f.=4, N=118, P=0.28) on bat entrances to the box across sensory conditions, so we excluded these terms from subsequent models, grouping all trials using each obstacle stimulus type. There was a significant difference in the number of trials in which the bats flew into the box across each stimulus type (GLMM with binomial error distribution, F3,167.21=9.86, P<0.001; Fig. 3A). Bats almost always entered the box under unobstructed control conditions (98%). The percentage of flights into the box was significantly reduced in A–V+ and A+V– conditions (84% and 83%, respectively) and even more so in A+V+ conditions (57%).

We also determined whether bats contacted one type of obstacle more than another type. ‘Contacts’ were defined as either colliding with the obstacle or touching it with a wing as it was passed. For 60% of A–V+ trials in which the bat entered the box, bats ‘made contact’ with the laser beam without attempting to avoid it or the mist column onto which it was projected, as indicated by the lack of observable changes in the flight trajectory (Fig. 3B). This is significantly more than for the A+V– trials, in which contact was made with the echo-acoustic obstacle in 3% of trials (GLMM with binomial error distribution, F2,85.29=36.12, P<0.001) There was no significant difference between A+V– and A+V+, in which no contacts were made in any trial.

Flight analysis

We used 3D reconstructions of each of the bats' flight trajectories (Fig. 4) to determine the animal's position during each recorded frame. Using this information, we calculated the mean speed of the bat during each trial and compared this across each obstacle condition and whether bats entered the box (LMM). There was no significant difference in speed across obstacle conditions (F3,163.04=0.46, P>0.05), but bats did fly significantly faster when they entered the box (F1,163.65=31.45, P<0.01) than when they avoided the box (Fig. 5A). There was no interaction effect between stimulus and the outcome.

Additionally, we decomposed the trajectories into 1.0 m bins to look at speed on a finer scale. These distance bins were compared across each obstacle type (LMM). We observe a significant interaction between the main effects of distance bin and obstacle type (F19,174.33=8.80, P<0.01), and planned contrasts of each obstacle type within each distance bin revealed significant differences primarily when the bat was within 0–1 m and 1–2 m of the box opening (Fig. 5B).

Angle of avoidance

We also analyzed the angle of avoidance in trials in which the bat did not enter the target box. The angle of avoidance is defined as the maximum angle between the bat–target vector and the bat tangent when the bat is initiating its avoidance turn, which occurs at or before the minimum distance between the bat and the opening of the box. This measure can be used as a proxy for when the bat makes the decision to not enter the box. There was no significant difference (LMM, two-way ANOVA) in angle of avoidance across obstacle types (F3,22.58=0.36, P>0.05), there was no significant difference in the distances at which the turn occurred (F1,22.99=0.0003, P>0.05), and we observed no interaction effects.

Echolocation calls

We determined the position of the bat along each flight trajectory at the time of each echolocation call emission. The trajectories were then binned into 0.5 m increments relative to the position of the center of the box opening, and we analyzed the number of calls produced in each distance bin and under each stimulus condition (negative binomial regression). As expected, bats increased the number of calls as distance to the box decreased (Fig. 6), and we observed a significant interaction effect between distance bin and obstacle type (F117,6718=2.9, P<0.01). There was also a significant difference between the number of calls emitted and whether or not bats entered the box (F1,259.32=2.68, P>0.05). On average, bats emitted two more calls when they entered the box than when they did not.

In many trials, audio and video recordings were captured in which the bat was not yet in flight and/or initially out of view in one or both cameras, making it impossible to re-create the trajectory at those points. These calls were not included in the previous analysis and instead were analyzed separately (negative binomial regression) to determine whether the bats were calling more frequently at the beginnings of trials with different obstacle conditions or whether the calls could be used to predict whether bats would enter the box. There was no significant effect associated with the outcome of whether bats entered the box or not (F1,556.27=2.18, P>0.05). There was a significant effect of the stimulus presented (F3,557.47=2.18, P<0.05), but these effects were not statistically significant (P<0.05) during post hoc comparisons after adjusting for multiple comparisons. No interaction effect was observed.

Lastly, we analyzed several spectrotemporal acoustic parameters of bat sonar calls during each trial (see Table S1 for summaries of statistical results). There was no significant overall effect of obstacle stimulus type on spectral or temporal parameters. Because echolocation call interval (synonymous with pulse) is known to vary significantly with distance to an object, an analysis of call interval across 1 m distance bins was conducted with respect to the different obstacle conditions (LMM). The data show that there is a trending decrease in call interval (increase in call rate) as bats approach the box (Fig. 7) and that there is a significant interaction effect between distance bin and obstacle type (F15,5434=2.54, P<0.01). Post hoc comparisons only result in a single significant comparison at 1–2 m between A+V– and A+V+ obstacles. Additional significant differences in sonar sound groups, peak and start frequencies, and bandwidth are related to whether the bat entered the box or not. We observed an increase of ∼124 Hz in start frequency of the frequency-modulated sweep in trials in which bats did not enter the box. Peak frequency of bat echolocation calls decreased by ∼647 Hz and total bandwidth of calls decreased by ∼300 Hz when bats did not enter the box. On average, bats produced three more sonar sound groups and increased their call interval by ∼21 ms when they did not enter the box.

Bats can use both visual and acoustic cues to navigate their environments. In some instances, such as short-range navigation under crepuscular light conditions, visual and acoustic cues may provide complementary information. However, there are frequently scenarios in which vision and echolocation do not provide complementary information. For example, when navigating long distances or detecting large objects at distances exceeding ∼100 m, echolocation no longer provides reliable information, owing to the high degree of atmospheric attenuation that limits the functional range of high-frequency sound transmission (Lawrence and Simmons, 1982; Holderied and von Helversen, 2003; Stilz and Schnitzler, 2012). Thus, vision would likely provide reliable cues for identifying landmarks or large obstacles, whereas echolocation favors detection of small objects at close distances (Boonman et al., 2013). And in complete or near complete darkness, where visual cues are virtually absent, echolocation provides information about the location, size, texture and motion of objects around which the bat maneuvers (Fenton et al., 2016). The question of how vision and echolocation interact arises in situations in which both cues are available and provide useful information.

We employed an orientation paradigm to investigate the effects of multimodal cueing on obstacle avoidance in the laryngeal echolocating bat, E.fuscus. Specifically, we focused on the behavioral responses to stimuli that yielded cues within the visual and/or auditory domains, as these two senses are most often utilized in tandem and are both functional in distal sensing. We sought to determine whether bats demonstrate either auditory or visual dominance in the context of spatial orientation and obstacle avoidance. In nature, this might arise when dim-to-intermediate light levels could facilitate bimodal sensing.

When presented with a task that required entering a box that was partially obstructed by an obstacle, E. fuscus demonstrated behavioral patterns that depended on the stimulus dimensions of the obstacle. Stimuli were constructed to provide echoic feedback, visual feedback or both, and the bats' performances were analyzed across several echolocation and flight kinematic parameters. When visual and echo acoustic stimuli were presented simultaneously, the two cues (A+V+ condition) were combined into a multimodal composite signal (MCS). MCSs can result in several potential outcomes that depend first on whether the individual component signals convey the same information (i.e. they elicit the same behavioral response) and second on how the conveyed information of the combined signal influences behavior (Partan and Marler, 1999). If the two component signals elicit the same behavioral response, then the two are said to convey redundant information, whereas different behavioral responses are the result of non-redundant information (Fig. 8).

We anticipated that MCSs, consisting of visual and auditory stimuli, in the obstacle avoidance task would offer redundant information and that, when presented together, they would result in an equivalent or enhanced response (i.e. reduced percentage of trials in which the bat entered the box). This is because redundancy is one of the simplest ways to counteract a noisy environment or discriminate potentially ambiguous signals by having multiple sensory modalities supply ‘backup’ information. Although our laboratory-based task did not introduce noise, environments frequented by bats in the wild often contain extraneous sounds, such as signals produced by nearby conspecifics or reverberant echoes from highly cluttered environments (Dusenbery, 1992; Schnitzler and Kalko, 2001). The results of our study suggest that visual and auditory cues provide redundant information to the bats performing in the obstacle avoidance task reported here. Specifically, we observed that visual cues alone (A–V+) and acoustic cues alone (A+V–) resulted in a similar reduction in successful entry to the box when compared with the unobstructed control condition. When combined into MCSs (A+V+), the bats showed an even further decrease in entry to the box. This suggests that our multimodal signal results in an enhancement effect, perhaps due to the increased saliency of the A+V+ obstacle. Similar effects have been observed in the eastern gray squirrel (S.carolinensis), which displays enhanced responses to multisensory, audio/visual components of a conspecific alarm signal compared with either unisensory component (Partan et al., 2009).

The position and the orientation of the obstacle had no impact on whether bats entered the box. When bats did enter the box, they rarely made contact with physical objects, and thus avoided potential physical discomfort associated with a collision in A+V– and A+V+ conditions. When navigating in proximity to the A–V+ obstacles, bats frequently flew through them, breaking the beam of the laser with their wings and, occasionally, their entire body. In psychophysical experiments that probe the detection, discrimination and scaling of physical stimuli (Munoz and Blumstein, 2012), behavioral responses are used to make inferences about perception (Shettleworth, 2009). Although we demonstrated that the laser stimulus was detectable by E. fuscus and that our obstacle conditions generated different behavioral responses, we do not yet have data to make inferences about the bats’ perception of the A–V+ obstacle.

It appears that bats did not treat the laser beam as a solid object, based on their high percentage of contact with the obstacle, but their decreased number of entrances suggests that they did respond to the laser beam as either an obstacle or other aversive stimulus. Several factors should be considered when interpreting these results, which motivate new lines of investigation. First, it is possible that the 3 mm diameter laser beam was too small to simulate a solid obstacle, and future experiments with wider diameter laser obstacles could address this possibility. Second, the laser obstacle yielded no tactile feedback when it was contacted by the bat in a way the solid echo-acoustic and visual/echo-acoustic obstacles provided. In fact, the absence of tactile feedback from contact with the laser may have informed the bat that the A–V+ obstacle is not a solid object, and this experience may have reduced attempts to avoid ‘collisions’. At this point, we cannot determine whether the bat learned from experience that the laser was not a solid object or whether it never perceived the laser as a solid object. In the future, these questions could be addressed by an experimental test on a 2D plane in which bats are rewarded to navigate around laser obstacles of varying diameter. This could result in one of three outcomes: (1) no change in behavior – bats continue to make contact with the obstacle with no change in percentage of avoidance; (2) the obstacle is treated the same as the echo-acoustic obstacle and no contact is made with the beam, and there is no change in the percentage of avoidances; (3) the obstacle is treated the same as the visual-acoustic obstacle and bats further increase their percentage of avoidances. This approach would offer some additional insight to visual obstacle avoidance in bats, but it would still not yield conclusive answers to the question of whether bats perceive laser beams as solid objects. To tackle this challenge, bats could be trained in a psychophysical task that excludes physical contact with visual stimuli. For example, bats might perform in a match-to-sample task with a range of visual stimuli that include laser beams of varying diameters and solid objects. We wish to stress that the paradigm used in the present study was novel and exploratory in nature, and we encourage others to adopt laser stimuli to further investigate bats' use of purely visual information in obstacle avoidance and to learn what stimulus parameters may influence whether bats perceive laser beams as solid objects.

Often, we analyze echolocation and flight parameters as indicators of the information bats are gathering about their environments. In this experimental setup, bats flew overall faster in trials in which they entered the box. When analyzing speed on a finer scale, we observe an interaction effect between distance to the box opening and stimulus obstacle condition and the speed at which the bat is traveling. When the bat is close to the opening of the box (1–2 m), we see significant decreases in speed when bats are presented with the visual-only and acoustic-visual obstacles, compared with the control and acoustic-only conditions. At 0–1 m, we observe a significant decrease in speed when bats are presented with the acoustic-visual obstacle compared with all other conditions. This suggests that bats make distance-dependent adjustments in flight speed that depend on obstacle modality.

Some acoustic parameters of the bats' echolocation calls showed similar distance-dependent relationships with the obstacle type being presented. The call interval significantly increased when bats navigated around the acoustic-visual obstacle, compared with the acoustic-only obstacle, at 1–2 m. This increase in call interval suggests that the visual obstacle influenced the bats' echolocation behavior at this distance. We also note the significant increase in the total number of calls at 0.5–1 m when presented with the acoustic-visual obstacle compared with all other conditions, followed by a significant decrease at 0–0.5 m. Overall, bats tended to emit slightly more total echolocation calls when they entered the box than when they did not.

Although the documented changes in flight and echolocation behaviors in this study do not offer direct insight into the bats' perception of the obstacles, we can conclude that the addition of visual information to the active sensing of echolocation has the effect of altering the way individual bats choose to interact with their environment. This is consistent with the recent study conducted by McGowan and Kloepper (2020), in which wild Brazilian free-tailed bats (Tadarida brasiliensis) are documented to exhibit different echolocation patterns when flying during the day compared with at night. Future iterations of the present experiment could introduce new behavioral paradigms to further test which environmental contexts influence multimodal sensory processing. Neurophysiological experiments may also contribute to our understanding of multimodal sensing by characterizing the underlying neural processes that mediate responses to different combinations of visual and acoustic stimuli.

Author contributions

Conceptualization: T.K.J.; Methodology: T.K.J.; Software: T.K.J.; Validation: T.K.J.; Formal analysis: T.K.J.; Investigation: T.K.J.; Resources: T.K.J., C.F.M.; Data curation: T.K.J.; Writing - original draft: T.K.J.; Writing - review & editing: T.K.J., C.F.M.; Visualization: T.K.J.; Supervision: T.K.J., C.F.M.; Project administration: T.K.J., C.F.M.; Funding acquisition: T.K.J., C.F.M.

Funding

Results reported here are based on work supported by the National Science Foundation Graduate Research Fellowship Program under grant 1452598 to T.K.J., and National Science Foundation Brain Initiative (NCS-FO 1734744), Air Force Office of Scientific Research (FA9550-14-1-0398NIFTI) and Office of Naval Research (N00014-17-1-2736) grants to C.F.M. Any opinions, findings, and conclusions or recommendations expressed in this publication are those of the authors and do not necessarily reflect the views of the funding agencies.

Anderson
,
E. M.
and
Racey
,
P. A.
(
1991
).
Feeding behaviour of captive brown long-eared bats, Plecotus auritus
.
Anim. Behav.
42
,
489
-
493
.
Bekoff
,
M.
(
1972
).
The development of social interaction, play, and metacommunication in mammals: an ethological perspective
.
Q. Rev. Biol.
47
,
412
-
434
.
Bell
,
G. P.
(
1982
).
Behavioral and ecological aspects of gleaning by a desert insectivorous bat Antrozous pallidus (Chiroptera: Vespertilionidae)
.
Behav. Ecol. Sociobiol.
10
,
217
-
223
.
Boonman
,
A.
,
Bar-On
,
Y.
,
Cvikel
,
N.
and
Yovel
,
Y.
(
2013
).
It's not black or white—on the range of vision and echolocation in echolocating bats
.
Front. Physiol.
4
,
248
.
Bradbury
,
J. W.
and
Nottebohm
,
F.
(
1969
).
The use of vision by the little brown bat, Myotis lucifugus, under controlled conditions
.
Anim. Behav.
17
,
480
-
485
.
Chase
,
J.
(
1981
).
Visually guided escape responses of Microchiropteran bats
.
Anim. Behav.
29
,
708
-
713
.
Chase
,
J.
(
1983
).
Differential responses to visual and acoustic cues during escape in the bat Anoura geoffroyi: Cue preferences and behaviour
.
Anim. Behav.
31
,
526
-
531
.
Chase
,
J.
and
Suthers
,
R. A.
(
1969
).
Visual obstacle avoidance by echolocating bats
.
Anim. Behav.
17
,
201
-
207
.
Curtis
,
W. E.
(
1952
).
Quantitative studies of echolocation in bats (Myotis I. lucifugus); Studies of vision of bats (Myotis l. lucifugus and Eptesicus f. fuscus); and Quantitative studies of vision of owls (Tyto alba pratincola). PhD Thesis, Cornell University
.
Danilovich
,
S.
and
Yovel
,
Y.
(
2019
).
Integrating vision and echolocation for navigation and perception in bats
.
Sci. Adv.
5
,
eaaw6503
.
Danilovich
,
S.
,
Krishnan
,
A.
,
Lee
,
W.-J.
,
Borrisov
,
I.
,
Eitan
,
O.
,
Kosa
,
G.
,
Moss
,
C. F.
and
Yovel
,
Y.
(
2015
).
Bats regulate biosonar based on the availability of visual information
.
Curr. Biol.
25
,
R1124
-
R1125
.
Davis
,
R.
(
1966
).
Homing performance and homing ability in bats
.
Ecol. Monogr.
36
,
201
-
237
.
Davis
,
W. H.
and
Barbour
,
R. W.
(
1965
).
The use of vision in flight by the bat Myotis sodalis
.
Am. Midl. Nat.
74
,
497
.
Dusenbery
,
D. B.
(
1992
).
Sensory Ecology: How Organisms Acquire and Respond to Information
.
New York
:
W.H. Freeman
.
Eklöf
,
J.
(
2003
).
Vision in echolocating bats. PhD Thesis, Göteborg University
.
Eklöf
,
J.
,
Svensson
,
A. M.
and
Rydell
,
J.
(
2002
).
Northern bats, Eptesicus nilsonii, use vision but not flutter-detection when searching for prey in clutter
.
Oikos
99
,
347
-
351
.
Faure
,
P. A.
and
Barclay
,
R. M. R.
(
1994
).
Substrate-gleaning versus aerial-hawking: plasticity in the foraging and echolocation behaviour of the long-eared bat, Myotis evotis
.
J. Comp. Physiol. A
174
,
651
-
660
.
Fenton
,
M. B.
,
Grinnell
,
A.
,
Popper
,
A. N.
and
Fay
,
R. R.
and
Acoustical Society of America eds
. (
2016
).
Bat Bioacoustics
.
New York
:
ASA Press/Springer
.
Gomes
,
D. G. E.
,
Page
,
R. A.
,
Geipel
,
I.
,
Taylor
,
R. C.
,
Ryan
,
M. J.
and
Halfwerk
,
W.
(
2016
).
Bats perceptually weight prey cues across sensory systems when hunting in noise
.
Science
353
,
1277
-
1280
.
Greif
,
S.
,
Zsebők
,
S.
,
Schmieder
,
D.
and
Siemers
,
B. M.
(
2017
).
Acoustic mirrors as sensory traps for bats
.
Science
357
,
1045
-
1047
.
Griffin
,
D. R.
(
1970
).
Migration and homing of bats
. In
Biology of Bats
(ed.
W.
Wimsatt
), pp. 233-264.
Elsevier
.
Griffin
,
D. R.
and
Galambos
,
R.
(
1941
).
The sensory basis of obstacle avoidance by flying bats
.
J. Exp. Zool.
86
,
481
-
506
.
Hedrick
,
T. L.
(
2008
).
Software techniques for two- and three-dimensional kinematic measurements of biological and biomimetic systems
.
Bioinspir. Biomim.
3
,
034001
.
Heffner
,
R. S.
and
Heffner
,
H. E.
(
1992
).
Visual factors in sound localization in mammals
.
J. Comp. Neurol.
317
,
219
-
232
.
Heffner
,
R. S.
,
Koay
,
G.
and
Heffner
,
H. E.
(
1999
).
Sound localization in an Old-World fruit bat (Rousettus aegyptiacus): acuity, use of binaural cues, and relationship to vision
.
J. Comp. Psychol. Wash. DC 1983
113
,
297
-
306
.
Holderied
,
M. W.
and
von Helversen
,
O.
(
2003
).
Echolocation range and wingbeat period match in aerial-hawking bats
.
Proc. R. Soc. B
270
,
2293
-
2299
.
Höller
,
P.
and
Schmidt
,
U.
(
1996
).
The orientation behaviour of the lesser spearnosed bat, Phyllostomus discolor (Chiroptera) in a model roost: Concurrence of visual, echoacoustical and endogenous spatial information
.
J. Comp. Physiol. A
179
,
245
-
254
.
Hope
,
G. M.
and
Bhatnagar
,
K. P.
(
1979
).
Electrical response of bat retina to spectral stimulation: comparison of four microchiropteran species
.
Experientia
35
,
1189
-
1191
.
Hughes
,
M.
(
1996
).
The function of concurrent signals: visual and chemical communication in snapping shrimp
.
Anim. Behav.
52
,
247
-
257
.
Jakobsen
,
L.
,
Brinkløv
,
S.
and
Surlykke
,
A.
(
2013
).
Intensity and directionality of bat echolocation signals
.
Front. Physiol.
4
,
89
.
Jensen
,
M. E.
,
Miller
,
L. A.
and
Rydell
,
J.
(
2001
).
Detection of prey in a cluttered environment by the northern bat Eptesicus nilssonii
.
J. Exp. Biol.
204
,
199
-
208
.
Lawrence
,
B.
and
Simmons
,
J.
(
1982
).
Echolocation in bats: the external ear and perception of the vertical positions of targets
.
Science
218
,
481
-
483
.
Layne
,
J. N.
(
1967
).
Evidence for the use of vision in diurnal orientation of the bat Myotis austroriparius
.
Anim. Behav.
15
,
409
-
415
.
Marimuthu
,
G.
and
Neuweiler
,
G.
(
1987
).
The use of acoustical cues for prey detection by the Indian False Vampire Bat, Megaderma lyra
.
J. Comp. Physiol. A
160
,
509
-
515
.
McGowan
,
K. A.
and
Kloepper
,
L. N.
(
2020
).
Different as night and day: wild bats modify echolocation in complex environments when visual cues are present
.
Anim. Behav.
168
,
1
-
6
.
McGuire
,
L. P.
and
Fenton
,
M. B.
(
2010
).
Hitting the wall: Light affects the obstacle avoidance ability of free-flying little brown bats (Myotis lucifugus)
.
Acta Chiropt.
12
,
247
-
250
.
Mistry
,
S.
(
1990
).
Characteristics of the visually guided escape response of the Mexican free-tailed bat, Tadarida brasiliensis mexicana
.
Anim. Behav.
39
,
314
-
320
.
Munoz
,
N. E.
and
Blumstein
,
D. T.
(
2012
).
Multisensory perception in uncertain environments
.
Behav. Ecol.
23
,
457
-
462
.
Parker
,
J. L.
and
Robinson
,
C. W.
(
2017
).
Auditory and visual contributions to multisensory integration
.
CogSci
6
,
2858
-
2863
.
Partan
,
S.
and
Marler
,
P.
(
1999
).
Communication goes multimodal
.
Science
283
,
1272
-
1273
.
Partan
,
S. R.
,
Fulmer
,
A. G.
,
Gounard
,
M. A. M.
and
Redmond
,
J. E.
(
2010
).
Multimodal alarm behavior in urban and rural gray squirrels studied by means of observation and a mechanical robot
.
Curr. Zool.
56
,
313
-
326
.
Partan
,
S. R.
,
Larco
,
C. P.
and
Owens
,
M. J.
(
2009
).
Wild tree squirrels respond with multisensory enhancement to conspecific robot alarm behaviour
.
Anim. Behav.
77
,
1127
-
1135
.
Posner
,
M. I.
,
Nissen
,
M. J.
and
Klein
,
R. M.
(
1976
).
Visual dominance: an information-processing account of its origins and significance
.
Psychol. Rev.
83
,
157
-
171
.
Russo
,
D.
,
Jones
,
G.
and
Arlettaz
,
R.
(
2007
).
Echolocation and passive listening by foraging mouse-eared bats Myotis myotis and M. blythii
.
J. Exp. Biol.
210
,
166
-
176
.
Ryan
,
M. J.
(
1987
).
The role of prey-generated sounds, vision, and echolocation in prey localization by the African bat Cardioderma cor (Megadermatidae)
.
J. Comp. Physiol. A
161
,
59
-
66
.
Schnitzler
,
H.-U.
and
Kalko
,
E. K.
(
2001
).
Echolocation by insect-eating bats
.
Bioscience
51
,
557
-
569
.
Shettleworth
,
S. J.
(
2009
).
Cognition, Evolution, and Behavior
.
Oxford University Press
.
Stilz
,
W.-P.
and
Schnitzler
,
H.-U.
(
2012
).
Estimation of the acoustic range of bat echolocation for extended targets
.
J. Acoust. Soc. Am.
132
,
1765
-
1775
.
Suthers
,
R.
,
Chase
,
J.
and
Braford
,
B.
(
1969
).
Visual form discrimination by echolocating bats
.
Biol. Bull.
137
,
535
-
546
.
Thiagavel
,
J.
,
Cechetto
,
C.
,
Santana
,
S. E.
,
Jakobsen
,
L.
,
Warrant
,
E. J.
and
Ratcliffe
,
J. M.
(
2018
).
Auditory opportunity and visual constraint enabled the evolution of echolocation in bats
.
Nat. Commun.
9
,
98
.
Uetake
,
K.
and
Kudo
,
Y.
(
1994
).
Visual dominance over hearing in feed acquisition procedure of cattle
.
Appl. Anim. Behav. Sci.
42
,
1
-
9
.
Ward
,
A. J. W.
and
Mehner
,
T.
(
2010
).
Multimodal mixed messages: the use of multiple cues allows greater accuracy in social recognition and predator detection decisions in the mosquitofish, Gambusia holbrooki
.
Behav. Ecol.
21
,
1315
-
1320
.
Warnecke
,
M.
,
Macías
,
S.
,
Falk
,
B.
and
Moss
,
C. F.
(
2018
).
Echo interval and not echo intensity drives bat flight behavior in structured corridors
.
J. Exp. Biol.
221
,
jeb191155
.
Welch
,
R. B.
and
Warren
,
D. H.
(
1980
).
Immediate perceptual response to intersensory discrepancy
.
Psychol. Bull.
88
,
638
-
667
.
Wilcoxon
,
H. C.
,
Dragoin
,
W. B.
and
Kral
,
P. A.
(
1971
).
Illness-induced aversions in rat and quail: relative salience of visual and gustatory cues
.
Science
171
,
826
-
828
.
Williams
,
T. C.
,
Williams
,
J. M.
and
Griffin
,
D. R.
(
1966
).
The homing ability of the neotropical bat Phyllostomus hastatus, with evidence for visual orientation
.
Anim. Behav.
14
,
468
-
473
.
Witten
,
I. B.
and
Knudsen
,
E. I.
(
2005
).
Why seeing is believing: merging auditory and visual worlds
.
Neuron
48
,
489
-
496
.

Competing interests

The authors declare no competing or financial interests.

Supplementary information