To navigate well through three-dimensional environments, animals must in some way gauge the distances to objects and features around them. Humans use a variety of visual cues to do this, but insects, with their small size and rigid eyes, are constrained to a more limited range of possible depth cues. For example, insects attend to relative image motion when they move, but cannot change the optical power of their eyes to estimate distance. On clear days, the horizon is one of the most salient visual features in nature, offering clues about orientation, altitude and, for humans, distance to objects. We set out to determine whether flying fruit flies treat moving features as farther off when they are near the horizon. Tethered flies respond strongly to moving images they perceive as close. We measured the strength of steering responses while independently varying the elevation of moving stimuli and the elevation of a virtual horizon. We found responses to vertical bars are increased by negative elevations of their bases relative to the horizon, closely correlated with the inverse of apparent distance. In other words, a bar that dips far below the horizon elicits a strong response, consistent with using the horizon as a depth cue. Wide-field motion also had an enhanced effect below the horizon, but this was only prevalent when flies were additionally motivated with hunger. These responses may help flies tune behaviors to nearby objects and features when they are too far off for motion parallax.

To navigate with precision and speed, flying insects must approximate the distances of the visible features of their environment. But retinal images do not contain inherent distance information (Land and Nilsson, 2012), so animals infer distance with secondary cues. Every distance cue has limitations, and only provides good estimates when certain conditions are met (Howard, 2002). Humans use several depth cues, some of which are reliable in circumstances where others are not (Landy et al., 1995). The horizon is one of the most conspicuous features of the outdoor world, and both humans and insects use its image for a variety of purposes. Our goal was to determine whether fruit flies, like humans, are able to use angular distance from the horizon to judge features as closer.

Images are captured when light from the three-dimensional world focuses onto a two-dimensional surface. This encodes the relative angles between features, but not the distances to them, and images that appear close together on the retina may represent objects that are far apart in the world (Land and Nilsson, 2012). Depth cues allow animals some ability to reconstruct the distance information necessary to model the three-dimensional environment (Howard, 2002).

When scanning a landscape, or catching a ball, or watching a movie, we rely on an array of depth cues, that give scenes an intrinsic quality of distance (Landy et al., 1995). Insects, with rigid exoskeletons, cannot use ocular convergence or accommodation of focal length to judge distance (Srinivasan, 1992). Further, although several insects have binocular overlap (Beersma et al., 1977), only a few groups have been shown to use stereopsis to determine depth (Rossel, 1983; Eriksson, 1985), and the tiny separation between their eyes limits its use to short distances (Collett, 1987).

For familiar objects, some insects use larger retinal image sizes to indicate proximity, such as bees searching for flowers (Cartwright and Collett, 1983). For insects in motion, the best depth cue is motion parallax (Lehrer et al., 1988), in which changing the viewing position displaces the images of nearby objects. This is related to stereopsis, but rather than a fixed offset between two eyes, the viewer's own motion provides the offset. Several insects generate motions specifically to produce parallax, such as locusts peering before a jump (Wallace, 1959; Collett, 1978), and hymenopterans performing intricate orientation flights before foraging (Wehner, 1981; Zeil, 1993). In other cases, the parallax generated from natural flight allows insects to gauge distances (Kirchner and Srinivasan, 1989; Srinivasan et al., 1991) and modulate flight speed (Preiss, 1987). Fruit flies use motion parallax while walking on the ground (Schuster et al., 2002) and during flight (Cabrera and Theobald, 2013).

However, when animals are not translating, or visual elements are far off, motion parallax provides no distance information. Humans retain depth perception by a range of minor depth cues (Landy et al., 1995), for example when viewing photographs (Benson and Yonas, 1973). One such indirect cue is an image's elevation, or elevation away from a visible horizon. Frogs use retinal elevation as a distance indicator when striking at prey (Collett and Udin, 1988); fiddler crabs use visual elevation above the horizon as the crucial test to distinguish conspecifics from predators (Layne et al., 1997; Zeil and Hemmi, 2006); tiger beetles prefer prey farther from the horizon, presumably judging it as closer (Layne et al., 2006). Insects additionally use the horizon to correct attitude deviations (Berry et al., 2007) and use nearby horizontal edges to regulate altitude (Straw et al., 2010). For humans, an image's angular offset from the horizon is an indication of distance, with objects near the horizon appearing farther off (Ooi et al., 2001). This is one common, though probably incomplete, explanation for the moon illusion, in which the full moon near the horizon seems substantially larger (presumably because it seems farther off) than it does high in the sky a couple hours later (when its actual angular extent is slightly larger) (Rock and Kaufman, 1962; Kaufman and Kaufman, 2000).

We set out to determine whether flying fruit flies, like humans, perceive visual features as farther off when they appear near the horizon. Drosophila melanogaster steer in response to image motion in their frontal visual region, and are particularly adept at tracking vertical bars (Reichardt and Wenking, 1969; Maimon et al., 2008). Flies respond more strongly to images they perceive as close objects (Wehner and Horn, 1975; Cabrera and Theobald, 2013), probably because they represent the features most relevant to making good steering decisions. We hypothesized that, in the absence of other depth cues, flies would steer more strongly to moving images placed farther from the horizon. To test this, we measured steering effort of tethered fruit flies while independently varying the elevations of a virtual horizon and a moving visual feature.

We tethered fruit flies to the center of a projection-based visual flight arena (Fig. 1A) (Cabrera and Theobald, 2013) where they viewed perspective-corrected images of vertical bars and an artificial horizon (Fig. 1B). The figures show approximate intensities only, as actual intensities in the arena differ from those viewed on a monitor or printed on a page. Tethered flies respond to lateral bar motion by varying left and right wing beat amplitudes, producing a difference (ΔWBA), which generates yaw torque (Fig. 1C) (Götz, 1987; Tammero et al., 2004). Our bars were 11 deg wide, spanned 60 vertical degrees, and were shaded such that the contrast above and below the horizon was the same, but with opposite sign. The bar moved left and right 45 deg from the center line at 1 Hz. We took the correlation between the bar motion and ΔWBA as a measure of steering effort. In closed-loop tracking, flies can track light and dark bars with different dynamics (Reiser and Dickinson, 2008), but open-loop tracking of vertically centered bars in our arena against the light or dark backgrounds produced statistically indistinguishable correlations (dark background mean correlation=0.268±0.066, N=42, bright background mean correlation=0.267±0.028, N=49, t=0.199, P=0.842).

Bar tracking effort varies with the angle between the visual horizon and bar bottom

We varied the center elevation of a bar between −30 and 30 deg, both with and without a horizon at 0 deg (Fig. 2A), then varied the elevation of the horizon between −30 and 30 deg, with a bar whose center was at 0 deg (Fig. 2B). This created two analogous sets of stimuli, in which the bar and horizon elevation did not match, but their relative positions did. Responses to horizontal bar motion were robust, and flies tracked multiple cycles with little delay (Fig. 2C,D). But both the actual and apparent elevation of the bar determined how strongly wing beat amplitudes correlated to bar motion (Fig. 2E,F). A bar elevated by 30 deg, such that the bottom met the horizon, produced relatively weak tracking, but when the bottom was 60 deg below the horizon, it generated much stronger tracking (Fig. 2E, white circles). Because both relative and actual elevation might affect responses, we removed the horizon to examine the effect of intrinsic elevation (Fig. 2E, gray circles). Elevated bars produced a response that was indistinguishable with or without a horizon, and lower bars still produced stronger tracking, but tracking at the lowest elevation was significantly stronger in the presence of a horizon (t=3.11, P=0.0015, N=49). This relatively minor effect of the visual horizon may imply that when a horizon is not visible, flies behave as if it were central. But the horizon effect is strongly apparent when it is shifted up or down. When we held the bar elevation to the middle and moved the horizon, a higher horizon elicited greatly increasing tracking responses (Fig. 2F).

Multiple wing beat changes accompany tracking correlation

Several factors could affect the correlation values we observed. To investigate the aspects of raw traces (Fig. 2C,D) that produce the correlations (Fig. 2E,F), we extracted potentially relevant factors. We estimated phase and amplitude by fitting our data to the triangular waveform of the bar motion, optimizing with a Levenberg–Marquardt gradient method to minimize the least squares error. The amplitude had no direct effect on the Pearson’s correlation, which is scale free, but higher positions relative to the horizon corresponded to decreased tracking amplitude (Fig. 3A), and phase lag generally increased (Fig. 3B). Both corresponded to weaker tracking at greater apparent distances. The effects were more pronounced when the bar was elevated (white circles in Fig. 3A,B), and the phase lag effect did not hold for negatively elevated horizons (black circles). The residual variance in the ΔWBA, or the turns that did not correspond to bar motion, showed no clear correspondence to relative bar elevation (Fig. 3C), although the highest horizon elevation produced reduced tracking variation (left-most black marker in Fig. 3C). In free flight, high horizontal edges induce flies to increase altitude (Straw et al., 2010). To determine whether this affected tethered flies, we examined the sum of wing beat amplitudes (ΣWBA), which increases when flies compensate for upwards optical flow (Theobald et al., 2010). Mean ΣWBA slightly increased as the horizon appeared higher (Fig. 3D, N=49). To determine whether the increased ΣWBA might create stronger tracking correlations itself, we examined the ΣWBA and tracking responses of individual flies viewing a central horizon, the middle bar of Fig. 3D (Fig. 3E, N=49). ΣWBA appeared to be uncorrelated with tracking, implying these are separate effects. However, variation partly results from differences in tethering and fly placement, which affects the strength of the signal over the sensor. To account for this effect, we subtracted the initial ΣWBA value to plot only the change in the sum of wing beat amplitude that followed a stimulus. This similarly showed that the change in ΣWBA (or Δ ΣWBA) from the onset of the stimulus had no clear link to tracking (Fig. 3F, N=49).

Angle below the horizon can indicate distance

If a few assumptions are met, a viewer can estimate the distance to an object that touches the ground simply by measuring the angle it subtends below the horizon. If the ground is flat, the viewer is near the ground and the horizon is far off, then the horizontal distance to the object (d) is simply: d=a/tanθ, where a is the viewer's altitude and θ is the angular distance from the horizon to the object bottom (Fig. 4A).

To better evaluate the effect of the horizon and elevation on tracking, we examined the fits of a series of simple linear regressions. We first considered the angle of the bar bottom below the horizon to fit tracking responses (R) from Fig. 2E,F: R=mθ+b, where m and b are the slope and intercept of the regression. These produced good fits (Fig. 4B) for both the variable bar elevation (white circles, dotted line, r2=0.81, intercept=0.140, slope=−0.004, slope different from 0: t=5.56, P<0.001, N=49) and the variable horizon elevation (black circles, solid line, r2=0.83, intercept=0.172, slope=−0.003, slope different from 0: t=3.39, P<0.001, N=49). The intercepts here represent the tracking response expected when the bar bottom is on the horizon, and the slopes are the rate at which they increase as the angle becomes more negative.

However, the variable horizon data in Fig. 4B (black circles) noticeably curve, dipping below our straight line fit. To address this, we considered the apparent distance implied by a bar meeting the assumptions above: R=ma/tanθ+b. We used a constant value of 1 for a, and removed the condition of the bar bottom touching the horizon, as this implies an infinite distance (which produces an outlier effect). Even so, the fits were considerably worse (Fig. 4C) for both the variable bar elevation (white circles, dotted line, r2=0.44, intercept=0.324, slope=−0.020, slope different from 0: t=2.33, P=0.010, N=49) and the variable horizon elevation (black circles, solid line, r2=0.45, intercept=0.270, slope=−0.015, slope not statistically different from 0: t=1.62, P=0.053, N=49). Here, the intercepts represent the responses expected if the bar were at zero distance, and the slopes describe the decrease to bars farther off.

Some behaviors increase disproportionately with nearness, so we finally considered the fit to the inverse of apparent distance: R=matanθ+b. This produced better fits than apparent distance (Fig. 4D), with the variable bar elevation slightly lower than bar angle alone (white circles, dotted line, r2=0.76, intercept=0.163, slope=0.129, slope different from 0: t=5.35, P<0.001, N=49) but an excellent fit for the variable horizon elevation, much better than bar angle alone (black circles, solid line, r2=0.95, intercept=0.182, slope=0.112, slope different from 0: t=4.29, P<0.001, N=49). In this case, the intercepts represent the response to a bar at infinity, and the slopes describe the increasing response, inversely proportional to decreasing distance. Because this produced the best fit for the variable horizon experiment, and the best fit overall, we illustrated these fits in Fig. 2E,F. The slopes of the variable bar and variable horizon fits were not statistically distinguishable (t=0.47, P=0.682)

Reversing contrast decreases the horizon's effect on lateral tracking

On land, natural light generally arrives from the sky, then illuminates the ground, which absorbs some of it. Because of this, the sky above the horizon is frequently, but not always, brighter than the ground, even at night. To assess the importance of this arrangement to the horizon's effect on tracking responses, we reversed contrast, displaying darker regions above and brighter regions below the horizon. Although lower bars still produced greater responses, the reversed horizon reduced tracking at each bar elevation (Fig. 5A; intercept=0.233, slope=0.121, slope different from 0: t=3.349, P<0.001, N=42). The response to the highest bar could not be distinguished from zero (t=1.13, P=0.263, N=42). When varying the elevation of the reversed horizon, its effects on tracking were minimal, and possibly absent entirely, as the slope of the regression could not be distinguished from zero (Fig. 5B; intercept=0.137, slope=0.031, slope not statistically different from 0: t=0.898, P=0.815, N=42).

Fixing the head in place reduces the horizon's effect on lateral tracking

Changing head position is a straightforward behavior by which the horizon could modulate tracking. If head pitch angle were fixed by the horizon, then objects far from the horizon would produce retinal images far from the midline. This is how fiddler crabs identify predators (Layne, 1998). To determine the contribution of head motion, we fixed fly heads to the thorax during tethering. Importantly, this blocks lateral as well as vertical head motion, and will inhibit behaviors that depend on lateral gaze shifts. But recent studies have shown that if background motion is absent, head fixation does not impair fruit flies from tracking vertical moving features (Fox and Frye, 2014). Consistent with this, bars elevated against the horizon produced tracking comparable in flies with free and glued heads, but for bars lower against the horizon, flies with glued heads showed a muted increase in tracking (Fig. 6A,B compared with responses from Fig. 2E,F shown in gray). The linear fit for the bar elevation conditions produced an intercept and slope of 0.255 and 0.083, respectively (Fig. 6A; slope different from 0: t=2.81, P=0.002, N=62). The fit for the horizon elevation conditions had an intercept and slope of 0.207 and 0.044, and no significant support for a slope different from 0 (Fig. 6B; t=1.48, P=0.06, N=62). To determine whether retinal location explains the reduced response, we took videos of fly heads from the side during bar tracking with a variable-height horizon. We used custom-written software to estimate the angle of head tilt in each frame, for 100 frames s−1 (images shown in Fig. 6C). The results showed highly variable head angles that did not strongly correlate with horizon angle (Fig. 6D, N=15). Although a horizon below 0 deg was associated with a lower head tilt, so was a horizon above 0 deg, implying a complex relationship between horizon and head tilt.

Wide-field tracking responses can also vary with the visible horizon

Both small visual features, such as isolated spots or vertical bars, and wide visual fields, such as the optic flow produced by self-motion, elicit robust responses in flies (Krapp et al., 2012; Maimon et al., 2008). But natural scenes contain intermediate visual stimuli, and responses with blocked motion-sensitive T4–T5 cells imply the neural processing is more nuanced than simple small-field and wide-field motion detection (Bahl et al., 2013; Fenk et al., 2014). We set out to determine whether the visible horizon modulates wide-field optic flow responses differently from bar-tracking responses. Wide fields of dots that simulate translational or rotational motion produce changes in ΔWBA that work to correct the apparent deviation from a forward heading (Theobald et al., 2010). These responses are robust and repeatable, although the correlation between motion and tracking is often lower than for bars (Mazo and Theobald, 2014). Further, translational motions produce stronger responses than rotational motions, but provide the independent depth cue of motion parallax, and additionally generate responses that inherently vary with elevation (Mazo and Theobald, 2014). To avoid these effects, we presented rotational flow fields. We produced flow fields as described previously (Cabrera and Theobald, 2013), but limited to a spherical segment of varying elevation, that, if wrapped all around, would represent a 2 steradian solid angle. Fitting non-overlapping, 2 steradian segments required us to modify the elevations tested compared with the bars. The dots simulated oscillation of 22 deg around a vertical axis. We measured the effect of displaying the horizontally moving segment in the upper, middle or lower regions of the arena with a visible horizon (Fig. 7A,B). The dots were shaded like the bars to have the same contrast against the light background above the horizon and the dark background below.

The dots generated weaker responses that initially varied little with horizon elevation. To enhance the responses, we removed flies from bottles 2 h before tethering and provided them with water but no food. The hungry flies tracked dot fields with amplitudes greater than those for tracking bars (Fig. 7C,D; N=39), and correlations that were comparable (Fig. 7E,F). This contrasts with the unmotivated responses of flies that were on food immediately before tethering, which showed low correlations with the bar motion under all experimental conditions (Fig. 7E,F; N=156). The responses grew as the stimulus was lowered relative to the horizon, similar to bar tracking, but with differences, as they were poorly fitted, in both cases, with an inverse of apparent distance equation. Our bars always extended down to the horizon, but dot segments were sometimes completely above it. We originally predicted that this positive angular distance from the horizon would induce stronger responses, just as distance below the horizon does, but this was not the case. Responses were stronger to dot fields below the horizon, and weaker to dot fields above, regardless of whether the dot field or horizon had shifted. Altering the equation for fitting the line produced little improvement because the responses seemed to saturate both above and below the horizon.

These results indicate that fruit flies track visual features with greater fidelity when they dip below a visual horizon (Fig. 2). As flies track apparently nearer features more strongly (Schuster et al., 2002; Cabrera and Theobald, 2013), this result is consistent with their using angular distance to the horizon as a minor depth cue, as humans do (Ooi et al., 2001). The tracking correlations are partly by variations in the phase and amplitude of bar tracking (Fig. 3), and are well modeled by an inverse response to apparent distance (Fig. 4). This modulation of tracking strength is reduced if the contrast between the sky and ground is reversed, or if the fly's head is glued in place (Figs 5, 6). Initial data showed that fixing the head position did not affect the base response to bars, but impaired the enhanced tracking to bars below the horizon, suggesting that flies might modulate responses by tilting their heads to alter retinal images (Fig. 6A,B). But examining head tilts directly showed a more complex relationship (Fig. 6D), in which flies lift their heads most for horizons at middle elevations. Another possibility is that flies execute brief head motions up to the horizon, not easily seen in our analysis, but important for steering modulation. Finally, when vertical bars were replaced with wide-field patterns, the responses were more nuanced. Initially, flies showed reduced tracking that was not modulated by elevation from the horizon, but motivating the flies with mild starvation strongly enhanced responses to dots below the horizon (Fig. 7). This response was qualitatively similar to bar tracking, but with subtly different aspects, and could imply a more complex role for the horizon when the stimulus is not a target. It additionally raises questions about the conditions under which tracking responses vary with internal physiological states. Bar tracking in the arena is already strong, but it may also be modified by hunger. Other states, such as thirst, virginal status or age, may further alter responses. This capacity to alter behavior, sometimes greatly, with internal state, may be one key to the surprising success of insect brains, and elevation may offer a promising direction for further experiments.

In experimental settings the horizon may produce unintended effects

In laboratory settings, visual stimuli for flies are often delivered with cylinders, such as patterned drums or LED arrays. These arenas can generate, from the perspective of the fly, a line of contrast with the rest of the room at the upper and lower boundaries. Our results imply that this arrangement could alter steering responses. For example, if a moving bar were presented in a dark cylinder in a bright room, we might expect enhanced tracking, because the bar sinks below a bright horizon. If the same stimulus were delivered in a fully enclosed arena, or with room lights off, the tracking response might be muted. Most behavioral experiments will properly hold elements like room lighting constant between experiments, so the visual horizon is unlikely to affect the qualitative results. But in some designs this may be an artifact, and introduce unwanted variation. In certain situations it could provide a way to enhance otherwise subtle tracking responses.

In nature, horizon effects will depend on the characteristics of the visual scene

Our experimental flies viewed a highly simplified visual world. In natural environments, the coupling between distance and the horizon is more nuanced than a simple line of contrast. Except at sea, the visual horizon is rarely the Earth's horizon. Rather, we see mountain ranges, tree lines and haze, that elevate, blur and obscure the line between ground and sky. In many cluttered visual scenes, such as in a forest, the horizon may be completely invisible. But busy scenes such as those may frequently be rich with parallax information, in which slight body motions will reveal distances with much more certainty than horizon cues could anyway. Fly brains may, similar to our own brains, somehow weigh the reliability of different depth cues depending on the characteristics of a visual scene.

Natural images have other characteristics that signal distance and may be discernible to a fly eye. Objects appear to have lower contrast and smaller size, and are more likely to be occluded when they are far off. However, these properties are likely to reduce fundamental motion detection, and therefore inherently reduce tracking strength. A fly cannot robustly track an image that makes no contrast against the background, for example. But the effect measured here may function in situations where a visual feature may be difficult to range based on size or contrast. Trees are the obvious vertical object that will motivate a fruit fly to approach, but they can vary in size by orders of magnitude, and often produce good visual contrast even when far off. In other words, unless a tree is near enough to determine its distance by motion parallax, it can be difficult, even for a human, to decide how far off it is. In addition to tracking, horizon cues could potentially affect tuning. The amplitude of fly steering responses is tuned to different oscillation frequencies (Duistermars et al., 2007a,b), which means visual features moving at different speeds. Object speed is a minor depth cue for humans, as in some circumstances slower objects seem more distant (especially objects in free fall). If the optimal tuning is adaptive, flies might benefit by shifting it to slower speeds when the horizon indicates objects are farther off. This hypothetical effect might be nullified, though, if object motion appears to be a result of self rotation, in which case distance would not affect image speed.

Depth cues allow both animals and computer vision systems to model the three-dimensional world with two-dimensional images. They have potential relevance to nearly any sighted organism, but flying insects are a special case. They continue to pose a paradox by using small brains and poor vision to seemingly control sophisticated aerial navigation. They can navigate around obstacles with quick precision, or steadfastly head to far-off destinations. Even tiny fruit flies can travel over great distances (Coyne et al., 1987), but to travel efficiently they must keep a constant heading as wind gusts alter their position and heading. Fruit flies are now known to be able to use the polarized sky to maintain a constant direction over long distances (Weir and Dickinson, 2012). A depth cue that functioned beyond the range of motion parallax could help to both choose distant targets and enhance flight stability in wide open spaces. Here, we have shown that visual elevation below the horizon could serve as a sensory depth cue for fruit flies when other cues are absent.

Subjects

We chose flies from a colony derived from wild-caught Drosophila melanogaster (Meigen), fed on standard media and kept on a 12 h:12 h light:dark cycle. We cold-anesthetized females, 3–6 days after eclosion, and attached a rigid tungsten tether, 0.02 mm in diameter, to the dorsal prothorax. We then placed flies upside down with a small piece of paper at their feet, which stopped them from beating their wings before an experiment began. After rewarming the flies for approximately an hour, we suspended them in the center of the flight arena, a cube of projection surfaces, and removed the paper, causing them to initiate tethered flight.

Visual stimulus

The flight arena was a perspex cube, with sides of 200 mm, inlaid with back-projection screen material. Four first-surface mirrors, angled at 45 deg to each side, allow a single projector (Lightspeed Designs DepthQ 360) to illuminate five sides of the cube (Fig. 1A). We used custom-written OpenGL code to drive a high-end graphics card (GeForce GTX 480) and animate the cube with perspective-corrected, three-dimensional scenes at 360 frames s−1. From the center of the cube, the scene appears continuous, even as shapes move from one surface to another (Fig. 1A). The four rendered scenes that reflect off the mirrors were mathematically reversed to account for the reflection. The front face displayed 229×229 pixels (2.5 pixels deg−1) and the side faces displayed 200×200 (2.2 pixels deg−1). The small difference is due to the shorter path between the projector and the front surface, which displays an image of the same size but at a higher pixel density. These experiments were performed with room lights on, producing a maximal contrast between dark and light areas in the arena of 85%.

Each experiment consisted of open-loop presentations of bar motion, interspersed by 3 s bouts of closed-loop fixation of a striped bar. During fixation, the fly's wing beats controlled the position of a rotating vertical bar, which improved their responsiveness to experimental presentations (Heisenberg and Wolf, 1979; Reichardt and Wenking, 1969). During experimental presentation, the striped bar was replaced by a gray bar, usually 11 deg wide and 60 deg long, and an artificial horizon appeared at some elevation, generally brighter above and dimmer below, to match natural conditions. The intensity of the bar was scaled to produce the same contrast against the bright background above the horizon and the dark background below. The bar moved left and right following a triangle waveform, at 1 Hz, for two full cycles.

Steering responses

The center of the visual arena situated a tethered fly between an infrared LED above, and a pair of photodiodes below. Each wing cast an infrared shadow onto a photodiode through each beat cycle, 200 times s−1, with larger wing beats occluding more light (Tammero et al., 2004). These signals fed into a dedicated wing beat analyzer which output a voltage proportional to the wing beat amplitude for each stroke. Although this did not capture the three-dimensional dynamics of a wing stroke, the difference between the left and right amplitudes (ΔWBA) is proportional to yaw torque when flies are connected to a torque meter (Götz, 1987; Tammero et al., 2004). The amplitude difference was visible to the naked eye when flies were exposed to involuntary optic flow, and the resulting ΔWBA voltage signal was accurate enough that they could precisely fixate the rotating stripe in closed-loop feedback between tests. In addition to wing stroke size, the brightness of the LED also affected the measured amplitude, and we turned the brightness up for the experiments in Fig. 7, as initial trials yielded weaker responses. As the absolute amplitude was only comparable within experiments, we used the Pearson's correlation, which is scale independent, between bar motion and ΔWBA as a measure of tracking response.

The authors would like to thank Melissa Coates for helpful feedback and careful reading of the manuscript.

Author contributions

J.C.T., C.M. and J.C. planned the experiments and analysis. J.C. and C.M. performed the experiments. I.R.-P. performed the image analysis. J.C.T., J.C., C.M. and I.R.-P. wrote the manuscript.

Funding

This research received no specific grant from any funding agency in the public, commercial or not-for-profit sectors.

Bahl
,
A.
,
Ammer
,
G.
,
Schilling
,
T.
and
Borst
,
A.
(
2013
).
Object tracking in motion-blind flies
.
Nat. Neurosci.
16
,
730
-
738
.
Beersma
,
D. G. M.
,
Stavenga
,
D. G.
and
Kuiper
,
J. W.
(
1977
).
Retinal lattice, visual field and binocularities in flies
.
J. Comp. Physiol. A
119
,
207
-
220
.
Benson
,
C.
and
Yonas
,
A.
(
1973
).
Development of sensitivity to static pictorial depth information
.
Percept. Psychophys.
13
,
361
-
366
.
Berry
,
R.
,
van Kleef
,
J.
and
Stange
,
G.
(
2007
).
The mapping of visual space by dragonfly lateral ocelli
.
J. Comp. Physiol. A
193
,
495
-
513
.
Cabrera
,
S.
and
Theobald
,
J. C.
(
2013
).
Flying fruit flies correct for visual sideslip depending on relative speed of forward optic flow
.
Front. Behav. Neurosci.
7
,
76
.
Cartwright
,
B. A.
and
Collett
,
T. S.
(
1983
).
Landmark learning in bees: experiments and models
.
J. Comp. Physiol. A
151
,
521
-
543
.
Collett
,
T. S.
(
1978
).
Peering-A locust behaviour pattern for obtaining motion parallax information
.
J. Exp. Biol.
76
,
237
-
241
.
Collett
,
T. S.
(
1987
).
Binocular depth vision in arthropods
.
Trends Neurosci.
10
,
1
-
2
.
Collett
,
T. S.
and
Udin
,
S. B.
(
1988
).
Frogs use retinal elevation as a cue to distance
.
J. Comp. Physiol. A
163
,
677
-
683
.
Coyne
,
J. A.
,
Bryant
,
S. H.
and
Turelli
,
M.
(
1987
).
Long-distance migration of Drosophila. 2. Presence in desolate sites and dispersal near a desert oasis
.
Am. Nat.
129
,
847
-
861
.
Duistermars
,
B. J.
,
Reiser
,
M. B.
,
Zhu
,
Y.
and
Frye
,
M. A.
(
2007a
).
Dynamic properties of large-field and small-field optomotor flight responses in Drosophila
.
J. Comp. Physiol. A
193
,
787
-
799
.
Duistermars
,
B. J.
,
Chow
,
D. M.
,
Condro
,
M.
and
Frye
,
M. A.
(
2007b
).
The spatial, temporal and contrast properties of expansion and rotation flight optomotor responses in Drosophila
.
J. Exp. Biol.
210
,
3218
-
3227
.
Eriksson
,
E. S.
(
1985
).
Attack behaviour and distance perception in the Australian bulldog ant Myrmecia nigriceps
.
J. Exp. Biol.
119
,
115
-
131
.
Fenk
,
L. M.
,
Poehlmann
,
A.
and
Straw
,
A. D.
(
2014
).
Asymmetric processing of visual motion for simultaneous object and background responses
.
Curr. Biol.
24
,
2913
-
2919
.
Fox
,
J. L.
and
Frye
,
M. A.
(
2014
).
Figure-ground discrimination behavior in Drosophila. II. Visual influences on head movement behavior
.
J. Exp. Biol.
217
,
570
-
579
.
Götz
,
K. G.
(
1987
).
Course-control, metabolism and wing interference during ultralong tethered flight in Drosophila melanogaster
.
J. Exp. Biol.
128
,
35
-
46
.
Heisenberg
,
M.
and
Wolf
,
R.
(
1979
).
On the fine structure of yaw torque in visual flight orientation of Drosophila melanogaster
.
J. Comp. Physiol. A
130
,
113
-
130
.
Howard
,
I. P.
(
2002
).
Seeing in Depth, Vol. 1: Basic Mechanisms
.
Toronto:
University of Toronto Press
.
Kaufman
,
L.
and
Kaufman
,
J. H.
(
2000
).
Explaining the moon illusion
.
Proc. Natl. Acad. Sci. USA
97
,
500
-
505
.
Kirchner
,
W. H.
and
Srinivasan
,
M. V.
(
1989
).
Freely flying honeybees use image motion to estimate object distance
.
Naturwissenschaften
76
,
281
-
282
.
Krapp
,
H. G.
,
Taylor
,
G. K.
and
Humbert
,
J. S.
(
2012
).
The mode-sensing hypothesis: Matching sensors, actuators and flight dynamics
. In
Frontiers in Sensing
(ed.
F. G.
Barth
,
J. A. C.
Humphrey
and
M. V.
Srinivasan
), pp.
101
-
114
.
Vienna
:
Springer
.
Land
,
M. F.
and
Nilsson
,
D.-E.
(
2012
).
Animal Eyes
.
Oxford:
Oxford University Press
.
Landy
,
M. S.
,
Maloney
,
L. T.
,
Johnston
,
E. B.
and
Young
,
M.
(
1995
).
Measurement and modeling of depth cue combination: in defense of weak fusion
.
Vision Res.
35
,
389
-
412
.
Layne
,
J. E.
(
1998
).
Retinal location is the key to identifying predators in fiddler crabs (Uca pugilator)
.
J. Exp. Biol.
201
,
2253
-
2261
.
Layne
,
J.
,
Land
,
M.
and
Zeil
,
J.
(
1997
).
Fiddler crabs use the visual horizon to distinguish predators from conspecifics: a review of the evidence
.
J. Mar. Biol. Assoc. UK
77
,
43
-
54
.
Layne
,
J. E.
,
Chen
,
P. W.
and
Gilbert
,
C.
(
2006
).
The role of target elevation in prey selection by tiger beetles (Carabidae: Cicindela spp.)
.
J. Exp. Biol.
209
,
4295
-
4303
.
Lehrer
,
M.
,
Srinivasan
,
M. V.
,
Zhang
,
S. W.
and
Horridge
,
G. A.
(
1988
).
Motion cues provide the bee's visual world with a third dimension
.
Nature
332
,
356
-
357
.
Maimon
,
G.
,
Straw
,
A. D.
and
Dickinson
,
M. H.
(
2008
).
A simple vision-based algorithm for decision making in flying Drosophila
.
Curr. Biol.
18
,
464
-
470
.
Mazo
,
C.
and
Theobald
,
J. C.
(
2014
).
To keep on track during flight, fruitflies discount the skyward view
.
Biol. Lett.
10
,
20131103
.
Ooi
,
T. L.
,
Wu
,
B.
and
He
,
Z. J.
(
2001
).
Distance determined by the angular declination below the horizon
.
Nature
414
,
197
-
200
.
Preiss
,
R.
(
1987
).
Motion parallax and figural properties of depth control flight speed in an insect
.
Biol. Cybern.
57
,
1
-
9
.
Reichardt
,
W.
and
Wenking
,
H.
(
1969
).
Optical detection and fixation of objects by fixed flying flies
.
Naturwissenschaften
56
,
424
.
Reiser
,
M. B.
and
Dickinson
,
M. H.
(
2008
).
A modular display system for insect behavioral neuroscience
.
J. Neurosci. Methods
167
,
127
-
139
.
Rock
,
I.
and
Kaufman
,
L.
(
1962
).
The moon illusion, II: The moon's apparent size is a function of the presence or absence of terrain
.
Science
136
,
1023
-
1031
.
Rossel
,
S.
(
1983
).
Binocular stereopsis in an insect
.
Nature
302
,
821
-
822
.
Schuster
,
S.
,
Strauss
,
R.
and
Götz
,
K. G.
(
2002
).
Virtual-reality techniques resolve the visual cues used by fruit flies to evaluate object distances
.
Curr. Biol.
12
,
1591
-
1594
.
Srinivasan
,
M. V.
(
1992
).
Distance perception in insects
.
Curr. Dir. Psychol. Sci.
1
,
22
-
26
.
Srinivasan
,
M. V.
,
Lehrer
,
M.
,
Kirchner
,
W. H.
and
Zhang
,
S. W.
(
1991
).
Range perception through apparent image speed in freely flying honeybees
.
Vis. Neurosci.
6
,
519
-
535
.
Straw
,
A. D.
,
Lee
,
S.
and
Dickinson
,
M. H.
(
2010
).
Visual control of altitude in flying Drosophila
.
Curr. Biol.
20
,
1550
-
1556
.
Tammero
,
L. F.
,
Frye
,
M. A.
and
Dickinson
,
M. H.
(
2004
).
Spatial organization of visuomotor reflexes in Drosophila
.
J. Exp. Biol.
207
,
113
-
122
.
Theobald
,
J. C.
,
Ringach
,
D. L.
and
Frye
,
M. A.
(
2010
).
Dynamics of optomotor responses in Drosophila to perturbations in optic flow
.
J. Exp. Biol.
213
,
1366
-
1375
.
Wallace
,
G. K.
(
1959
).
Visual scanning in the desert locust Schistocerca gregaria Forskål
.
J. Exp. Biol.
36
,
512
-
525
.
Wehner
,
R.
(
1981
).
Spatial vision in arthropods
. In
Handbook of Sensory Physiology
(ed.
H.
Autrum
), pp.
287
-
616
.
Berlin
;
Heidelberg
;
New York
:
Springer
.
Wehner
,
R.
and
Horn
,
E.
(
1975
).
The effect of object distance on pattern preferences in the walking fly, Drosophila melanogaster
.
Experientia
31
,
641
-
643
.
Weir
,
P. T.
and
Dickinson
,
M. H.
(
2012
).
Flying drosophila orient to sky polarization
.
Curr. Biol.
22
,
21
-
27
.
Zeil
,
J.
(
1993
).
Orientation flights of solitary wasps (Cerceris; Sphecidae; Hymenoptera). II: Similarities between orientation and return flights and the use of motion parallax
.
J. Comp. Physiol. A
172
,
207
-
222
.
Zeil
,
J.
and
Hemmi
,
J. M.
(
2006
).
The visual ecology of fiddler crabs
.
J. Comp. Physiol. A
192
,
1
-
25
.

Competing interests

The authors declare no competing or financial interests.