Technologies for Neuronal Mapping of Motion Detection and Color Vision Systems of Drosophila

Project Collaborators (sorted by Institute): 
Intramural Research Program, NIBIB
Section on Neuronal Connectivity, Cell Regulation and Development Affinity Group, NICHD
Project Brief: 

SPIS has been collaborating with NICHD's section on neuronal connectivity and NIBIB to develop one-of-kind virtual-reality (VR) behavioral systems to assess the functions of Drosophila visual circuits. These VR systems are comprised of custom electronics, opto-electronics, imaging systems, mechanical hardware, and software. Since vertebrates share similar visual functions and neural circuit architectures, the research will provide a better understanding of how these systems receive, process, and interpret visual stimuli associated with motion and color. SPIS developed custom instruments to facilitate the identification of specific neural circuit elements (i.e., via neuronal connection manipulations) that detect motion and distinguish colors through a battery of behavioral tests. Another instrument is currently under development to generate ultraviolet visual stimuli while observing the activity of visual neurons with two-photon microscopy calcium imaging. By synchronizing the stimuli and the calcium imaging, specific neural pathways will be detected and mapped.

 

VR motion detection prototype system and machine vision screenshot

(a) Vision system motion detection instrumentation.  Fly is suspended in center of the arena while a visual pattern rotates around the center pedestal. (b) As the pattern rotates, a machine-vision matching algorithm is used to measure the tilt angle of the fly head to determine if the fly is reacting to the motion.  Speed and direction are varied to confirm fly behavior.

VR color discrimination prototype system and machine vision screenshot

(a) Vision system color discrimination instrumentation. Fly is partially tethered (free to rotate on center axis) in center of octagonal arena while LED array panels display blue and green colors.  Infrared camera is mounted below arena to image the hovering fly. Laser is used as negative reinforcement during training. (b) Software displays the fly silhouette with quadrants defined by the LED color currently displayed.  An image processing algorithm is used to determine the point-of-view angle of the fly rotating in response to the color stimulation.

VR ultraviolet stimuli with confocal fluorescence microscope calcium imaging

(a) Instrument currently under development includes a 16x24 ultraviolet LED array.  The intensity of each LED is controlled with custom LED driver hardware and software.  A meta-language was developed to enable the investigator to automate complex stimulation sequences.  External triggers are used to synchronize visual stimuli with confocal microscope image acquisition. (b) 3D model of LED array integrated with confocal microscope.  The stage on the right side will secure the fly with the LED array stimulus hardware positioned on the left side.

Awards: 
  • 2016 NIH Director’s Award:  Using virtual-reality systems to dissect neural circuits of color-vision

  • 2012 NICHD Scientific Director’s Intramural Award: Integration of Chromatic Information in the Higher Visual Center of Drosphila