“The central focus of my laboratory is to understand the neural control of visual, olfactory, and multisensory behavior. We study the fruit fly Drosophila to leverage techniques in neurogenetics, systems identification, ‘virtual reality’ behavior, and in-vivo imaging”

THE PROBLEM

Right now your brain is processing massive amounts of information about your surroundings – colors, shapes, smells, sounds – to determine what is worthy of your attention. Perception is so automatic that you don’t need to think about it. It is intuitive and unconscious. Yet the most sophisticated human-engineered systems can’t do it. What if we could understand exactly how the brain processes reality within individual cells? That knowledge could be used to enable science fiction-style breakthroughs like prosthetic limbs that converse with the brain, or search robots that smell as well as a bloodhound, or an implantable chip that repairs blindness. The first step toward these transformative inventions is to ‘reverse-engineer’ the neural technology that evolution has honed over billions of years.

THE APPROACH

Research in the Frye lab is shedding light on the brain’s astonishing skills of perception. Combining tools from biology and engineering, like genetics, brain imaging and virtual reality simulators for fruit flies, the lab is investigating how the brain perceives the world while it’s doing it. Why fruit flies? Well, those little bugs that eat your old bananas actually perceive the world much in the same way humans do. Yet their brains have orders of magnitude fewer cells than ours, and scientists have invented ways to precisely control the genes and molecules underlying brain function. The lab’s research is unique in that they are monitoring the brain while the fly actively perceives its environment in real time. They can watch neurons fire in response to, say, a new smell or the movement of an object in the animal’s visual field, and then directly connect the cellular response to changes in the animal’s attention. In doing so, he is discovering the cellular building blocks of perception.

by Emily Rose, Senior Strategy Officer, UCLA Health Sciences Development

2 PHOTON IMAGING

VISUAL FIXATION

FIGURE-GROUND DISCRIMINATION

WALKING OPTOMOTOR BEHAVIOR

ODOR TRACKING

MULTISENSORY INTEGRATION

FUNDING