What the Nose Knows

by Lindsay Borthwick

New insights into the complexity of olfaction, and more highlights from Kavli Institutes in Neuroscience

The Author

Lindsay Borthwick

Two fascinating studies from the Kavli Institutes for Neuroscience underscore the complexity of olfaction, and how much brain researchers still have to learn about it. Researchers at Rockefeller have for the first time captured olfactory receptors and odor molecules in an atomic embrace. The snapshots are causing neuroscientists to rethink the lock-and-key model of olfaction found in most textbooks, which is based on the idea that the shape of an odor molecule determines the receptors into which it fits. Another study suggests that the populations of sensory neurons that respond to odors aren’t stable. Instead, the neurons that respond to a specific smell change over time. This finding is raising eyebrows among neuroscientists who are trying to explain how the brain maintains a consistent picture of the outside world—sweet (and not-so-sweet) smells and all.

Synthesizing Song

Researchers have translated a bird’s brain activity into song—a step toward the goal of recreating speech from human brain signals in people who have lost the ability to speak. The work is the result of a collaboration between neuroscientist Timothy Gentner and engineer Vikash Gilja, who are both based at the University of California, San Diego (UC San Diego) and was partly funded by the Kavli Institute for Brain and Mind. In the proof-of-concept study, the researchers used silicon electrodes implanted in the brain to capture neural activity in the sensorimotor cortex of zebra finches as they sang. Then, they used machine learning algorithms to translate the bird’s brain activity patterns into patterns of sounds. The results faithfully reproduced the birds’ vocalizations, including pitch, volume and timbre. Next, the researchers plan to adapt their system to recreate birdsong from brain activity in real time. "With our collaboration," said Gentner in a news article, "we are leveraging 40 years of research in birds to build a speech prosthesis for humans--a device that would not simply convert a person's brain signals into a rudimentary set of whole words but give them the ability to make any sound, and so any word, they can imagine, freeing them to communicate whatever they wish."

Twist and Turn

To better understand the brain, researchers are rapidly developing new ways to image brain activity in live, freely behaving mice. One of the most promising technologies is a miniaturized two-photon microscope mounted on the head of an animal. However, this kind of device still has its limits—a key one being the cables that tether the hardware on the mouse’s head to a base instrument. These cables limit an animal’s freedom of movement, making the experiments less natural than they could be. Now, a research team led by Xingde Li, a professor of biomedical engineering at Johns Hopkins University who specializes in neurophotonics, has created a compact and lightweight fiberscope that can image the brain in freely rotating and walking rodents. The team recently published a paper in Optica reporting the breakthrough technology, which recorded the activity of more than 50 neurons simultaneously. Li is a member of the Kavli Neuroscience Discovery Institute at Hopkins.

Chemical Conundrum

The Atlantic’s Ed Yong, who recently won the Pulitzer Prize for Explanatory Reporting, covers surprising new research results that are causing neuroscientists to rethink how the brain recognizes objects and maintains a stable picture of the outside world in the face of constant change. In a new article,” Yong describes a series of experiments, led by Columbia University researchers Carl Schoonover, Andrew Fink, and colleagues, focused on odor perception in mice. In a study published in Nature, the team found that the neurons in a rodents’ piriform cortex that respond to a given smell, such as the scent of an apple, aren’t stable. Instead, they change over time. Yet, a mouse can still recognize an apple or grass or many other odors it may encounter. This phenomenon has been noticed by other researchers in other parts of the brain and even has a name: representational drift. It also has neuroscientists scratching their heads, wondering if it is a generalized mechanism: “We have a hunch that this should be the rule rather than the exception,” Schoonover told Yong. “The onus now becomes finding the places where it doesn’t happen.” Schoonover and Fink are postdoctoral fellows in Richard Axel’s lab, which is affiliated with the Kavli Institute for Brain Science.

Molecular Embrace

What does an olfactory receptor bound to an odor molecule actually look like? A team of researchers from Rockefeller University, led by neurobiologist Vanessa Ruta, have taken the first snapshot of the interaction at the molecular level. Researchers have been awaiting such a picture for decades in order to understand olfaction at its earliest stages. Quanta Magazine covered the new research, posted earlier this year to the preprint server bioRxiv, in a new article, “Secret Workings of Smell Receptors Revealed for the First Time.” The study is helping to answer an age-old question: How does the brain distinguish among an almost unlimited number of scents using a limited number of odor receptors? Using cryo electron microscopy, the Rockefeller team captured an insect’s olfactory receptor by itself or bound to either of two odor molecules: eugenol, which is found in cloves, nutmeg and cinnamon, or DEET. When the researchers compared the bound and unbound structures, they were surprised to find that both molecules slipped into a deep pocket in the receptor, suggesting it “is doing a more holistic recognition of the molecule, as opposed to just detecting any specific structural feature,” Ruta told Quanta. “It’s just a very different chemical logic.” Ruta is a member of the Kavli Neural Systems Institute at Rockefeller.

Rx for Depression

As rates of depression skyrocket, researchers are seeking to understand how the mental health condition differs between individuals and how to develop personalized treatments that go beyond medication. Among those researchers is Jyoti Mishra, an assistant professor in the Department of Psychiatry at the University of California, San Diego (UC San Diego). In a new study, Mishra and her team followed 14 people for a month and generated a prediction of depression for each one. The predictions were based on brain activity and lifestyle factors, such as sleep, exercise, diet and stress. Some of the data were collected using widely available technologies like cell phone apps and smart watches. The team used machine learning to analyze the data and identify predictors of low mood in each individual, which could be used to guide the approach to treatment. For example, if a person’s depressive symptoms were linked to poor sleep, then interventions could be used to improve their sleep habits.

“We should not be approaching mental health as one size fits all. Patients will benefit by having more direct and quantified insight onto how specific behaviors may be feeding their depression. Clinicians can leverage this data to understand how their patients might be feeling and better integrate medical and behavioral approaches for improving and sustaining mental health,” said Mishra, who is a member of the Kavli Institute for Brain and Mind at UC San Diego.