(Originally published by NTNU)
November 2, 2018
Even the most basic moves in life, like getting out of bed in the morning, require far more coordination than one might think. Neuroscientists may have just uncovered key aspects of how the brain controls body posture during these kinds of everyday movements.
On October 31, 1905, a British surgeon named Sir Victor Horsley removed a 6-cm diameter ball of tissue and tumour from the brain of a man called George M.
George M.’s surgery gave him some relief from the seizures caused by the tumour. Yet he probably would be forgotten except that two curious medical doctors and neurologists named Henry Head and Gordon Holmes followed up on George M. and dozens of other patients like him.
Their goal was to learn how the brain worked by observing what didn’t work in patients who had had different parts of the brain damaged or removed.
As they examined patient after patient, the two doctors began to see that when a part of the brain called the parietal cortex was damaged, the patients could lose their sense of where certain body parts were in space.
The patients weren’t blind. They could see a body part, like an arm or a hand or a leg. But when they closed their eyes, they were unable to tell where it was.
This may seem trivial, yet it’s anything but. Your mostly unconscious sense of where you are in space is formed in your brain from the integration of all your senses. It allows you to touch type, scratch your head, get a beer from the fridge, and move from one position to the next.
Without this sense of the body in space, which Head and Holmes called “body schema,” the researchers wrote in a paper in 1911, “we could not probe with a stick, nor use a spoon unless our eyes were fixed upon the plate.”
More than 100 years after this idea of body schema was first described, researchers at NTNU’s Kavli Institute for Systems Neuroscience have found that the areas of the brain responsible for movement planning and spatial navigation — the posterior parietal cortex and the frontal motor cortex — are hugely responsive to the posture of the body.
In other words, the researchers believe the neurons they recorded are sending signals the brain uses to help create the body schema, that sense of self in space.
Their paper, “Efficient Cortical Coding of 3D Posture in Freely Behaving Rats,” has just been published in Science.
Six cameras and seven infrared tracking points
The researchers were interested in understanding more about what happens in the posterior parietal cortex and the frontal motor cortex.
They set up a 2-metre octagonal box with six cameras and outfitted 11 rats with special 3-D printed drives, which allowed the researchers to record a total of 800 neurons in one region and 700 in the other. The rats also had seven infrared tracking points, four on the animal’s head and three spaced along their backs to their tails.
The rats spent 20 minutes in the box where they were allowed to roam around and explore, and occasionally to find a piece of chocolate cookie, all while their movements were recorded visually and while the probes on their heads sent information about which neurons were firing when.
The set-up allowed the researchers to measure the animal’s movement in 3-D — not only where it went in the box in its quest for treats, but whether it was turning its head, or rearing up, or twisting in a particular direction.
What they found was when the rats were in a “default position,” roaming on all fours with the head lowered, not that many neurons were engaged in keeping track of what body position the animal was in. But when the rat moved out of this default position, such as when it would rear up on two legs to sniff something, many more neurons fired up to take on the task.
A 3-D view of behaviour
“This experimental set-up allowed us to see for the first time how these neurons responded during 3-D behaviour in a freely moving animal,” said Jonathan Whitlock, senior author of the paper and head of a research group at the Kavli Institute. “The most detailed knowledge we have about these areas has come from head-fixed paradigms, in which animals make simple movements of their hand, arm or eyes. Here we could see for the first time how the brain responded during unrestrained movement of the body, in its native 3-D.”
The researchers also conducted some of the tests in the darkness, during which the researchers (but not the rats) wore night-vision goggles to see what the rat was doing. This last effort was to make sure that the information they were recording from the rats was not solely based on what the rat saw, but from how the rat actually moved.
“We wanted to be certain that what these cells were relying on was not just vision,” said Bartul Mimica, the first author of the paper and a PhD candidate in the Kavli Institute’s Whitlock group.
Statistical model helped clarify patterns
In the end, the researchers used the data to match the firing of the neurons to the movement and postures they had recorded with the infrared sensors and the six cameras. They then developed a statistical model that allowed them to sort through and interpret all the data.
The model “allowed us to see what the neurons were responding to,” said Benjamin Dunn, who developed the model and is starting as an associate professor in data science at NTNU’s Department of Mathematical Sciences. They tested the robustness of the model by turning the data around and seeing if the neuron firing data they had collected could “predict” what the rat would be doing without actually looking at the movements they had recorded with the cameras — and it did.
Whitlock said it was especially exciting to see their findings support the 100-year-old observations by early neuroscientists.
“It was a real ‘aha-moment’ reading over the work of Head & Holmes, Balint, and other neurologists—since to me it seemed very clear that the neural signals we were seeing in our rats was probably what was missing in these patients,” he said.
The analysis also allowed the researchers to discover an unexpected detail. Because neutral positions, such as walking around on all fours, didn’t require nearly as many neurons to fire as more uncommon postures, such as rearing up, the researchers understood this as a way for the brain to actually conserve energy. That matters because the brain can consume as much as 25 per cent of the body’s daily energy budget.
“It is very efficient for the metabolic consumption of the cells,” said Tuce Tombaz, one of the paper’s co-authors and also a PhD candidate at the Kavli Institute’s Whitlock group. “Why would we have cells firing all the time for postures that we occupy all the time? This way the neurons don’t have to expend a lot of energy to code for every posture.”
From basic research to learning and robotics
The Kavli researchers are interested in understanding how the brain works, without any specific clinical or applied goal in mind. But understanding how hard the brain works when it comes to posture as compared to movement might help inform a range of disciplines, the researchers said.
The clearest application could be in understanding how to better treat stroke patients whose strokes have damaged this area of the brain, Dunn said. “You can’t fix a problem if you don’t understand it,” he said.
Another application might be in robotics, he added.
“The brain has always been an inspiration for artificial intelligence,” Dunn said.
“Knowing how we represent the relationship of our bodies as we move around could push the next generation of robots closer to human-like learning of movement and interaction in an ever-changing environment.”
But then there’s also the larger question of why the brain is organized in this way, Mimica said.
“Why would this most sophisticated part of the brain care so much about this?” he said. “There must be a reason for this. Uncovering this reason, maybe by finding some little detail, we can learn some deeper truths about the brain and how it is organized. What we’re doing is homing in on a precise account of what is going on in the brain.”