A Skeleton Key for Our Emotions
Emotions are sometimes thought of as ethereal things. We know we have them, and we know they are real, but there isn’t any way to measure them. Emotions, however, originate in a very real, very physical part of ourselves — the brain. But emotions originate in a very real, very physical part of ourselves — the brain. With the right technology, we might be able to “see” an emotion in the same way we use an X-ray to look at a broken bone. In fact, that technology might be a lot closer than you realize.
Maryam Shanechi, the Andrew and Erna Viterbi Early Career Chair and assistant professor of electrical and computer engineering, is a leader in brain-machine interfaces (BMIs). Often used to translate brain signals into actions and help restore motor function in injured patients, BMIs offer a direct pathway to the brain. With this sophisticated and in-depth map of exactly where and how a patient’s movements are represented in the brain, a BMI can read out their movement intention to restore motor function.
Now imagine what it would be like if the same thing could be applied to emotions. After all, depression, pain, anxiety and a host of other disorders are, at least in part, disorders of the brain, just like lost motor function might be. Current treatments for severe depression, for example, don’t always target each individual’s needs, and too often rely on expensive or addictive medications.
Treating negative mood disorders requires a better understanding of where and how they express themselves in the brain. The challenge is that human emotions aren’t so simple. Think about it: You have the most complex piece of problem-solving hardware in the known universe, the human brain, combined with the most illogical, narcissistic, unfocused piece of software in the known universe, the human mind.
Shanechi and her team tackled first part of the challenge earlier, developing the first “mood decoder”: a machine learning technique that was more effective than anything before in looking at brain activity and assessing a patient’s mood.
But diagnosing the location of the problem was only half the battle. Shanechi then developed a novel electrical stimulation wave combined with another machine learning technique that predicts how different levels of deep brain stimulation will affect a patient’s mood. This was accomplished with an input-output algorithm that randomly changes the frequency and amplitude of the wave, which is the input to the brain just like a medication. To put this two-step process more simply, the decoder provides a map of how emotional disorders are represented in the brain. Then the electrical wave predicts how and where electrical stimulation should be applied to address the disorder.
Of course, no two brains are alike. Two heart disease patients might have similar test results and benefit from similar medication, but the same is not true for emotional disorders. And this brings us to the most exciting part of the whole process. That electric wave Shanechi developed to predict how stimulation will affect a patient can be personalized. Think of it like a skeleton key for emotions. No matter what you’re feeling and where in the brain you’re feeling it, the wave can predict the right dose of electric stimulation for you.
“By putting these two boxes together,” Shanechi explained, “we hope to build closed-loop brain-machine interfaces that adjust the dose of electrical stimulation therapy by tracking the symptoms in real time based on brain activity, and by predicting how a change in stimulation can change the activity and thus these symptoms.”
Shanechi’s work is a major step toward a form of personalized medicine for mental and emotional disorders that could one day change the face of mental health treatment.
So, what kinds of feelings does that give you?