Archive

Posts Tagged ‘brain mapping’

A Glance at the Brain’s Circuit Diagram

December 16, 2012 3 comments

A new method facilitates the mapping of connections between neurons.

The human brain accomplishes its remarkable feats through the interplay of an unimaginable number of neurons that are interconnected in complex networks. A team of scientists from the Max Planck Institute for Dynamics and Self-Organization, the University of Göttingen and the Bernstein Center for Computational Neuroscience Göttingen has now developed a method for decoding neural circuit diagrams. Using measurements of total neuronal activity, they can determine the probability that two neurons are connected with each other.

The human brain consists of around 80 billion neurons, none of which lives or functions in isolation. The neurons form a tight-knit network that they use to exchange signals with each other. The arrangement of the connections between the neurons is far from arbitrary, and understanding which neurons connect with each other promises to provide valuable information about how the brain works. At this point, identifying the connection network directly from the tissue structure is practically impossible, even in cell cultures with only a few thousand neurons. In contrast, there are currently well-developed methods for recording dynamic neuronal activity patterns. Such patterns indicate which neuron transmitted a signal at what time, making them a kind of neuronal conversation log. The Göttingen-based team headed by Theo Geisel, Director at the Max Planck Institute for Dynamics and Self-Organization, has now made use of these activity patterns. Read more…

Advertisements

Researchers Explore How the Brain Perceives Direction and Location

December 16, 2012 Leave a comment

The Who asked “who are you?” but Dartmouth neurobiologist Jeffrey Taube asks “where are you?” and “where are you going?” Taube is not asking philosophical or theological questions. Rather, he is investigating nerve cells in the brain that function in establishing one’s location and direction.

Taube, a professor in the Department of Psychological and Brain Sciences, is using microelectrodes to record the activity of cells in a rat’s brain that make possible spatial navigation—how the rat gets from one place to another—from “here” to “there.” But before embarking to go “there,” you must first define “here.”

Survival Value

“Knowing what direction you are facing, where you are, and how to navigate are really fundamental to your survival,” says Taube. “For any animal that is preyed upon, you’d better know where your hole in the ground is and how you are going to get there quickly. And you also need to know direction and location to find food resources, water resources, and the like.”

Not only is this information fundamental to your survival, but knowing your spatial orientation at a given moment is important in other ways, as well. Taube points out that it is a sense or skill that you tend to take for granted, which you subconsciously keep track of. “It only comes to your attention when something goes wrong, like when you look for your car at the end of the day and you can’t find it in the parking lot,” says Taube. Read more…

Technique for Letting Brain Talk to Computers Now Tunes in Speech


Patients with a temporary surgical implant have used regions of the brain that control speech to “talk” to a computer for the first time, manipulating a cursor on a computer screen simply by saying or thinking of a particular sound.

“There are many directions we could take this, including development of technology to restore communication for patients who have lost speech due to brain injury or damage to their vocal cords or airway,” says author Eric C. Leuthardt, MD, of Washington University School of Medicine in St. Louis.

Scientists have typically programmed the temporary implants, known as brain-computer interfaces, to detect activity in the brain’s motor networks, which control muscle movements.

“That makes sense when you’re trying to use these devices to restore lost mobility — the user can potentially engage the implant to move a robotic arm through the same brain areas he or she once used to move an arm disabled by injury,” says Leuthardt, assistant professor of neurosurgery, of biomedical engineering and of neurobiology, “But that has the potential to be inefficient for restoration of a loss of communication.”

Patients might be able to learn to think about moving their arms in a particular way to say hello via a computer speaker, Leuthardt explains. But it would be much easier if they could say hello by using the same brain areas they once engaged to use their own voices.

The research appears April 7 in The Journal of Neural Engineering.

(up) Scientists at Washington University School of Medicine in St. Louis have adapted brain-computer interfaces like the one shown above to listen to regions of the brain that control speech. The development may help restore capabilities lost to brain injury or disability. Credit: Eric Leuthardt, MD, permission by Michael Purdy

The devices under study are temporarily installed directly on the surface of the brain in epilepsy patients. Surgeons like Leuthardt use them to identify the source of persistent, medication-resistant seizures and map those regions for surgical removal. Researchers hope one day to install the implants permanently to restore capabilities lost to injury and disease.

Leuthardt and his colleagues have recently revealed that the implants can be used to analyze the frequency of brain wave activity, allowing them to make finer distinctions about what the brain is doing. For the new study, Leuthardt and others applied this technique to detect when patients say or think of four sounds:

  • oo, as in few
  • e, as in see
  • a, as in say
  • a, as in hat

When scientists identified the brainwave patterns that represented these sounds and programmed the interface to recognize them, patients could quickly learn to control a computer cursor by thinking or saying the appropriate sound.

In the future, interfaces could be tuned to listen to just speech networks or both motor and speech networks, Leuthardt says. As an example, he suggests that it might one day be possible to let a disabled patient both use his or her motor regions to control a cursor on a computer screen and imagine saying “click” when he or she wants to click on the screen.

“We can distinguish both spoken sounds and the patient imagining saying a sound, so that means we are truly starting to read the language of thought,” he says. “This is one of the earliest examples, to a very, very small extent, of what is called ‘reading minds’ — detecting what people are saying to themselves in their internal dialogue.”

The next step, which Leuthardt and his colleagues are working on, is to find ways to distinguish what they call “higher levels of conceptual information.”

“We want to see if we can not just detect when you’re saying dog, tree, tool or some other word, but also learn what the pure idea of that looks like in your mind,” he says. “It’s exciting and a little scary to think of reading minds, but it has incredible potential for people who can’t communicate or are suffering from other disabilities.”

 

Notes about this brain-computer interface research article

A portion of this research was funded in collaboration with Gerwin Schalk, PhD, of the New York State Department of Health’s Wadsworth Center. The goal is to explore additional potential uses of this technology in people who are disabled and in those who are not.

Leuthardt et al. 2011 Journal of Neural Engineering. 8 036004 online at: http://iopscience.iop.org/1741-2552/8/3/036004

Funding from the National Institutes of Health and the Department of Defense supported this research.

Contact: Michael Purdy – Senior Medical Science Writer

Source:  Washington University in St. Louis Newsroom article – permission given by Michael Purdy

Image Source: Image adapted from image in article above. Permission given by Michael Purdy