Archive

Posts Tagged ‘brain’

Researchers create artificial link between unrelated memories


The ability to learn associations between events is critical for survival, but it has not been clear how different pieces of information stored in memory may be linked together by populations of neurons. In a study published April 2nd in Cell Reports, synchronous activation of distinct neuronal ensembles caused mice to artificially associate the memory of a foot shock with the unrelated memory of exploring a safe environment, triggering an increase in fear-related behavior when the mice were re-exposed to the non-threatening environment. The findings suggest that co-activated cell ensembles become wired together to link two distinct memories that were previously stored independently in the brain.

“Memory is the basis of all higher brain functions, including consciousness, and it also plays an important role in psychiatric diseases such as post-traumatic stress disorder,” says senior study author Kaoru Inokuchi of the University of Toyama. “By showing how the brain associates different types of information to generate a qualitatively new memory that leads to enduring changes in behavior, our findings could have important implications for the treatment of these debilitating conditions.”

Recent studies have shown that subpopulations of neurons activated during learning are reactivated during subsequent memory retrieval, and reactivation of a cell ensemble triggers the retrieval of the corresponding memory. Moreover, artificial reactivation of a specific neuronal ensemble corresponding to a pre-stored memory can modify the acquisition of a new memory, thereby generating false or synthetic memories. However, these studies employed a combination of sensory input and artificial stimulation of cell ensembles. Until now, researchers had not linked two distinct memories using completely artificial means.

With that goal in mind, Inokuchi and Noriaki Ohkawa of the University of Toyama used a fear-learning paradigm in mice followed by a technique called optogenetics, which involves genetically modifying specific populations of neurons to express light-sensitive proteins that control neuronal excitability, and then delivering blue light through an optic fiber to activate those cells. In the behavioral paradigm, one group of mice spent six minutes in a cylindrical enclosure while another group explored a cube-shaped enclosure, and 30 minutes later, both groups of mice were placed in the cube-shaped enclosure, where a foot shock was immediately delivered. Two days later, mice that were re-exposed to the cube-shaped enclosure spent more time frozen in fear than mice that were placed back in the cylindrical enclosure.

The researchers then used optogenetics to reactivate the unrelated memories of the safe cylinder-shaped environment and the foot shock. Stimulation of neuronal populations in memory-related brain regions called the hippocampus and amygdala, which were activated during the learning phase, caused mice to spend more time frozen in fear when they were later placed back in the cylindrical enclosure, as compared with stimulation of neurons in either the hippocampus or amygdala, or no stimulation at all.

The findings show that synchronous activation of distinct cell ensembles can generate artificial links between unrelated pieces of information stored in memory, resulting in long-lasting changes in behavior. “By modifying this technique, we will next attempt to artificially dissociate memories that are physiologically connected,” Inokuchi says. “This may contribute to the development of new treatments for psychiatric disorders such as post-traumatic stress disorder, whose main symptoms arise from unnecessary associations between unrelated memories.”

The above story is reprinted from materials provided by MedicalXpress.

More information: Cell Reports, Ohkawa et al.: “Artificial Association of Pre-Stored Information to Generate a Qualitatively New Memory” www.cell.com/cell-reports/abst… 2211-1247(15)00270-3

Advertisements

New Alzheimer’s treatment fully restores memory function


Of the mice that received the treatment, 75 percent got their memory functions back.

Australian researchers have come up with a non-invasive ultrasound technology that clears the brain of neurotoxic amyloid plaques – structures that are responsible for memory loss and a decline in cognitive function in Alzheimer’s patients.

If a person has Alzheimer’s disease, it’s usually the result of a build-up of two types of lesions – amyloid plaques, and neurofibrillary tangles. Amyloid plaques sit between the neurons and end up as dense clusters of beta-amyloid molecules, a sticky type of protein that clumps together and forms plaques.

Neurofibrillary tangles are found inside the neurons of the brain, and they’re caused by defective tau proteins that clump up into a thick, insoluble mass. This causes tiny filaments called microtubules to get all twisted, which disrupts the transportation of essential materials such as nutrients and organelles along them, just like when you twist up the vacuum cleaner tube.

As we don’t have any kind of vaccine or preventative measure for Alzheimer’s – a disease that affects 343,000 people in Australia, and 50 million worldwide – it’s been a race to figure out how best to treat it, starting with how to clear the build-up of defective beta-amyloid and tau proteins from a patient’s brain. Now a team from the Queensland Brain Institute (QBI) at the University of Queensland have come up with a pretty promising solution for removing the former.

Publishing in Science Translational Medicine, the team describes the technique as using a particular type of ultrasound called a focused therapeutic ultrasound, which non-invasively beams sound waves into the brain tissue. By oscillating super-fast, these sound waves are able to gently open up the blood-brain barrier, which is a layer that protects the brain against bacteria, and stimulate the brain’s microglial cells to activate. Microglila cells are basically waste-removal cells, so they’re able to clear out the toxic beta-amyloid clumps that are responsible for the worst symptoms of Alzheimer’s.

The team reports fully restoring the memory function of 75 percent of the mice they tested it on, with zero damage to the surrounding brain tissue. They found that the treated mice displayed improved performance in three memory tasks – a maze, a test to get them to recognise new objects, and one to get them to remember the places they should avoid.

“We’re extremely excited by this innovation of treating Alzheimer’s without using drug therapeutics,” one of the team, Jürgen Götz, said in a press release. “The word ‘breakthrough’ is often misused, but in this case I think this really does fundamentally change our understanding of how to treat this disease, and I foresee a great future for this approach.”

The team says they’re planning on starting trials with higher animal models, such as sheep, and hope to get their human trials underway in 2017.

You can hear an ABC radio interview with the team here.

The above story is reprinted from materials provided by ScienceAlert.

Neuroscientist debunks one of the most popular myths about the brain


At some point in your life, you’ve probably been labeled a “right-brain thinker” (you’re so creative!) or a “left-brain thinker” (you’re so logical). Maybe this has shaped the way you see yourself or view the world.

Well, either way it’s bogus science, says Sarah-Jayne Blakemore, a University College London professor of cognitive science, in the latest episode of the Freakonomics Radio Podcast.

“This is an idea that makes no physiological sense,” she says.

The popular “right brain-left brain” theory for explaining people’s personalities is not actually backed by science.

Blakemore believes that the concept of “logical, analytical, and accurate” thinkers favoring their left hemisphere and “creative, intuitive, and emotional” thinkers favoring their right hemisphere is the misinterpretation of valuable science. She thinks it entered pop culture because it makes for snappy self-help books. And of course people love categorizing themselves.

In the ’60s, ’70s, and ’80s, the renowned cognitive neuroscientist Michael Gazzaniga led breakthrough studies on how the brain works. He studied patients who — and here’s the key — lacked a corpus callosum, the tract that connect the brain’s hemispheres. During this time doctors had experimented on patients suffering from constant seizures due to intractable epilepsy by disconnecting the hemispheres.

Gazzaniga could thus determine the origins in the brain of certain cognitive and motor functions by monitoring the brains of these patients.

He found, for example, that a part of the left brain he dubbed “The Interpreter” handled the process of explaining actions that may have begun in the right brain.

He discovered “that each hemisphere played a role in different tasks and different cognitive functions, and that normally one hemisphere dominated over the other,” Blakemore explains.

This was breakthrough research on how parts of the brain worked. But in a normal human being, the corpus callosum is constantly transmitting information between both halves. It’s physically impossible to favor one side.

Blakemore thinks that this misinterpretation of the research is actually harmful, because the dichotomous labels convince people that their way of thinking is genetically fixed on a large scale.

“I mean, there are huge individual differences in cognitive strengths,” Blakemore says. “Some people are more creative; others are more analytical than others. But the idea that this has something to do with being left-brained or right-brained is completely untrue and needs to be retired.”

You can listen to Blakemore and many other experts taking down their least favorite ideas in the Freakonomics Radio episode “This Idea Must Die,” hosted by “Freakonomics” co-author Stephen J. Dubner.

The above story is reprinted from materials provided by BusinessInsider.

Here’s what happens to your brain when you give up sugar

February 22, 2015 Leave a comment

»Anyone who knows me also knows that I have a huge sweet tooth. I always have. My friend and fellow graduate student Andrew is equally afflicted, and living in Hershey, Pennsylvania – the “Chocolate Capital of the World” – doesn’t help either of us.

But Andrew is braver than I am. Last year, he gave up sweets for Lent. I can’t say that I’m following in his footsteps this year, but if you are abstaining from sweets for Lent this year, here’s what you can expect over the next 40 days.

Sugar: natural reward, unnatural fix

In neuroscience, food is something we call a “natural reward.” In order for us to survive as a species, things like eating, having sex and nurturing others must be pleasurable to the brain so that these behaviours are reinforced and repeated.

The nucleus accumbens. Geoff B Hall

Evolution has resulted in the mesolimbic pathway, a brain system that deciphers these natural rewards for us. When we do something pleasurable, a bundle of neurons called the ventral tegmental area uses the neurotransmitter dopamine to signal to a part of the brain called the nucleus accumbens. The connection between the nucleus accumbens and our prefrontal cortex dictates our motor movement, such as deciding whether or not to taking another bite of that delicious chocolate cake. The prefrontal cortex also activates hormones that tell our body: “Hey, this cake is really good. And I’m going to remember that for the future.”

Not all foods are equally rewarding, of course. Most of us prefer sweets over sour and bitter foods because, evolutionarily, our mesolimbic pathway reinforces that sweet things provide a healthy source of carbohydrates for our bodies. When our ancestors went scavenging for berries, for example, sour meant “not yet ripe,” while bitter meant “alert – poison!”

Fruit is one thing, but modern diets have taken on a life of their own. A decade ago, it was estimated that the average American consumed 22 teaspoons of added sugar per day, amounting to an extra 350 calories; it may well have risen since then. A few months ago, one expert suggested that the average Briton consumes 238 teaspoons of sugar each week.

Today, with convenience more important than ever in our food selections, it’s almost impossible to come across processed and prepared foods that don’t have added sugars for flavour, preservation, or both.

These added sugars are sneaky – and unbeknown to many of us, we’ve become hooked. In ways that drugs of abuse – such as nicotine, cocaine and heroin – hijack the brain’s reward pathway and make users dependent, increasing neuro-chemical and behavioural evidence suggests that sugar is addictive in the same way, too.

Sugar addiction is real

“The first few days are a little rough,” Andrew told me about his sugar-free adventure last year. “It almost feels like you’re detoxing from drugs. I found myself eating a lot of carbs to compensate for the lack of sugar.”

There are four major components of addiction: bingeing, withdrawal, craving, and cross-sensitisation (the notion that one addictive substance predisposes someone to becoming addicted to another). All of these components have been observed in animal models of addiction – for sugar, as well as drugs of abuse.

A typical experiment goes like this: rats are deprived of food for 12 hours each day, then given 12 hours of access to a sugary solution and regular chow. After a month of following this daily pattern, rats display behaviours similar to those on drugs of abuse. They’ll binge on the sugar solution in a short period of time, much more than their regular food. They also show signs of anxiety and depression during the food deprivation period. Many sugar-treated rats who are later exposed to drugs, such as cocaine and opiates, demonstrate dependent behaviours towards the drugs compared to rats who did not consume sugar beforehand.

Like drugs, sugar spikes dopamine release in the nucleus accumbens. Over the long term, regular sugar consumption actually changes the gene expression and availability of dopamine receptors in both the midbrain and frontal cortex. Specifically, sugar increases the concentration of a type of excitatory receptor called D1, but decreases another receptor type called D2, which is inhibitory. Regular sugar consumption also inhibits the action of the dopamine transporter, a protein which pumps dopamine out of the synapse and back into the neuron after firing.

In short, this means that repeated access to sugar over time leads to prolonged dopamine signalling, greater excitation of the brain’s reward pathways and a need for even more sugar to activate all of the midbrain dopamine receptors like before. The brain becomes tolerant to sugar – and more is needed to attain the same “sugar high.”

Sugar withdrawal is also real

Although these studies were conducted in rodents, it’s not far-fetched to say that the same primitive processes are occurring in the human brain, too. “The cravings never stopped, [but that was] probably psychological,” Andrew told me. “But it got easier after the first week or so.”

In a 2002 study by Carlo Colantuoni and colleagues of Princeton University, rats who had undergone a typical sugar dependence protocol then underwent “sugar withdrawal.” This was facilitated by either food deprivation or treatment with naloxone, a drug used for treating opiate addiction which binds to receptors in the brain’s reward system. Both withdrawal methods led to physical problems, including teeth chattering, paw tremors, and head shaking. Naloxone treatment also appeared to make the rats more anxious, as they spent less time on an elevated apparatus that lacked walls on either side.

Similar withdrawal experiments by others also report behaviour similar to depression in tasks such as the forced swim test. Rats in sugar withdrawal are more likely to show passive behaviours (like floating) than active behaviours (like trying to escape) when placed in water, suggesting feelings of helplessness.

A new study published by Victor Mangabeira and colleagues in this month’s Physiology & Behavior reports that sugar withdrawal is also linked to impulsive behaviour. Initially, rats were trained to receive water by pushing a lever. After training, the animals returned to their home cages and had access to a sugar solution and water, or just water alone. After 30 days, when rats were again given the opportunity to press a lever for water, those who had become dependent on sugar pressed the lever significantly more times than control animals, suggesting impulsive behaviour.

These are extreme experiments, of course. We humans aren’t depriving ourselves of food for 12 hours and then allowing ourselves to binge on soda and doughnuts at the end of the day. But these rodent studies certainly give us insight into the neuro-chemical underpinnings of sugar dependence, withdrawal, and behaviour.

Through decades of diet programmes and best-selling books, we’ve toyed with the notion of “sugar addiction” for a long time. There are accounts of those in “sugar withdrawal” describing food cravings, which can trigger relapse and impulsive eating. There are also countless articles and books about the boundless energy and new-found happiness in those who have sworn off sugar for good. But despite the ubiquity of sugar in our diets, the notion of sugar addiction is still a rather taboo topic.

Are you still motivated to give up sugar for Lent? You might wonder how long it will take until you’re free of cravings and side-effects, but there’s no answer – everyone is different and no human studies have been done on this. But after 40 days, it’s clear that Andrew had overcome the worst, likely even reversing some of his altered dopamine signalling. “I remember eating my first sweet and thinking it was too sweet,” he said. “I had to rebuild my tolerance.”

And as regulars of a local bakery in Hershey – I can assure you, readers, that he has done just that.«

The above story is reprinted from materials provided by The Conversation.

Increasing Brain Acidity May Reduce Anxiety


Animal study highlights potential new target for treating anxiety disorders

Increasing acidity in the brain’s emotional control center reduces anxiety, according to an animal study published February 26 in The Journal of Neuroscience. The findings suggest a new mechanism for the body’s control of fear and anxiety, and point to a new target for the treatment of anxiety disorders.

Anxiety disorders, which are characterized by an inability to control feelings of fear and uncertainty, are the most prevalent group of psychiatric diseases. At the cellular level, these disorders are associated with heightened activity in the basolateral amygdala (BLA), which is known to play a central role in emotional behavior.

Many cells in the BLA possess acid-sensing ion channels called ASIC1a, which respond to pH changes in the environment outside of the cell. Maria Braga, DDS, PhD, and colleagues at the Uniformed Services University of the Health Sciences, F. Edward Hébert School of Medicine, found that activating ASIC1a decreased the activity of nearby cells and reduced anxiety-like behavior in animals. These findings add to previous evidence implicating the role of ASIC1a in anxiety.

“These findings suggest that activating these channels, specifically in fear-related areas such as the amygdala, may be a key to regulating anxiety,” explained Anantha Shekhar, MD, PhD, who studies panic disorders at Indiana University and was not involved in this study. “Developing specific drugs that can stimulate these channels could provide a new way to treat anxiety and fear disorders such a post-traumatic stress and panic disorders.”

To determine the effect ASIC1a activation has on neighboring cells, Braga’s group bathed BLA cells in an acidic solution in the laboratory and measured the signals sent to nearby cells. Lowering the pH of the solution decreased the activity of cells in the BLA.

Activating ASIC1a also affected animal behavior. When the researchers administered a drug that blocks ASIC1a directly into the BLA of rats, the rats displayed more anxiety-like behavior than animals that did not receive the drug. In contrast, when rats received a drug designed to increase the activity of ASIC1a channels in the BLA, the animals displayed less anxiety-like behavior.

“Our study emphasizes the importance of identifying and elucidating mechanisms involved in the regulation of brain function for the development of more efficacious therapies for treating psychiatric and neurological illnesses,” Braga said. While the findings suggest that drugs targeting ASICs may one day lead to novel therapies for anxiety disorders, Braga noted that “more research is needed to understand the roles that ASIC1a channels play in the brain.”

Link to the article:

The Journal of Neuroscience

Scientists Map Process by Which Brain Cells Form Long-Term Memories

July 2, 2013 2 comments

Scientists at the Gladstone Institutes have deciphered how a protein called Arc regulates the activity of neurons – providing much-needed clues into the brain’s ability to form long-lasting memories.

These findings, reported in Nature Neuroscience, also offer newfound understanding as to what goes on at the molecular level when this process becomes disrupted.

Led by Gladstone senior investigator Steve Finkbeiner, MD, PhD, this research delved deep into the inner workings of synapses. Synapses are the highly specialized junctions that process and transmit information between neurons. Most of the synapses our brain will ever have are formed during early brain development, but throughout our lifetimes these synapses can be made, broken and strengthened. Synapses that are more active become stronger, a process that is essential for forming new memories.

However, this process is also dangerous, as it can overstimulate the neurons and lead to epileptic seizures. It must therefore be kept in check.

The researchers’ experiments revealed that Arc acted as a master regulator of the entire homeostatic scaling process. During memory formation, certain genes must be switched on and off at very specific times in order to generate proteins that help neurons lay down new memories. The image is an arc immunohistochemical staining of a rat dentate gyrus. This is used for illustrative purposes and is not connected to the research.

Neuroscientists recently discovered one important mechanism that the brain uses to maintain this important balance: a process called “homeostatic scaling.” Homeostatic scaling allows individual neurons to strengthen the new synaptic connections they’ve made to form memories, while at the same time protecting the neurons from becoming overly excited. Exactly how the neurons pull this off has eluded researchers, but they suspected that the Arc protein played a key role.

“Scientists knew that Arc was involved in long-term memory, because mice lacking the Arc protein could learn new tasks, but failed to remember them the next day,” said Finkbeiner, who is also a professor of neurology and physiology at UC San Francisco, with which Gladstone is affiliated. “Because initial observations showed Arc accumulating at the synapses during learning, researchers thought that Arc’s presence at these synapses was driving the formation of long-lasting memories.”

But Finkbeiner and his team thought there was something else in play.

The Role of Arc in Homeostatic Scaling

In laboratory experiments, first in animal models and then in greater detail in the petri dish, the researchers tracked Arc’s movements. And what they found was surprising.

“When individual neurons are stimulated during learning, Arc begins to accumulate at the synapses – but what we discovered was that soon after, the majority of Arc gets shuttled into the nucleus,” said Erica Korb, PhD, the paper’s lead author who completed her graduate work at Gladstone and UCSF.

“A closer look revealed three regions within the Arc protein itself that direct its movements: one exports Arc from the nucleus, a second transports it into the nucleus, and a third keeps it there,” she said. “The presence of this complex and tightly regulated system is strong evidence that this process is biologically important.”

In fact, the team’s experiments revealed that Arc acted as a master regulator of the entire homeostatic scaling process. During memory formation, certain genes must be switched on and off at very specific times in order to generate proteins that help neurons lay down new memories. From inside the nucleus, the authors found that it was Arc that directed this process required for homeostatic scaling to occur. This strengthened the synaptic connections without overstimulating them – thus translating learning into long-term memories.

Implications for a Variety of Neurological Diseases

“This discovery is important not only because it solves a long-standing mystery on the role of Arc in long-term memory formation, but also gives new insight into the homeostatic scaling process itself – disruptions in which have already been implicated in a whole host of neurological diseases,” said Finkbeiner. “For example, scientists recently discovered that Arc is depleted in the hippocampus, the brain’s memory center, in Alzheimer’s disease patients. It’s possible that disruptions to the homeostatic scaling process may contribute to the learning and memory deficits seen in Alzheimer’s.”

Dysfunctions in Arc production and transport may also be a vital player in autism. For example, the genetic disorder Fragile X syndrome – a common cause of both mental retardation and autism, directly affects the production of Arc in neurons.

“In the future,” added Dr. Korb, “we hope further research into Arc’s role in human health and disease can provide even deeper insight into these and other disorders, and also lay the groundwork for new therapeutic strategies to fight them.”

Journal reference: Abstract for “Arc in the nucleus regulates PML-dependent GluA1 transcription and homeostatic plasticity” by Erica Korb, Carol L Wilkinson, Ryan N Delgado, Kathryn L Lovero and Steven Finkbeiner in Nature Neuroscience. Published online June 9 2013 doi:10.1038/nn.3429

The above story is reprinted from materials provided by UCSF press release.

How does short-term memory work in relation to long-term memory? Are short-term daily memories somehow transferred to long-term storage while we sleep?


Alison Preston, an assistant professor at the University of Texas at Austin’s Center for Learning and Memory, recalls and offers an answer for this question.

A short-term memory’s conversion to long-term memory requires the passage of time, which allows it to become resistant to interference from competing stimuli or disrupting factors such as injury or disease. This time-dependent process of stabilization, whereby our experiences achieve a permanent record in our memory, is referred to as “consolidation.”

Memory consolidation can occur at many organizational levels in the brain. Cellular and molecular changes typically take place within the first minutes or hours of learning and result in structural and functional changes to neurons (nerve cells) or sets of neurons. Systems-level consolidation, involving the reorganization of brain networks that handle the processing of individual memories, may then happen, but on a much slower time frame that can take several days or years.

Memory does not refer to a single aspect of our experience but rather encompasses a myriad of learned information, such as knowing the identity of the 16th president of the United States, what we had for dinner last Tuesday or how to drive a car. The processes and brain regions involved in consolidation may vary depending on the particular characteristics of the memory to be formed.

Let’s consider the consolidation process that affects the category of declarative memory—that of general facts and specific events. This type of memory relies on the function of a brain region called the hippocampus and other surrounding medial temporal lobe structures. At the cellular level, memory is expressed as changes to the structure and function of neurons. For example, new synapses—the connections between cells through which they exchange information—can form to allow for communication between new networks of cells. Alternately, existing synapses can be strengthened to allow for increased sensitivity in the communication between two neurons.

Consolidating such synaptic changes requires the synthesis of new RNA and proteins in the hippocampus, which transform temporary alterations in synaptic transmission into persistent modifications of synaptic architecture. For example, blocking protein synthesis in the brains of mice does not affect the short-term memory or recall of newly learned spatial environments in hippocampal neurons. Inhibiting protein synthesis, however, does abolish the formation of new long-term representations of space in hippocampal neurons, thus impairing the consolidation of spatial memories.

Over time, the brain systems that support individual, declarative memories also change as a result of systems-level consolidation processes. Initially, the hippocampus works in concert with sensory processing regions distributed in the neocortex (the outermost layer of the brain) to form the new memories. Within the neocortex, representations of the elements that constitute an event in our life are distributed across multiple brain regions according to their content. For example, visual information is processed by primary visual cortex in the occipital lobe at the rear of the brain, while auditory information is processed by primary auditory cortex located in the temporal lobes, which lie on the side of the brain.

When a memory is initially formed, the hippocampus rapidly associates this distributed information into a single memory, thus acting as an index to representations in the sensory processing regions. As time passes, cellular and molecular changes allow for the strengthening of direct connections between neocortical regions, enabling the memory of an event to be accessed independently of the hippocampus. Damage to the hippocampus by injury or neurodegenerative disorder (Alzheimer’s disease, for instance) produces anterograde amnesia—the inability to form new declarative memories—because the hippocampus is no longer able to connect mnemonic information distributed in the neocortex before the data has been consolidated. Interestingly, such a disruption does not impair memory for facts and events that have already been consolidated. Thus, an amnesiac with hippocampal damage would not be able to learn the names of current presidential candidates but would be able to recall the identity 16th US president (Abraham Lincoln, of course!).

The role of sleep in memory consolidation is an ancient question dating back to the Roman rhetorician Quintilian in the first century A.D. Much research in the past decade has been dedicated to better understanding the interaction between sleep and memory. Yet little is understood.

At the molecular level, gene expression responsible for protein synthesis is increased during sleep in rats exposed to enriched environments, suggesting memory consolidation processes are enhanced, or may essentially rely, on sleep. Further, patterns of activity observed in rats during spatial learning are replayed in hippocampal neurons during subsequent sleep, further suggesting that learning may continue in sleep.

In humans, recent studies have demonstrated the benefits of sleep on declarative memory performance, thus giving a neurological basis to the old adage, “sleep on it.” A night of sleep reportedly enhances memory for associations between word pairs. Similar overnight improvements on virtual navigation tasks have been observed, which correlate with hippocampal activation during sleep. Sleep deprivation, on the other hand, is known to produce deficits in hippocampal activation during declarative memory formation, resulting in poor subsequent retention. Thus, the absence of prior sleep compromises our capacity for committing new experiences to memory. These initial findings suggest an important, if not essential, role for sleep in the consolidation of newly formed memories.

Story Source:

The above story is reprinted from materials provided by ScientificAmerican magazine.