Blue light can help heal mild traumatic brain injury

Blue light can help heal mild traumatic brain injury

 

“Daily exposure to blue wavelength light each morning helps to re-entrain the circadian rhythm so that people get better, more regular sleep. This is likely true for everybody, but we recently demonstrated it in people recovering from mild traumatic brain injury, or mTBI. That improvement in sleep was translated into improvements in cognitive function, reduced daytime sleepiness and actual brain repair,” said William D. “Scott

William D.S. Killgore, John R. Vanuk, Bradley R. Shane, Mareen Weber, Sahil Bajaj. A randomized, double-blind, placebo-controlled trial of blue wavelength light exposure on sleep and recovery of brain structure, function, and cognition following mild traumatic brain injury. Neurobiology of Disease, 2020; 134: 104679 DOI: 10.1016/j.nbd.2019.104679

#mtbi #bluelight #braininjury

https://www.sciencedirect.com/science/article/pii/S0969996119303547?via%3Dihub#f0035

Mtbi, traumatic brain injury, heal, recovery, blue light, wavelength, 469, pulvinar, grey matter, thalami, tbi, cognition, reaction speed, accuracy, melatonin, circadian rhythm

Drinking tea improves brain structure, study suggests

Drinking tea improves brain structure, study suggests

Drinking tea improves brain structure, study suggests

A recent study led by researchers from the National University of Singapore (NUS) revealed that regular tea drinkers have better organised brain regions — and this is associated with healthy cognitive function — compared to non-tea drinkers. The research team made this discovery after examining neuroimaging data of 36 older adults.

Junhua Li, Rafael Romero-Garcia, John Suckling, Lei Feng. Habitual tea drinking modulates brain efficiency: evidence from brain connectivity evaluation. Aging, 2019; 11 (11): 3876 DOI: 10.18632/aging.102023

#tea #brainhealth #brainstructure

 

https://www.aging-us.com/article/102023/text

Scientist discover Virus that Makes People More Stupid – 43% of those tested were infected

Algal viruses attach, enter, and infect green alga (seen in series here).

Algal viruses attach, enter, and infect green alga (seen in series here).

Liz is a staff writer for Science.By Elizabeth Pennisi

27 October 2014 3:30 pm

It’s not such a stretch to think that humans can catch the Ebola virus from monkeys and the flu virus from pigs. After all, they are all mammals with fundamentally similar physiologies. But now researchers have discovered that even a virus found in the lowly algae can make mammals its home. The invader doesn’t make people or mice sick, but it does seem to slow specific brain activities.

The virus, called ATCV-1, showed up in human brain tissue several years ago, but at the time researchers could not be sure whether it had entered the tissue before or after the people died. Then, it showed up again in a survey of microbes and viruses in the throats of people with psychiatric disease. Pediatric infectious disease expert Robert Yolken from Johns Hopkins University School of Medicine in Baltimore, Maryland, and his colleagues were trying to see if pathogens play a role in these conditions. At first, they didn’t know what ATCV-1 was, but a database search revealed its identity as a virus that typically infects a species of green algae found in lakes and rivers.

The researchers wanted to find out if the virus was in healthy people as well as sick people. They checked for it in 92 healthy people participating in a study of cognitive function and found it in 43% of them. What’s more, those infected with the virus performed 10% worse than uninfected people on tests requiring visual processing. They were slower in drawing a line connecting a sequence of numbers randomly placed on a page, for example. And they seemed to have shorter attention spans, the researchers report online today in theProceedings of the National Academy of Sciences. The effects were modest, but significant.

The slower brain function was not associated with any differences in sex, income or education level, race, place of birth, or cigarette smoking. But that doesn’t necessarily mean the virus causes cognitive decline; it might just benefit from some other factor that impairs the brain in some people, such as other infectious agents, heavy metals, or pollutants, the researchers say. Continue reading “Scientist discover Virus that Makes People More Stupid – 43% of those tested were infected”

Chips that mimic the brain

Contact: Giacomo Indiveri giacomo.indiveri@ini.phys.ethz.ch 41-446-353-024 University of Zurich

No computer works as efficiently as the human brain – so much so that building an artificial brain is the goal of many scientists. Neuroinformatics researchers from the University of Zurich and ETH Zurich have now made a breakthrough in this direction by understanding how to configure so-called neuromorphic chips to imitate the brain’s information processing abilities in real-time. They demonstrated this by building an artificial sensory processing system that exhibits cognitive abilities.

New approach: simulating biological neurons

Most approaches in neuroinformatics are limited to the development of neural network models on conventional computers or aim to simulate complex nerve networks on supercomputers. Few pursue the Zurich researchers’ approach to develop electronic circuits that are comparable to a real brain in terms of size, speed, and energy consumption. “Our goal is to emulate the properties of biological neurons and synapses directly on microchips,” explains Giacomo Indiveri, a professor at the Institute of Neuroinformatics (INI), of the University of Zurich and ETH Zurich.

The major challenge was to configure networks made of artificial, i.e. neuromorphic, neurons in such a way that they can perform particular tasks, which the researchers have now succeeded in doing: They developed a neuromorphic system that can carry out complex sensorimotor tasks in real time. They demonstrate a task that requires a short-term memory and context-dependent decision-making – typical traits that are necessary for cognitive tests. In doing so, the INI team combined neuromorphic neurons into networks that implemented neural processing modules equivalent to so-called “finite-state machines” – a mathematical concept to describe logical processes or computer programs. Behavior can be formulated as a “finite-state machine” and thus transferred to the neuromorphic hardware in an automated manner. “The network connectivity patterns closely resemble structures that are also found in mammalian brains,” says Indiveri.

Chips can be configured for any behavior modes

The scientists thus demonstrate for the first time how a real-time hardware neural-processing system where the user dictates the behavior can be constructed. “Thanks to our method, neuromorphic chips can be configured for a large class of behavior modes. Our results are pivotal for the development of new brain-inspired technologies,” Indiveri sums up. One application, for instance, might be to combine the chips with sensory neuromorphic components, such as an artificial cochlea or retina, to create complex cognitive systems that interact with their surroundings in real time.

###

Literature:

E. Neftci, J. Binas, U. Rutishauser, E. Chicca, G. Indiveri, R. J. Douglas. Synthesizing Cognition in Neuromorphic VLSI Systems. PNAS. July 22, 2013. Doi:10.1073/pnas.0709640104

Contacts:

Prof. Giacomo Indiveri Institute of Neuroinformatics University of Zurich / ETH Zurich Tel. +41 44 635 30 24  E-Mail: giacomo.indiveri@ini.phys.ethz.ch

Davos 2013: world leaders to discuss aliens, super-humans, and immortals

Jan 25, 2013 15:01 Moscow Time

земля планета земля инопланетянин земля инопланетянин космос инопланетянин инопланетная жизнь 2011 май коллаж

© Colalge “The Voice of Russia”

This year, apart from the traditional economic concerns, the program of the World Economic Forum in Davos is scheduled to address a number of highly controversial issues which have been kept classified for decades. Called the ‘X factors’, these issues include the potential risks of medically induced enhancement of cognitive abilities, prolongation of human life, and discovery of extraterrestrial life.

After reading the Executive Summary of the WEF 2013 one is left with an impression that he has just read the scenario for the next ‘X Files’ episode. Runaway climate change, rogue deployment of geoengeneering, and digital wildfires are just a few issues that the readers of the Executive Summary can find not only unconventional but also futuristic. Nonetheless, all of these themes are due to be discussed under the rubric of the ‘X Factors’.

Developed in partnership with the editors of Nature, a leading science journal, the ‘X Factors’ category looks well beyond the landscape of 50 traditional global risks and identifies the most significant game-changers of the next decade. Apart from the already mentioned runaway climate change, digital wildfires, and rogue geoengeneering, which seem to be at least minimally realistic, the list of ‘X Factors’ also includes the possible implications of people living longer, getting smarter, and meeting extra-terrestrial ‘Others’. While some remain highly skeptical regarding these issues, the editors of Nature together with the WED team seem to be convinced that in the very near future these risks will not only become very real, but will also profoundly challenge the existing social and scientific paradigms.

In WEF team’s opinion, super-human abilities are no longer the preserve of science fiction. Instead, the time of human prodigies is fast approaching the horizon of plausibility. At the time when researchers all over the world are working hard to develop the medical cure to such mental illnesses as Alzheimer’s and schizophrenia, it is conceivable that in the not too distant future scientists will identify compounds that will be more effective than existing cognitive pharmaceutical enhancers such as Ritalin and modafinil. While these new compounds will be prescribed only for treatment of severe neurological diseases, it is highly likely that they will also be used off-label by healthy people seeking for an edge in their every-day endeavors. effective new compounds which appear to enhance intelligence or cognition are sure to be used off-label by healthy people looking for an edge at work or school.

Interestingly, WEF experts believe that significant enhancement of cognitive abilities can be attained through hardware as well as drugs. Laboratory studies indicate that direct electrical stimulation through the implanted electrodes can significantly improve memory. Unlike drugs, such cognitive enhancement therapy is less easily available and is thus less likely to be adopted by healthy people. Nonetheless, the scientists suggest that within 10 years time intra-brain devices and sensors will open a new realm of enhanced neurobiology for those who can afford it. In this context, the scientists wonder whether it can be ethically acceptable for the world to be divided into the cognitively-enhanced and unenhanced. Will the humanity accept the idea that significant cognitive enhancement should be available to purchase on the open market or will there be a push for legislation to maintain a more level playing field?

The other question that the experts are asking is what happens if cognitive enhancement program goes awry or if it falls in the wrong hands. Cognitive enhancement drugs and devices have a very wide-ranging effects on various systems of human body since they work by targeting neurotransmitter systems. In this respect, WEF scientists argue that “there is a significant possibility of (un)intended effects on other systems – for example, drugs to enhance learning may lead to a greater willingness to take risks; drugs to enhance working memory may lead to increased impulsive behaviour”. Indeed, recent research into the field already suggests that, in addition to improving long term memory, it is possible to use TMS to manipulate or even suspend a person’s moral judgement of right versus wrong. The technology can also be used to “erase” memory and deliberately cause permanent brain damage. In this sense, it is not difficult to see how new cognitive enhancement drugs and technologies can open up a space for their misuse by criminal organizations and terrorist networks.

Another issue that the WEF experts decided to present for discussion this year is the implications of longer life-span among humans. The WEF team suggests that while “medical advances are prolonging life, long-term palliative care is expensive. Covering the costs associated with old age could be a struggle”. Indeed, according to official statistics people all over the globe now live up to 35 percent longer than hundred years ago and more funds are needed to provide adequate care for the millions of elderly. However, the problem of funding is not the only concern which is related to longer life-span. The risk of over-populating the planet is yet another issue which the world will soon face.

In this respect, most radical commentators were quick to suggest that the only solution to the problem of longer living humans is euthanasia. The proponents of this view contend that with medical advancements even the weakest and the sickest people will survive and live to their late 90s and possible 100s, which will not only lead to a significant increase in global population, but will also negate the fundamental law of the survival of the fittest. In this context, some suggest that euthanasia might be the only way out from the vicious circle of artificially healthy individuals living unnaturally long lives.

The last and probably the most controversial X Factor that will be discussed during the Davos Forum is the possible discovery of extraterrestrial life. While it is the first time that the Forum addresses the aliens, the issue has recently become a frequent theme of discussion among the world leading politicians and military officials. In December 2012, Russian Prime Minster Dmitry Medvedev mused on topic of aliens after completing an on-camera interview with international reporters in Moscow. Back then, Mr Medvedev jokingly claimed that “I will not tell you how many of them [aliens] are among us because it may cause panic”. It turns out, however, that Mr Medvedev’s concern with the aliens did not end last December. A shocking Davos Forum agenda aims to bring the topic of aliens beyond the realm of jokes.

WEF experts contend that “given the pace of space exploration, it is increasingly conceivable that we may discover the existence of alien life or other planets that could support human life. In 10 years’ time we may have evidence not only that Earth is not unique but also that life exists elsewhere in the universe.” In this context, WEF team urges the global elite to prepare themselves and their nations for such discovery. The scientists suggest that new funding and new brain power will be needed to overcome the challenges that the humanity will face as a result of its encounter with an extra-terrestrial civilization. The world might even need to create artificial-intelligence emissaries to survive an inter-stellar crossing. The discovery of an Earth 2.0 or life beyond our planet might also inspire new generations of space entrepreneurs to meet the challenge of taking human exploration of the galaxy from the realm of fiction to fact.

At the same time, WEF experts do not believe that the discovery of alien life will change the fabric of human society in the short-run. While the discovery would certainly be one of the biggest news stories of the year and interest would be intense, it would not change the world immediately. Over the long run, however, the psychological and philosophical implications of the discovery could be profound. In the opinion of WEF scientists, “the discovery of even simple life would fuel speculation about the existence of other intelligent beings and challenge many assumptions that underpin human philosophy and religion.”

All in all, it seems that humanity is heading to exciting times, and Davos may be the first trigger that will unleash a series of most extraordinary worldwide revel worldwide revelations.

http://english.ruvr.ru/2013_01_25/Davos-2013-world-leaders-to-discuss-aliens-super-humans-and-immortals/

Can your body sense future events without any external clue?

Contact: Hilary Hurd Anyaso
h-anyaso@northwestern.edu
847-491-4887
Northwestern University

New Northwestern analysis focuses on ‘pre-feelings’ and ability to anticipate the near future

EVANSTON, Ill. — Wouldn’t it be amazing if our bodies prepared us for future events that could be very important to us, even if there’s no clue about what those events will be?

Presentiment without any external clues may, in fact, exist, according to new Northwestern University research that analyzes the results of 26 studies published between 1978 and 2010.

Researchers already know that our subconscious minds sometimes know more than our conscious minds. Physiological measures of subconscious arousal, for instance, tend to show up before conscious awareness that a deck of cards is stacked against us.

“What hasn’t been clear is whether humans have the ability to predict future important events even without any clues as to what might happen,” said Julia Mossbridge, lead author of the study and research associate in the Visual Perception, Cognition and Neuroscience Laboratory at Northwestern.

A person playing a video game at work while wearing headphones, for example, can’t hear when his or her boss is coming around the corner.

“But our analysis suggests that if you were tuned into your body, you might be able to detect these anticipatory changes between two and 10 seconds beforehand and close your video game,” Mossbridge said. “You might even have a chance to open that spreadsheet you were supposed to be working on. And if you were lucky, you could do all this before your boss entered the room.”

This phenomenon is sometimes called “presentiment,” as in “sensing the future,” but Mossbridge said she and other researchers are not sure whether people are really sensing the future.

“I like to call the phenomenon ‘anomalous anticipatory activity,'” she said. “The phenomenon is anomalous, some scientists argue, because we can’t explain it using present-day understanding about how biology works; though explanations related to recent quantum biological findings could potentially make sense. It’s anticipatory because it seems to predict future physiological changes in response to an important event without any known clues, and it’s an activity because it consists of changes in the cardiopulmonary, skin and nervous systems.”

###

The study, “Predictive Physiological Anticipation Preceding Seemingly Unpredictable Stimuli: A Meta-Analysis,” is in the current edition of Frontiers in Perception Science. In addition to Mossbridge, co-authors of the study include Patrizio Tressoldi of the Università di Padova, Padova, Italy, and Jessica Utts of the University of California, Irvine.

NORTHWESTERN NEWS: www.northwestern.edu/newscenter/

Robots That Perceive the World Like Humans

ScienceDaily (Oct. 18, 2012) — Perceive first, act afterwards.The architecture of most of today’s robots is underpinned by this control strategy. The eSMCs project has set itself the aim of changing the paradigm and generating more dynamic computer models in which action is not a mere consequence of perception but an integral part of the perception process. It is about improving robot behaviour by means of perception models closer to those of humans. Philosophers at the UPV/EHU-University of the Basque Country are working to improve the systems of perception of robots by applying human models.

“The concept of how science understands the mind when it comes to building a robot or looking at the brain is that you take a photo, which is then processed as if the mind were a computer, and a recognition of patterns is carried out. There are various types of algorithms and techniques for identifying an object, scenes, etc. However, organic perception, that of human beings, is much more active. The eye, for example, carries out a whole host of saccadic movements — small rapid ocular movements — that we do not see.Seeing is establishing and recognising objects through this visual action, knowing how the relationship and sensation of my body changes with respect to movement,” explains XabierBarandiaran, a PhD-holder in Philosophy and researcher at IAS-Research (UPV/EHU) which under the leadership of Ikerbasque researcher Ezequiel di Paolo is part of the European project eSMCs (Extending Sensorimotor Contingencies to Cognition).

Until now, the belief has been that sensations were processed, and the perception was created,and this in turn then led to reasoning and action. As Barandiaran sees it, action is an integral part of perception: “Our basic idea is that when we perceive, what is there is active exploration, a particular co-ordination with the surroundings, like a kind of invisible dance than makes vision possible.”

The eSMCs project aims to apply this idea to the computer models used in robots, improve their behaviour and thus understand the nature of the animal and human mind. For this purpose, the researchers are working on sensorimotor contingencies:regular relationships existing between actions and changes in the sensory variations associated with these actions.

An example of this kind of contingency is when you drink water and speak at the same time, almost without realising it. Interaction with the surroundings has taken place “without any need to internally represent that this is a glass and then compute needs and plan an action,” explains Barandiaran, “seeing the glass draws one’s attention, it is coordinated with thirst while the presence of the water itself on the table is enough for me to coordinate the visual-motor cycle that ends up with the glass at my lips. “The same thing happens in the robots in the eSMCs project, “they are moving the whole time, they don’t stop to think; they think about the act using the body and the surroundings,” he adds.

The researchers in the eSMCs project maintain that actions play a key role not only in perception, but also in the development of more complex cognitive capacities. That is why they believe that sensorimotor contingencies can be used to specify habits, intentions, tendencies and mental structures, thus providing the robot with a more complex, fluid behaviour.

So one of the experiments involves a robot simulation (developed by Thomas Buhrmann, who is also a member of this team at the UPV/EHU) in which an agent has to discriminate between what we could call an acne pimple and a bite or lump on the skin. “The acne has a tip, the bite doesn’t. Just as people do, our agent stays with the tip and recognises the acne, and when it goes on to touch the lump, it ignores it. What we are seeking to model and explain is that moment of perception that is built with the active exploration of the skin, when you feel ‘ah! I’ve found the acne pimple’ and you go on sliding your finger across it,” says Barandiaran. The model tries to identify what kind of relationship is established between the movement and sensation cycles and the neurodynamic patterns that are simulated in the robot’s “mini brain.”

In another robot, built at the Artificial Intelligence Laboratory of Zürich University, Puppy, a robot dog, is capable of adapting and “feeling” the texture of the terrain on which it is moving (slippery, viscous, rough, etc.)by exploring the sensorimotor contingencies that take place when walking.

The work of the UPV/EHU’s research team is focusing on the theoretical part of the models to be developed. “As philosophers, what we mostly do is define concepts.Our main aim is to be able to define technical concepts like the sensorimotor habitat, or that of the pattern of sensorimotor co-ordination, as well as that of habit or of mental life as a whole. “Defining concepts and giving them a mathematical form is essential so that the scientist can apply it to specific experiments, not only with robots, but also with human beings. The partners at the University Medical Centre Hamburg-Eppendorf, for example, are studying in dialogue with the theoretical development of the UPV/EHU team how the perception of time and space changes in Parkinson’s patients.

http://www.sciencedaily.com/releases/2012/10/121018100131.htm

 

Eating lots of carbs, sugar may raise risk of cognitive impairment, Mayo Clinic study finds

Contact: Nick Hanson newsbureau@mayo.edu 507-284-5005 Mayo Clinic

Those 70-plus who ate food high in fat and protein fared better cognitively, research showed

ROCHESTER, Minn. — People 70 and older who eat food high in carbohydrates have nearly four times the risk of  developing mild cognitive impairment, and the danger also rises with a diet heavy in sugar, Mayo Clinic researchers have found. Those who consume a lot of protein and fat relative to carbohydrates are less likely to become cognitively impaired, the study found. The findings are published in the Journal of Alzheimer’s Disease.

The research highlights the importance of a well-rounded diet, says lead author Rosebud Roberts, M.B., Ch.B., a Mayo Clinic epidemiologist.

“We think it’s important that you eat a healthy balance of protein, carbohydrates and fat, because each of these nutrients has an important role in the body,” Dr. Roberts says.

Researchers tracked 1,230 people ages 70 to 89 who provided information on what they ate during the previous year. At that time, their cognitive function was evaluated by an expert panel of physicians, nurses and neuropsychologists. Of those participants, only the roughly 940 who showed no signs of cognitive impairment were asked to return for follow-up evaluations of their cognitive function. About four years into the study, 200 of those 940 were beginning to show mild cognitive impairment, problems with memory, language, thinking and judgment that are greater than normal age-related changes.

Those who reported the highest carbohydrate intake at the beginning of the study were 1.9 times likelier to develop mild cognitive impairment than those with the lowest intake of carbohydrates. Participants with the highest sugar intake were 1.5 times likelier to experience mild cognitive impairment than those with the lowest levels.

But those whose diets were highest in fat — compared to the lowest — were 42 percent less likely to face cognitive impairment, and those who had the highest intake of protein had a reduced risk of 21 percent.

When total fat and protein intake were taken into account, people with the highest carbohydrate intake were 3.6 times likelier to develop mild cognitive impairment.

“A high carbohydrate intake could be bad for you because carbohydrates impact your glucose and insulin metabolism,” Dr. Roberts says. “Sugar fuels the brain — so moderate intake is good. However, high levels of sugar may actually prevent the brain from using the sugar — similar to what we see with type 2 diabetes.”

###

The study was funded by the National Institute on Aging.

About Mayo Clinic

Mayo Clinic is a nonprofit worldwide leader in medical care, research, and education for people from all walks of life. For more information, visit www.mayoclinic.org/about/ and www.mayoclinic.org/news.

Nick Hanson 507-284-5005 (days) 507-284-2511 (evenings) Email: newsbureau@mayo.edu

MULTIMEDIA ALERT: For audio and video of Dr. Roberts talking about the study, visit Mayo Clinic News Network.

Cognitive Decline Begins in Late 20s, U.Va. Study Suggests

2009 study posted for filing

 

March 18, 2009 — A new study indicates that some aspects of peoples’ cognitive skills — such as the ability to make rapid comparisons, remember unrelated information and detect relationships — peak at about the age of 22, and then begin a slow decline starting around age 27.

 

“This research suggests that some aspects of age-related cognitive decline begin in healthy, educated adults when they are in their 20s and 30s,” said Timothy Salthouse, a University of Virginia professor of psychology and the study’s lead investigator.

 

His findings appear in the current issue of the journal Neurobiology of Aging.

 

Salthouse and his team conducted the study during a seven-year period, working with 2,000 healthy participants between the ages of 18 and 60.

 

Participants were asked to solve various puzzles, remember words and details from stories, and identify patterns in an assortment of letters and symbols.

 

Many of the participants in Salthouse’s study were tested several times during the course of years, allowing researchers to detect subtle declines in cognitive ability.

 

Top performances in some of the tests were accomplished at the age of 22. A notable decline in certain measures of abstract reasoning, brain speed and in puzzle-solving became apparent at 27.

 

Salthouse found that average memory declines can be detected by about age 37. However, accumulated knowledge skills, such as improvement of vocabulary and general knowledge, actually increase at least until the age of 60.

 

“These patterns suggest that some types of mental flexibility decrease relatively early in adulthood, but that how much knowledge one has, and the effectiveness of integrating it with one’s abilities, may increase throughout all of adulthood if there are no pathological diseases,” Salthouse said.

 

However, Salthouse points out that there is a great deal of variance from person to person, and, he added, most people function at a highly effective level well into their final years, even when living a long life.

 

One of the unique features of this project in the University of Virginia Cognitive Aging Laboratory is that some of the participants return to the laboratory for repeated assessments after intervals of one to seven years.

 

“By following individuals over time, we gain insight to cognition changes, and may possibly discover ways to alleviate or slow the rate of decline,” Salthouse said. “And by better understanding the processes of cognitive impairment, we may become better at predicting the onset of dementias such as Alzheimer’s disease.”

 

Salthouse’s team also is surveying participants’ health and lifestyles to see if certain characteristics, such as social relationships, serve to moderate age-related cognitive changes.

 

They hope to continue their studies over many more years, with many of the same participants, to gain a long-term understanding of how the brain changes over time.

 

— By Fariss Samarrai

Long-term methadone treatment can affect nerve cells in brain

Long-term methadone treatment can cause changes in the brain, according to recent studies from the Norwegian Institute of Public Health. The results show that treatment may affect the nerve cells in the brain. The studies follow on from previous studies where methadone was seen to affect cognitive functioning, such as learning and memory.

 

Since it is difficult to perform controlled studies of methadone patients and unethical to attempt in healthy volunteers, rats were used in the studies. Previous research has shown that methadone can affect cognitive functioning in both humans and experimental animals.

Sharp decrease in key signaling molecule

Rats were given a daily dose of methadone for three weeks. Once treatment was completed, brain areas which are central for learning and memory were removed and examined for possible neurobiological changes or damage.
In one study, on the day after the last exposure to methadone, there was a significant reduction (around 70 per cent) in the level of a signal molecule which is important in learning and memory, in both the hippocampus and in the frontal area of the brain. This reduction supports findings from a previous study (Andersen et al., 2011) where impaired attention in rats was found at the same time. At this time, methadone is no longer present in the brain. This indicates that methadone can lead to cellular changes that affect cognitive functioning after the drug has left the body, which may be cause for concern.

No effect on cell generation

The second study, a joint project with Southwestern University in Texas, investigated whether methadone affects the formation of nerve cells in the hippocampus. Previous research has shown that new nerve cells are generated in the hippocampus in both adult humans and rats, and that this formation is probably important for learning and memory. Furthermore, it has been shown that other opiates such as morphine and heroin can inhibit this formation. It was therefore reasonable to assume that methadone, which is also an opiate, would have the same effect.
However, the researchers did not find any change in the generation of new nerve cells after long-term methadone treatment. If the same is true in humans, this is probably more positive for methadone patients than continuing with heroin. However, the researchers do not know what effect methadone has on nerve cells that have previously been exposed to heroin.

Large gaps in knowledge

Since the mid-1960s, methadone has been used to treat heroin addiction. This is considered to be a successful treatment but, despite extensive and prolonged use, little is known about possible side effects. There are large knowledge gaps in this field.
Our studies show that prolonged methadone treatment can affect the nerve cells, and thus behaviour, but the results are not always as expected. Many more pre-clinical and clinical studies are needed to understand methadone’s effect on the brain, how this can result in altered cognitive function, and, if so, how long these changes last. Knowledge of this is important – both for the individual methadone patient and the outcome of treatment.

References

  • Andersen JM, Klykken C, Mørland J. (2012) Long-term methadone treatment reduces phosphorylation of CaMKII in rat brain. J Pharm Pharmacol. 64(6):843-7.
  • Sankararaman A, Masiulis I, Richardson DR, Andersen JM, Mørland J, Eisch AJ. (2012) Methadone does not alter key parameters of adult hippocampal neurogenesis in the heroin-naïve rat. Neurosci Lett. 516(1):99-104.

Study pinpoints effects of different doses of an ADHD drug; Finds higher doses may harm learning

MADISON – New research with monkeys sheds light on how the drug methylphenidate may affect learning and memory in children with attention deficit hyperactivity disorder.

The results parallel a 1977 finding that a low dose of the drug boosted cognitive performance of children with ADHD, but a higher dose that reduced their hyperactivity also impaired their performance on a memory test.

“Many people were intrigued by that result, but their attempts to repeat the study did not yield clear-cut results,” says Luis Populin, an associate professor of neuroscience at the University of Wisconsin-Madison School of Medicine and Public Health.

Populin was senior author of the new study exploring the same topic, now available in the early access section of the Journal of Cognitive Neuroscience, published last week. In the study, three monkeys were taught to focus on a central dot on a screen, while a “target” dot flashed nearby. The monkeys were taught that they could earn a sip of water by waiting until the central dot switched off, and then looking at the location of the now-vanished target dot.

The system tests working (short-term) memory, impulsiveness and willingness to stick with the task, as the monkeys could quit “working” at any time, says Populin. The study used different doses of methylphenidate — the generic name for Ritalin — that were comparable to the range of clinical prescriptions for ADHD.

According to the Centers for Disease Control, almost 5 percent of American children are taking medications for ADHD.

Strikingly, dosage had a major and unexpected impact. “At a low dose, the performance scores improved because the monkeys could control their impulses and wait long enough to focus their eyes on the target. All three were calmer and could complete a significantly larger number of trials,” says Populin, who collaborated with Jeffrey Henriques and graduate student Abigail Rajala on the study.

At the higher dose, “performance on the task is impaired,” Populin says,  “but the subjects don’t seem to care, all three monkeys continued making the same errors over and over.” The monkeys stayed on task more than twice as long at the higher dose, even though they had much more trouble performing the task.

Although ADHD drugs are commonly thought to improve memory, “If we take the accuracy of their eye movements as a gauge of working memory, memory was not helped by either dose,” says Populin. “It did not get better at the lower dose, and there actually was a small negative effect at the higher dose.”

Memory is at the root of many intellectual abilities, but it can be affected by many factors, says Bradley Postle, a professor of psychology at UW-Madison.

Postle, an expert on working memory who was not involved in the study, says methylphenidate affects the brain’s executive function, “which can create an internal environment that, depending on the dose, is either more or less amenable to memory formation and/or retention. If you can concentrate, and are able to process information without being interrupted by distracting thoughts or distractions in your environment, you will perform much better on a memory test. Apparently, the lower dose of methylphenidate helped create the conditions for success without actually improving memory itself.”

Monkeys are not people, but monkeys in the study still reminded him of school children, Populin says.

“They made premature movements, could not wait to look at the target before they could be rewarded for doing so. It’s like a kid where the teacher says, ‘When you complete the task, raise your hand.’ But he can’t wait, even if he knows that by responding prematurely he will not get rewarded,” he says.

The study results had another parallel with daily life, Populin says. Drug dosages may be set high enough to reduce the characteristic hyperactivity of ADHD, “but some children say that makes them feel less creative and spontaneous; more like a robot. If learning drops off as it did in our study, that dose may not be best for them. Our monkeys actually did act like robots at the higher doses, keeping at it for up to seven hours even though their performance was so low.”

The logical way forward would involve a similar study with people diagnosed with ADHD, Populin says. With millions of children, and an increasing number of adults, taking medicines for the condition, “We have to be very careful about finding the right spot on the dose curve, or we may get changes in behavior that we don’t want.  People think these drugs help improve memory, but our data say, ‘No, your memory is not getting better.’ At the higher dose, you get a behavioral improvement at a price, and that price is cognitive ability.”