Consciousness is partly preserved during general anesthesia

Public Release: 3-Jul-2018

University of Turku

When people are administered an anaesthetic, they seem to lose consciousness – or at least they stop reacting to their environment. But is consciousness lost fully during anaesthesia or does consciousness persist in the brain but in an altered state? This question has been explored in the joint research project “The Conscious Mind: Integrating subjective phenomenology with objective measurements” of the University of Turku and the Hospital District of Southwest Finland studying neural mechanisms of human consciousness. In the study, the changes caused by the anaesthetics were monitored with electroencephalogram (EEG) and positron emission tomography (PET).

The study is a joint project between the research group of Adjunct Professor of Pharmacology and Anaesthesiologist Harry Scheinin studying anaesthesia mechanisms, and the research group of Professor of Psychology Antti Revonsuo studying human consciousness and brain from the point of view of philosophy and psychology. The study was conducted in collaboration with investigators from the University of Michigan, Ann Arbor, and the University of California, Irvine, USA. The latest research findings in the project have been published as four different publications in the July issues of the two leading journals in anaesthesiology. The main funders of the project are the Academy of Finland and Jane and Aatos Erkko Foundation.

Brain dreams and processes words during anaesthesia

In the first part of the study, healthy voluntary participants were anaesthetised either with dexmedetomidine or propofol. The drugs were administered with computer-driven target-controlled infusions until the subject just barely lost responsiveness. From this state, the subjects could be woken up with light shaking or a loud voice without changing the drug infusion. Immediately after the subjects regained responsiveness, they were asked whether they experienced anything during the anaesthesia period.

Nearly all participants reported dream-like experiences that sometimes mixed with the reality, says Professor Revonsuo.

The subjects were played Finnish sentences during the anaesthesia, half of which ended as expected (congruent) and half in an unexpected (incongruent) word, such as “The night sky was filled with shimmering tomatoes”. Normally, when a person is awake, the unexpected word causes a response in the EEG, which reflects how the brain processes the meaning of the sentence and word. The researchers tested whether the subjects detected and understood words or entire sentences while under anaesthesia.

The responses in the EEG showed that the brain cannot differentiate between normal and bizarre sentences when under anaesthesia. When we used dexmedetomidine, also the expected words created a significant response, meaning that the brain was trying to interpret the meaning of the words. However, after the participants woke from the anaesthesia, they did not remember the sentences they had heard and the results were the same with both drugs, says Senior Researcher, Adjunct Professor Katja Valli who participated in the study.

The subjects were also played unpleasant sounds during the anaesthesia. After the subjects woke up, the sounds were played again and, surprisingly, they reacted faster to these sounds than to new sounds they had not heard before. The subjects who were given dexmedetomidine also recognised the played sounds better than by chance, even though they could not recall them spontaneously.

In other words, the brain can process sounds and words even though the subject did not recall it afterwards. Against common belief, anaesthesia does not require full loss of consciousness, as it is sufficient to just disconnect the patient from the environment, explains Dr. Scheinin

The applied study design enabled separation of consciousness from other drug effects

The perceived changes in the EEG were mostly similar to earlier studies. However, the current study used constant infusion both when the participants were asleep and awake, which enabled the researchers to differentiate the effects of the drugs on consciousness from other possible direct or indirect effects. Partly because these effects get mixed, it is still a great challenge to estimate the depth of anaesthesia during surgery.

The project also studied the effects of four different anaesthetics on regional cerebral glucose metabolism with PET imaging. The findings alleviated the concern for potential harmful effects of dexmedetomidine on the ratio of cerebral blood flow and metabolism. In the future, the project will further analyse the association between cerebral blood flow or metabolism and the state of consciousness.

Consciousness is in a dream-like state during anaesthesia

All in all, the findings indicate that consciousness is not necessarily fully lost during anaesthesia, even though the person is no longer reacting to their environment. However, dream-like experiences and thoughts might still float in consciousness. The brain might still register speech and try to decipher words, but the person will not understand or remember them consciously, and the brain cannot construe full sentences from them.

The state of consciousness induced by anaesthetics can be similar to natural sleep. While sleeping, people dream and the brain observes the occurrences and stimuli in their environment subconsciously, summarises Professor Revonsuo.

Anaesthesia could resemble normal sleep more than we have previously thought, adds Dr. Scheinin.

###

The research articles were published in the July issues of the Anesthesiology and the British Journal of Anaesthesia. Based on their impact factors, these journals are the best anaesthesiology journals in the world. All the articles are open access publications and can be freely downloaded.

Why being left-handed matters for mental health treatment

Public Release: 18-Jun-2018

Cornell University

ITHACA, N.Y. – Treatment for the most common mental health problems could be ineffective or even detrimental to about 50 percent of the population, according to a radical new model of emotion in the brain.

Since the 1970s, hundreds of studies have suggested that each hemisphere of the brain is home to a specific type of emotion. Emotions linked to approaching and engaging with the world – like happiness, pride and anger – lives in the left side of the brain, while emotions associated with avoidance – like disgust and fear – are housed in the right.

But those studies were done almost exclusively on right-handed people. That simple fact has given us a skewed understanding of how emotion works in the brain, according to Daniel Casasanto, associate professor of human development and psychology at Cornell University.

That longstanding model is, in fact, reversed in left-handed people, whose emotions like alertness and determination are housed in the right side of their brains, Casasanto suggests in a new study. Even more radical: The location of a person’s neural systems for emotion depends on whether they are left-handed, right-handed or somewhere in between, the research shows.

The study, “Approach motivation in human cerebral cortex,” is published in Philosophical Transactions of the Royal Society B: Biological Sciences.

According to the new theory, called the “sword and shield hypothesis,” the way we perform actions with our hands determines how emotions are organized in our brains. Sword fighters of old would wield their swords in their dominant hand to attack the enemy — an approach action — and raise their shields with their non-dominant hand to fend off attack — an avoidance action. Consistent with these action habits, results show that approach emotions depend on the hemisphere of the brain that controls the dominant “sword” hand, and avoidance emotions on the hemisphere that controls the non-dominant “shield” hand.

The work has implications for a current treatment for recalcitrant anxiety and depression called neural therapy. Similar to the technique used in the study and approved by the Food and Drug Administration, it involves a mild electrical stimulation or a magnetic stimulation to the left side of the brain, to encourage approach-related emotions.

But Casasanto’s work suggests the treatment could be damaging for left-handed patients. Stimulation on the left would decrease life-affirming approach emotions. “If you give left-handers the standard treatment, you’re probably going to make them worse,” Casasanto said.

“And because many people are neither strongly right- nor left-handed, the stimulation won’t make any difference for them, because their approach emotions are distributed across both hemispheres,” he said.

“This suggests strong righties should get the normal treatment, but they make up only 50 percent of the population. Strong lefties should get the opposite treatment, and people in the middle shouldn’t get the treatment at all.”

However, Casasanto cautions that this research studied only healthy participants and more work is needed to extend these findings to a clinical setting.

###

The research was funded by a James S. McDonnell Foundation Scholar Award and the National Science Foundation.

Cornell University has television, ISDN and dedicated Skype/Google+ Hangout studios available for media interviews.

Genes found only in humans influence brain size

Public Release: 31-May-2018

New genes arose in human ancestors just before a dramatic increase in brain size and are involved in genetic defects associated with neurological disorders

University of California – Santa Cruz

IMAGE

IMAGE: Researchers studied the effects of NOTCH2NL genes in cortical organoids grown from human embryonic stem cells. Immunofluorescence staining shows markers for radial glia (green) and cortical neurons (red).

A set of three nearly identical genes found only in humans appear to play a critical role in the development of our large brains, according to a study led by researchers at the University of California, Santa Cruz.

The genes appeared between 3 and 4 million years ago, just before the period when fossils show a dramatic increase in the brain sizes of human ancestors. In modern humans, the genes are involved in genetic defects associated with neurological disorders.

Published May 31 in Cell, the study represents more than five years of work to characterize the genes, their role in neurological development, and their evolutionary origins. They belong to an ancient family of genes known as Notch genes, first discovered in fruit flies and named for a genetic defect causing notched wings.

“This is a family of genes that goes back hundreds of millions of years in evolutionary history and is known to play important roles in embryonic development. To find that humans have a new member of this family that is involved in brain development is extremely exciting,” said senior author David Haussler, professor of biomolecular engineering and scientific director of the UC Santa Cruz Genomics Institute.

The site of the genes on the long arm of chromosome 1 is involved in genetic defects in which large segments of DNA are either duplicated or deleted, leading to neurological disorders known collectively as 1q21.1 deletion/duplication syndrome. Deletions are often associated with microcephaly (abnormally small head size) and autism, while duplications are often associated with macrocephaly (abnormally large head size) and schizophrenia.

The new human-specific Notch genes were derived from NOTCH2, one of four previously known mammalian Notch genes, through a duplication event that inserted an extra partial copy of NOTCH2 into the genome. This happened in an ancient ape species that was a common ancestor of humans, chimpanzees, and gorillas. The partial duplicate was a nonfunctional “pseudogene,” versions of which are still found in chimp and gorilla genomes. In the human lineage, however, this pseudogene was “revived” when additional NOTCH2 DNA was copied into its place, creating a functional gene. This new gene was then duplicated several more times, resulting in four related genes, called NOTCH2NL genes, found only in humans.

One of the four NOTCH2NL genes appears to be a nonfunctional pseudogene, but the other three (NOTCH2NLA, NOTCH2NLB, and NOTCH2NLC) are active genes that direct the production of truncated versions of the original NOTCH2 protein. Notch proteins are involved in signaling between and within cells. In many cases, the Notch signaling pathway regulates the differentiation of stem cells in developing organs throughout the body, telling stem cells when to become, for example, mature heart cells or neurons.

“Notch signaling was already known to be important in the developing nervous system,” said senior author Sofie Salama, a research scientist at UC Santa Cruz. “NOTCH2NL seems to amplify Notch signaling, which leads to increased proliferation of neural stem cells and delayed neural maturation.”

The NOTCH2NL genes are especially active in the pool of neural stem cells thought to generate most of the cortical neurons. By delaying their maturation, the genes allow a larger pool of these stem cells (called “radial glia”) to build up in the developing brain, ultimately leading to a larger number of mature neurons in the neocortex (the outer layer of the brain in mammals; in humans, it hosts higher cognitive functions such as language and reasoning).

This delayed development of cortical neurons fits a pattern of delayed maturation characteristic of human development, Haussler said. “One of our most distinguishing features is larger brains and delayed brain development, and now we’re seeing molecular mechanisms supporting this evolutionary trend even at a very early stage of brain development,” he said.

Salama noted that the new genes are just one of many factors that contribute to cortical development in humans. “NOTCH2NL doesn’t act in a vacuum, but it arose at a provocative time in human evolution, and it is associated with neural developmental disorders. That combination makes it especially interesting,” she said.

The DNA copying errors that created the NOTCH2NL genes in the first place are the same type of errors that cause the 1q21.1 deletion/duplication syndrome. These errors tend to occur in places on the chromosomes where there are long stretches of nearly identical DNA sequences.

“These long segments of DNA that are almost identical can confuse the replication machinery and cause instability in the genome,” Haussler explained. “We may have gained our larger brains in part through the duplication of these genes, but at the expense of greater instability in that region of chromosome 1, which makes us susceptible to the deletion/duplication syndrome.”

Long stretches of repetitive DNA also present challenges for DNA sequencing technologies. In fact, the location of NOTCH2NL in the human reference genome was not accurate when Haussler’s team first started investigating it.

“When we looked at the reference genome to see where NOTCH2NL was, we found that it was near the area involved in the 1q21.1 syndrome, but not part of the region that was deleted or duplicated,” Haussler said. “This explains why the gene was not looked at before by geneticists studying the syndrome.”

After checking other genome data and contacting the team working on the next iteration of the reference genome, however, Haussler found that NOTCH2NL is in fact located in the interval where the defects occur. The new reference genome (the 38th version, released later in 2013) also shows the additional copies of the gene. Haussler’s team subsequently showed that the duplications or deletions in the syndrome result in an increase or decrease (respectively) in the number of copies of NOTCH2NL genes in the affected person’s genome. Other genes are also duplicated or deleted and may also be involved in the syndrome.

Interestingly, these genetic changes do not always result in neurological disorders. In about 20 to 50 percent of affected children, the syndrome is the result of a new genetic mistake, but in many cases one of the parents is found to also carry the genetic defect, without showing any apparent symptoms. According to Haussler, this is not uncommon in genetic diseases and underscores the importance of multiple factors in the development of disease.

“It’s amazing how often we find people with what seem to be serious genetic conditions, yet something else compensates for it,” he said.

The investigation of these genes began in 2012 when Frank Jacobs, now at the University of Amsterdam and the third senior author of the paper, was working with Haussler and Salama at UC Santa Cruz as a postdoctoral researcher. His project involved coaxing human embryonic stem cells to differentiate into neurons and studying the genes that are expressed during this process. As the cells develop into cortical neurons in the petri dish, they self-organize into a layered structure like a miniature version of the brain’s cortex, which researchers call a “cortical organoid.”

Jacobs was comparing gene expression patterns in cortical organoids grown from embryonic stem cells of humans and rhesus monkeys. Many genes showed differences in the timing and amount of expression, but NOTCH2NL was exceptional. “It was screaming hot in human cells and zero in rhesus. Rhesus cells just don’t have this gene,” Salama said. “Finding a new Notch gene in humans set us off on a long journey.”

Haussler, a Howard Hughes Medical Institute investigator, said he remembers presenting their initial findings in 2013 to scientists at HHMI. “Their general reaction was, ‘Well, it’s amazing if it’s true, but we’re not convinced yet.’ So we spent the next five years working to convince everybody.”

The development of the CRISPR/Cas9 system for making genetic modifications provided a crucial tool for their work. Salama’s team used it to delete the NOTCH2NL genes from human embryonic stem cells. Cortical organoids grown from these cells showed an acceleration of neural maturation and were smaller in size than organoids from normal cells. The researchers also inserted NOTCH2NL genes into mouse embryonic stem cells and showed that the genes promote Notch signaling and delay neural maturation in mouse cortical organoids.

“The fact that we can genetically manipulate stem cells with CRISPR and then grow them into cortical organoids in the lab is extremely powerful,” Haussler said. “My dream for decades has been to peer into human evolution at the level of individual genes and gene functions. It’s incredibly exciting that we’re able to do that now.”

A major part of the research involved careful and precise sequencing of the region of chromosome 1 where the NOTCH2NL genes are located in 8 normal individuals and 6 patients with 1q21.1 deletion/duplication syndrome. (The researchers also analyzed the genomes of three archaic humans, two Neanderthals, and one Denisovan, finding in all of them the same three active NOTCH2NL genes that are present in modern humans.)

The sequencing results showed that the NOTCH2NL genes are variable in modern humans. The researchers identified eight different versions of NOTCH2NL, and Haussler said there are probably more. Each version has a slightly different DNA sequence, but it remains unclear what effects these differences may have.

“We’ve found that all of them can promote Notch signaling. They behaved in subtly different ways when we tested them in cell cultures, but we have a lot more work to do before we can start to get a handle on what this means,” Salama said.

Other genes involved in human brain development seem to have arisen through a duplication process similar to the creation of NOTCH2NL. A notable example is the gene SRGAP2C, which is thought to increase the number of connections between neurons. Locations in the genome where such duplications and rearrangements occur repeatedly, known as “duplication hubs,” make up about 5 percent of the human genome and seem to have been important in human evolution, Haussler said.

###

The first authors of the paper are Ian Fiddes, a graduate student working with Haussler at UC Santa Cruz, and Gerrald Lodewijk, a graduate student working with Jacobs at the University of Amsterdam. Other coauthors include researchers at Stanford University, UC San Francisco, University of Washington, Broad Institute of MIT and Harvard, Medical Genetics Service in Lausanne, Switzerland, and Institute of Genetic Medicine in Newcastle upon Tyne, U.K. This work was supported by the Howard Hughes Medical Institute, U.S. National Institutes of Health, European Research Council, California Institute for Regenerative Medicine, Netherlands Organization for Scientific Research (NWO), and European Molecular Biology Organization.

Smarter brains run on sparsely connected neurons

Public Release: 17-May-2018

Princeton researchers crowdsource brain mapping with gamers, discover 6 new neuron types

Caption

By turning a time-intensive research problem into an interactive game, Princeton neuroscientist Sebastian Seung has built an unprecedented data set of neurons, which he is now turning over to the public via the Eyewire Museum. These 17 retinal neurons, mapped by Eyewire gamers, include ganglion cell types in blue and green and amacrine cells in yellow and red.

Credit: Image by Alex Norton, Eyewire

With the help of a quarter-million video game players, Princeton researchers have created and shared detailed maps of more than 1,000 neurons — and they’re just getting started.

“Working with Eyewirers around the world, we’ve made a digital museum that shows off the intricate beauty of the retina’s neural circuits,” said Sebastian Seung, the Evnin Professor in Neuroscience and a professor of computer science and the Princeton Neuroscience Institute (PNI). The related paper is publishing May 17 in the journal Cell.

Seung is unveiling the Eyewire Museum, an interactive archive of neurons available to the general public and neuroscientists around the world, including the hundreds of researchers involved in the federal Brain Research through Advancing Innovative Neurotechnologies (BRAIN) Initiative.

“This interactive viewer is a huge asset for these larger collaborations, especially among people who are not physically in the same lab,” said Amy Robinson Sterling, a crowdsourcing specialist with PNI and the executive director of Eyewire, the online gaming platform for the citizen scientists who have created this data set.

“This museum is something like a brain atlas,” said Alexander Bae, a graduate student in electrical engineering and one of four co-first authors on the paper. “Previous brain atlases didn’t have a function where you could visualize by individual cell, or a subset of cells, and interact with them. Another novelty: Not only do we have the morphology of each cell, but we also have the functional data, too.”

The neural maps were developed by Eyewirers, members of an online community of video game players who have devoted hundreds of thousands of hours to painstakingly piecing together these neural cells, using data from a mouse retina gathered in 2009.

Eyewire pairs machine learning with gamers who trace the twisting and branching paths of each neuron. Humans are better at visually identifying the patterns of neurons, so every player’s moves are recorded and checked against each other by advanced players and Eyewire staffers, as well as by software that is improving its own pattern recognition skills.

Since Eyewire’s launch in 2012, more than 265,000 people have signed onto the game, and they’ve collectively colored in more than 10 million 3-D “cubes,” resulting in the mapping of more than 3,000 neural cells, of which about a thousand are displayed in the museum.

Each cube is a tiny subset of a single cell, about 4.5 microns across, so a 10-by-10 block of cubes would be the width of a human hair. Every cell is reviewed by between 5 and 25 gamers before it is accepted into the system as complete.

“Back in the early years it took weeks to finish a single cell,” said Sterling. “Now players complete multiple neurons per day.” The Eyewire user experience stays focused on the larger mission — “For science!” is a common refrain — but it also replicates a typical gaming environment, with achievement badges, a chat feature to connect with other players and technical support, and the ability to unlock privileges with increasing skill. “Our top players are online all the time — easily 30 hours a week,” Sterling said.

Dedicated Eyewirers have also contributed in other ways, including donating the swag that gamers win during competitions and writing program extensions “to make game play more efficient and more fun,” said Sterling, including profile histories, maps of player activity, a top 100 leaderboard and ever-increasing levels of customizability.

“The community has really been the driving force behind why Eyewire has been successful,” Sterling said. “You come in, and you’re not alone. Right now, there are 43 people online. Some of them will be admins from Boston or Princeton, but most are just playing — now it’s 46.”

For science!

With 100 billion neurons linked together via trillions of connections, the brain is immeasurably complex, and neuroscientists are still assembling its “parts list,” said Nicholas Turner, a graduate student in computer science and another of the co-first authors. “If you know what parts make up the machine you’re trying to break apart, you’re set to figure out how it all works,” he said.

The researchers have started by tackling Eyewire-mapped ganglion cells from the retina of a mouse. “The retina doesn’t just sense light,” Seung said. “Neural circuits in the retina perform the first steps of visual perception.”

The retina grows from the same embryonic tissue as the brain, and while much simpler than the brain, it is still surprisingly complex, Turner said. “Hammering out these details is a really valuable effort,” he said, “showing the depth and complexity that exists in circuits that we naively believe are simple.”

The researchers’ fundamental question is identifying exactly how the retina works, said Bae. “In our case, we focus on the structural morphology of the retinal ganglion cells.”

“Why the ganglion cells of the eye?” asked Shang Mu, an associate research scholar in PNI and fellow first author. “Because they’re the connection between the retina and the brain. They’re the only cell class that go back into the brain.” Different types of ganglion cells are known to compute different types of visual features, which is one reason the museum has linked shape to functional data.

Using Eyewire-produced maps of 396 ganglion cells, the researchers in Seung’s lab successfully classified these cells more thoroughly than has ever been done before.

“The number of different cell types was a surprise,” said Mu. “Just a few years ago, people thought there were only 15 to 20 ganglion cell types, but we found more than 35 — we estimate between 35 and 50 types.”

Of those, six appear to be novel, in that the researchers could not find any matching descriptions in a literature search.

A brief scroll through the digital museum reveals just how remarkably flat the neurons are — nearly all of the branching takes place along a two-dimensional plane. Seung’s team discovered that different cells grow along different planes, with some reaching high above the nucleus before branching out, while others spread out close to the nucleus. Their resulting diagrams resemble a rainforest, with ground cover, an understory, a canopy and an emergent layer overtopping the rest.

All of these are subdivisions of the inner plexiform layer, one of the five previously recognized layers of the retina. The researchers also identified a “density conservation principle” that they used to distinguish types of neurons.

One of the biggest surprises of the research project has been the extraordinary richness of the original sample, said Seung. “There’s a little sliver of a mouse retina, and almost 10 years later, we’re still learning things from it.”

###

“Digital museum of retinal ganglion cells with dense anatomy and physiology,” by Alexander Bae, Shang Mu, Jinseop Kim, Nicholas Turner, Ignacio Tartavull, Nico Kemnitz, Chris Jordan, Alex Norton, William Silversmith, Rachel Prentki, Marissa Sorek, Celia David, Devon Jones, Doug Bland, Amy Sterling, Jungman Park, Kevin Briggman, Sebastian Seung and the Eyewirers, was published May 17 in the journal Cell with DOI 10.1016/j.cell.2018.04.040. The research was supported by the Gatsby Charitable Foundation, National Institute of Health-National Institute of Neurological Disorders and Stroke (U01NS090562 and 5R01NS076467), Defense Advanced Research Projects Agency (HR0011-14-2- 0004), Army Research Office (W911NF-12-1-0594), Intelligence Advanced Research Projects Activity (IARPA) (D16PC00005), KT Corporation, Amazon Web Services Research Grants, Korea Brain Research Institute (2231-415) and Korea National Research Foundation Brain Research Program (2017M3C7A1048086).

Skin cancers linked with up to a 92% reduced risk of Alzheimer’s disease

Public Release: 19-Apr-2018

 

Wiley

Previous studies have demonstrated a decreased risk of Alzheimer’s disease (AD) in individuals with various cancers, including non-melanoma skin cancers (including squamous cell cancers and basal cell cancers). A new Journal of the European Academy of Dermatology & Venereology study finds that this inverse relationship also holds true for malignant melanoma.

The study included patients aged 60-88 years with a clinic follow-up of at least 1 year and no diagnosis of AD or skin cancer at the beginning of the study. Of 1147 patients who were later diagnosed with malignant melanoma, 5 were diagnosed with subsequent AD. Of 2506 who were diagnosed with basal cell cancer, 5 had a subsequent AD diagnosis, and of 967 who were diagnosed with squamous cell cancer, only 1 had a subsequent AD diagnosis.

After adjustments, a diagnosis of malignant melanoma was associated with a 61% reduced risk of developing AD. For basal cell and squamous cell carcinomas, the reduced risks were 82% and 92%, respectively.

We’ll pay more for unhealthy foods we crave, neuroscience research finds

Public Release: 2-Apr-2018

 

New York University

We’ll pay more for unhealthy foods when we crave them, new neuroscience research finds. The study also shows that we’re willing to pay disproportionately more for higher portion sizes of craved food items.

The research, which appears in the journal Proceedings of the National Academy of Sciences (PNAS), identifies an obstacle to healthy living.

“Our results indicate that even if people strive to eat healthier, craving could overshadow the importance of health by boosting the value of tempting, unhealthy foods relative to healthier options,” explains Anna Konova, a postdoctoral researcher in NYU’s Center for Neural Science and the paper’s lead author. “Craving, which is pervasive in daily life, may nudge our choices in very specific ways that help us acquire those things that made us feel good in the past–even if those things may not be consistent with our current health goals.”

The study’s other co-authors were Kenway Louie, an NYU research assistant professor, and Paul Glimcher, an NYU professor and director of NYU’s Institute for the Interdisciplinary Study of Decision Making.

There is growing interest across several sectors–marketing, psychology, economics, and medicine–in understanding how our psychological states and physiological needs affect our behavior as consumers. Of particular concern is craving, which has long been recognized as a state of mind that contributes to addiction and, in recent years, to eating disorders and obesity.

Yet, the researchers note, little is known about the nature of craving and its impact on our choices and behavior.

In their PNAS study, the scientists conducted a series of experiments that asked subjects to indicate how much they’d pay for certain snack foods after they developed a craving for one of them–significant differences in a desire for a specific food item (e.g., a Snickers or granola bar) before and after exposure to the item constituted cravings.

The results showed that people were willing to pay more for the same exact snack food item if they were just exposed to it and asked to recall specific memories of consumption of this item, relative to before this exposure. Notably, this occurred even if the study’s subjects were hungry before and after the exposure, suggesting that craving and hunger are partly distinct experiences.

“In other words, craving Snickers does not make you hungrier; it makes you desire Snickers specifically,” explains Louie, who adds that there was also a spillover effect as it applied, to some degree, to similar food items that subjects were never exposed to (e.g., other chocolate, nut, and caramel candy bars).

Moreover, the researchers found stronger effects–bigger changes in the willingness to pay for an item the subjects craved–when the items were higher-calorie, higher-fat/sugar content foods, such as a chocolate bar or cheese puffs, relative to healthier options (e.g., a granola bar).

Finally, the experiments revealed a connection among craving, portion, and price. That is, people were willing to pay disproportionately more for higher portion sizes of the craved items.

“It appears that craving boosts or multiplies the economic value of the craved food,” says Konova.

###

This work was supported by grants from the National Institute on Drug Abuse (R01DA038063 and F32DA039648), part of the National Institutes of Health, and the Brain and Behavior Research Foundation (NARSAD Young Investigator Grant #25387).

DOI: 10.1073/pnas.1714443115

High total cholesterol in late old age may be marker of protective factor

Public Release: 5-Mar-2018

Risk of cognitive decline reduced for people 85 and older with high cholesterol

High total cholesterol in late old age may be marker of protective factor

The Mount Sinai Hospital / Mount Sinai School of Medicine

 

People aged 85 and older whose total cholesterol had increased from their levels at midlife had a reduced risk for marked cognitive decline, compared with those a decade younger whose choles-terol was similarly elevated, Mount Sinai researchers report in a new study.

The results of the study will be available online by Alzheimer’s & Dementia: The Journal of the Alzheimer’s Association as an article in press corrected proof on Monday, March 5, at 10 a.m.

The researchers found that people aged 85-94 with good cognitive function whose total choles-terol increased from midlife had a 32 percent reduced risk for marked cognitive decline over the next ten years, compared with people aged 75-84, who had a 50 percent increased risk.

The researchers said that the results did not suggest that those 85 and older should increase their cholesterol for better cognitive health, but rather that those in that age cohort with good cogni-tion and high cholesterol probably also had some protective factor that someday could be identi-fied and studied.

The research team evaluated the association of five total cholesterol values with a substantial de-cline in cognitive function from normal function, called marked cognitive decline. The five values were midlife (average age 40) total cholesterol, late-life (average age 77) total cholesterol, mean total cholesterol since midlife, linear change since midlife (in other words, whether it was in-creasing or decreasing), and quadratic change since midlife (whether the linear change was accel-erating or decelerating). Data were obtained from the original Framingham Heart Study, a long-term, ongoing cardiovascular cohort study on residents of Framingham, Massachusetts. That study began in 1948 with 5,209 adult subjects and is now on its third generation of participants

The team assessed whether marked cognitive decline was associated with the five cholesterol values, and whether the associations with those values changed depending on the age of cogni-tive assessment. They found several cholesterol values including high last cholesterol, increasing levels, and decreasing acceleration were predictors associated with increased risk of a marked cognitive decline, that were associated with increased risk of a marked cognitive decline. How-ever, as the outcome age increased, some associations were reduced, or even reversed. Further-more, in the subgroup of cognitively healthy 85-94 year olds, a high midlife cholesterol level was associated with a reduced risk for marked cognitive decline. This contrasts with samples in other studies that have focused on elderly subjects primarily below age 75, where midlife cholesterol was associated with increased risk of cognitive decline.

“Our results have important implications for researching genetic and other factors associated with successful cognitive aging,” said the study’s first author, Jeremy Silverman, PhD, Professor of Psychiatry, Icahn School of Medicine at Mount Sinai. “The data are consistent with our protect-ed survivor model – among individuals who survive to very old age with intact cognition, those with high risk factor levels are more likely to possess protective factors than those with lower risk factor levels. Long-lived individuals who are cognitively intact despite high risk should be tar-geted in research studies seeking protective factors, which could help identify future drugs and therapies to treat dementia and Alzheimer’s disease.”

Dr. Silverman notes that these results do not imply that those 85 and older should increase their cholesterol. His research team will next study other risk factors for cognitive decline, including body mass index and blood pressure..

“We don’t think high cholesterol is good for cognition at 85, but its presence might help us iden-tify those who are less affected by it. We hope to identify genes or other protective factors for cogitive decline by focusing on cognitively healthy very old people who are more likely to carry protective factors.”

###

The Mount Sinai Health System is New York City’s largest integrated delivery system encompassing seven hospital campuses, a leading medical school, and a vast network of ambulatory practices throughout the greater New York region. Mount Sinai’s vision is to produce the safest care, the highest quality, the highest satisfaction, the best access and the best value of any health system in the nation. The System includes approximately 7,100 primary and specialty care physicians; 10 joint-venture ambulatory surgery centers; more than 140 ambulatory practices throughout the five boroughs of New York City, Westchester, Long Island, and Florida; and 31 affiliated community health centers. Physicians are affiliated with the renowned Icahn School of Medicine at Mount Sinai, which is ranked among the highest in the nation in National Institutes of Health funding per investigator. The Mount Sinai Hospital is ranked No. 18 on U.S. News & World Report’s “Honor Roll” of top U.S. hospitals; it is one of the nation’s top 20 hospitals in Cardiology/Heart Surgery, Diabetes/Endocrinology, Gastroenterology/GI Surgery, Geriat

rics, Nephrology, and Neurology/Neurosurgery, and in the top 50 in four other specialties in the 2017-2018 “Best Hospitals” issue. Mount Sinai’s Kravis Children’s Hospital also is ranked in six out of ten pediatric specialties by U.S. News & World Report. The New York Eye and Ear Infirmary of Mount Sinai is ranked 12th nationally for Ophthalmology and 50th for Ear, Nose, and Throat, while Mount Sinai Beth Israel, Mount Sinai St. Luke’s and Mount Sinai West are ranked regionally. For more information, visit http://www.mountsinai.org/, or find Mount Sinai on Facebook, Twitter and YouTube.

Food abundance driving conflict in Africa, not food scarcity

Public Release: 1-Mar-2018

 

Dartmouth College

In Africa, food abundance may be driving violent conflict rather than food scarcity, according to a study published in the American Journal of Agricultural Economics, a publication of the Agricultural & Applied Economics Association.

The study refutes the notion that climate change will increase the frequency of civil war in Africa as a result of food scarcity triggered by rising temperatures and drought. Most troops in Africa are unable to sustain themselves due to limited access to logistics and state support, and must live off locally sourced food. The findings reveal that the actors are often drawn to areas with abundant food resources, whereby, they aim to exert control over such resources.

To examine how the availability of food may have affected armed conflict in Africa, the study relies on PRIO-Grid data from over 10,600 grid cells in Africa from 1998 to 2008, new agricultural yields data from EarthStat and Armed Conflict Location and Event Dataset, which documents incidents of political violence, including those with and without casualties. The data was used to estimate how annual local wheat and maize yields (two staple crops) at a local village/town level may have affected the frequency of conflict. To capture only the effects of agricultural productivity on conflict rather than the opposite, the analysis incorporates the role of droughts using the Standardized Precipitation Index, which aggregates monthly precipitation by cell year.

The study identifies four categories in which conflicts may arise over food resources in Africa, which reflect the interests and motivations of the respective group:

  • State and military forces that do not receive regular support from the state are likely to gravitate towards areas, where food resources are abundant in order to feed themselves.
  • Rebel groups and non-state actors opposing the government may be drawn to food rich areas, where they can exploit the resources for profit.
  • Self-defense militias and civil defense forces representing agricultural communities in rural regions, may protect their communities against raiders and expand their control into other areas with arable land and food resources.
  • Militias representing pastoralists communities live in mainly arid regions and are highly mobile, following their cattle or other livestock, rather than relying on crops. To replenish herds or obtain food crops, they may raid other agriculturalist communities.

These actors may resort to violence to seek access to food, as the communities that they represent may not have enough food resources or the economic means to purchase livestock or drought-resistant seeds. Although droughts can lead to violence, such as in urban areas; this was found not to be the case for rural areas, where the majority of armed conflicts occurred where food crops were abundant. Food scarcity can actually have a pacifying effect.

“Examining food availability and the competition over such resources, especially where food is abundant, is essential to understanding the frequency of civil war in Africa,” says Ore Koren, a U.S. foreign policy and international security fellow at Dartmouth College and Ph.D. candidate in political science at the University of Minnesota. “Understanding how climate change will affect food productivity and access is vital; yet, predictions of how drought may affect conflict may be overstated in Africa and do not get to the root of the problem. Instead, we should focus on reducing inequality and improving local infrastructure, alongside traditional conflict resolution and peace building initiatives,” explains Koren.

###

Koren is available for comment at: ore.david.koren@dartmouth.edu.

Broadcast studios: Dartmouth has TV and radio studios available for interviews. For more information, visit: http://communications.dartmouth.edu/media/broadcast-studios

Our mitochondria are optimized to run at 122 degrees Fahrenheit ?

Public Release: 25-Jan-2018

Do our mitochondria run at 50 degrees C?

PLOS

IMAGEWindows Live Blog

Caption

Left: Mitochondria of human cells illuminated by the thermo-sensitive probe. Four human cells, each with its nucleus (N) and its numerous hot filamentous mitochondria (yellow-red). Right: Mitochondria as radiators. A high-magnification rendering of one such filament reveals parallel arrays of closely juxtaposed membranes that could heat the mitochondrial interior.

Credit

Left: Malgorzata Rak; Right: Terrence G. Frey

Our body temperature is held at a fairly steady 37.5°C, and the assumption has always been that most of our physiological processes take place at this temperature. The heat needed to maintain this temperature in the face of a colder environment is generated by tiny subcellular structures called mitochondria. But a new study publishing January 25 in the open access journal PLOS Biology by INSERM and CNRS researchers at Hôpital Robert Debré in Paris led by Dr Pierre Rustin (and their international collaborators from Finland, South Korea, Lebanon and Germany) presents surprising evidence that mitochondria can run more than 10°C hotter than the body’s bulk temperature, and indeed are optimized to do so. Because of the extraordinary nature of these claims, PLOS Biology has commissioned a cautionary accompanying article by Professor Nick Lane from University College, London, an expert on evolutionary bioenergetics.

To ensure a stable internal temperature, the human body makes use of the heat produced by the last stage of food consumption: combustion of nutrients in structures known as mitochondria, of which there are tens or hundreds in each cell. Mitochondria form a complex network within the cell, and their contents are isolated from the rest of the cell by two membranes. A considerable number of biologically catalyzed chemical reactions take place in their interior; 40% of the energy that they release is captured in the form of a chemical compound, ATP, which is used to drive functions of the body such as heart beats, brain activity or muscle contraction. The remaining 60%, however, is dissipated as heat.

The authors’ results appear to show that, in maintaining our body at a constant temperature of 37.5°C, mitochondria operate much like thermostatic radiators in a poorly insulated room, running at a much higher temperature than their surroundings.

This work was made possible by the use of a chemical probe whose fluorescence is particularly sensitive to temperature. When this “molecular thermometer” (Mito Thermo Yellow) was introduced into the heart of the mitochondria, they were able to demonstrate a stabilized temperature of about 50°C. Specifically, the probe’s fluorescence suggested that the temperature of the mitochondria in living and intact cells, themselves placed in a culture medium maintained at 38°C, is more than 10°C higher, as long as the mitochondria are functional. This elevated temperature is abolished when the mitochondria are inactivated by various means. The researchers also showed that several human mitochondrial enzymes have evolved an optimum temperature close to 50°C, which helps to support their interpretation of the molecular thermometer data.

Nick Lane, who was not involved in the study, but helped the journal to assess the manuscript, finds the results potentially exciting, but warns that further work needs to be done. In his accompanying Primer, he says “This is a radical claim, and if it is true, how come we didn’t know something so important long ago?”

Lane asks a battery of questions about the Mito Thermo Yellow probe, about the plausibility of the extreme temperature gradients which the authors’ interpretation imply, and about the meaning of the very concept of “temperature” at such microscopic scales. “We need to know a lot more about both the specific behaviour of Mito Thermo Yellow and its exact location within the mitochondrion before we can come to any firm conclusions about ‘temperature’. In the meantime, I doubt that the 10°C temperature difference should be taken literally. But it should be taken seriously.”

The authors acknowledge that these high temperatures at the core of the micro-space inside mitochondria are unexpected but emphasize that this revelation should lead to a reassessment of our vision of how mitochondria function and their role in cells. “Much of our knowledge about mitochondria, the activity of their enzymes, the permeability of their membranes, the consequences of genetic defects that impair their activity, the effect of toxins or drugs, have all been established at 37.5°C; the temperature of the human body, certainly, but apparently not that of the mitochondria,” they say.

“Heat has fallen out of fashion in biology. Whether or not all these ideas are correct, the distribution and heat generation of mitochondria within cells should be taken much more seriously. These researchers bring this important subject back to centre stage, which is exactly where it should be,” concludes Lane.

###

In your coverage please use this URL to provide access to the freely available article in PLOS Biology:

Article: http://journals.plos.org/plosbiology/article?id=10.1371/journal.pbio.2003992

Primer: http://journals.plos.org/plosbiology/article?id=10.1371/journal.pbio.2005113

Citation:

Article: Chrétien D, Bénit P, Ha H-H, Keipert S, El-Khoury R, Chang Y-T, et al. (2018) Mitochondria are physiologically maintained at close to 50°C. PLoS Biol 16(1): e2003992. https://doi.org/10.1371/journal.pbio.2003992

Primer: Lane N (2018) Hot mitochondria? PLoS Biol 16(1): e2005113. https://doi.org/10.1371/journal.pbio.2005113

Funding:

Article: European Research Council (grant number 232738). Received by HTJ. The funder had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript. Academy Professorship (grant number 256615). Received by HTJ. The funder had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript. Academy of Finland (grant number FinMIT CoE 272376). Received by HTJ. The funder had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript. Ouvrir Les Yeux (OLY). To PB and PR. The funder had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript. Association Française contre l’Ataxie de Friedreich (AFAF) (grant number). To PB and PR. The funder had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript. Association contre les Maladies Mitochondriales (AMMi). To DC, PB, MR, and PR. The funder had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript. Association d’Aide aux Jeunes Infirmes (AAJI). To PB and PR. The funder had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript. E-Rare (grant number E-rare Genomit). To DC, PB, MR, and PR. The funder had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript. ANR (grant number ANR MITOXDRUGS-DS0403). To DC, PB, MR, and PR. The funder had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript. ANR (grant number ANR FIFA2-12-BSV1-0010). To DC, PB, MR, and PR. The funder had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.

Primer: The author(s) received no specific funding for this work.

Viruses are everywhere, maybe even in space

Public Release: 18-Jan-2018

Portland State University

Viruses are the most abundant and one of the least understood biological entities on Earth. They might also exist in space, but as of yet scientists have done almost no research into this possibility.

Portland State University biology professor Ken Stedman and colleagues are trying to change this through their article “Astrovirology: Viruses at Large in the Universe,” published in the February 2018 issue of the journal Astrobiology. In this call to arms, the authors state that NASA and other space agencies should be looking for viruses in liquid samples from Saturn and Jupiter’s moons, develop technology to detect viruses in ancient deposits on Earth and Mars, and determine if Earth viruses could survive in space.

“More than a century has passed since the discovery of the first viruses,” said Stedman, who teaches at PSU’s College of Liberal Arts & Sciences. “Entering the second century of virology, we can finally start focusing beyond our own planet.”

Stedman argues that since there are more viruses on Earth — 10 to 100 times more than any other cellular organism — the same could be true on other planets and moons. Viruses also appear to be extremely ancient, may have been involved in the origin of life and have probably been involved in major evolutionary transitions on Earth.

“With this paper, we hope to inspire integration of virus research into astrobiology and also point out pressing unanswered questions in astrovirology, particularly regarding the detection of virus biosignatures and whether viruses could be spread extraterrestrially,” Stedman said.

###

Stedman, co-founder of PSU’s Center for Life in Extreme Environments, wrote the article with colleagues Aaron Berliner from the Center for the Utilization of Biological Engineering in Space at the UC Berkeley, and Tomohiro Moichizuki from the Earth-Life Science Institute at the Tokyo Institute of Technology.

Stedman has funding from NASA for virus evolution research. He was previously funded by NASA to work on virus preservation. These results led to him founding a vaccine-stabilization company based on NASA-funded research.

1500 years ago Life Expectancy was about 70 not 40

Public Release: 3-Jan-2018

Redefining knowledge of elderly people throughout history

Australian National University

An archaeologist from The Australian National University (ANU) is set to redefine what we know about elderly people in cultures throughout history, and dispel the myth that most people didn’t live much past 40 prior to modern medicine.

Christine Cave, a PhD Scholar with the ANU School of Archaeology and Anthropology, has developed a new method for determining the age-of-death for skeletal remains based on how worn the teeth are.

Using her method, which she developed by analysing the wear on teeth and comparing with living populations of comparable cultures, she examined the skeletal remains of three Anglo-Saxon English cemeteries for people buried between the years 475 and 625.

Her research determined that it was not uncommon for people to live to old age.

“People sometimes think that in those days if you lived to 40 that was about as good as it got. But that’s not true.

“For people living traditional lives without modern medicine or markets the most common age of death is about 70, and that is remarkably similar across all different cultures.”

Ms Cave said the myth has been built up due to deficiencies in the way older people are categorised in archaeological studies.

“Older people have been very much ignored in archaeological studies and part of the reason for that has been the inability to identify them,” she said.

“When you are determining the age of children you use developmental points like tooth eruption or the fusion of bones that all happen at a certain age.

“Once people are fully grown it becomes increasingly difficult to determine their age from skeletal remains, which is why most studies just have a highest age category of 40 plus or 45 plus.

“So effectively they don’t distinguish between a fit and healthy 40 year old and a frail 95 year old.

“It’s meaningless if you are trying to study elderly people.”

Ms Cave said the new method will give archaeologists a more accurate view of past societies and what life was like for older people.

For those in the three cemeteries she studied, which were Greater Chesterford in Essex, Mill Hill in Kent, and Worthy Park in Hampshire, she found a marked difference in the way male and female people of old age were buried.

“Women were more likely to be given prominent burials if they died young, but were much less likely to be given one if they were old,” she said.

“The higher status men are generally buried with weapons, like a spear and a shield or occasionally a sword.

“Women were buried with jewellery, like brooches, beads and pins. This highlights their beauty which helps explain why most of the high-status burials for women were for those who were quite young.”

Ms Cave’s study “Sex and the Elderly” was published in the Journal of Anthropological Archaeology.

###

FOR INTERVIEW:

Christine Cave
PhD Scholar
ANU School of Archaeology and Anthropology
M: 0417 284 191
E: christine.cave@anu.edu.au

MEDIA CONTACT:

Aaron Walker
ANU Media Team
T: 62125 7979
M: 0418 307 213
E: media@anu.edu.au

Gamers have an advantage in learning

Public Release: 29-Sep-2017

 

Ruhr-University Bochum

Neuropsychologists of the Ruhr-Universität Bochum let video gamers compete against non-gamers in a learning competition. During the test, the video gamers performed significantly better and showed an increased brain activity in the brain areas that are relevant for learning. Prof Dr Boris Suchan, Sabrina Schenk and Robert Lech report their findings in the journal Behavioural Brain Research.

The weather prediction task

The research team studied 17 volunteers who – according to their own statement – played action-based games on the computer or a console for more than 15 hours a week. The control group consisted of 17 volunteers who didn’t play video games on a regular basis. Both teams did the so-called weather prediction task, a well-established test to investigate the learning of probabilities. The researchers simultaneously recorded the brain activity of the participants via magnetic resonance imaging.

The participants were shown a combination of three cue cards with different symbols. They should estimate whether the card combination predicted sun or rain and got a feedback if their choice was right or wrong right away. The volunteers gradually learned, on the basis of the feedback, which card combination stands for which weather prediction. The combinations were thereby linked to higher or lower probabilities for sun and rain. After completing the task, the study participants filled out a questionnaire to sample their acquired knowledge about the cue card combinations.

Video gamers better with high uncertainties

The gamers were notably better in combining the cue cards with the weather predictions than the control group. They fared even better with cue card combinations that had a high uncertainty such as a combination that predicted 60 percent rain and 40 percent sunshine.

The analysis of the questionnaire revealed that the gamers had acquired more knowledge about the meaning of the card combinations than the control group. “Our study shows that gamers are better in analysing a situation quickly, to generate new knowledge and to categorise facts – especially in situations with high uncertainties,” says first author Sabrina Schenk.

This kind of learning is linked to an increased activity in the hippocampus, a brain region that plays a key role in learning and memory. “We think that playing video games trains certain brain regions like the hippocampus”, says Schenk. “That is not only important for young people, but also for older people; this is because changes in the hippocampus can lead to a decrease in memory performance. Maybe we can treat that with video games in the future.”

Hold the phone: An ambulance might lower your chances of surviving some injuries

Public Release: 20-Sep-2017

Gunshot and stabbing victims more likely to die if transported to the trauma center by ambulance

Johns Hopkins Medicine

 

IMAGE

A new study finds that victims of gunshots and stabbings are significantly less likely to die if they’re taken to the trauma center by a private vehicle than ground emergency medical services (EMS).

Credit: Credit: Johns Hopkins Medicine

Victims of gunshots and stabbings are significantly less likely to die if they’re taken to the trauma center by a private vehicle than ground emergency medical services (EMS), according to results of a new analysis.

A report on the study’s findings, published Sept. 20 in JAMA Surgery, highlights the importance of studying the effects of transport, EMS services and other prehospital interventions by specific injury type.

“Time is truly of the essence when it comes to certain kinds of injuries and our analysis suggests that, for penetrating injuries such as knife and gun wounds, it might be better to just get to a trauma center as soon as possible in whatever way possible,” says Elliott Haut, M.D., Ph.D., an associate professor of surgery and emergency medicine at the Johns Hopkins University School of Medicine and the paper’s senior author.

While Haut acknowledges that more research needs to be conducted before declaring one type of transport superior to another, he says, “For certain types of injury, it might be best to call the police, Uber or a cab — however you can get to the trauma center fastest.”

Typically, policies and protocols for prehospital interventions are established at a regional or statewide trauma system level, which allows first responders such as EMTs and paramedics to determine what, if any, medical procedures should be performed prior to and during transport to the hospital, says Haut. But research studies have rarely, if ever, evaluated or compared all of the effects of system-driven prehospitalization policies, leaving ideal prehospital care strategies undefined, Haut adds.

In what the researchers believe is the first effort of its kind to do that evaluation for ambulance versus private vehicle transportation, they elected to analyze the relationship between transport mode and in-hospital mortality (death in the emergency department and prior to discharge) on a city-by-city, trauma system level.

For the study, Haut and colleagues examined data from the American College of Surgeons National Trauma Data Bank, the largest available collection of United States trauma registry data. Specifically, the team examined information already gathered on 103,029 patients at least 16 years old who entered a U.S. trauma center between Jan. 1, 2010, and Dec. 31, 2012, for a gunshot or stab wound and were transported to the trauma center by ground EMS (ambulance) or private vehicle. The data were gathered from 298 level I and level II trauma centers within the 100 most populous U.S. metro areas.

Approximately 16.4 percent of all patients were transported by private vehicle. The analysis found an overall 2.2 percent mortality rate for patients transported via private vehicle, compared to 11.6 percent for ground EMS.

Gunshot victims transported by private vehicle saw a lower mortality rate (4.5 percent versus 19.3 percent), as did stab victims (0.2 percent versus 2.9 percent). When adjusting for differences in injury severity, patients with penetrating injuries were 62 percent less likely to die when transported by private vehicle compared to EMS.

“Unlike CPR and defibrillation for heart attacks, the type of damage done in penetrating trauma often can’t be reversed in a prehospital setting. This study supports other studies that prehospital interventions can actually result in less favorable outcomes for certain types of injuries,” says Michael W. Wandling, M.D, M.S., an American College of Surgeons (ACS) Clinical Scholar in Residence, general surgery resident at the Northwestern University Feinberg School of Medicine and the study’s first author.

###

Other authors on this paper include Avery B. Nathens of the ACS and Michael B. Shapiro of the Northwestern University Feinberg School of Medicine.

Haut is primary investigator of a research grant (1R01HS024547) from the Agency for Healthcare Research and Quality titled “Individualized Performance Feedback on Venous Thromboembolism Prevention Practice” and is primary investigator of two contracts from the Patient-Centered Outcomes Research Institute titled “Preventing Venous Thromboembolism: Empowering Patients and Enabling Patient-Centered Care via Health Information Technology” (CE-12-11-4489) and “Preventing Venous Thromboembolism (VTE): Engaging Patients to Reduce Preventable Harm from Missed/Refused Doses of VTE Prophylaxis” (DI-1603-34596). Haut receives royalties from Lippincott, Williams, & Wilkins for a book, “Avoiding Common ICU Errors.”

Haut was the paid author of a paper commissioned by the National Academies of Sciences, Engineering, and Medicine titled “Military Trauma Care’s Learning Health System: The Importance of Data Driven Decision Making” which was used to support the report titled “A National Trauma Care System: Integrating Military and Civilian Trauma Systems to Achieve Zero Preventable Deaths After Injury.”

Colorectal cancer death rates rising in people under 55

Public Release: 8-Aug-2017

 

Rise confined to white population

American Cancer Society

ATLANTA -August 8, 2017- A new report finds that colorectal cancer mortality rates have increased in adults under 55 since the mid-2000s after falling for decades, strengthening evidence that previously reported increases in incidence in this age group are not solely the result of more screening. The rise was confined to white individuals according to the report, which appears in the Journal of the American Medical Association.

As reported previously by American Cancer Society investigators, colorectal cancer (CRC) incidence has been increasing in the United States among adults younger than 55 years since at least the mid-1990s. The increase thus far is confined to white men and women and is most rapid for metastatic disease. CRC mortality overall is declining rapidly, masking trends in young adults, which have not been comprehensively examined.

For the current study, ACS investigators led by Rebecca Siegel, MPH analyzed CRC mortality among persons aged 20 to 54 years by race from 1970 through 2014 using data from the National Center for Health Statistics. The analysis included 242,637 people ages 20 to 54 who died from CRC between 1970 and 2014.

CRC mortality rates among those ages 20 to 54 declined from 6.3 per 100,000 in 1970 to 3.9 in 2004, at which point mortality rates began to increase by 1.0% annually, eventually reaching 4.3 per 100,000 in 2014. The increase was confined to white individuals, among whom mortality rates increased by 1.4% per year, from 3.6 in 2004 to 4.1 in 2014. Among black individuals, mortality declined throughout the study period at a rate of 0.4% to 1.1% annually (from 8.1 in 1970 to 6.1 in 2014). Among other races combined, mortality rates declined from 1970-2006 and were stable thereafter.

While mortality remained stable in white individuals ages 20 to 29 from 1988-2014, it increased from 1995-2014 by 1.6% per year in those ages 30 to 39 years, and from 2005-2014 by 1.9% per year for those ages 40 to 49 years and by 0.9% per year for those ages 50 to 54 years. Conversely, rates declined in black individuals in every age group. The authors note that these disparate racial patterns are inconsistent with trends in major risk factors for colorectal cancer like obesity, which is universally increasing.

The authors say rising colorectal cancer mortality in people in their 50s was particularly unexpected because screening, which can prevent cancer as well as detect it early, when it is more curable, has been recommended starting at age 50 for decades. Screening prevalence has increased for all age groups over 50, but is lower in people 50 to 54 than in those 55 and older: 44% versus 62%, respectively, in 2013, according to the National Health Interview Survey.

“Although the risk of colorectal cancer remains low for young and middle-aged adults, rising mortality strongly suggests that the increase in incidence is not only earlier detection of prevalent cancer, but a true and perplexing escalation in disease occurrence. It is especially surprising for people in their 50s, for whom screening is recommended, and highlights the need for interventions to improve use of age-appropriate screening and timely follow-up of symptoms.”

###

Article: Colorectal Cancer Mortality Rates in Adults Aged 20 to 54 Years in the United States, 1970-2014, doi:10.1001/jama.2017.7630.

Pro-vaccine messages are having the opposite effect

Public Release: 7-Aug-2017

Pro-vaccine messages can boost belief in MMR myths, study shows

University of Edinburgh

Current strategies for correcting misinformation about the dangers of vaccinations have the opposite effect and reinforce ill-founded beliefs, a study suggests.

Presenting scientific facts to disprove misconceptions was found to actually strengthen unfounded opinions, such as that the measles, mumps and rubella (MMR) vaccine causes autism.

Similarly, showing images which suggest unvaccinated children can suffer from disease inspired the strongest belief that vaccines had harmful side effects.

A survey of people carried out in Scotland and Italy measured attitudes towards popular misconceptions about the MMR vaccine and asked them whether they would give the vaccine to their child.

The participants were then divided and presented with different approaches to combat misinformation about vaccines. The study was led by the University of Edinburgh.

Countering false information in ways that repeat it appears to amplify and spread the misconception, making it familiar and therefore more acceptable, researchers said.

One group was shown a leaflet that confronted vaccine myths with facts. The second group was given a series of tables comparing potential problems caused by measles, mumps and rubella with potential side effects of the MMR vaccine.

A third group was shown images of children suffering from measles, mumps and rubella. The fourth and final group acted as a control and was given unrelated reading material.

After these interventions, participants took the survey again to see if their attitudes had changed. A week later, they took it a third time to see if their attitudes had changed.

Researchers found that all of the strategies were counter-productive. Belief in vaccine myths were strengthened and the likelihood of vaccinating children lessened. This effect only increased over time.

The findings suggest that public health campaigns need more testing, according to the researchers.

They recommend a variety of simultaneous and frequent interventions, as opposed to a singular campaign. Experts also suggested addressing other barriers that impede vaccine uptake, such as ease of access and cost.

Professor Sergio Della Sala from the University of Edinburgh’s Department of Psychology, said: “These findings offer a useful example of how factual information is misremembered over time. Even after a short delay, facts fade from the memory, leaving behind the popular misconceptions.”

Genes that lead to coronary disease are need for fertility

Public Release: 22-Jun-2017

Human genes for coronary artery disease make them more prolific parents

Genome-wide scans suggest that natural selection keeps these genes in the population as they benefit childbearing capacity

PLOS

Coronary artery disease may have persisted in human populations because the genes that cause this late-striking disease also contribute to greater numbers of children, reports Dr Sean Byars of The University of Melbourne and Associate Professor Michael Inouye of the Baker Heart and Diabetes Institute, Australia, in a study published June 22, 2017 in PLOS Genetics.

Coronary artery disease, a condition where plaque builds up gradually in the arteries that feed the heart, is one of the leading causes of death worldwide, and may have plagued humans for thousands of years. One of the big questions surrounding the disease is why natural selection has not removed genes for this common and costly disease. In a new study, researchers used genetic information from the 1000 Genomes database and the International HapMap3 project, along with lifetime reproductive data from the Framingham Heart Study, to identify genetic variation linked to the disease that natural selection had also modified recently.

They showed that these same genetic variations also contribute in multiple ways to greater male and female reproductive success, which appears to represent an evolutionary trade-off between early-life reproductive benefits that compensate for later-life disease costs.

The findings offer an answer to the question of why natural selection cannot weed out genes associated with coronary artery disease – parents pass them on to their offspring before experiencing advanced symptoms and death. The study also provides a novel approach for detecting the influence of natural selection on traits caused by the cumulative effects of multiple genes, which in the past, has been far more difficult to uncover than for disorders linked to a single gene.

###

In your coverage please use this URL to provide access to the freely available article in PLOS Genetics: http://journals.plos.org/plosgenetics/article?id=10.1371/journal.pgen.1006328

Citation: Byars SG, Huang QQ, Gray L-A, Bakshi A, Ripatti S, Abraham G, et al. (2017) Genetic loci associated with coronary artery disease harbor evidence of selection and antagonistic pleiotropy. PLoS Genet 13(6): e1006328. https://doi.org/10.1371/journal.pgen.1006328

Funding: This study was supported by the National Health and Medical Research Council (NHMRC) of Australia (grant no. 1062227) and the National Heart Foundation of Australia. MI was supported by a Career Development Fellowship co-funded by the NHMRC and the National Heart Foundation of Australia (no. 1061435). GA was supported by an NHMRC Peter Doherty Early Career Fellowship (no.1090462). SR was supported by the Academy of Finland Center of Excellence in Complex Disease Genetics (Grant No 213506 and 129680), Academy of Finland (Grant No 251217 and 285380), the Finnish Foundation for Cardiovascular Research, the Sigrid Juselius Foundation, Biocentrum Helsinki and the European Union’s Seventh Framework Programme (FP7/2007-2013) under grant agreement No 201413 (ENGAGE) and 261433 (BioSHaRE-EU), and Horizon 2020 Research and Innovation Programme under grant agreement No 692145 (ePerMed). The Framingham Heart Study is conducted and supported by the National Heart, Lung, and Blood Institute (NHLBI) in collaboration with Boston University (Contract No. N01-HC-25195 and HHSN268201500001I). This manuscript was not prepared in collaboration with investigators of the Framingham Heart Study and does not necessarily reflect the opinions or views of the Framingham Heart Study, Boston University, or NHLBI. Funding for SHARe Affymetrix genotyping was provided by NHLBI Contract N02-HL-64278. SHARe Illumina genotyping was provided under an agreement between Illumina and Boston University. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.

Competing Interests: The authors have declared that no competing interests exist

‘Humanlike’ ways of thinking evolved 1.8 million years ago, suggests new study

Public Release: 8-May-2017

 

Indiana University

 

IMAGE

IMAGE: This is Shelby S. Putt.

Credit: University of Iowa

By using highly advanced brain imaging technology to observe modern humans crafting ancient tools, an Indiana University neuroarchaeologist has found evidence that human-like ways of thinking may have emerged as early as 1.8 million years ago.

The results, reported May 8 in the journal Nature Human Behavior, place the appearance of human-like cognition at the emergence of Homo erectus, an early apelike species of human first found in Africa whose evolution predates Neanderthals by nearly 600,000 years.

“This is a significant result because it’s commonly thought our most modern forms of cognition only appeared very recently in terms of human evolutionary history,” said Shelby S. Putt, a postdoctoral researcher with The Stone Age Institute at Indiana University, who is first author on the study. “But these results suggest the transition from apelike to humanlike ways of thinking and behaving arose surprisingly early.”

The study’s conclusions are based upon brain activity in modern individuals taught to create two types of ancient tools: simple Oldowan-era “flake tools” — little more than broken rocks with a jagged edge — and more complicated Acheulian-era hand axes, which resemble a large arrowhead. Both are formed by smashing rocks together using a process known as “flintknapping.”

Oldowan tools, which first appeared about 2.6 million years ago, are among the earliest used by humanity’s ancestors. Acheulian-era tool use dates from 1.8 million to 100,000 years ago.

Putt said that neuroarchaeologists look to modern humans to understand how pre-human species evolved cognition since the act of thinking — unlike fossilized bones or ancient artifacts — leave no physical trace in the archaeological record.

The methods used to conduct studies on modern humans crafting ancient tools was limited until recently by brain imaging technology. Previous studies depended on placing people within the confines of a functional magnetic resonance imaging machine — essentially a narrow mental tube — to observe their brain activity while watching videos of people crafting tools.

Putt’s study, by contrast, employed more advanced functional near-infrared spectroscopy — a device that resembles a lightweight cap with numerous wires used to shine highly sensitive lasers onto the scalp — to observe brain activity in people as they learned to craft both types of tools with their hands.

In the study, 15 volunteers were taught to craft both types of tools through verbal instruction via videotape. An additional 16 volunteers were shown the same videos without sound to learn toolmaking through nonverbal observation. These experiments were conducted in the lab of John P. Spencer at the University of Iowa, where Putt earned her Ph.D. before joining IU. Spencer is now a faculty member at the University of East Anglia.

The resulting brain scans revealed that visual attention and motor control were required to create the simpler Oldowan tools. A much larger portion of the brain was engaged in the creation of the more complex Acheulian tools, including regions of the brain associated with the integration of visual, auditory and sensorimotor information; the guidance of visual working memory; and higher-order action planning.

“The fact that these more advanced forms of cognition were required to create Acheulean hand axes — but not simpler Oldowan tools — means the date for this more humanlike type of cognition can be pushed back to at least 1.8 million years ago, the earliest these tools are found in the archaeological record,” Putt said. “Strikingly, these parts of the brain are the same areas engaged in modern activities like playing the piano.”

###

In addition to Spencer, other authors on the study were Sobanawartiny Wijeakumar of the University of East Anglia and Robert Franciscus of the University of Iowa. The research was supported in part by the Wenner-Gren Foundation; the Leakey Foundation; Sigma Xi, the Scientific Research Society; the American Association of University Women; and the University of Iowa.

Putt also said the study was inspired in part by a similar experiment previously performed at IU by Nicholas Toth and Kathy Schick, both professors in the IU College of Arts and Sciences’ Cognitive Science Program and co-directors of The Stone Age Institute, and Dietrich Stout, a Ph.D. student in their lab who is now a faculty member at Emory University in Georgia. Putt joined IU in part to pursue additional research on human cognition at The Stone Age Institute under support from the institute’s $3.2 million grant from the Temple Foundation in 2016.

The cold exterminated all of them

Public Release: 6-Mar-2017

 

Through age determinations that are using the radioactive decay of uranium, scientists have discovered that one of the greatest mass extinctions was due to an ice age and not to a warming of Earth temperature

Université de Genève

 

IMAGE

Caption

Permian-Triassic boundary in shallow marine sediments, characterised by a significant sedimentation gap between the black shales of Permian and dolomites of Triassic age. This gap documents a globally recognized regression phase, probably linked to a period of a cold climate and glaciation.

Credit: © H. Bucher, Zürich

The Earth has known several mass extinctions over the course of its history. One of the most important happened at the Permian-Triassic boundary 250 million years ago. Over 95% of marine species disappeared and, up until now, scientists have linked this extinction to a significant rise in Earth temperatures. But researchers from the University of Geneva (UNIGE), Switzerland, working alongside the University of Zurich, discovered that this extinction took place during a short ice age which preceded the global climate warming. It’s the first time that the various stages of a mass extinction have been accurately understood and that scientists have been able to assess the major role played by volcanic explosions in these climate processes. This research, which can be read in Scientific Reports, completely calls into question the scientific theories regarding these phenomena, founded on the increase of CO2 in the atmosphere, and paves the way for a new vision of the Earth’s climate history.

Teams of researchers led by Professor Urs Schaltegger from the Department of Earth and Environmental Sciences at the Faculty of Science of the UNIGE and by Hugo Bucher, from the University of Zürich, have been working on absolute dating for many years. They work on determining the age of minerals in volcanic ash, which establishes a precise and detailed chronology of the earth’s climate evolution. They became interested in the Permian-Triassic boundary, 250 million years ago, during which one of the greatest mass extinctions ever took place, responsible for the loss of 95% of marine species. How did this happen? for how long marine biodiversity stayed at very low levels ?

A technique founded on the radioactive decay of uranium

Researchers worked on sediment layers in the Nanpanjiang basin in southern China. They have the particularity of being extremely well preserved, which allowed for an accurate study of the biodiversity and the climate history of the Permian and the Triassic. “We made several cross-sections of hundreds of metres of basin sediments and we determined the exact positions of ash beds contained in these marine sediments,” explained Björn Baresel, first author of the study. They then applied a precise dating technique based on natural radioactive decay of uranium, as Urs Schaltegger added: “In the sedimentary cross-sections, we found layers of volcanic ash containing the mineral zircon which incorporates uranium. It has the specificity of decaying into lead over time at a well-known speed. This is why, by measuring the concentrations of uranium and lead, it was possible for us to date a sediment layer to an accuracy of 35,000 years, which is already fairly precise for periods over 250 million years.”

Ice is responsible for mass extinction

By dating the various sediment layers, researchers realised that the mass extinction of the Permian-Triassic boundary is represented by a gap in sedimentation, which corresponds to a period when the sea-water level decreased. The only explanation to this phenomenon is that there was ice, which stored water, and that this ice age which lasted 80,000 years was sufficient to eliminate much of marine life. Scientists from the UNIGE explain the global temperature drop by a stratospheric injection of large amounts of sulphur dioxide reducing the intensity of solar radiation reaching the surface of the earth. “We therefore have proof that the species disappeared during an ice age caused by the activity of the first volcanism in the Siberian Traps,” added Urs Schaltegger. This ice age was followed by the formation of limestone deposits through bacteria, marking the return of life on Earth at more moderate temperatures. The period of intense climate warming, related to the emplacement of large amounts of basalt of the Siberian Traps and which we previously thought was responsible for the extinction of marine species, in fact happened 500,000 years after the Permian-Triassic boundary.

This study therefore shows that climate warming is not the only explanation of global ecological disasters in the past on Earth: it is important to continue analysing ancient marine sediments to gain a deeper understanding of the earth’s climate system.

Clean water correlated with higher asthma rates ?

“Those that had access to good, clean water had much higher asthma rates and we think it is because they were deprived of the beneficial microbes,” said Finlay. “That was a surprise because we tend to think that clean is good but we realize that we actually need some dirt in the world to help protect you.”

Public Release: 17-Feb-2017

Yeast found in babies’ guts increases risk of asthma

University of British Columbia

Credit: UBC Public Affairs

University of British Columbia microbiologists have found a yeast in the gut of new babies in Ecuador that appears to be a strong predictor that they will develop asthma in childhood. The new research furthers our understanding of the role microscopic organisms play in our overall health.

“Children with this type of yeast called Pichia were much more at risk of asthma,” said Brett Finlay, a microbiologist at UBC. “This is the first time anyone has shown any kind of association between yeast and asthma.”

In previous research, Finlay and his colleagues identified four gut bacteria in Canadian children that, if present in the first 100 days of life, seem to prevent asthma. In a followup to this study, Finlay and his colleagues repeated the experiment using fecal samples and health information from 100 children in a rural village in Ecuador.

Canada and Ecuador both have high rates of asthma with about 10 per cent of the population suffering from the disease.

They found that while gut bacteria play a role in preventing asthma in Ecuador, it was the presence of a microscopic fungus or yeast known as Pichia that was more strongly linked to asthma. Instead of helping to prevent asthma, however, the presence of Pichia in those early days puts children at risk.

Finlay also suggests there could be a link between the risk of asthma and the cleanliness of the environment for Ecuadorian children. As part of the study, the researchers noted whether children had access to clean water.

“Those that had access to good, clean water had much higher asthma rates and we think it is because they were deprived of the beneficial microbes,” said Finlay. “That was a surprise because we tend to think that clean is good but we realize that we actually need some dirt in the world to help protect you.”

Now Finlay’s colleagues will re-examine the Canadian samples and look for the presence of yeast in the gut of infants. This technology was not available to the researchers when they conducted their initial study.

###

This research was a collaboration with Marie-Claire Arrieta, a former UBC postdoctoral fellow and now an assistant professor at the University of Calgary, and Philip Cooper, a professor at the Liverpool School of Tropical Medicine.

VIDEO: Yeast linked to asthma in Ecuador https://youtu.be/tC-cO_Z9Bz8

This research was presented today at the 2017 annual meeting for Association for the Advancement of Science: https://aaas.confex.com/aaas/2017/webprogram/Session15097.html. Finlay is in Boston for the conference and is also available by phone.

Experts claim “ There is no threshold for LDL cholesterol below which there are no net benefits of statins “

Public Release: 18-Jan-2017

Experts urge for wider prescription of statins in treatment and prevention

Link to diabetes is questionable and inconsequential

Florida Atlantic University

World-renowned researchers from the Charles E. Schmidt College of Medicine at Florida Atlantic University as well as Harvard Medical School address the possible but unproven link between statins and diabetes, as well as the implications of prescription of statins for clinicians and their patients, in a commentary published in the prestigious American Journal of Medicine. The editor-in-chief of the journal published the commentary and an editorial he wrote online ahead of print.

Charles H. Hennekens, M.D, Dr.P.H., the first Sir Richard Doll professor and senior academic advisor to the dean, the Charles E. Schmidt College of Medicine at FAU; Bettina Teng, BA, a recent pre-med honors graduate of the Harriet L. Wilkes Honors College at FAU; and Marc A. Pfeffer, M.D., Ph.D., the Dzau professor of medicine at HMS, emphasize to clinicians that the risk of diabetes, even if real, pales in comparison to the benefits of statins in both the treatment and primary prevention of heart attacks and strokes.

“The totality of evidence clearly indicates that the more widespread and appropriate utilization of statins, as adjuncts, not alternatives to therapeutic lifestyle changes, will yield net benefits in the treatment and primary prevention of heart attacks and strokes, including among high, medium and low risk patients unwilling or unable to adopt therapeutic lifestyle changes,” said Hennekens.

In the accompanying editorial, Joseph S. Alpert, M.D., editor-in-chief and a renowned cardiologist and professor of medicine at the University of Arizona School of Medicine, reinforces these important and timely clinical and public health challenges in treatment and primary prevention.

“There is no threshold for low density lipoprotein cholesterol below which there are no net benefits of statins either in the treatment or primary prevention of heart attacks and strokes,” said Alpert.

The authors and editorialist express grave concerns that there will be many needless premature deaths as well as preventable heart attacks and strokes if patients who would clearly benefit from statins are not prescribed the drug, refuse to take the drug, or stop using the drug because of ill-advised adverse publicity about benefits and risks, which may include misplaced concerns about the possible but unproven small risk of diabetes.

“These public health issues are especially alarming in primary prevention, particularly among women, for whom cardiovascular disease also is the leading cause of death, and for whom there is even more underutilization of statins than for men,” said Hennekens.

At its national meeting in November 2013, the American Heart Association, in collaboration with the American College of Cardiology, presented and published its new guidelines for the use of statins in the treatment and primary prevention of heart attacks and strokes, in which the organizations also recommended wider utilization in both treatment and prevention.

According to the United States Centers for Disease Control and Prevention, heart disease is the leading killer among men and women, causing approximately 600,000 deaths each year.

Among the numerous honors and recognitions Hennekens has received include the 2013 Fries Prize for Improving Health for his seminal contributions to the treatment and primary prevention of cardiovascular disease, the 2013 Presidential Award from his alma mater, Queens College, for his distinguished contributions to society, and the 2013 American Heart Association Award, which he shared with the Charles E. Schmidt College of Medicine at FAU for reducing premature deaths from heart attacks and strokes. From 1995 to 2005, Science Watch ranked him as the third most widely cited medical researcher in the world and five of the top 20 were his former trainees and/or fellows. In 2012, Science Heroes ranked Hennekens No. 81 in the history of the world for having saved more than 1.1 million lives. In 2014, he received the Ochsner Award for reducing premature deaths from cigarettes. In 2016, he was ranked the No. 14 “Top Scientist in the World” with an H-index of 173.

###

About the Charles E. Schmidt College of Medicine:

FAU’s Charles E. Schmidt College of Medicine is one of 141 accredited medical schools in the U.S. The college was launched in 2010, when the Florida Board of Governors made a landmark decision authorizing FAU to award the M.D. degree. After receiving approval from the Florida legislature and the governor, it became the 134th allopathic medical school in North America. With more than 70 full and part-time faculty and more than 1,300 affiliate faculty, the college matriculates 64 medical students each year and has been nationally recognized for its innovative curriculum. To further FAU’s commitment to increase much needed medical residency positions in Palm Beach County and to ensure that the region will continue to have an adequate and well-trained physician workforce, the FAU Charles E. Schmidt College of Medicine Consortium for Graduate Medical Education (GME) was formed in fall 2011 with five leading hospitals in Palm Beach County. In June 2014, FAU’s College of Medicine welcomed its inaugural class of 36 residents in its first University-sponsored residency in internal medicine.

About Florida Atlantic University:

Florida Atlantic University, established in 1961, officially opened its doors in 1964 as the fifth public university in Florida. Today, the University, with an annual economic impact of $6.3 billion, serves more than 30,000 undergraduate and graduate students at sites throughout its six-county service region in southeast Florida. FAU’s world-class teaching and research faculty serves students through 10 colleges: the Dorothy F. Schmidt College of Arts and Letters, the College of Business, the College for Design and Social Inquiry, the College of Education, the College of Engineering and Computer Science, the Graduate College, the Harriet L. Wilkes Honors College, the Charles E. Schmidt College of Medicine, the Christine E. Lynn College of Nursing and the Charles E. Schmidt College of Science. FAU is ranked as a High Research Activity institution by the Carnegie Foundation for the Advancement of Teaching. The University is placing special focus on the rapid development of critical areas that form the basis of its strategic plan: Healthy aging, biotech, coastal and marine issues, neuroscience, regenerative medicine, informatics, lifespan and the environment. These areas provide opportunities for faculty and students to build upon FAU’s existing strengths in research and scholarship. For more information, visit http://www.fau.edu.

Scientists engineer animals with ancient genes to test causes of evolution

Public Release: 13-Jan-2017

 

Study overturns a textbook case of genetic adaptation

University of Chicago Medical Center

 

Scientists at the University of Chicago have created the first genetically modified animals containing reconstructed ancient genes, which they used to test the evolutionary effects of genetic changes that happened in the deep past on the animals’ biology and fitness.

The research, published early online in Nature Ecology & Evolution on Jan. 13, is a major step forward for efforts to study the genetic basis of adaptation and evolution. The specific findings, involving the fruit fly’s ability to break down alcohol in rotting fruit, overturn a widely-held hypothesis about the molecular causes of one of evolutionary biology’s classic cases of adaptation.

“One of the major goals of modern evolutionary biology is to identify the genes that caused species to adapt to new environments, but it’s been hard to do that directly, because we’ve had no way to test the effects of ancient genes on animal biology,” said Mo Siddiq, a graduate student in the Department of Ecology and Evolution at the University of Chicago, one of the study’s lead scientists.

“We realized we could overcome this problem by combining two recently developed methods–statistical reconstruction of ancient gene sequences and engineering of transgenic animals,” he said.

Until recently, most studies of molecular adaptation have analyzed gene sequences to identify “signatures of selection”–patterns suggesting that a gene changed so quickly during its evolution that selection is likely to have been the cause.  The evidence from this approach is only circumstantial, however, because genes can evolve quickly for many reasons, such as chance, fluctuations in population size, or selection for functions unrelated to the environmental conditions to which the organism is thought to have adapted.

Siddiq and his advisor, Joe Thornton, PhD, professor of ecology and evolution and human genetics at the University of Chicago, wanted to directly test the effects of a gene’s evolution on adaptation. Thornton has pioneered methods for reconstructing ancestral genes–statistically determining their sequences from large databases of present-day sequences, then synthesizing them and experimentally studying their molecular properties in the laboratory. This strategy has yielded major insights into the mechanisms by which biochemical functions evolve.

Thornton and Siddiq reasoned that by combining ancestral gene reconstruction with techniques for engineering transgenic animals, they could study how genetic changes that occurred in the deep past affected whole organisms-their development, physiology, and even their fitness.

“This strategy of engineering ‘ancestralized animals’ could be applied to many evolutionary questions,” Thornton said. “For the first test case, we chose a classic example of adaptation-how fruit flies evolved the ability to survive the high alcohol concentrations found in rotting fruit. We found that the accepted wisdom about the molecular causes of the flies’ evolution is simply wrong.”

The fruitfly Drosophila melanogaster is one of the most studied organisms in genetics and evolution.  In the wild, D. melanogaster lives in alcohol-rich rotting fruit, tolerating far higher alcohol concentrations than its closest relatives, which live on other food sources. Twenty-five years ago at the University of Chicago, biologists Martin Kreitman and John McDonald invented a new statistical method for finding signatures of selection, which remains to this day one of the most widely used methods in molecular evolution. They demonstrated it on the alcohol dehydrogenase (Adh) gene–the gene for the enzyme that breaks down alcohol inside cells–from this group of flies.  Adh had a strong signature of selection, and it was already known that D. melanogaster flies break down alcohol faster than their relatives. So, the idea that the Adh enzyme was the cause of the fruit fly’s adaptation to ethanol became the first accepted case of a specific gene that mediated adaptive evolution of a species.

Siddiq and Thornton realized that this hypothesis could be tested directly using the new technologies.  Siddiq first inferred the sequences of ancient Adh genes from just before and just after D. melanogaster evolved its ethanol tolerance, some two to four million years ago. He synthesized these genes biochemically, expressed them, and used biochemical methods to measure their ability to break down alcohol in a test tube. The results were surprising: the genetic changes that occurred during the evolution of D. melanogaster had no detectable effect on the protein’s function. 

Working with collaborators David Loehlin at the University of Wisconsin and Kristi Montooth at the University of Nebraska, Siddiq then created and characterized transgenic flies containing the reconstructed ancestral forms of Adh. They bred thousands of these “ancestralized” flies, tested how quickly they could break down alcohol, and how well the larvae and adult flies survived when raised on food with high alcohol content.  Surprisingly, the transgenic flies carrying the more recent Adh were no better at metabolizing alcohol than flies carrying the more ancient form of Adh.  Even more strikingly, they were no better able to grow or survive on increasing alcohol concentrations. Thus, none of the predictions of the classic version of the story were fulfilled. There is no doubt that D. melanogaster did adapt to high-alcohol food sources during its evolution, but not because of changes in the Adh enzyme.

“The Adh story was accepted because the ecology, physiology, and the statistical signature of selection all pointed in the same direction. But three lines of circumstantial evidence don’t make an airtight case,” Thornton said. “That’s why we wanted to test the hypothesis directly, now that we finally have the means to do so.”

Siddiq and Thornton hope that the strategy of making ancestralized transgenics will become the gold standard in the field to decisively determine the historical changes in genes to their changes on organisms’ biology and fitness. 

For his part, Kreitman, who is still a professor of ecology and evolution at UChicago, has been supportive of the new research, helping advise Siddiq on the project and sharing his vast knowledge about molecular evolution and Drosophila genetics.

“From the beginning, Marty was excited about our experiments, and he was just as supportive when our results overturned well-known conclusions based on his past work,” Siddiq said. “I think that’s extremely inspiring.”

###

The study, “Experimental test and refutation of a classic case of molecular adaptation in Drosophila melanogaster,” was supported by the National Science Foundation, the National Institutes of Health, the Howard Hughes Medical Institute, and the Life Sciences Research Foundation.

About the University of Chicago Medicine

The University of Chicago Medicine & Biological Sciences is one of the nation’s leading academic medical institutions. It comprises the Pritzker School of Medicine, a top 10 medical school in the nation; the University of Chicago Biomedical Sciences Division; and the University of Chicago Medical Center, which recently opened the Center for Care and Discovery, a $700 million specialty medical facility. Twelve Nobel Prize winners in physiology or medicine have been affiliated with the University of Chicago Medicine.

Visit our research blog at sciencelife.uchospitals.edu and our newsroom at uchospitals.edu/news.

Twitter @UChicagoMed, @ScienceLife

Facebook.com/UChicagoMed

When horses are in trouble they ask humans for help

Public Release: 15-Dec-2016

 

Kobe University

 

IMAGE

IMAGE: Horse with caretaker at the equestrian club.

Credit: Kobe University

Research Fellow Monamie RINGHOFER and Associate Professor Shinya YAMAMOTO (Kobe University Graduate School of Intercultural Studies) have proved that when horses face unsolvable problems they use visual and tactile signals to get human attention and ask for help. The study also suggests that horses alter their communicative behavior based on humans’ knowledge of the situation. These findings were published in the online version of Animal Cognition on November 24.

Communicating with other individuals in order to get information about foraging sites and predators is a valuable survival skill. Chimpanzees, who are evolutionarily close to humans, are especially skilled at understanding others. Studies suggest that chimpanzees distinguish the attentional states of other individuals (seeing or not seeing), and they are also able to understand others’ knowledge states (knowing or not knowing). Some domestic animals are also very good at communicating with humans – recent studies of dogs have revealed that they are excellent at understanding various human gestures and expressions. It is thought that these abilities were influenced by the domestication process.

Since they were domesticated 6000 years ago, horses have contributed to human society in various shapes and forms, from transport to companionship. Horse-riding has recently drawn attention for its positive effects on our physical and mental health. The high social cognitive skills of horses towards humans might partially explain why humans and horses have a collaborative relationship today. However, the scientific evidence for this ability is still scarce.

In this study, scientists investigated horses’ social cognitive skills with humans in a problem-solving situation where food was hidden in a place accessible only to humans. The experiment was carried out in a paddock belonging to the equestrian club at Kobe University, where eight horses from the club participated with the cooperation of their student caretakers.

For the first experiment, an assistant experimenter hid food (carrots) in a bucket which the horse could not reach. The researchers observed whether and how the horse sent signals to the caretaker when the caretaker (unaware of the situation) arrived. The horse stayed near the caretaker and looked at, touched and pushed the caretaker. These behaviors occurred over a significantly longer period compared to cases when they carried out the experiment without hiding the food. The results showed that when horses cannot solve problems by themselves they send signals to humans both visually (looking) and physically (touching and pushing).

Building on these results, for the second experiment they tested whether the horses’ behavior changed based on the caretakers’ knowledge of the hidden food. If the caretaker hadn’t watched the food being hidden, the horses gave more signals, demonstrating that horses can change their behavior in response to the knowledge levels of humans.

These two experiments revealed some behaviors used by horses to communicate demands to humans. They also suggest that horses possess high cognitive skills that enable them to flexibly alter their behavior towards humans according to humans’ knowledge state. This high social cognitive ability may have been acquired during the domestication process. In order to identify the characteristic that enables horses to form close bonds with humans, in future research the team aims to compare communication between horses, as well as looking more closely at the social cognitive ability of horses in their communication with humans.

By deepening our understanding of the cognitive abilities held by species who have close relationships with humans, and making comparisons with the cognitive abilities of species such as primates who are evolutionarily close to humans, we can investigate the development of unique communication traits in domesticated animals. This is connected to the influence of domestication on the cognitive ability of animals, and can potentially provide valuable information for realizing stronger bonds between humans and animals

High school football players, 1956-1970, did not have increase of neurodegenerative diseases

Public Release: 12-Dec-2016

 

Mayo Clinic

 

ROCHESTER, Minn. – A Mayo Clinic study published online today in Mayo Clinic Proceedings found that varsity football players from 1956 to 1970 did not have an increased risk of degenerative brain diseases compared with athletes in other varsity sports.

The researchers reviewed all the yearbooks and documented team rosters for Mayo High School and Rochester High School, now called John Marshall High School. The high school football players were compared with non-football playing athletes who were swimmers, basketball players and wrestlers.

Using the medical records-linkage system of the Rochester Epidemiology Project, each student was observed for about 40 years after participation in high school sports.

Among the 296 students who played football, the researchers found:

  • 34 cases of head trauma
  • 5 cases of mild cognitive impairment
  • 3 cases of parkinsonism
  • 2 cases of dementia
  • 0 cases of amyotrophic lateral sclerosis (ALS)

Among the 190 non-football athletes, the researchers found:

  • 14 cases of head trauma
  • 4 cases of mild cognitive impairment
  • 3 cases of parkinsonism
  • 1 case of dementia
  • 0 cases of ALS

The football players were found to have a suggestive increased risk of medically documented head trauma, especially in the 153 students who played football for more than one season, but they still did not show increased risk of neurodegenerative diseases.

This study mirrors a previous Mayo Clinic study of high school athletes who played between 1946 and 1956. That study also found no increased risk of degenerative brain diseases. While football between 1956 to 1970 is somewhat more similar to that of the present era — including body weight, athletic performance and equipment — football-related concussions still were minimized as “getting your bell rung,” the researchers note.

Football has continued to evolve. Helmets, for example, have gone from leather to hard plastic shells. However, helmets do not eliminate concussions and may provide players with a false sense of protection, says Rodolfo Savica, M.D., Ph.D., senior author of the study and a Mayo Clinic neurologist.

The researchers point out that high school sports offer clear benefits of physical fitness on cardiovascular health, and some studies also have suggested a possible protective effect against later degenerative brain illness. But the researchers caution that additional studies are needed to explore more recent eras and to involve players who participate at the collegiate and professional levels.

“This study should not be interpreted as evidence that football-related head trauma is benign,” the researchers write. “The literature on chronic traumatic encephalopathy in college and professional football players seems irrefutable, with reports of devastating outcomes. However, there may be a gradient of risk, with low potential in high school football players that played in the study period.”

In the future, the researchers plan to replicate the study with football players in the current eras.

###

In addition to Dr. Savica, Mayo Clinic study co-authors are:

  • Peter Janssen
  • Jay Mandrekar, Ph.D.
  • Michelle Mielke, Ph.D.
  • Eric Ahlskog, Ph.D., M.D.
  • Bradley Boeve, M.D.
  • Keith Josephs, M.D.

About Mayo Clinic Proceedings

Mayo Clinic Proceedings is a monthly peer-reviewed medical journal that publishes original articles and reviews dealing with clinical and laboratory medicine, clinical research, basic science research, and clinical epidemiology. Mayo Clinic Proceedings is sponsored by the Mayo Foundation for Medical Education and Research as part of its commitment to physician education. It publishes submissions from authors worldwide. The journal has been published for more than 80 years and has a circulation of 130,000. Articles are available at http://www.mayoclinicproceedings.org/.

About Mayo Clinic

Mayo Clinic is a nonprofit organization committed to clinical practice, education and research, providing expert, whole-person care to everyone who needs healing. For more information, visit mayoclinic.org/about-mayo-clinic or newsnetwork.mayoclinic.org.

Saturated fat could be good for you ?

“Participants on the very-high-fat diet also had substantial improvements in several important cardiometabolic risk factors, such as ectopic fat storage, blood pressure, blood lipids (triglycerides), insulin and blood sugar.”

Public Release: 2-Dec-2016

A Norwegian study challenges the long-held idea that saturated fats are unhealthy

The University of Bergen

A new Norwegian diet intervention study (FATFUNC), performed by researchers at the KG Jebsen center for diabetes research at the University of Bergen, raises questions regarding the validity of a diet hypothesis that has dominated for more than half a century: that dietary fat and particularly saturated fat is unhealthy for most people.

The researchers found strikingly similar health effects of diets based on either lowly processed carbohydrates or fats. In the randomized controlled trial, 38 men with abdominal obesity followed a dietary pattern high in either carbohydrates or fat, of which about half was saturated. Fat mass in the abdominal region, liver and heart was measured with accurate analyses, along with a number of key risk factors for cardiovascular disease.

“The very high intake of total and saturated fat did not increase the calculated risk of cardiovascular diseases,” says professor and cardiologist Ottar Nygård who contributed to the study.

“Participants on the very-high-fat diet also had substantial improvements in several important cardiometabolic risk factors, such as ectopic fat storage, blood pressure, blood lipids (triglycerides), insulin and blood sugar.”

High quality food is healthier

Both groups had similar intakes of energy, proteins, polyunsaturated fatty acids, the food types were the same and varied mainly in quantity, and intake of added sugar was minimized.

“We here looked at effects of total and saturated fat in the context of a healthy diet rich in fresh, lowly processed and nutritious foods, including high amounts of vegetables and rice instead of flour-based products,” says PhD candidate Vivian Veum.

“The fat sources were also lowly processed, mainly butter, cream and cold-pressed oils.”

Total energy intake was within the normal range. Even the participants who increased their energy intake during the study showed substantial reductions in fat stores and disease risk.

“Our findings indicate that the overriding principle of a healthy diet is not the quantity of fat or carbohydrates, but the quality of the foods we eat,” says PhD candidate Johnny Laupsa-Borge.

Saturated fat increases the “good” cholesterol

Saturated fat has been thought to promote cardiovascular diseases by raising the “bad” LDL cholesterol in the blood. But even with a higher fat intake in the FATFUNC study compared to most comparable studies, the authors found no significant increase in LDL cholesterol.

Rather, the “good” cholesterol increased only on the very-high-fat diet.

“These results indicate that most healthy people probably tolerate a high intake of saturated fat well, as long as the fat quality is good and total energy intake is not too high. It may even be healthy,” says Ottar Nygård.

“Future studies should examine which people or patients may need to limit their intake of saturated fat,” assistant professor Simon Nitter Dankel points out, who led the study together with the director of the laboratory clinics, professor Gunnar Mellgren, at Haukeland university hospital in Bergen, Norway.

“But the alleged health risks of eating good-quality fats have been greatly exaggerated. It may be more important for public health to encourage reductions in processed flour-based products, highly processed fats and foods with added sugar,” he says.

###

Facts: The FATFUNC-study

  • The Study is named (FATFUNC) and was performed by researchers at the KG Jebsen center for diabetes research, Department of Clinical Science at the University of Bergen.
  • In the randomized controlled trial, 38 men with abdominal obesity followed a dietary pattern high in either carbohydrates (53 % of total energy, in line with typical official recommendations) or fat (71 % of total energy, of which about half was saturated).
  • Fat mass in the abdominal region, liver and heart was measured with accurate analyses (computed tomography, CT), along with a number of key risk factors for cardiovascular disease.

The decline in industrial emissions also has negative implications

Public Release: 21-Nov-2016

 

Scientists clarify the causes of the increasing brown colouration of the water in reservoirs

Helmholtz Centre for Environmental Research – UFZ

 

IMAGE

Caption

The Rappbode reservoir in the Harz region — it also is affected by the increasing brown colouration of the water. It is one of 36 reservoirs in Germany which have been studied by UFZ scientists in order to identify the causes of increasing brown colouration.

Credit: UFZ / André Künzelmann

Due to the burning of biomass and fossil fuels and, above all, due to agriculture, excessive quantities of reactive nitrogen are still being released into the atmosphere, soil and water — with negative effects on biodiversity, the climate and human health. However, a differentiated analysis of nitrogen input pathways from the different sources reveals significant differences. While nitrogen inputs into soils — primarily due to agriculture — have elevated nitrate concentrations in the groundwater of many regions to values above the threshold of 50 mg per litre, atmospheric pollution is decreasing in large parts of Europe and North America due to emission-reducing measures. This means that less nitrogen is released into soils and water via atmospheric depositions. Long-term measurements over the past 20 years clearly indicate that this is the case in Germany: On average 35 mg less atmospheric nitrogen was released into the soils per square metre per year. According to studies conducted by UFZ scientists, this leads to 0.08 mg less nitrate per litre per year entering streams and drinking water reservoirs. “It does not sound like much, but in a number of natural areas not or hardly impacted by industry and agriculture, pre-industrial conditions will set in over time,” says UFZ hydrogeologist Dr. Andreas Musolff. “At less than 6 mg of nitrate per litre of water in some cases, conditions are far from the problematic nitrate concentrations measured in regions heavily impacted by industry or agriculture.”

That this positive development can also have negative implications became apparent when scientists started studying the causes of a brown colouration of water in reservoirs increasingly observable in Germany, northern Europe and North America. This brown colouration is especially problematic for drinking water treatment. In reviewing various hypotheses, they noted that the brown colouration of the water was strongly correlated with the decreasing concentrations of nitrate in the riparian soils surrounding the tributary streams of the reservoirs. This is due to the fact that the presence of nitrate in the riparian wetlands where most of the stream flow is generated ensures that carbon, phosphate and various metals remain bound to oxidised iron. Lower nitrate levels allow a chemical reduction of iron compounds and thus the mobilisation of previously adsorbed substances. Thus, compounds previously bound to soil particles become mobile and are released into the streams with the rainwater. In the case of carbon this means that the concentration of dissolved organic carbon increases and is visible as the brownish colour of the water. In just under 40 percent of the 110 tributaries of drinking water reservoirs that were studied, the scientists found significantly increased DOC concentrations with an average of 0.12 mg more DOC per litre per year. The most significant increase was found in natural, forested, where nitrate concentrations in the water were less than 6 mg per litre.

In addition to DOC, phosphate concentrations are also increasing significantly in over 30 per cent of the tributaries. The calculated average 7 μg per litre per year tends to favour algae growth and is equally problematic for water quality in the long run. There is evidence that not only DOC and phosphate, but also adsorbed metals such as arsenic, vanadium, zinc and lead are increasingly becoming mobilised. “We solve one problem by making the air cleaner, but in turn create a different problem in other areas,” says biologist Dr. Jörg Tittel, head of the project at the UFZ, explaining the unexpected effect. “None of the dissolved substances is toxic at this low concentration and the substances are also largely removed by water treatment. However, water treatment is becoming more expensive.”

Initial evidence confirming this hypothesis was provided by the evaluation of data collected for a small 1.7 km2 catchment in the Erzgebirge mountains near the Wilzsch, a tributary of the Zwickauer Mulde, which flows into the Carlsfeld reservoir. Thereafter scientists chose a much larger scale, focusing on 110 streams and their catchments entering a total of 36 drinking water reservoirs. Despite a much greater diversity in terms of the size of the streams and their catchments, their topography, the amount of precipitation, land use and the chemical characteristics, the hypothesis could be confirmed based on this much larger data set as well: The observed increase in DOC is closely correlated with the decreasing amount of nitrate in the water.

In the meantime a discussion has begun as to how the results of this meta-analysis can be translated into practical measures to halt the increase in DOC, in partnership with the relevant authorities. “The study helps to focus future research on the relevant processes and to plan appropriate field experiments that further improve the basis for decision-making in terms of concrete measures,” says Andreas Musolff.

###

The research findings were produced within the framework of the project “Pollution of drinking water reservoirs by dissolved organic carbon: outlook, precautions, courses of action (TALKO)”, which up until 2015 was funded with more than one million euros by the Federal Ministry of Education and Research (BMBF). The aim of collaboration between the UFZ, reservoir administrations, water suppliers, public authorities and an engineering office was to find ways to reduce discharges into reservoirs, improve predictions and optimise water treatment technologies.

Viruses revealed to be a major driver of human evolution

Public Release: 13-Jul-2016

“The discovery that this constant battle with viruses has shaped us in every aspect–not just the few proteins that fight infections, but everything–is profound. “

Study tracking protein adaptation over millions of years yields insights relevant to fighting today’s viruses

Genetics Society of America

 

IMAGE

Aminopeptidase N is a protein that acts as a receptor for coronaviruses, the family of viruses behind recent epidemics of SARS and MERS, among others. Researchers found evidence that this protein has adapted repeatedly during mammalian evolution to evade binding by coronaviruses

Credit: Image Courtesy of David Enard

BETHESDA, MD – The constant battle between pathogens and their hosts has long been recognized as a key driver of evolution, but until now scientists have not had the tools to look at these patterns globally across species and genomes. In a new study, researchers apply big-data analysis to reveal the full extent of viruses’ impact on the evolution of humans and other mammals.

Their findings suggest an astonishing 30 percent of all protein adaptations since humans’ divergence with chimpanzees have been driven by viruses.

“When you have a pandemic or an epidemic at some point in evolution, the population that is targeted by the virus either adapts, or goes extinct. We knew that, but what really surprised us is the strength and clarity of the pattern we found,” said David Enard, Ph.D., a postdoctoral fellow at Stanford University and the study’s first author. “This is the first time that viruses have been shown to have such a strong impact on adaptation.”

The study was recently published in the journal eLife and will be presented at The Allied Genetics Conference, a meeting hosted by the Genetics Society of America, on July 14.

Proteins perform a vast array of functions that keep our cells ticking. By revealing how small tweaks in protein shape and composition have helped humans and other mammals respond to viruses, the study could help researchers find new therapeutic leads against today’s viral threats.

“We’re learning which parts of the cell have been used to fight viruses in the past, presumably without detrimental effects on the organism,” said the study’s senior author, Dmitri Petrov, Ph.D., Michelle and Kevin Douglas Professor of Biology and Associate Chair of the Biology Department at Stanford. “That should give us an insight on the pressure points and help us find proteins to investigate for new therapies.”

Previous research on the interactions between viruses and proteins has focused almost exclusively on individual proteins that are directly involved in the immune response–the most logical place you would expect to find adaptations driven by viruses. This is the first study to take a global look at all types of proteins.

“The big advancement here is that it’s not only very specialized immune proteins that adapt against viruses,” said Enard. “Pretty much any type of protein that comes into contact with viruses can participate in the adaptation against viruses. It turns out that there is at least as much adaptation outside of the immune response as within it.”

The team’s first step was to identify all the proteins that are known to physically interact with viruses. After painstakingly reviewing tens of thousands of scientific abstracts, Enard culled the list to about 1,300 proteins of interest. His next step was to build big-data algorithms to scour genomic databases and compare the evolution of virus-interacting proteins to that of other proteins.

The results revealed that adaptations have occurred three times as frequently in virus-interacting proteins compared with other proteins.

“We’re all interested in how it is that we and other organisms have evolved, and in the pressures that made us what we are,” said Petrov. “The discovery that this constant battle with viruses has shaped us in every aspect–not just the few proteins that fight infections, but everything–is profound. All organisms have been living with viruses for billions of years; this work shows that those interactions have affected every part of the cell.”

Viruses hijack nearly every function of a host organism’s cells in order to replicate and spread, so it makes sense that they would drive the evolution of the cellular machinery to a greater extent than other evolutionary pressures such as predation or environmental conditions. The study sheds light on some longstanding biological mysteries, such as why closely-related species have evolved different machinery to perform identical cellular functions, like DNA replication or the production of membranes. Researchers previously did not know what evolutionary force could have caused such changes. “This paper is the first with data that is large enough and clean enough to explain a lot of these puzzles in one fell swoop,” said Petrov.

The team is now using the findings to dig deeper into past viral epidemics, hoping for insights to help fight disease today. For example, HIV-like viruses have swept through the populations of our ancestors as well as other animal species at multiple points throughout evolutionary history. Looking at the effects of such viruses on specific populations could yield a new understanding of our constant war with viruses–and how we might win the next big battle.

###

This study will be presented on Thursday, July 14 from 11:15 – 11:30 a.m. during the Natural Selection and Adaptation session, Crystal Ballroom J1 as part of The Allied Genetics Conference, Orlando World Center Marriott, Orlando, Florida.

CITATION

Viruses are a dominant driver of protein adaptation in mammals

David Enard, Le Cai, Carina Gwennap, Dmitri A Petrov

eLife May 2016 5:e12469 doi: http://dx.doi.org/10.7554/eLife.12469

https://elifesciences.org/content/5/e12469

FUNDING

This work is funded by NIH grants R01GM089926 and R01GM097415.

About the Genetics Society of America (GSA)

Founded in 1931, the Genetics Society of America (GSA) is the professional scientific society for genetics researchers and educators. The Society’s more than 5,000 members worldwide work to deepen our understanding of the living world by advancing the field of genetics, from the molecular to the population level. GSA promotes research and fosters communication through a number of GSA-sponsored conferences including regular meetings that focus on particular model organisms. GSA publishes two peer-reviewed, peer-edited scholarly journals: GENETICS, which has published high quality original research across the breadth of the field since 1916, and G3: Genes|Genomes|Genetics, an open-access journal launched in 2011 to disseminate high quality foundational research in genetics and genomics. The Society also has a deep commitment to education and fostering the next generation of scholars in the field. For more information about GSA, please visit http://www.genetics-gsa.org.

USF professor: No association between ‘bad cholesterol’ and elderly deaths

“older people with high levels of a certain type of cholesterol, known as low-density lipoprotein (LDL-C), live as long, and often longer, than their peers with low levels of this same cholesterol. “

Public Release: 27-Jun-2016

 

Systematic review of studies of over 68,000 elderly people also raises questions about the benefits of statin drug treatments

University of South Florida (USF Innovation)

TAMPA, Fla. (June 27, 2016) – A University of South Florida professor and an international team of experts have found that older people with high levels of a certain type of cholesterol, known as low-density lipoprotein (LDL-C), live as long, and often longer, than their peers with low levels of this same cholesterol.

The findings, which came after analyzing past studies involving more than 68,000 participants over 60 years of age, call into question the “cholesterol hypothesis,” which previously suggested people with high cholesterol are more at risk of dying and would need statin drugs to lower their cholesterol.

Appearing online this month in the open access version of the British Medical Journal, the research team’s analysis represents the first review of a large group of prior studies on this issue.

“We have known for decades that high total cholesterol becomes a much weaker risk for cardiovascular disease with advancing age,” said Diamond. “In this analysis, we focused on the so-called “bad cholesterol” which has been blamed for contributing to heart disease.”

According to the authors, either a lack of association or an inverse relationship between LDL-C and cardiovascular deaths was present in each of the studies they evaluated. Subsequently, the research team called for a reevaluation of the need for drugs, such as statins, which are aimed at reducing LDL-C as a step to prevent cardiovascular diseases.

“We found that several studies reported not only a lack of association between low LDL-C, but most people in these studies exhibited an inverse relationship, which means that higher LDL-C among the elderly is often associated with longer life,” said Diamond.

Diamond also points out the research that suggests that high cholesterol may be protective against diseases which are common in the elderly. For example, high levels of cholesterol are associated with a lower rate of neurological disorders, such as Parkinson’s disease and Alzheimer’s disease. Other studies have suggested that high LDL-C may protect against some often fatal diseases, such as cancer and infectious diseases, and that having low LDL-C may increase one’s susceptibility to these diseases.

“Our results pose several relevant questions for future,” said study leader and co-author health researcher Dr. Uffe Ravnskov. “For example, why is total cholesterol a factor for cardiovascular disease for young and middle-age people, but not for the elderly? Why do a substantial number of elderly people with high LDL-C live longer than elderly people with low LDL-C?”

Diamond and colleagues have published a number of studies relating to the use and possible misuse of statins for treating cholesterol. Those studies, including their recent paper published in the medical journal Expert Review of Clinical Pharmacology, which demonstrated that the benefits of taking statins have been exaggerated and are misleading.

“Our findings provide a contradiction to the cholesterol hypothesis,” concluded Diamond. “That hypothesis predicts that cardiovascular disease starts in middle age as a result of high LDL-C cholesterol, worsens with aging, and eventually leads to death from cardiovascular disease. We did not find that trend. If LDL-C is accumulating in arteries over a lifetime to cause heart disease, then why is it that elderly people with the highest LDL-C live the longest? Since people over the age of 60 with high LDL-C live the longest, why should we lower it?”

###

The international team of researchers who carried out this analysis included: Uffe Ravnskov, David Diamond, Rokura Hama, Tomohito Hamazaki, Björn Hammarskjöld, Niamh Hynes, Malcolm Kendrick, Peter H. Langsjoen, Aseem Malhotra, Luca Mascitelli, Kilmer S. McCully, Yoichi Ogushi, Harumi Okuyama, Paul J. Rosch, Tore Schersten, Sherif Sultan and Ralf Sundberg.

The University of South Florida is a high-impact, global research university dedicated to student success. USF is a Top 25 research university among public institutions nationwide in total research expenditures, according to the National Science Foundation. Serving over 48,000 students, the USF System has an annual budget of $1.6 billion and an annual economic impact of $4.4 billion. USF is a member of the American Athletic Conference.

– USF –

Fish can recognize human faces, new research shows

Public Release: 7-Jun-2016

 

University of Oxford

A species of tropical fish has been shown to be able to distinguish between human faces. It is the first time fish have demonstrated this ability.

The research, carried out by a team of scientists from the University of Oxford (UK) and the University of Queensland (Australia), found that archerfish were able to learn and recognize faces with a high degree of accuracy — an impressive feat, given this task requires sophisticated visual recognition capabilities.

The study is published in the journal Scientific Reports.

First author Dr Cait Newport, Marie Curie Research Fellow in the Department of Zoology at Oxford University, said: ‘Being able to distinguish between a large number of human faces is a surprisingly difficult task, mainly due to the fact that all human faces share the same basic features. All faces have two eyes above a nose and mouth, therefore to tell people apart we must be able to identify subtle differences in their features. If you consider the similarities in appearance between some family members, this task can be very difficult indeed.

‘It has been hypothesized that this task is so difficult that it can only be accomplished by primates, which have a large and complex brain. The fact that the human brain has a specialized region used for recognizing human faces suggests that there may be something special about faces themselves. To test this idea, we wanted to determine if another animal with a smaller and simpler brain, and with no evolutionary need to recognize human faces, was still able to do so.’

The researchers found that fish, which lack the sophisticated visual cortex of primates, are nevertheless capable of discriminating one face from up to 44 new faces. The research provides evidence that fish (vertebrates lacking a major part of the brain called the neocortex) have impressive visual discrimination abilities.

In the study, archerfish — a species of tropical fish well known for its ability to spit jets of water to knock down aerial prey – were presented with two images of human faces and trained to choose one of them using their jets. The fish were then presented with the learned face and a series of new faces and were able to correctly choose the face they had initially learned to recognize. They were able to do this task even when more obvious features, such as head shape and colour, were removed from the images.

The fish were highly accurate when selecting the correct face, reaching an average peak performance of 81% in the first experiment (picking the previously learned face from 44 new faces) and 86% in second experiment (in which facial features such as brightness and colour were standardized).

Dr Newport said: ‘Fish have a simpler brain than humans and entirely lack the section of the brain that humans use for recognizing faces. Despite this, many fish demonstrate impressive visual behaviours and therefore make the perfect subjects to test whether simple brains can complete complicated tasks.

‘Archerfish are a species of tropical freshwater fish that spit a jet of water from their mouth to knock down insects in branches above the water. We positioned a computer monitor that showed images of human faces above the aquariums and trained them to spit at a particular face. Once the fish had learned to recognize a face, we then showed them the same face, as well as a series of new ones.

‘In all cases, the fish continued to spit at the face they had been trained to recognize, proving that they were capable of telling the two apart. Even when we did this with faces that were potentially more difficult because they were in black and white and the head shape was standardized, the fish were still capable of finding the face they were trained to recognize.

‘The fact that archerfish can learn this task suggests that complicated brains are not necessarily needed to recognize human faces. Humans may have special facial recognition brain structures so that they can process a large number of faces very quickly or under a wide range of viewing conditions.’

Human facial recognition has previously been demonstrated in birds. However, unlike fish, they are now known to possess neocortex-like structures. Additionally, fish are unlikely to have evolved the ability to distinguish between human faces.

Body-worn cameras associated with increased assaults against police, and increase in use-of-force if officers choose when to activate cameras

Public Release: 16-May-2016

 

University of Cambridge

New evidence from the largest-yet series of experiments on use of body-worn cameras by police has revealed that rates of assault against police by members of the public actually increased when officers wore the cameras.

The research also found that on average across all officer-hours studied, and contrary to current thinking, the rate of use-of-force by police on citizens was unchanged by the presence of body-worn cameras, but a deeper analysis of the data showed that this finding varied depending on whether or not officers chose when to turn cameras on.

If officers turned cameras on and off during their shift then use-of-force increased, whereas if they kept the cameras rolling for their whole shift, use-of-force decreased.

The findings are released today across two articles published in the European Journal of Criminology and the Journal of Experimental Criminology.

While researchers describe these findings as unexpected, they also urge caution as the work is ongoing, and say these early results demand further scrutiny. However, gathering evidence for what works in policing is vital, they say.

“At present, there is a worldwide uncontrolled social experiment taking place – underpinned by feverish public debate and billions of dollars of government expenditure. Robust evidence is only just keeping pace with the adoption of new technology,” write criminologists from the University of Cambridge and RAND Europe, who conducted the study.

For the latest findings, researchers worked with eight police forces across the UK and US – including West Midlands, Cambridgeshire and Northern Ireland’s PSNI, as well as Ventura, California and Rialto, California PDs in the United States – to conduct ten randomised-controlled trials.

Over the ten trials, the research team found that rates of assault against officers wearing cameras on their shift were an average of 15% higher, compared to shifts without cameras.

The researchers say this could be due to officers feeling more able to report assaults once they are captured on camera – providing them the impetus and/or confidence to do so.

The monitoring by camera also may make officers less assertive and more vulnerable to assault. However, the researchers point out these are just possible explanations, and much more work is needed to unpick the reasons behind these surprising findings.

In the experimental design, the shift patterns of 2,122 participating officers across the forces were split at random between those allocated a camera and those without a camera. A total of 2.2 million officer-hours policing a total population of more than 2 million citizens were covered in the study.

The researchers set out a protocol for officers allocated cameras during the trials: record all stages of every police-public interaction, and issue a warning of filming at the outset. However, many officers preferred to use their discretion, activating cameras depending on the situation.

Researchers found that during shifts with cameras in which officers stuck closer to the protocol, police use-of-force fell by 37% over camera-free shifts. During shifts in which officers tended to use their discretion, police use-of-force actually rose 71% over camera-free shifts.

“The combination of the camera plus the early warning creates awareness that the encounter is being filmed, modifying the behaviour of all involved,” said principle investigator Barak Ariel from the University of Cambridge’s Institute of Criminology.

“If an officer decides to announce mid-interaction they are beginning to film, for example, that could provoke a reaction that results in use-of-force,” Ariel said. “Our data suggests this could be what is driving the results.”

The new results are the latest to come from the research team since their ground-breaking work reporting the first experimental evidence on body-worn cameras with Rialto PD in California – a study widely-cited as part of the rationale for huge investment in this policing technology.

“With so much at stake, these findings must continue to be scrutinised through further research and more studies. In the meantime, it’s clear that more training and engagement with police officers are required to ensure they are confident in the decisions they make while wearing cameras, and are safe in their job,” said co-author and RAND Europe researcher Alex Sutherland.

Ariel added: “It may be that in some places it’s a bad idea to use body-worn cameras, and the only way you can find that out is to keep doing these tests in different kinds of places. After all, what might work for a sheriff’s department in Iowa may not necessarily apply to the Tokyo PD.”

‘A pretend scientist in a bow tie’: Climate-change denying Weather Channel founder SLAMS Bill Nye for criticizing skeptical film ‘Climate Hustle’

A pretend scientist in a bow tie’: Climate-change denying Weather Channel founder SLAMS Bill Nye for criticizing skeptical film ‘Climate Hustle’

  • John Coleman, Weather Channel founder, attacked TV star Nye Friday
  • Nye had criticized ‘Climate Hustle,’ which denies man-made climate change
  • ‘It’s not in the world’s interest,’ Nye said of the documentary film
  • Coleman, who introduces the movie was ‘offended’ that Nye is so popular
  • Nye has a BS in Mechanical Engineering, but is not a scientist

By JAMES WILKINSON FOR DAILYMAIL.COM

PUBLISHED: 09:45 EST, 30 April 2016 | UPDATED: 12:20 EST, 30 April 2016

Weather Channel founder John Coleman has attacked Bill Nye ‘the science guy’ for raining on his parade ahead of the release of new climate-change denying film ‘Climate Hustle.’

Coleman, a meteorologist of 60 years who will introduce the film on its one-night-only screening on May 2, objected to Nye’s remark that the film is ‘very much not in our national interest and the world’s interest.’

‘I have always been amazed that anyone would pay attention to Bill Nye, a pretend scientist in a bow tie,’ Coleman said Friday, according to Climate Depot.

Upset: Weather Channel founder John Coleman (left) attacked ‘pretend scientist’ Bill Nye (right) after the TV ‘science guy’ criticized ‘Climate Hustle,’ a climate-change-skeptical film hosted by Coleman

CFact’s Marc Morano interviews Bill Nye on jailing skeptics

Coleman continued: ‘As a man who has studied the science of meteorology for over 60 years and received the AMS [American Meteorological Society’s] “Meteorologist of the Year” award, I am totally offended that Nye gets the press and media attention he does.’

The film, marketed with the tagline ‘Are they trying to control the climate… or you?’ suggests that man-made global warming is a hoax.

Its website says that an ‘overheated environmental con job [is] being used to push for increased government regulations and a new “Green” energy agenda.’

Nye, a children’s TV host who has made a career out of presenting science-themed shows, is highly critical of climate change deniers.

When asked by the filmmakers whether climate-change deniers should be jailed, Nye replied, ‘Well, we’ll see what happens. Was it appropriate to jail the guys from ENRON?’

He continued: ‘Was it appropriate to jail people from the cigarette industry who insisted that this addictive product was not addictive and so on?’

Although Nye stopped short of actually recommending jail terms, Coleman – the original weatherman on ABC’s ‘Good Morning America’ – said: ‘That is the most awful thing since Galileo was jailed for saying the Earth was not the center of the Universe.’

‘In 20 or 30 years, when Nye is an old man, he will realize how wrong he was as the Earth continues to be a just a great place to live.’

Engineer: Nye (pictured) is not actually a scientist – has has a BC in Mechanical Engineering – but his views align with those of most climate experts. Coleman says he is ‘offended’ by how much attention Nye gets

Although Nye is not a professional scientist – he has a Bachelor of Science in Mechanical Engineering – his beliefs are in line with the majority of climate scientists, who agree that climate change is occurring due to man-made pollution.

And Coleman’s belief that man-made climate change is not happening is in line with the majority of meteorologists – 64 percent of them in 2013, according to a study by social scientists from George Mason University, the AMS and Yale University.

That’s because most meteorologists are not climate researchers or experts, nor do they have to perform climate research as part of their jobs, The Guardian reported.

Weather forecasters were specifically named as an example of AMS members who do not have to do any climate research as part of their job.

‘Climate Hustle’ will have a one-night-only screening, with an introduction by Coleman and followed by a discussion panel including Bill Nye, at select theaters on May 2.

Read more: http://www.dailymail.co.uk/news/article-3567104/Climate-change-denying-Weather-Channel-founder-SLAMS-Bill-Nye-criticizing-skeptical-film-Climate-Hustle.html#ixzz47N4ndpc8
Follow us: @MailOnline on Twitter | DailyMail on Facebook

Carbon dioxide fertilization greening Earth, study finds

 

NASA/Goddard Space Flight Center

 

IMAGE

IMAGE: This image shows the change in leaf area across the globe from 1982-2015.

Credit: Credits: Boston University/R. Myneni

From a quarter to half of Earth’s vegetated lands has shown significant greening over the last 35 years largely due to rising levels of atmospheric carbon dioxide, according to a new study published in the journal Nature Climate Change on April 25.

An international team of 32 authors from 24 institutions in eight countries led the effort, which involved using satellite data from NASA’s Moderate Resolution Imaging Spectrometer and the National Oceanic and Atmospheric Administration’s Advanced Very High Resolution Radiometer instruments to help determine the leaf area index, or amount of leaf cover, over the planet’s vegetated regions. The greening represents an increase in leaves on plants and trees equivalent in area to two times the continental United States.

Green leaves use energy from sunlight through photosynthesis to chemically combine carbon dioxide drawn in from the air with water and nutrients tapped from the ground to produce sugars, which are the main source of food, fiber and fuel for life on Earth. Studies have shown that increased concentrations of carbon dioxide increase photosynthesis, spurring plant growth.

However, carbon dioxide fertilization isn’t the only cause of increased plant growth–nitrogen, land cover change and climate change by way of global temperature, precipitation and sunlight changes all contribute to the greening effect. To determine the extent of carbon dioxide’s contribution, researchers ran the data for carbon dioxide and each of the other variables in isolation through several computer models that mimic the plant growth observed in the satellite data.

Results showed that carbon dioxide fertilization explains 70 percent of the greening effect, said co-author Ranga Myneni, a professor in the Department of Earth and Environment at Boston University. “The second most important driver is nitrogen, at 9 percent. So we see what an outsized role CO2 plays in this process.”

About 85 percent of Earth’s ice-free lands is covered by vegetation. The area covered by all the green leaves on Earth is equal to, on average, 32 percent of Earth’s total surface area – oceans, lands and permanent ice sheets combined. The extent of the greening over the past 35 years “has the ability to fundamentally change the cycling of water and carbon in the climate system,” said lead author Zaichun Zhu, a researcher from Peking University, China, who did the first half of this study with Myneni as a visiting scholar at Boston University.

Every year, about half of the 10 billion tons of carbon emitted into the atmosphere from human activities remains temporarily stored, in about equal parts, in the oceans and plants. “While our study did not address the connection between greening and carbon storage in plants, other studies have reported an increasing carbon sink on land since the 1980s, which is entirely consistent with the idea of a greening Earth,” said co-author Shilong Piao of the College of Urban and Environmental Sciences at Peking University.

While rising carbon dioxide concentrations in the air can be beneficial for plants, it is also the chief culprit of climate change. The gas, which traps heat in Earth’s atmosphere, has been increasing since the industrial age due to the burning of oil, gas, coal and wood for energy and is continuing to reach concentrations not seen in at least 500,000 years. The impacts of climate change include global warming, rising sea levels, melting glaciers and sea ice as well as more severe weather events.

The beneficial impacts of carbon dioxide on plants may also be limited, said co-author Dr. Philippe Ciais, associate director of the Laboratory of Climate and Environmental Sciences, Gif-suv-Yvette, France. “Studies have shown that plants acclimatize, or adjust, to rising carbon dioxide concentration and the fertilization effect diminishes over time.”

“While the detection of greening is based on data, the attribution to various drivers is based on models,” said co-author Josep Canadell of the Oceans and Atmosphere Division in the Commonwealth Scientific and Industrial Research Organisation in Canberra, Australia. Canadell added that while the models represent the best possible simulation of Earth system components, they are continually being improved.

###

Read the paper at Nature Climate Change.

http://www.nature.com/nclimate/journal/vaop/ncurrent/full/nclimate3004.html

For more information about NASA’s Earth science activities, visit:

http://www.nasa.gov/earthrightnow

The United States absorbed carbon dioxide despite a drought

Public Release: 25-Apr-2016

“The researchers found that the warm spring caused trees, grasses and crops to start growing earlier in the year. The ecosystems thus absorbed more carbon from the air than during a «normal» spring.”

ETH Zurich

In the US, spring 2012 was the warmest on record. The subsequent summer was dryer and hotter than any summer since the 1930s, a period that became known in the history books as the ‘Dust Bowl’. In 2012, drought and heat afflicted almost the entire contiguous United States.

Climate researchers suspected that this summer drought four years ago could turn the contiguous United States into a carbon source, as was the case in Europe during the hot summer of 2003. During a normal year, ecosystems take up more carbon from the air than they release. They therefore act as a carbon sink – an effect that plays an important role for the world’s climate.

This is because plants take up carbon dioxide (CO2) for growth during photosynthesis and then store it in the form of biomass and in the soil. Through this mechanism, ecosystems compensate for a third of the anthropogenic CO2 emissions.

A team of researchers from the US, Australia, the Netherlands and ETH Zurich have now shown that the contiguous United States remained a carbon sink in 2012, despite the drought. The study has just been published in the journal PNAS.

The researchers found that the warm spring caused trees, grasses and crops to start growing earlier in the year. The ecosystems thus absorbed more carbon from the air than during a «normal» spring.

However, during the subsequent drought, the ecosystems absorbed less carbon than usual, as the plants reduced growth due to the dry and hot conditions. Nevertheless, the overall carbon balance remained positive. “The increase in carbon uptake during the warm spring compensated for the reductions in uptake during the drought,” says ETH researcher Sebastian Wolf, who led the study.

Grasslands release CO2 during drought

Across the entire contiguous United States, the extensive forests of the Appalachians were a particularly effective carbon sink. These forests absorbed additional carbon during spring, and remained largely unaffected by drought during the summer months. On the other hand, the grasslands of the Midwest also absorbed more carbon during the warm spring but their uptake was substantially reduced during the summer drought, as the vegetation became senescent.

In addition, the scientists found indications that the drought and heat in summer 2012 in the US were probably intensified by a feedback mechanism from the warm spring: as the plants started growing earlier, they also depleted their soil water resources earlier in the year. The ecosystems were thus more susceptible to the drought in summer. And due to the water limitations, plants were forced to close their stomata sooner.

As long as plants have sufficient water, they keep the stomata on their leaves open to exchange CO2, water vapour and oxygen with the atmosphere. When water becomes scarce, they close their stomata and thus evaporate less water vapour. The missing effect of evaporative cooling then intensifies the heat and therefore the stress for the plants.

A unique combination of measured data

Wolf and his colleagues combined various sources of data for their analysis. The study incorporated measurements of environmental conditions from 22 locations across the US. The researchers used special towers to take continuous measurements for at least five years; these measurements included temperature, soil moisture, precipitation, and the exchange of carbon dioxide and water between the ecosystems and the atmosphere.

The scientists also used measurements from the satellite platform MODIS to determine the CO2 uptake of vegetation across the entire US. In addition, measurements of CO2 concentrations from tall towers with a height of up to 300 metres were combined with models in order to estimate the CO2 uptake from an atmospheric perspective.

Through a clever combination of these different datasets, the researchers were able to calculate the carbon exchange for the entire contiguous United States during 2012.

###

Reference

Wolf S, Keenan TF, Fisher JB, Baldocchi DD, Desai AR, Richardson AD, Scott RL, Law BE, Litvak ME, Brunsell NA, Peters W, van der Laan-Luijkx IT (2016) Warm spring reduced carbon cycle impact of the 2012 US summer drought. Proceedings of the National Academy of Sciences. DOI 10.1073/pnas.1519620113

Children of older mothers do better

Public Release: 12-Apr-2016

 

The benefits associated with being born in a later year outweigh the biological risks associated with being born to an older mother

Max-Planck-Gesellschaft

Children of older mothers are healthier, taller and obtain more education than the children of younger mothers. The reason is that in industrialized countries educational opportunities are increasing, and people are getting healthier by the year. In other words, it pays off to be born later.

Most previous research suggests that the older women are when they give birth, the greater the health risks are for their children. Childbearing at older ages is understood to increase the risk of negative pregnancy outcomes such as Down syndrome, as well as increase the risk that the children will develop Alzheimer’s disease, hypertension, and diabetes later in life.

However, despite the risks associated with delaying childbearing, children may also benefit from mothers delaying childbearing to older ages. These are the findings from a new study conducted by Mikko Myrskylä, the director of the Max Planck Institute for Demographic Research (MPIDR),) and his colleague Kieron Barclay at the London School of Economics, that has been published today in Population and Development Review.

Both public health and social conditions have been improving over time in many countries. Previous research on the relationship between maternal age and child outcomes has ignored the importance of these macro-level environmental changes over time. From the perspective of any individual parent, delaying childbearing means having a child with a later birth year. For example, a ten-year difference in maternal age is accompanied by a decade of changes to social and environmental conditions. Taking this perspective, this new MPIDR-study shows that when women delay childbearing to older ages their children are healthier, taller, and more highly educated. It shows that despite the risks associated with childbearing at older ages, which are attributable to aging of the reproductive system, these risks are either counterbalanced, or outweighed, by the positive changes to the environment in the period during which the mother delayed her childbearing.

For example, a woman born in 1950 who had a child at the age of 20 would have given birth in 1970. If that same woman had a child at 40, she would have given birth in 1990. “Those twenty years make a huge difference,” explains Mikko Myrskylä. A child born in 1990, for example, had a much higher probability of going to a college or university than somebody born 20 years earlier.

Barclay and Myrskylä used data from over 1.5 million Swedish men and women born between 1960 and 1991 to examine the relationship between maternal age at the time of birth, and height, physical fitness, grades in high school, and educational attainment of the children. Physical fitness and height are good proxies for overall health, and educational attainment is a key determinant of occupational achievement and lifetime opportunities.

They found that when mothers delayed childbearing to older ages, even as old as 40 or older, they had children who were taller, had better grades in high school, and were more likely to go to university. For example, comparing two siblings born to the same mother decades apart, on average the child born when the mother was in her early 40s spends more than a year longer in the educational system than his or her sibling born when the mother was in her early 20s.

In their statistical analyses, Barclay and Myrskylä compared siblings who share the same biological mother and father. Siblings share 50% of their genes, and also grow up in the same household environment with the same parents. “By comparing siblings who grew up in the same family it was possible for us to pinpoint the importance of maternal age at the time of birth independent of the influence of other factors that might bias the results” said Kieron Barclay.

“The benefits associated with being born in a later year outweigh the individual risk factors arising from being born to an older mother. We need to develop a different perspective on advanced maternal age. Expectant parents are typically well aware of the risks associated with late pregnancy, but they are less aware of the positive effects” said Myrskylä.

###

Original publication

Kieron Barclay and Mikko Myrskylä
Advanced Maternal Age and Offspring Outcomes: Causal Effects and Countervailing Period Trends
Population and Development Review
Article first published online: 8 APR 2016
DOI: 10.1111/j.1728-4457.2016.00105.x

Why do sunbathers live longer than those who avoid the sun?

“We found smokers in the highest sun exposure group were at a similar risk as non-smokers avoiding sun exposure”

Public Release: 22-Mar-2016

 

Wiley

New research looks into the paradox that women who sunbathe are likely to live longer than those who avoid the sun, even though sunbathers are at an increased risk of developing skin cancer.

An analysis of information on 29,518 Swedish women who were followed for 20 years revealed that longer life expectancy among women with active sun exposure habits was related to a decrease in heart disease and noncancer/non-heart disease deaths, causing the relative contribution of death due to cancer to increase.

Whether the positive effect of sun exposure demonstrated in this observational study is mediated by vitamin D, another mechanism related to UV radiation, or by unmeasured bias cannot be determined. Therefore, additional research is warranted.

“We found smokers in the highest sun exposure group were at a similar risk as non-smokers avoiding sun exposure, indicating avoidance of sun exposure to be a risk factor of the same magnitude as smoking,” said Dr. Pelle Lindqvist, lead author of the Journal of Internal Medicine study. “Guidelines being too restrictive regarding sun exposure may do more harm than good for health.”

###

About Wiley

Wiley is a global provider of knowledge and knowledge-enabled services that improve outcomes in areas of research, professional practice and education. Through the Research segment, the Company provides digital and print scientific, technical, medical, and scholarly journals, reference works, books, database services, and advertising. The Professional Development segment provides digital and print books, online assessment and training services, and test prep and certification. In Education, Wiley provides education solutions including online program management services for higher education institutions and course management tools for instructors and students, as well as print and digital content. The Company’s website can be accessed at http://www.wiley.com.

Basic science disappearing from medical journals, study finds

Public Release: 1-Feb-2016

 

Decline could affect physicians’ understanding of and interest in the basic mechanisms of disease and treatments

St. Michael’s Hospital

TORONTO, Feb. 1, 2016–A new study has found a steep decline in the number of scholarly papers about basic science published in leading medical journals in the last 20 years.

“This rapid decline in basic science publications is likely to affect physicians’ understanding of and interest in the basic mechanisms of disease and treatments,” warned Dr. Warren Lee, lead author of the study published in the February issue of the FASEB Journal, one of the world’s most-cited biology journals.

“If the decline continues, could basic science actually disappear from the pages of specialty medical journals?” asked Dr. Lee, a critical care physician at St. Michael’s Hospital and a scientist in its Keenan Research Centre for Biomedical Science.

Basic science is research that examines cells and molecules to better understand the causes and mechanisms of disease. It differs from clinical research, which includes clinical trials of drugs and epidemiological studies that review information from charts and health databases.

Dr. Lee and his team did a search on Pubmed, the main database of medical research, to identify articles on basic science published from 1994 to 2013 in the highest-impact journals in cardiology, endocrinology, gastroenterology, infectious diseases, nephrology, neurology, oncology and pulmonology.

While there was no decline in two of the journals, Diabetes Care and the Journal of the American Society of Nephrology, in the remaining six journals, the amount of basic science fell by 40 to 60 percent.

In contrast, there was no decline in the number of basic science articles published in three well-known, non-clinical journals dealing with biological sciences, which Dr. Lee also surveyed– the Journal of Biological Chemistry, the Journal of Clinical Investigation and Cell.

Dr. Lee said the reasons for the decline in the coverage of basic science articles by medical journals are unclear, but it may be due in part to the fact that papers about clinical research are cited by other researchers more often. The number of times a paper is cited contributes to a journal’s “impact factor,” which indicates its relative importance.

He said the fading of basic science from medical journals also parallels the rise of other forms of research by clinicians, such as epidemiology and more recently medical education, quality of care, and ethics.

“The decline of basic science in these journals worries me, because if medical residents and clinicians are never exposed to basic science, they are going to think that it’s unimportant or irrelevant,” Dr. Lee said. “And it has become a bit of a vicious cycle. If residents think that basic science research is irrelevant, they won’t consider pursuing it as part of their training or their career. Ironically, scientific advances mean that we are on the threshold of what has been called “precision” or “personalized medicine”, where doctors will be able to understand exactly what is wrong with each patient and tailor the therapy accordingly. But all of that depends on understanding the underlying science behind the disease. Scientific discovery forms the underpinning of medical advances, and clinicians and medical students need to be part of that.”

###

About St. Michael’s Hospital

St. Michael’s Hospital provides compassionate care to all who enter its doors. The hospital also provides outstanding medical education to future health care professionals in more than 23 academic disciplines. Critical care and trauma, heart disease, neurosurgery, diabetes, cancer care, and care of the homeless are among the Hospital’s recognized areas of expertise. Through the Keenan Research Centre and the Li Ka Shing International Healthcare Education Center, which make up the Li Ki Sheng Knowledge Institute, research and education at St. Michael’s Hospital are recognized and make an impact around the world. Founded in 1892, the hospital is fully affiliated with the University of Toronto.

Media contact:

For more information or to arrange an interview with Dr. Lee, please contact:
Rebecca Goss
Communications and Public Affairs
416-864-6060 Ext. 7178
gossr@smh.ca

Or

Kendra Stephenson
Communications Adviser – Media
Communications & Public Affairs
416-864-5047
stephensonk@smh.ca

Antidepressants double the risk of aggression and suicide in children

Public Release: 27-Jan-2016

 

BMJ

Children and adolescents have a doubled risk of aggression and suicide when taking one of the five most commonly prescribed antidepressants, according to findings of a study published in The BMJ today.

However, the true risk for all associated serious harms–such as deaths, aggression, akathisia and suicidal thoughts and attempts–remains unknown for children, adolescents and adults, say experts.

This is because of the poor design of clinical trials that assess these antidepressants, and the misreporting of findings in published articles.

Selective serotonin reuptake inhibitors antidepressants (SSRIs) and serotonin-norepinephrine reuptake inhibitors (SNRIs) are the most commonly prescribed drugs for depression.

A team of researchers from Denmark carried out a systematic review and meta-analysis of 68 clinical study reports of 70 trials with 18,526 patients to examine use of antidepressants and associated serious harms.

These included deaths, suicidal thoughts and attempts as well as aggression and akathisia, a form of restlessness that may increase suicide and violence.

They examined double blind placebo controlled trials that contained patient narratives or individual patient listings of associated harms.

Harms associated with antidepressants are often not included in published trial reports, explain the authors. This is why they analysed clinical study reports, prepared by pharmaceutical companies for market authorisation, and summary trial reports, both of which usually include more information.

In adults, they found no significant associations between antidepressants and suicide and aggression. However, a novel finding showed there was a doubling of risk for aggression and suicides in children and adolescents.

This study has shown limitations in trials, not only in design, but also in reporting of clinical study reports, which may have lead to “serious under-estimation of the harms,” write the authors.

They compared the results from the clinical study reports with data from individual patient listings or narratives of adverse effects. This revealed misclassification of deaths and suicidal events in people taking antidepressants.

For example, four deaths were misreported by a pharmaceutical company, in all cases favouring the antidepressant, and more than half of the suicide attempts and suicidal ideation, for example, were coded as “emotional lability” or “worsening of depression.”

In the Eli Lilly summary trial reports, almost all deaths were noted, but suicidal attempts were missing in 90% of instances, and information on other outcomes was incomplete. These were “even more unreliable than we previously suspected,” write the authors.

Clinical study reports for antidepressants duloxetine, fluoxetine, paroxetine, sertraline and venlafaxine were obtained from regulatory agencies in the UK and Europe. Summary trial reports for duloxetine and fluoxetine were taken from the drug company Eli Lilly’s website.

However, clinical study reports could not be obtained for all trials and all antidepressants, and individual listings of adverse outcomes for all patients were available for only 32 trials.

“The true risk for serious harms is still unknown [because] the low incidence of these rare events, and the poor design and reporting of the trials, makes it difficult to get accurate effect estimates,” they explain.

They recommend “minimal use of antidepressants in children, adolescents, and young adults, as the serious harms seem to be greater, and as their effect seems to be below what is clinically relevant,” and suggest alternative treatments such as exercise or psychotherapy.

They also call for the need to identify “hidden information in clinical study reports to form a more accurate view of the benefits and harms of drugs.”

In an accompanying editorial, Joanna Moncrieff from University College London, agrees that “regulators and the public need access to more comprehensive and reliable data”, and that clinical study reports “are likely to underestimate the extent of drug related harms.”

Over half the clinical study reports had no individual patient listings and “this begs the question of how many more adverse events would have been revealed if [these] were available for all trials, and raises concerns why this information is allowed to be withheld.”

Memory capacity of brain is 10 times more than previously thought

Public Release: 20-Jan-2016

Salk Institute

LA JOLLA–Salk researchers and collaborators have achieved critical insight into the size of neural connections, putting the memory capacity of the brain far higher than common estimates. The new work also answers a longstanding question as to how the brain is so energy efficient and could help engineers build computers that are incredibly powerful but also conserve energy.

“This is a real bombshell in the field of neuroscience,” says Terry Sejnowski, Salk professor and co-senior author of the paper, which was published in eLife. “We discovered the key to unlocking the design principle for how hippocampal neurons function with low energy but high computation power. Our new measurements of the brain’s memory capacity increase conservative estimates by a factor of 10 to at least a petabyte, in the same ballpark as the World Wide Web.”

Our memories and thoughts are the result of patterns of electrical and chemical activity in the brain. A key part of the activity happens when branches of neurons, much like electrical wire, interact at certain junctions, known as synapses. An output ‘wire’ (an axon) from one neuron connects to an input ‘wire’ (a dendrite) of a second neuron. Signals travel across the synapse as chemicals called neurotransmitters to tell the receiving neuron whether to convey an electrical signal to other neurons. Each neuron can have thousands of these synapses with thousands of other neurons.

“When we first reconstructed every dendrite, axon, glial process, and synapse from a volume of hippocampus the size of a single red blood cell, we were somewhat bewildered by the complexity and diversity amongst the synapses,” says Kristen Harris, co-senior author of the work and professor of neuroscience at the University of Texas, Austin. “While I had hoped to learn fundamental principles about how the brain is organized from these detailed reconstructions, I have been truly amazed at the precision obtained in the analyses of this report.”

Synapses are still a mystery, though their dysfunction can cause a range of neurological diseases. Larger synapses–with more surface area and vesicles of neurotransmitters–are stronger, making them more likely to activate their surrounding neurons than medium or small synapses.

The Salk team, while building a 3D reconstruction of rat hippocampus tissue (the memory center of the brain), noticed something unusual. In some cases, a single axon from one neuron formed two synapses reaching out to a single dendrite of a second neuron, signifying that the first neuron seemed to be sending a duplicate message to the receiving neuron.

At first, the researchers didn’t think much of this duplicity, which occurs about 10 percent of the time in the hippocampus. But Tom Bartol, a Salk staff scientist, had an idea: if they could measure the difference between two very similar synapses such as these, they might glean insight into synaptic sizes, which so far had only been classified in the field as small, medium and large.

To do this, researchers used advanced microscopy and computational algorithms they had developed to image rat brains and reconstruct the connectivity, shapes, volumes and surface area of the brain tissue down to a nanomolecular level.

The scientists expected the synapses would be roughly similar in size, but were surprised to discover the synapses were nearly identical.

“We were amazed to find that the difference in the sizes of the pairs of synapses were very small, on average, only about eight percent different in size. No one thought it would be such a small difference. This was a curveball from nature,” says Bartol.

Because the memory capacity of neurons is dependent upon synapse size, this eight percent difference turned out to be a key number the team could then plug into their algorithmic models of the brain to measure how much information could potentially be stored in synaptic connections.

It was known before that the range in sizes between the smallest and largest synapses was a factor of 60 and that most are small.

But armed with the knowledge that synapses of all sizes could vary in increments as little as eight percent between sizes within a factor of 60, the team determined there could be about 26 categories of sizes of synapses, rather than just a few.

“Our data suggests there are 10 times more discrete sizes of synapses than previously thought,” says Bartol. In computer terms, 26 sizes of synapses correspond to about 4.7 “bits” of information. Previously, it was thought that the brain was capable of just one to two bits for short and long memory storage in the hippocampus.

“This is roughly an order of magnitude of precision more than anyone has ever imagined,” says Sejnowski.

What makes this precision puzzling is that hippocampal synapses are notoriously unreliable. When a signal travels from one neuron to another, it typically activates that second neuron only 10 to 20 percent of the time.

“We had often wondered how the remarkable precision of the brain can come out of such unreliable synapses,” says Bartol. One answer, it seems, is in the constant adjustment of synapses, averaging out their success and failure rates over time. The team used their new data and a statistical model to find out how many signals it would take a pair of synapses to get to that eight percent difference.

The researchers calculated that for the smallest synapses, about 1,500 events cause a change in their size/ability (20 minutes) and for the largest synapses, only a couple hundred signaling events (1 to 2 minutes) cause a change.

“This means that every 2 or 20 minutes, your synapses are going up or down to the next size. The synapses are adjusting themselves according to the signals they receive,” says Bartol.

“Our prior work had hinted at the possibility that spines and axons that synapse together would be similar in size, but the reality of the precision is truly remarkable and lays the foundation for whole new ways to think about brains and computers,” says Harris. “The work resulting from this collaboration has opened a new chapter in the search for learning and memory mechanisms.” Harris adds that the findings suggest more questions to explore, for example, if similar rules apply for synapses in other regions of the brain and how those rules differ during development and as synapses change during the initial stages of learning.

“The implications of what we found are far-reaching,” adds Sejnowski. “Hidden under the apparent chaos and messiness of the brain is an underlying precision to the size and shapes of synapses that was hidden from us.”

The findings also offer a valuable explanation for the brain’s surprising efficiency. The waking adult brain generates only about 20 watts of continuous power–as much as a very dim light bulb. The Salk discovery could help computer scientists build ultraprecise, but energy-efficient, computers, particularly ones that employ “deep learning” and artificial neural nets–techniques capable of sophisticated learning and analysis, such as speech, object recognition and translation.

“This trick of the brain absolutely points to a way to design better computers,” says Sejnowski. “Using probabilistic transmission turns out to be as accurate and require much less energy for both computers and brains.”

###

Other authors on the paper were Cailey Bromer of the Salk Institute; Justin Kinney of the McGovern Institute for Brain Research; and Michael A. Chirillo and Jennifer N. Bourne of the University of Texas, Austin.

The work was supported by the NIH and the Howard Hughes Medical Institute.

About the Salk Institute for Biological Studies:

The Salk Institute for Biological Studies is one of the world’s preeminent basic research institutions, where internationally renowned faculty probes fundamental life science questions in a unique, collaborative and creative environment. Focused both on discovery and on mentoring future generations of researchers, Salk scientists make groundbreaking contributions to our understanding of cancer, aging, Alzheimer’s, diabetes and infectious diseases by studying neuroscience, genetics, cell and plant biology and related disciplines.

Faculty achievements have been recognized with numerous honors, including Nobel Prizes and memberships in the National Academy of Sciences. Founded in 1960 by polio vaccine pioneer Jonas Salk, MD, the Institute is an independent nonprofit organization and architectural landmark.

Textbook view of how blood is made “doesn’t actually even exist”

“when the team isolated a human blood stem cell in its purest form – as a single stem cell capable of regenerating the entire blood system. “

Public Release: 5-Nov-2015

Stem-cell scientists redefine how blood is made

Toppling conventional ‘textbook’ view from 1960s

University Health Network

(TORONTO, Canada – Nov. 5, 2015) – Stem-cell scientists led by Dr. John Dick have discovered a completely new view of how human blood is made, upending conventional dogma from the 1960s.

The findings, published online today in the journal Science, prove “that the whole classic ‘textbook’ view we thought we knew doesn’t actually even exist,” says principal investigator John Dick, Senior Scientist at Princess Margaret Cancer Centre, University Health Network (UHN), and Professor in the Department of Molecular Genetics, University of Toronto.

“Instead, through a series of experiments we have been able to finally resolve how different kinds of blood cells form quickly from the stem cell – the most potent blood cell in the system – and not further downstream as has been traditionally thought,” says Dr. Dick, who holds a Canada Research Chair in Stem Cell Biology and is also Director of the Cancer Stem Cell Program at the Ontario Institute for Cancer Research.. He talks about the research at https://youtu.be/D08FMKDppVQ .

The research also topples the textbook view that the blood development system is stable once formed. Not so, says Dr. Dick. “Our findings show that the blood system is two-tiered and changes between early human development and adulthood.”

Co-authors Dr. Faiyaz Notta and Dr. Sasan Zandi from the Dick lab write that in redefining the architecture of blood development, the research team mapped the lineage potential of nearly 3,000 single cells from 33 different cell populations of stem and progenitor cells obtained from human blood samples taken at various life stages and ages.

For people with blood disorders and diseases, the potential clinical utility of the findings is significant, unlocking a distinct route to personalizing therapy.

Dr. Dick says: “Our discovery means we will be able to understand far better a wide variety of human blood disorders and diseases – from anemia, where there are not enough blood cells, to leukemia, where there are too many blood cells. Think of it as moving from the old world of black-and-white television into the new world of high definition.”

There are also promising implications for advancing the global quest in regenerative medicine to manufacture mature cell types such as platelets or red blood cells by engineering cells (a process known as inducing pluripotent stem cells), says Dr. Dick, who collaborates closely with Dr. Gordon Keller, Director of UHN’s McEwen Centre for Regenerative Medicine.

“By combining the Keller team’s ability to optimize induced pluripotent stem cells with our newly identified progenitors that give rise only to platelets and red blood cell, we will be able develop better methods to generate these mature cells” he says. Currently, human donors are the sole source of platelets – which cannot be stored or frozen – for transfusions needed by many thousands of patients with cancer and other debilitating disorders.

Today’s discovery builds on Dr. Dick’s break-through research in 2011, also published in Science, when the team isolated a human blood stem cell in its purest form – as a single stem cell capable of regenerating the entire blood system.

“Four years ago, when we isolated the pure stem cell, we realized we had also uncovered populations of stem-cell like ‘daughter’ cells that we thought at the time were other types of stem cells,” says Dr. Dick.

“When we burrowed further to study these ‘daughters’, we discovered they were actually already mature blood lineages. In other words, lineages that had broken off almost immediately from the stem cell compartment and had not developed downstream through the slow, gradual ‘textbook’ process.

“So in human blood formation, everything begins with the stem cell, which is the executive decision-maker quickly driving the process that replenishes blood at a daily rate that exceeds 300 billion cells.”

For 25 years, Dr. Dick’s research has focused on understanding the cellular processes that underlie how normal blood stem cells work to regenerate human blood after transplantation and how blood development goes wrong when leukemia arises. His research follows on the original 1961 discovery of the blood stem cell by Princess Margaret Cancer Centre scientists Dr. James Till and the late Dr. Ernest McCulloch, which formed the basis of all current stem-cell research.

###

The research published today was funded by the Canadian Institutes for Health Research, Canadian Cancer Society, Terry Fox Foundation, Genome Canada through the Ontario Genomics Institute, Ontario Institute of Cancer Research, a Canada Research Chair, the Ontario Ministry of Health and Long-term Care, and The Princess Margaret Cancer Foundation.

About Princess Margaret Cancer Centre, University Health Network

The Princess Margaret Cancer Centre has achieved an international reputation as a global leader in the fight against cancer and delivering personalized cancer medicine. The Princess Margaret, one of the top five international cancer research centres, is a member of the University Health Network, which also includes Toronto General Hospital, Toronto Western Hospital and Toronto Rehabilitation Institute. All are research hospitals affiliated with the University of Toronto. For more information, go to http://www.theprincessmargaret.ca or http://www.uhn.ca .

Death rates, health problems, rise among middle-aged white Americans

Public Release: 2-Nov-2015

Suicide, drug and alcohol poisoning, liver diseases are top causes of death

NIH/National Institute on Aging

Deaths among white U.S. men and women aged 45-54 rose significantly between 1999 and 2013, according to a new analysis. This change reversed decades of progress in mortality and was unique to non-Hispanic whites in the United States. In parallel, morbidity rates increased as well. The study found self-reported declines in health, mental health, and abilities to conduct activities of daily living, accompanied by increases in reports of chronic pain, inability to work, and deterioration of liver function among this group.

Anne Case, Ph.D., and Angus Deaton, Ph.D., of Princeton University, detail these findings in a study published online on November 2, 2015 in the Proceedings of the National Academy of Sciences. The analysis was funded by the National Institute on Aging (NIA), part of the National Institutes of Health.

The three causes of death that accounted for the change in mortality among non-Hispanic whites were suicide, drug and alcohol poisoning, and chronic liver diseases and cirrhosis. The researchers used data from the Centers for Disease Control and Prevention, the U.S. Census Bureau, individual death records, and other sources for their analysis.

From 1978-1998, the mortality rate for middle-aged white Americans fell by an average of two percent per year. This matched the average rate of decline in France, Germany, Canada, Australia, Sweden, and the United Kingdom, as well as the average over all European Union countries. Other rich countries continued to decline at about two percent per year after 1998. The mortality rate for middle-aged people in the United States began to increase by half a percent a year, starting in 1999.

The authors note that the increase in midlife mortality is only partly understood. Increased availability of opioid prescription drugs, chronic pain (for which opioids are often prescribed), and the economic crisis which began in 2008 may all have contributed to an increase in overdoses, suicide, and increased liver disease associated with alcohol abuse. In their discussion, the researchers also noted that the reversal in health trends indicates that today’s middle-aged adults will be entering their senior years and Medicare eligibility in worse health than today’s adults age 65 and older.

###

ARTICLE: “Rising Morbidity and Mortality in Midlife Among White non-Hispanic Americans in the 21st Century” by Anne Case and Angus Deaton. Proceedings of the National Academy of Sciences. Published online on November 2, 2015.

SPOKESPERSONS: John Phillips, Ph.D., chief, Population and Social Processes Branch, and Lis Nielsen, Ph.D., chief, Individual Behavioral Processes Branch, NIA Division of Behavioral and Social Research, are available to discuss the article.

CONTACT: To schedule interviews, contact Barbara Cire in the NIA Office of Communications and Public Liaison, (301) 496-1752, nianews3@mail.nih.gov.

About the National Institute on Aging: The NIA leads the federal government effort conducting and supporting research on aging and the health and well-being of older people. The Institute’s broad scientific program seeks to understand the nature of aging and to extend the healthy, active years of life. For more information on research, aging, and health, go to http://www.nia.nih.gov.

About the National Institutes of Health (NIH): NIH, the nation’s medical research agency, includes 27 Institutes and Centers and is a component of the U.S. Department of Health and Human Services. NIH is the primary federal agency conducting and supporting basic, clinical, and translational medical research, and is investigating the causes, treatments, and cures for both common and rare diseases. For more information about NIH and its programs, visit http://www.nih.gov.

Plague in humans ‘twice as old’ but didn’t begin as flea-borne, ancient DNA reveals

Public Release: 22-Oct-2015

University of Cambridge

New research using ancient DNA has revealed that plague has been endemic in human populations for more than twice as long as previously thought, and that the ancestral plague would have been predominantly spread by human-to-human contact — until genetic mutations allowed Yersinia pestis (Y. pestis), the bacteria that causes plague, to survive in the gut of fleas.

These mutations, which may have occurred near the turn of the 1st millennium BC, gave rise to the bubonic form of plague that spreads at terrifying speed through flea — and consequently rat — carriers. The bubonic plague caused the pandemics that decimated global populations, including the Black Death, which wiped out half the population of Europe in the 14th century.

Before its flea-borne evolution, however, researchers say that plague was in fact endemic in the human populations of Eurasia at least 3,000 years before the first plague pandemic in historical records (the Plague of Justinian in 541 AD).

They say the new evidence that Y. pestis bacterial infection in humans actually emerged around the beginning of the Bronze Age suggests that plague may have been responsible for major population declines believed to have occurred in the late 4th and early 3rd millennium BC.

The work was conducted by an international team including researchers from the universities of Copenhagen, Denmark, and Cambridge, UK, and the findings are published today in the journal Cell.

“We found that the Y. pestis lineage originated and was widespread much earlier than previously thought, and we narrowed the time window as to when and how it developed,” said senior author Professor Eske Willerslev, who recently joined Cambridge University’s Department of Zoology from the University of Copenhagen.

“The underlying mechanisms that facilitated the evolution of Y. pestis are present even today. Learning from the past may help us understand how future pathogens may arise and evolve,” he said.

Researchers analysed ancient genomes extracted from the teeth of 101 adults dating from the Bronze Age and found across the Eurasian landmass from Siberia to Poland.

They found Y. pestis bacteria in the DNA of seven of the adults, the oldest of whom died 5,783 years ago — the earliest evidence of plague. Previously, direct molecular evidence for Y. pestis had not been obtained from skeletal material older than 1,500 years.

However, six of the seven plague samples were missing two key genetic components found in most modern strains of plague: a “virulence gene” called ymt, and a mutation in an “activator gene” called pla.

The ymt gene protects the bacteria from being destroyed by the toxins in flea guts, so that it multiplies, choking the flea’s digestive tract. This causes the starving flea to frantically bite anything it can, and, in doing so, spread the plague.

The mutation in the pla gene allows Y. pestis bacteria to spread across different tissues, turning the localised lung infection of pneumonic plague into one of the blood and lymph nodes.

Researchers concluded these early strains of plague could not have been carried by fleas without ymt. Nor could they cause bubonic plague — which affects the lymphatic immune system, and inflicts the infamous swollen buboes of the Black Death — without the pla mutation.

Consequently, the plague that stalked populations for much of the Bronze Age must have been pneumonic, which directly affects the respiratory system and causes desperate, hacking coughing fits just before death. Breathing around infected people leads to inhalation of the bacteria, the crux of its human-to-human transmission.

Study co-author Dr Marta Mirazón-Lahr, from Cambridge’s Leverhulme Centre for Human Evolutionary Studies (LCHES), points out that a study earlier this year from Willerslev’s Copenhagen group showed the Bronze Age to be a highly active migratory period, which could have led to the spread of pneumonic plague.

“The Bronze Age was a period of major metal weapon production, and it is thought increased warfare, which is compatible with emerging evidence of large population movements at the time. If pneumonic plague was carried as part of these migrations, it would have had devastating effects on small groups they encountered,” she said.

“Well-documented cases have shown the pneumonic plague’s chain of infection can go from a single hunter or herder to ravaging an entire community in two to three days.”

The most recent of the seven ancient genomes to reveal Y. pestis in the new study has both of the key genetic mutations, indicating an approximate timeline for the evolution that spawned flea-borne bubonic plague.

“Among our samples, the mutated plague strain is first observed in Armenia in 951 BC, yet is absent in the next most recent sample from 1686 BC — suggesting bubonic strains evolve and become fixed in the late 2nd and very early 1st millennium BC,” said Mirazón-Lahr.

“However, the 1686 BC sample is from the Altai mountains near Mongolia. Given the distance between Armenia and Altai, it’s also possible that the Armenian strain of bubonic plague has a longer history in the Middle East, and that historical movements during the 1st millennium BC exported it elsewhere.”

The Books of Samuel in the Bible describe an outbreak of plague among the Philistines in 1320 BC, complete with swellings in the groin, which the World Health Organization has argued fits the description of bubonic plague. Mirazón-Lahr suggests this may support the idea of a Middle Eastern origin for the plague’s highly lethal genetic evolution.

Co-author Professor Robert Foley, also from Cambridge’s LCHES, suggests that the lethality of bubonic plague may have required the right population demography before it could thrive.

“Every pathogen has a balance to maintain. If it kills a host before it can spread, it too reaches a ‘dead end’. Highly lethal diseases require certain demographic intensity to sustain them.

“The endemic nature of pneumonic plague was perhaps more adapted for an earlier Bronze Age population. Then, as Eurasian societies grew in complexity and trading routes continued to open up, maybe the conditions started to favour the more lethal form of plague,” Foley said.

“The Bronze Age is the edge of history, and ancient DNA is making what happened at this critical time more visible,” he said.

Willerslev added: “These results show that the ancient DNA has the potential not only to map our history and prehistory, but also discover how disease may have shaped it.”

Evidence that Earth’s first mass extinction was caused by critters not catastrophe

Public Release: 2-Sep-2015

A powerful analogy for what is happening today

Vanderbilt University

NASHVILLE, Tenn. – In the popular mind, mass extinctions are associated with catastrophic events, like giant meteorite impacts and volcanic super-eruptions.

But the world’s first known mass extinction, which took place about 540 million years ago, now appears to have had a more subtle cause: evolution itself.

“People have been slow to recognize that biological organisms can also drive mass extinction,” said Simon Darroch, assistant professor of earth and environmental sciences at Vanderbilt University. “But our comparative study of several communities of Ediacarans, the world’s first multicellular organisms, strongly supports the hypothesis that it was the appearance of complex animals capable of altering their environments, which we define as ‘ecosystem engineers,’ that resulted in the Ediacaran’s disappearance.”

The study is described in the paper “Biotic replacement and mass extinction of the Ediacara biota” published Sept. 2 in the journal Proceedings of the Royal Society B.

“There is a powerful analogy between the Earth’s first mass extinction and what is happening today,” Darroch observed. “The end-Ediacaran extinction shows that the evolution of new behaviors can fundamentally change the entire planet, and we are the most powerful ‘ecosystem engineers’ ever known.”

The earliest life on Earth consisted of microbes – various types of single-celled microorganisms. They ruled the Earth for more than 3 billion years. Then some of these microorganisms discovered how to capture the energy in sunlight. The photosynthetic process that they developed had a toxic byproduct: oxygen. Oxygen was poisonous to most microbes that had evolved in an oxygen-free environment, making it the world’s first pollutant.

But for the microorganisms that developed methods for protecting themselves, oxygen served as a powerful new energy source. Among a number of other things, it gave them the added energy they needed to adopt multicellular forms. Thus, the Ediacarans arose about 600 million years ago during a warm period following a long interval of extensive glaciation.

“We don’t know very much about the Ediacarans because they did not produce shells or skeletons. As a result, almost all we know about them comes from imprints of their shapes preserved in sand or ash,” said Darroch.

What scientists do know is that, in their heyday, Ediacarans spread throughout the planet. They were a largely immobile form of marine life shaped like discs and tubes, fronds and quilted mattresses. The majority were extremely passive, remaining attached in one spot for their entire lives. Many fed by absorbing chemicals from the water through their outer membranes, rather than actively gathering nutrients.

Paleontologists have coined the term “Garden of Ediacara” to convey the peace and tranquility that must have prevailed during this period. But there was a lot of churning going on beneath that apparently serene surface.

After 60 million years, evolution gave birth to another major innovation: animals. All animals share the characteristics that they can move spontaneously and independently, at least during some point in their lives, and sustain themselves by eating other organisms or what they produce. Animals burst onto the scene in a frenzy of diversification that paleontologists have labeled the Cambrian explosion, a 25-million-year period when most of the modern animal families – vertebrates, molluscs, arthropods, annelids, sponges and jellyfish – came into being.

“These new species were ‘ecological engineers’ who changed the environment in ways that made it more and more difficult for the Ediacarans to survive,” said Darroch.

He and his colleagues performed an extensive paleoecological and geochemical analysis of the youngest known Ediacaran community exposed in hillside strata in southern Namibia. The site, called Farm Swartpunt, is dated at 545 million years ago, in the waning one to two million years of the Ediacaran reign.

“We found that the diversity of species at this site was much lower, and there was evidence of greater ecological stress, than at comparable sites that are 10 million to 15 million years older,” Darroch reported. Rocks of this age also preserve an increasing diversity of burrows and tracks made by the earliest complex animals, presenting a plausible link between their evolution and extinction of the Ediacarans.

The older sites were Mistaken Point in Newfoundland, dating from 579 to 565 million years ago; Nilpena in South Australia, dating from 555 to 550 million years ago; and the White Sea in Russia, dating also from 555 to 550 million years ago million years ago.

Darroch and his colleagues made extensive efforts to ensure that the differences they recorded were not due to some external factor.

For example, they ruled out the possibility that the Swartpunt site might have been lacking in some vital nutrients by closely comparing the geochemistry of the sites.

It is a basic maxim in paleontology that the more effort that is made in investigating a given site, the greater the diversity of fossils that will be found there. So the researchers used statistical methods to compensate for the variation in the differences in the amount of effort that had been spent studying the different sites.

Having ruled out any extraneous factors, Darroch and his collaborators concluded that “this study provides the first quantitative palaeoecological evidence to suggest that evolutionary innovation, ecosystem engineering and biological interactions may have ultimately caused the first mass extinction of complex life.”

###

Marc Laflamme, Thomas Boag and Sara Mason from the University of Toronto; Douglas Erwin and Sarah Tweedt from the Smithsonian Institution, Erik Sperling from Stanford University; Alex Morgan and Donald Johnston from Harvard University; Rachel Racicot from Yale University; and Paul Myrow from Colorado College collaborated in the study.

The project was supported by grants from the Connaught Foundation, National Science and Engineering Research Council of Canada, NASA Astrobiology Institute, National Geographic Society and National Science Foundation grant EAR 1324095.

Visit Research News @ Vanderbilt for more research news from Vanderbilt.

New England Journal of Medicine says Conflicts of Interest are a Good Thing?

Public Release: 2-Jun-2015

Are commercial conflicts of interests justifiable in medical journals?

Experts criticize a leading journal’s backtrack regarding policies on conflict of interest

BMJ

A group of former senior editors, writing in The BMJ today, criticise a “seriously flawed and inflammatory attack” by The New England Journal of Medicine (NEJM) on what that journal believes have become overly stringent policies on conflicts of interest.

The NEJM was the first major medical journal to introduce conflict of interest policies in 1984. It required all authors to disclose any financial ties to health industries and made conflict of interests more transparent.

But recently the NEJM published a series of commentaries and an editorial that attempt to justify and rationalise financial conflicts of interests in medicine, and assert that there are negative consequences of such policies.

The articles by national correspondent, Lisa Rosenbaum, and supported by editor in chief, Jeffrey Drazen, “reinterpret and downplay the importance of conflicts of interest in medicine” and do not provide evidence to back claims, argue former senior editors from the NEJM.

They explain that the key concerns for medical journals are not about doctors and researchers being bought by drug companies, or being motivated by a desire for financial gain.

Rather the essential issue is that the objectivity of authors with financial conflicts of interest “might be compromised, either consciously or unconsciously and there would be no easy way to know whether it had been.”

They explain that judges and journalists, for example, are expected to stay away from cases or stories in which they have a financial conflict of interest.

“Yet Rosenbaum and Drazen seem to think it is insulting to physicians and medical researchers to suggest that their judgment can be affected in the same way,” they add.

The authors acknowledge that doctors and researchers sometimes have financial ties with industry for research and consulting specifically related to research, but argue that physicians who develop products and hold patents or receive royalties should not evaluate the products they develop.

Financial conflicts of interest have eroded the credibility of the medical profession, and doctors and the public expect journals to be trustworthy, they explain.

In addition, they commend The BMJ’s introduction of a “zero tolerance” policy on educational articles by authors with any industry ties.

In an accompanying editorial, a group of senior editors at The BMJ also respond to NEJM‘s articles, saying that they are “deeply troubled by a possible retreat from policies that prevent experts with relevant commercial ties from authoring commentary or review articles.”

Such policies were not motivated by a few events, as Drazen suggests, but by recognition of extensive, systemic problems, they add, and argue for a separation between doctors working with industry to develop treatments and those who can assess medical evidence without any conflict of interest.

While they agree that experts with industry funding may be able to express independent views, journal readers and editors do not have a reliable way to see which thoughts might be influenced. “Bias is not always overt or easily detected,” they explain.

They conclude: “It is a mistake by NEJM to suggest that rigorous standards should be revisited. To do so would undermine the trustworthiness of medical journals and be a disservice to clinical practice and patient safety.”

Viruses: You’ve heard the bad — here’s the good

Public Release: 30-Apr-2015

American Society for Microbiology

“The word, virus, connotes morbidity and mortality, but that bad reputation is not universally deserved,” said Marilyn Roossinck, PhD, Professor of Plant Pathology and Environmental Microbiology and Biology at the Pennsylvania State University, University Park. “Viruses, like bacteria, can be important beneficial microbes in human health and in agriculture,” she said. Her review of the current literature on beneficial viruses appeared ahead of print April 24 in the Journal of Virology, which is published by the American Society for Microbiology.

In sharp contrast to the gastrointestinal distress it causes in humans, the murine (mouse infecting) norovirus plays a role in development of the mouse intestine and its immune system, and can actually replace the beneficial effects of certain gut bacteria when these have been decimated by antibiotics. Normal, healthy gut bacteria help prevent infection by bacteria that cause gastrointestinal illness, but excessive antibiotic intake can kill the normal gut flora, and make one vulnerable to gastrointestinal disease. However, norovirus infection of mice actually restored the normal function of the immune system’s lymphocytes and the normal morphology of the intestine, said Roossinck.

Continue reading “Viruses: You’ve heard the bad — here’s the good”

Study links quitting smoking with deterioration in diabetes control

Public Release: 29-Apr-2015

Coventry University

Sufferers of type 2 diabetes mellitus (T2DM) who quit smoking are likely to see a temporary deterioration in their glycaemic control which could last up to three years, according to new research published today in The Lancet Diabetes & Endocrinology.

The research team, led by Dr Deborah Lycett of Coventry University and funded by the National Institute for Health Research’s School for Primary Care Research, examined the primary care records of 10,692 adult smokers with T2DM over six years to investigate whether or not quitting was associated with altered diabetes control.

The study found that in the 3,131 (29%) people who quit and remained abstinent for at least one year, HbA1c – which is an average measurement indicating how well the body is controlling blood glucose levels – increased by 2.3mmol/mol (0.21%) before decreasing gradually as abstinence continued.

Continue reading “Study links quitting smoking with deterioration in diabetes control”

Humans with genetically long telomeres have an increased risk of dying from cancer – which is the exact opposite of what the researchers expected

Public Release: 29-Apr-2015

Danish discovery may change cancer treatment

University of Copenhagen – The Faculty of Health and Medical Sciences

Danish researchers from the University of Copenhagen and Herlev Hospital have made a discovery that may change the principles for treating certain types of cancer.

The discovery relates to the so-called telomeres that constitute the ends of human chromosomes. Short telomeres are related to unhealthy lifestyles, old age and the male gender – all of which are risk factors in terms of high mortality. Up until now, the assumption has been that short telomeres are related to ill health. The challenge for researchers worldwide has therefore been to find out whether or not the short telomeres were indeed a signifier or an indirect cause of increased mortality.

Continue reading “Humans with genetically long telomeres have an increased risk of dying from cancer – which is the exact opposite of what the researchers expected”

Link between serotonin and depression is a myth, says top psychiatrist

Public Release: 21-Apr-2015

BMJ

The widely held belief that depression is due to low levels of serotonin in the brain – and that effective treatments raise these levels – is a myth, argues a leading psychiatrist in The BMJ this week.

David Healy, Professor of Psychiatry at the Hergest psychiatric unit in North Wales, points to a misconception that lowered serotonin levels in depression are an established fact, which he describes as “the marketing of a myth.”

The serotonin reuptake inhibiting (SSRI) group of drugs came on stream in the late 1980s, nearly two decades after first being mooted, writes Healy. The delay centred on finding an indication.

After concerns emerged about tranquilliser dependence in the early 1980s, drug companies marketed SSRIs for depression, “even though they were weaker than older tricyclic antidepressants, and sold the idea that depression was the deeper illness behind the superficial manifestations of anxiety,” he explains. The approach was an astonishing success, “central to which was the notion that SSRIs restored serotonin levels to normal, a notion that later transmuted into the idea that they remedied a chemical imbalance.”

Continue reading “Link between serotonin and depression is a myth, says top psychiatrist”

Mothers Hepatitis B infection may give infants a better survival advantage to counteract bacterial infection during early life.

Public Release: 25-Mar-2015

HBV exposure matures infants’ immune systems

Duke-NUS Graduate Medical School Singapore

IMAGE

IMAGE: This is the hepatitis B virus surface antigen in cord blood cells from HBV positive mothers.

Credit: Michelle Hong / Duke-NUS Graduate Medical School Singapore

A Singapore led study has shown that Hepatitis B Virus Infection (HBV) exposure increases the immune system maturation of infants, which may give a better survival advantage to counteract bacterial infection during early life. These findings radically modify the way that HBV vertical infection of neonates (mother-to-child) is portrayed, and present a paradigm shift in the approach to treatment of patients with chronic hepatitis B.

The research, published in Nature Communications on 25 March 2015, was led by Professor Antonio Bertoletti from the Emerging Infectious Diseases Program (EID) at Duke-NUS Graduate Medical School (Duke-NUS).

Currently widespread in Asia, HBV affects approximately 300 million people worldwide while 6 in 100 Singaporeans are chronic carriers. The majority of HBV chronic infections in Asia are acquired at birth. While there is a safe and effective vaccine available, 5% to 10% of babies born to HBV positive mothers still contract the infection. Conventionally, HBV is thought to exploit the immaturity of the neonatal immune system to establish persistent infection.

Current guidelines from international liver associations recommend treatment for HBV carriers only when they show clear signs of active liver disease, typically after the age of 30. This is based on the assumptions that HBV is considered harmless until symptoms of the disease emerge, and that young patients are immune-tolerant to HBV, meaning they have no protective response to the virus and are unable to react to treatment.

Prof Bertoletti, in collaboration with the National University Hospital (Singapore) and Universitaria di Parma (Italy), showed that contrary to current belief, infants exposed to HBV are not immune-tolerant but they have more mature immune systems. The team examined the immune cells in the cord blood of mothers who were HBV positive and discovered that both the innate and adaptive immune cells are more activated and mature, and they respond better to bacteria challenge, a phenomenon called “trained immunity”. These suggest that their immune cells may be more acclimatised to dealing with potential bacterial infections than the cells from cord blood of healthy mothers.

First author, Duke-NUS Research Fellow Michelle Hong, is heartened about contributing to the understanding of a disease that is endemic in Asia. “Our work contributes to the understanding of how HBV exposure before birth shapes the global immune response of newborn infants and transforms the way we look at HBV. Despite causing diseases later in life, HBV might actually be beneficial to humans early in life.”

Previously, Prof Bertoletti had shown that young adults (aged 14 to 30) with chronic HBV infection are not immune tolerant and possess immune cells able to counter the virus. Moving forward, he plans to examine the impact of HBV infection in paediatric patients; those aged two to 12, to determine how their immune system responds to the virus. The combined findings from these different studies are poised to shape the guidelines for chronic HBV treatment in patients – starting from young adults’ or even earlier.


Journal Reference:

  1. Michelle Hong, Elena Sandalova, Diana Low, Adam J. Gehring, Stefania Fieni, Barbara Amadei, Simonetta Urbani, Yap-Seng Chong, Ernesto Guccione & Antonio Bertoletti. Trained immunity in newborn infants of HBV-infected mothers.Nature Communications, March 2015 DOI: 10.1038/ncomms7588

Education may not improve our life chances of happiness

Public Release: 25-Mar-2015

University of Warwick

Getting a good education may not improve your life chances of happiness, according to new mental health research from the University of Warwick.

In a new study published in the British Journal of Psychiatry, researchers from Warwick Medical School examined socioeconomic factors related to high mental wellbeing, such as level of education and personal finances.

Low educational attainment is strongly associated with mental illness but the research team wanted to find out if higher educational attainment is linked with mental wellbeing.

The team found all levels of educational attainment had similar odds of high mental well-being.

High mental wellbeing was defined as ‘feeling good and functioning well’. People with high levels of mental wellbeing manage to feel happy and contented with their lives more often than those who don’t because of the way they manage problems and challenges especially in relationships with others.

Continue reading “Education may not improve our life chances of happiness”

Medical expansion has made us feel sicker not healthier

Public Release: 19-Mar-2015

Medical expansion has led people worldwide to feel less healthy

In 28 countries, more medicine has unexpected effects

Ohio State University

COLUMBUS, Ohio – Across much of the Western world, 25 years of expansion of the medical system has actually led to people feeling less healthy over time, a new study has found.

A researcher at The Ohio State University used several large multinational datasets to examine changes in how people rated their health between 1981 and 2007 and compared that to medical expansion in 28 countries that are members of the Organization for Economic Co-operation and Development.

During that time, the medical industry expanded dramatically in many of those countries, which you might expect would lead to people who felt healthier.

But that’s not what Hui Zheng, assistant professor of sociology at Ohio State, found.

Continue reading “Medical expansion has made us feel sicker not healthier”

Wealth and power may have played a stronger role than ‘survival of the fittest’

Public Release: 16-Mar-2015

Number of reproducing males declined during global growth

Arizona State University

IMAGE

Caption
4 to 8,000 years ago, there was an extreme reduction in the number of males who reproduced, but not in the number females. Wealth and power may have played a stronger role in shaping recent human evolution than

Credit: Sabine Deviche

Tempe, Ariz. — The DNA you inherit from your parents contributes to the physical make-up of your body — whether you have blue eyes or brown, black hair or red, or are male or female. Your DNA can also influence whether you might develop certain diseases or disorders such as Crohn’s Disease, cystic fibrosis, hemophilia or neurofibromatosis, to name a few.

In a study led by scientists from Arizona State University, the University of Cambridge, University of Tartu and Estonian Biocentre, and published March 13 in an online issue of the journal Genome Research, researchers discovered a dramatic decline in genetic diversity in male lineages four to eight thousand years ago — likely the result of the accumulation of material wealth, while in contrast, female genetic diversity was on the rise. This male-specific decline occurred during the mid- to late-Neolithic period.

Continue reading “Wealth and power may have played a stronger role than ‘survival of the fittest’”