Seven in 8 children’s tonsillectomies are unnecessary, study reveals

Public Release: 5-Nov-2018

 

University of Birmingham

A new study by the University of Birmingham has found that seven in every eight children who have their tonsils removed are unlikely to benefit from the operation.

Researchers analysed the electronic medical records of over 1.6 million children from more than 700 UK general practices dating between 2005 and 2016. They found that out of 18,271 children who had their tonsils removed during this time, only 2,144 (11.7 per cent) had enough sore throats to justify surgery.

The researchers at the University’s Institute of Applied Health Research concluded that their evidence, published today (Nov 6th) in British Journal of General Practice, showed that annually 32,500 children undergo needless tonsillectomies at a cost to the NHS of £36.9 million.

What’s more, they found that many children who might benefit from having their tonsils removed are not having the surgical procedure. They found that of 15,764 children who had records showing sufficient sore throats to undergo a tonsillectomy, just 2,144 (13.6 per cent) actually went on to have one.

Current UK health policy, based on the best scientific evidence, is that to meet the criteria to benefit from a tonsillectomy children must suffer from either more than seven documented sore throats in a year; more than five sore throats per year for two successive years; or three sore throats per year for three successive years.

The researchers found that, of those who had undergone a tonsillectomy, 12.4 per cent had reported five to six sore throats in a year; 44.7 per cent had suffered two to four sore throats in a year; and 9.9 per cent had just one sort throat in a year.

Tom Marshall, Professor or Public Health and Primary Care at the University of Birmingham, said: “Research shows that children with frequent sore throats usually suffer fewer sore throats over the next year or two. In those children with enough documented sore throats, the improvement is slightly quicker after tonsillectomy, which means surgery is justified.

“But research suggests children with fewer sore throats don’t benefit enough to justify surgery, because the sore throats tend to go away anyway.

“Our research showed that most children who had their tonsils removed weren’t severely enough affected to justify treatment, while on the other hand, most children who were severely enough affected with frequent sore throats did not have their tonsils removed. The pattern changed little over the 12 year period.

“Children may be more harmed than helped by a tonsillectomy. We found that even among severely affected children only a tiny minority of ever have their tonsils out. It makes you wonder if tonsillectomy ever really essential in any child.”

###

For more information please contact Emma McKinney, Communications Manager (Health Sciences), University of Birmingham, Email: e.j.mckinney@bham.ac.uk or tel: +44 (0) 121 414 6681, or contact the press office on +44 (0) 7789 921 165 or pressoffice@contacts.bham.ac.uk

Notes to Editors

  • The University of Birmingham is ranked amongst the world’s top 100 institutions. Its work brings people from across the world to Birmingham, including researchers, teachers and more than 5,000 international students from over 150 countries.
  • Šumilo et al (2018). ‘Incidence of indications for tonsillectomy and frequency of evidence-based surgery: a 12-year retrospective cohort study of primary care electronic records’. British Journal of General Practice. DOI: 10.3399/bjgp18X699833
  • Once the embargo lifts, the paper will be available at https://doi.org/10.3399/bjgp18X699833

Economic impact of excess weight now exceeds $1.7 trillion

Public Release: 30-Oct-2018

Costs include $1.24 billion in lost productivity, according to Milken Institute study documenting role of obesity and overweight in chronic diseases

Milken Institute

LOS ANGELES, Tuesday, October 30, 2018–The impact of obesity and overweight on the U.S. economy has eclipsed $1.7 trillion, an amount equivalent to 9.3 percent of the nation’s gross domestic product, according to a new Milken Institute report on the role excess weight plays in the prevalence and cost of chronic diseases.

The estimate includes $480.7 billion in direct health-care costs and $1.24 trillion in lost productivity, as documented in America’s Obesity Crisis: The Health and Economic Impact of Excess Weight. The study draws on research that shows how overweight and obesity elevate the risk of diseases such as breast cancer, heart disease, and osteoarthritis, and estimates the cost of medical treatment and lost productivity for each disease.

For example, the treatment cost for all type 2 diabetes cases – one of the most prevalent chronic diseases connected to excess weight – was $1.21 billion and indirect costs were $215 billion. On an individual basis, that comes to $7,109 in treatment costs per patient and $12,633 in productivity costs.

America’s Obesity Crisis assesses the role excess weight plays in the prevalence of 23 chronic diseases and the economic consequences that result. To mention a few, obesity and overweight are linked to:

  • 75 percent of osteoarthritis cases
  • 64 percent of Type 2 diabetes cases
  • 73 percent of kidney disease cases

The findings suggest that more effective weight-control strategies could reduce both the health and economic burdens of chronic diseases, according to co-author Hugh Waters, director of health economics research at the Milken Institute.

“Despite the billions of dollars spent each year on public health programs and consumer weight-loss products, the situation isn’t improving,” Waters said. “A new approach is needed.”

The impact of obesity on chronic disease is not limited to the stress that added weight places on joints and the cardiovascular system. For example, research indicates that hormones secreted by fat cells may trigger inflammation and increase insulin resistance. These reactions can, in turn, contribute to greater risk of type 2 diabetes, cardiovascular disease, and some cancers.

Nearly 40 percent of Americans were obese and 33 percent were overweight but not obese in 2016, according to the Centers for Disease Control and Prevention. The numbers have climbed steadily since 1962, when 13 percent of the population were obese and 32 percent were overweight.

Direct medical costs include payments made by individuals, families, employers, and insurance companies to treat the diseases in question. Indirect costs include the economic impact of work absences, lost wages, and reduced productivity of patients and caregivers.

The estimates in America’s Obesity Crisis are based on an analysis of data compiled by the Centers for Disease Control and Prevention, the National Center for Health Statistics, the U.S. Agency for Healthcare Research and Quality, and the Bureau of Labor Statistics. The report relies on the World Health Organization’s definition of overweight as a body mass index of 25 to 29.9 and obesity as a BMI of 30 or higher.

Fighting mutant influenza

Public Release: 24-Oct-2018

American Chemical Society

Another flu season is here, which means another chance for viruses to mutate. Already, most influenza A viruses contain a mutation that confers resistance against one class of antiviral medications, and the bugs are steadily gaining resistance against another class. Scientists report in ACS Medicinal Chemistry Letters a series of experiments designed to develop new medications that could potentially fight off the resistant and sensitive types of influenza A.

For most people, the flu is a nuisance, causing aches and pains, as well as coughs and runny noses for a few weeks. But for the elderly and young children, the illness can be deadly. And in the most recent flu season, even some seemingly healthy adults died after being infected. Because the virus can mutate, it has built up resistance against some drugs that had been used to help fight the infection. Influenza A is responsible for most cases of the flu, and about 95 percent of influenza A viruses have a mutation called S31N in the channel protein AM2. U.S. Food and Drug Administration (FDA)-approved antivirals called adamantanes target that protein, but are no longer recommended because the S31N mutation renders these drugs useless. The other FDA-approved class of antiviral medications includes oseltamivir. Although these drugs are still effective, resistance is growing with the rise of a mutation in a different viral protein. That’s why Jun Wang and colleagues sought to develop a new medication that would work in both resistant strains and those that still respond to oseltamivir.

The key, Wang’s team realized, was to target the AM2 protein with the S31N mutation, since it is found in almost all influenza A viruses. Using a step-wise process, the researchers identified one sulfur-containing inhibitor of AM2 S31N that was stable under conditions that mimic human metabolism. They then spun that discovery into a series of similar sulfur-bearing molecules, eventually identifying two compounds with even better antiviral properties than oseltamivir, fighting off drug-resistant and drug-sensitive strains. In addition, the researchers note that the two compounds have optimal in vitro pharmacokinetic properties, making them well-suited for the next step of in vivo studies in mice.

###

The authors acknowledge startup funding from the University of Arizona and funding from the National Institutes of Health.

The abstract that accompanies this study is available here.

The American Chemical Society, the world’s largest scientific society, is a not-for-profit organization chartered by the U.S. Congress. ACS is a global leader in providing access to chemistry-related information and research through its multiple databases, peer-reviewed journals and scientific conferences. ACS does not conduct research, but publishes and publicizes peer-reviewed scientific studies. Its main offices are in Washington, D.C., and Columbus, Ohio.

To automatically receive news releases from the American Chemical Society, contact newsroom@acs.org.

Follow us on Twitter | Facebook

Scientists accidentally reprogram mature mouse GABA neurons into dopaminergic-like neurons

Public Release: 11-Oct-2018

Cell Press

IMAGE

IMAGE: This image shows Dr. Chun-Li Zhang and Lei-Lei Wang.

Credit: David Gresham / UT Southwestern

Attempting to make dopamine-producing neurons out of glial cells in mouse brains, a group of researchers instead converted mature inhibitory neurons into dopaminergic cells. Their findings, appearing October 11 in the journal Stem Cell Reports, reveal that–contrary to previous belief–it is possible to reprogram one mature neuron type into another without first reverting it to a stem-cell-like state.

“Initially, I was a little disappointed that we converted medium spiny neurons instead of glia,” says first author Chun-Li Zhang, a professor of molecular biology at UT Southwestern Medical Center (@UTSWNews). “But when we realized the novelty of our results, we were kind of amazed. To our knowledge, changing the phenotype of resident, already-mature neurons has never been accomplished before.”

Dopaminergic cells are important for controlling voluntary movement and emotions such as motivation and reward that drive behavior. They are often lost in movement disorders like Parkinson’s disease. Many neuroscientists are interested in the therapeutic potential of creating new dopaminergic cells.

Zhang and his team attempted to induce the glia–cells surrounding neurons with protective and other functions–to morph inside live mouse brains. They injected a viral vector to express a cocktail of proteins into the striatum, a region of the brain rich in GABAergic neurons that help control muscle movement. The cocktail consisted of three transcription factors, NURR1, FOXA2, and LMX1A, which help decode genetic instructions for building dopaminergic neurons. The mice were also treated with valproic acid, which was previously shown to play a role in cell reprogramming.

The team targeted glial cells due to their ability to regenerate and multiply more readily than neurons, theoretically making them better therapeutic candidates. But when they looked at the brain slices of the injected mice, they found the glia unchanged. Instead, some GABAergic medium spiny neurons–cells that are directly controlled by dopaminergic neurons–had transformed.

The new cells appeared to behave more like native dopaminergic neurons, although they also retained residual features of the medium spiny neurons. They showed rhythmic activity and formed network connections similarly to dopaminergic cells, as the researchers discovered through electrode recordings and reporter assays.

Subsequent immunohistochemistry and reporter assays revealed that the new cells sprung from mature medium spiny neurons without passing through a proliferative progenitor stage.

“Our results offer a new perspective on neuronal plasticity,” says Zhang. “We traditionally think of mature cell identity and function as fixed, but our findings suggest that they are more dependent on biochemical factors in their environment than we thought. This could mean that no cell type is fixed even for a functional, mature neuron.”

Zhang and his team next seek to address some of the limitations of their findings by clarifying the exact reprogramming mechanism and, of course, identifying the conditions that can reprogram glia into dopaminergic neurons, as they originally sought.

“We hope that the ability to change neuron identity will someday be directed to treat neurological diseases, including Parkinson’s disease,” says Zhang.

###

This research was funded by the Welch Foundation, the Mobility Foundation, the Michael J. Fox Foundation, the Decherd Foundation, the Pape Adams Foundation, Texas Institute for Brain Injury and Repair, Kent Waldrep Foundation Center for Basic Research on Nerve Growth and Regeneration and the National Institutes of Health.

Stem Cell Reports, Zhang et al.: “Phenotypic reprogramming of striatal neurons into dopaminergic neuron-like cells in the adult mouse brain” https://www.cell.com/stem-cell-reports/fulltext/S2213-6711(18)30389-8

Stem Cell Reports, published by Cell Press for the International Society for Stem Cell Research (@ISSCR), is a monthly open-access forum communicating basic discoveries in stem cell research, in addition to translational and clinical studies. The journal focuses on shorter, single-point manuscripts that report original research with conceptual or practical advances that are of broad interest to stem cell biologists and clinicians. Visit http://www.cell.com/stem-cell-reports. To receive Cell Press media alerts, please contact press@cell.com.

Link between gut flora and multiple sclerosis discovered

Public Release: 11-Oct-2018

University of Zurich

IMAGE

Caption

Diminishing myelin sheaths: The damaged areas (at the bottom of the image) of the brains of MS patients lack myelin (at the top, in blue). (Image: Dr. med. Imke Metz, University of Göttingen, Germany)

Credit: Image: Dr. med. Imke Metz, University of Göttingen, Germany

Multiple sclerosis is an autoimmune disease in which the body’s own immune system attacks and damages the protective coating around nerve cells. This coating is made up of myelin – a biological membrane of protein and fatty substances – which is why research efforts to find the disease’s target antigen have so far focused on the myelin membrane’s components. New findings made by the research group of Mireia Sospedra and Roland Martin from the University of Zurich’s Clinical Research Priority Program Multiple Sclerosis now suggest that it is worth broadening the research perspective to gain a better understanding of the pathological processes.

Inflammatory cascade

In the journal Science Translational Medicine, the scientists report that T cells – i.e. the immune cells responsible for pathological processes – react to a protein called GDP-L-fucose synthase. This enzyme is formed in human cells as well as in bacteria frequently found in the gastrointestinal flora of patients suffering from multiple sclerosis. “We believe that the immune cells are activated in the intestine and then migrate to the brain, where they cause an inflammatory cascade when they come across the human variant of their target antigen,” says Mireia Sospedra.

For the genetically defined subgroup of MS patients examined by the researchers, results show that gut microbiota could play a far greater role in the pathogenesis of the disease than previously assumed. Mireia Sospedra hopes that these findings can soon also be translated into therapy; she plans to test the immunoactive components of GDP-L-fucose synthase using an approach that the researchers have been pursuing for several years already.

Re-educating the immune system

“Our clinical approach specifically targets the pathological autoreactive immune cells,” says Sospedra. This approach therefore differs radically from other treatments that are currently available, which throttle the whole immune system. While these treatments often succeed in stopping the progression of the disease, they also weaken the immune system – and can thus cause severe side effects.

The clinical approach of the research group involves drawing blood from MS patients in a clinical trial and then attaching the immunoactive protein fragments onto the surface of red blood cells in a laboratory. When the blood is reintroduced into the bloodstream of patients, the fragments help to “re-educate” their immune system and make it “tolerate” its own brain tissue. This therapeutic approach aims for effective targeted treatment without severe side effects.

###

Literature

Raquel Planas, Radleigh Santos, Paula Tomas-Ojer, Carolina Cruciani, Andreas Lutterotti, Wolfgang Faigle, Nicole Schaeren-Wiemers, Carmen Espejo, Herena Eixarch, Clemencia Pinilla, Roland Martin, Mireia Sospedra. GDP-L-Fucose Synthase As A Novel CD4+ T Cell-Specific Autoantigen in DRB3*02:02 Multiple Sclerosis Patients. Science Translational Medicine. 10 Oct 2018. DOI: 10.1126/scitranslmed.aat4301.

Dutch study estimates 1 in 2 women and 1 in 3 men set to develop dementia/parkinsonism/stroke

Public Release: 1-Oct-2018

 

Preventive strategies could, in theory, more than halve lifetime risk for those aged 85+, say researchers

BMJ

One in two women and one in three men will likely be diagnosed with dementia, Parkinson’s disease, or stroke in their lifetime, estimate Dutch researchers in an observational study published online in the Journal of Neurology Neurosurgery & Psychiatry.

But preventive strategies, which delay the onset of these common diseases by even a few years, could, in theory, cut this lifetime risk by between 20 and more than 50 per cent, they say.

The global costs of dementia, stroke, and parkinsonism are thought to amount to more than 2 per cent of the world’s annual economic productivity (GDP), a figure that is set to rise steeply as life expectancy continues to increase.

But while the lifetime risks of other serious illnesses, such as breast cancer and heart disease are well known and used to raise public awareness, the same can’t be said of dementia, stroke, parkinsonism, say the researchers.

To try and redress this, they tracked the neurological health of more than 12,000 people taking part in the Rotterdam Study between 1990 and 2016. This study has been looking at the incidence of, and influential factors behind, diseases of ageing in the general population.

All the participants were aged at least 45 years old when they were recruited, and more than half (just under 58 per cent) were women.

When they joined, participants were given a thorough health check, which was repeated every four years. Family doctor health records were also scrutinised for signs of disease or diagnoses arising between the four yearly check-ups.

Monitoring for dementia, parkinsonism, and stroke continued until death, or January 1 2016, whichever came first.

Between 1990 and 2016, 5291 people died, 3260 of whom had not been diagnosed with any neurological disease. But 1489 people were diagnosed with dementia, mostly Alzheimer’s disease (just under 80%); 1285 had a stroke, nearly two thirds of which (65%) was caused by a blood clot (ischaemic); and 263 were diagnosed with parkinsonism.

A higher prevalence of high blood pressure, abnormal heart rhythm (atrial fibrillation), high cholesterol and type 2 diabetes was evident at the start of the monitoring period among those subsequently diagnosed with any of the three conditions.

Unsurprisingly, the risk of developing any of them rose steeply with age, but based on the data, the overall lifetime risk of a 45 year-old developing dementia, parkinsonism, or having a stroke was one in two for a woman (48%) and one in three for a man (36%).

This gender difference was largely driven by women being at heightened risk of developing dementia before men. But there were other gender differences in risk.

While 45 year-olds of both sexes had a similar lifetime risk of stroke, men were at substantially higher risk of having a stroke at younger ages than women.

And women were twice as likely as men to be diagnosed with both dementia and stroke during their lifetime.

The researchers calculated that if the onset of dementia, stroke, and parkinsonism were delayed by 1 to 3 years, the remaining lifetime risk could, in theory, be reduced by 20 per cent in 45 year-olds, and by more than 50 per cent in those aged 85+.

A delay of only a few years for one disease could also have a significant impact on combined lifetime risk, suggest the researchers.

“For instance, delaying dementia onset by 3 years has the potential to reduce lifetime risk of any disease by 15 per cent for men and women aged 45, and by up to 30 per cent for those aged 85 and older,” they write.

The researchers point out that their study included only people of European ancestry with a relatively long life expectancy, so might not be applicable to other ethnicities/populations, and that they weren’t able to measure the severity of any of the diagnosed conditions.

This research is observational, so no definitive conclusions can be drawn. But the researchers nevertheless conclude: “These findings strengthen the call for prioritising the focus on preventive interventions at population level which could substantially reduce the burden of common neurological diseases in the ageing population.”

###

Peer reviewed? Yes
Evidence type: Observational
Subjects: People

High gluten diet in pregnancy linked to increased risk of diabetes in children

Public Release: 19-Sep-2018

 

Further studies needed to confirm or rule out findings, and to explore possible underlying mechanism

BMJ

A high gluten intake by mothers during pregnancy is associated with an increased risk of their child developing type 1 diabetes, suggests a study published by The BMJ today.

However, the researchers say that further studies are needed to confirm or rule out these findings before any changes to dietary recommendations could be justified.

Gluten is a general name for the proteins found in wheat, rye, and barley and is suggested to affect the development of type 1 diabetes. In animal studies, a gluten free diet during pregnancy almost completely prevented type 1 diabetes in offspring, but no intervention study has been undertaken in pregnant women.

To better understand the nature of this association, researchers led by Julie Antvorskov at the Bartholin Institute in Denmark in collaboration with researchers at Denmark’s Statens Serum Institut, set out to examine whether gluten intake during pregnancy is associated with subsequent risk of type 1 diabetes in children.

They analysed data for 63,529 pregnant women enrolled into the Danish National Birth Cohort between January 1996 and October 2002.

Women reported their diet using a food frequency questionnaire at week 25 of pregnancy and information on type 1 diabetes in their children was obtained through the Danish Registry of Childhood and Adolescent Diabetes.

Average gluten intake was 13 g/day, ranging from less than 7 g/day to more than 20 g/day, and the researchers identified 247 cases of type 1 diabetes (a rate of 0.37%) among the participants’ children.

After taking account of potentially influential factors, such as mother’s age, weight (BMI), total energy intake, and smoking during pregnancy,they found that the child’s risk of type 1 diabetes increased proportionally with the mother’s gluten intake during pregnancy (per 10 g/day increase).

For example, children of women with the highest gluten intake (20 g/day or more) versus those with the lowest gluten intake (less than 7 g/day) had double the risk of developing type 1 diabetes over a mean follow-up period of 15.6 years.

This is an observational study, so no firm conclusions can be drawn about cause and effect. However, the researchers say this was a high quality study with a large sample size, and they were able to adjust for a number of factors that could have affected the results.

The mechanisms that might explain this association are not known, but could include increased inflammation or increased gut permeability (so-called leakiness of the gut), they write. However, more evidence is needed before changes to dietary recommendations could be justified, they conclude.

In a linked editorial, researchers at the National Institute for Health and Welfare in Finland, say further studies are needed “to identify whether the proposed association really is driven by gluten, or by something else in the grains or the diet.”

The authors agree that it is too early to change dietary recommendations on gluten intake in pregnancy, but say doctors, researchers, and the public “should be aware of the possibility that consuming large amounts of gluten might be associated with an increased risk for the child to develop type 1 diabetes, and that further studies are needed to confirm or rule out these findings, and to explore possible underlying mechanisms.”

###

Externally peer-reviewed? Yes (research), No (linked editorial)
Type of evidence: Observational
Subjects: People

One in three college freshmen worldwide reports mental health disorder

Public Release: 13-Sep-2018

Students from 19 colleges in eight countries report symptoms consistent with psychological disorder, study says

American Psychological Association

As if college were not difficult enough, more than one-third of first-year university students in eight industrialized countries around the globe report symptoms consistent with a diagnosable mental health disorder, according to research published by the American Psychological Association.

“While effective care is important, the number of students who need treatment for these disorders far exceeds the resources of most counseling centers, resulting in a substantial unmet need for mental health treatment among college students,” said lead author Randy P. Auerbach, PhD, of Columbia University. “Considering that students are a key population for determining the economic success of a country, colleges must take a greater urgency in addressing this issue.”

Auerbach and his co-authors analyzed data from the World Health Organization’s World Mental Health International College Student Initiative, in which almost 14,000 students from 19 colleges in eight countries (Australia, Belgium, Germany, Mexico, Northern Ireland, South Africa, Spain and the United States) responded to questionnaires to evaluate common mental disorders, including major depression, generalized anxiety disorder and panic disorder.

The researchers found that 35 percent of the respondents reported symptoms consistent with at least one mental health disorder as defined by the Diagnostic and Statistical Manual of Mental Disorders, 4th Edition. Major depressive disorder was the most common, followed by generalized anxiety disorder. The findings were published in the Journal of Abnormal Psychology.

“The finding that one-third of students from multiple countries screened positive for at least one of six mental health disorders represents a key global mental health issue,” said Auerbach.

Previous research suggests that only 15-20 percent of students will seek services at their respective counseling center, which may already be overtaxed, according to Auerbach. If students need help outside of their school counseling center or local psychologists, Auerbach suggested that they seek Internet resources, such as online cognitive behavioral therapy.

“University systems are currently working at capacity and counseling centers tend to be cyclical, with students ramping up service use toward the middle of the semester, which often creates a bottleneck,” said Auerbach. “Internet-based clinical tools may be helpful in providing treatment to students who are less inclined to pursue services on campus or are waiting to be seen.”

Future research needs focus on identifying which interventions work best for specific disorders, said Auerbach. For example, certain types of depression or anxiety may be best treated with certain types of Internet interventions, whereas other disorders, such as substance use, may require treatment in person by a psychologist or other mental health professional.

“Our long-term goal is to develop predictive models to determine which students will respond to different types of interventions,” said Auerbach. “It is incumbent on us to think of innovative ways to reduce stigma and increase access to tools that may help students better manage stress.”

###

Article: “The WHO World Mental Health Surveys International College Student Project: Prevalence and Distribution of Mental Disorders,” by Randy Auerbach, PhD, Columbia University; Jordi Alonso, MD, PhD, and Gemma Vilagut, PhD, IMIM Hospital del Mar Medical Research Institute, Barcelona: Pim Cuijpers, PhD, Amsterdam Public Health Research Institute; David Ebert, PhD, Friedrich-Alexander University Erlangen Nuremberg; Penelope Hasking, PhD, Curtin University; Matthew Nock, PhD, Harvard University; Dan Stein, PhD, University of Cape Town: Alan Zaslavsky, PhD, Ronald Kessler, PhD, Stephanie Pinder-Amaker, PhD, and Nancy Simpson, PhD, Harvard Medical School; Philippe Mortier, MD, PhD, Koen Demyttenaere, MD, PhD, and Ronny Bruffaerts, PhD, Katholieke Universiteit Leuven; Corina Benjet, PhD, National Institute of Psychiatry Ramon de la Fuente Muniz; Jennifer Greif Green, PhD, Boston University; and Elaine Murray, PhD, Ulster University. Journal of Abnormal Psychology, published Sept. 13, 2018.

Full text of the article is available from the APA Public Affairs Office and at
http://www.apa.org/pubs/journals/releases/abn-abn0000362.pdf

Contact: Randy Auerbach via email at rpa2009@columbia.edu or by phone at (646) 774-5745.

The American Psychological Association, in Washington, D.C., is the largest scientific and professional organization representing psychology in the United States. APA’s membership includes nearly 115,700 researchers, educators, clinicians, consultants and students. Through its divisions in 54 subfields of psychology and affiliations with 60 state, territorial and Canadian provincial associations, APA works to advance the creation, communication and application of psychological knowledge to benefit society and improve people’s lives.

Researcher links diplomats’ mystery illness to radiofrequency/microwave radiation

Public Release: 29-Aug-2018

 

University of California – San Diego

Writing in advance of the September 15 issue of Neural Computation, Beatrice Golomb, MD, PhD, professor of medicine at University of California San Diego School of Medicine, says publicly reported symptoms and experiences of a “mystery illness” afflicting American and Canadian diplomats in Cuba and China strongly match known effects of pulsed radiofrequency/microwave electromagnetic (RF/MW) radiation.

Her conclusions, she said, may aid in the treatment of the diplomats (and affected family members) and assist U.S. government agencies seeking to determine the precise cause. More broadly, Golomb said her research draws attention to a larger population of people who are affected by similar health problems.

“I looked at what’s known about pulsed RF/MW in relation to diplomats’ experiences,” said Golomb. “Everything fits. The specifics of the varied sounds that the diplomats reported hearing during the apparent inciting episodes, such as chirping, ringing and buzzing, cohere in detail with known properties of so-called ‘microwave hearing,’ also known as the Frey effect.

“And the symptoms that emerged fit, including the dominance of sleep problems, headaches and cognitive issues, as well as the distinctive prominence of auditory symptoms. Even objective findings reported on brain imaging fit with what has been reported for persons affected by RF/MW radiation.”

Beginning in 2016, personnel at the U.S. Embassy in Havana, Cuba (as well as Canadian diplomats and family members) described hearing strange sounds, followed by development of an array of symptoms. The source of the health problems has not been determined. Though some officials and media have described the events as “sonic attacks,” some experts on sound have rejected this explanation. In May of this year, the State Department reported that U.S. government employees in Guangzhou, China had also experienced similar sounds and health problems.

Affected diplomats and family members from both locations were medically evacuated to the U.S. for treatment, but despite multiple government investigations, an official explanation of events and subsequent illnesses has not been announced. At least two early published studies examining available data were inconclusive.

In her paper, scheduled to be published September 15 in Neural Computation, Golomb compared rates of described symptoms among diplomats with a published 2012 study of symptoms reported by people affected by electromagnetic radiation in Japan. By and large, she said the cited symptoms — headache, cognitive problems, sleep issues, irritability, nervousness or anxiety, dizziness and tinnitus (ringing in the ears) — occurred at strikingly similar rates.

Some diplomats reported hearing loss. That symptom was not assessed in both studies so rates could not be compared, but Golomb said it is widely reported in both conditions. She also noted that previous brain imaging research in persons affected by RF/ EMR “showed evidence of traumatic brain injury, paralleling reports in diplomats.”

David O. Carpenter, MD, is director of the Institute for Health and the Environment at the University of Albany, part of the State University of New York. He was not involved in Golomb’s study. He said evidence cited by Golomb illustrates “microwave hearing,” which results “from heating induced in tissue, which causes ‘waves’ in the ear and results in clicks and other sounds.” Reported symptoms, he said, characterize the syndrome of electrohypersensitivity (EHS), in which unusual exposure to radiofrequency radiation can trigger symptoms in vulnerable persons that may be permanent and disabling.

“We have seen this before when the Soviets irradiated the U.S. Embassy in Moscow in the days of the Cold War,” he said.

Golomb, whose undergraduate degree was in physics, conducts research investigating the relationship of oxidative stress and mitochondrial function — mechanisms shown to be involved with RF/EMR injury — to health, aging, behavior and illness. Her work is wide-ranging, with published studies on Gulf War illness, statins, antibiotic toxicity, ALS, autism and the health effects of chocolate and trans fats, with a secondary interest in research methods, including placebos.

Golomb said an analysis of 100 studies examining whether low-level RF produced oxidative injury found that 93 studies concluded that it did. Oxidative injury or stress arises when there is an imbalance between the production of reactive oxygen species (free radicals) and the body’s detoxifying antioxidant defenses. Oxidative stress has been linked to a range of diseases and conditions, from Alzheimer’s disease, autism and depression to cancer and chronic fatigue syndrome, as well as toxic effects linked to certain drugs and chemicals. More to the point, Golomb said, oxidative injury has been linked to the symptoms and conditions reported in diplomats.

The health consequences of RF/MW exposure is a matter of on-going debate. Some government agencies, such as the National Institute of Environmental Health Sciences and the National Cancer Institute, publicly assert that low- to mid-frequency, non-ionizing radiation like those from microwaves and RF is generally harmless. They cite studies that have found no conclusive link between exposure and harm.

But others, including researchers like Golomb, dispute that conclusion, noting that many of the no-harm studies were funded by vested industries or had other conflicts of interest. She said independent studies over decades have reported biological effects and harms to health from nonionizing radiation, specifically RF/MW radiation, including via oxidative stress and downstream mechanisms, such as inflammation, autoimmune activation and mitochondrial injury.

Golomb compared the situation to persons with peanut allergies: Most people do not experience any adverse effect from peanut exposure, but for a vulnerable subgroup, exposure produces negative, even life-threatening, consequences.

In her analysis, Golomb concludes that “of hypotheses tendered to date, (RF/MW exposure) alone fits the facts, including the peculiar ones” regarding events in Cuba and China. She said her findings advocate for more robust attention to pulsed RF/MW and associated adverse health effects.

“The focus must be on research by parties free from ties to vested interests. Such research is needed not only to explain and address the symptoms in diplomats, but also for the benefit of the small fraction – but large number — of persons outside the diplomatic corps, who are beset by similar problems.”

###

This study was unfunded.

For first time in 40 years, cure for acute leukemia within reach

Public Release: 24-Aug-2018

Hebrew University drug trials show 50 percent cure rate in lab mice

The Hebrew University of Jerusalem

IMAGE

IMAGE: Leukemia cancer cells before and after new drug treatment.

Credit: Waleed Minzel/Hebrew University.

Acute myeloid leukemia is one of the most aggressive cancers. While other cancers have benefitted from new treatments, there has been no encouraging news for most leukemia patients for the past 40 years. Until now.

As published today in the scientific journal Cell, Professor Yinon Ben-Neriah and his research team at the Hebrew University of Jerusalem (HU)’s Faculty of Medicine have developed a new biological drug with a cure rate of 50% for lab mice with acute leukemia.

Leukemia produce a variety (and a high quantity) of proteins that together provide leukemic cells with rapid growth and death protection from chemotherapy.

To date, most of the biological cancer drugs used to treat leukemia target only individual leukemic cell proteins. However, during “targeted therapy” treatments, leukemic cells quickly activate their other proteins to block the drug. The result is drug-resistant leukemic cells which quickly regrow and renew the disease.

However, the new drug developed by Ben-Neriah and his team functions like a cluster bomb. It attacks several leukemic proteins at once, making it difficult for the leukemia cells to activate other proteins that can evade the therapy. Further, this single molecule drug accomplishes the work of three or four separate drugs, reducing cancer patients need to be exposed to several therapies and to deal with their often unbearable side-effects.

Additionally promising, is the new drug’s ability to eradicate leukemia stem cells. This has long been the big challenge in cancer therapy and one of the main reasons that scientists have been unable to cure acute leukemia.

“We were thrilled to see such a dramatic change even after only a single dose of the new drug. Nearly all of the lab mice’s’ leukemia signs disappeared overnight,” shared professor Ben-Neriah.

BioTheryX recently bought the rights to this promising drug from HU’s technology transfer company Yissum. Together with Ben-Neriah’s research team, they are now applying for FDA approval for phase I clinical studies.

Lung cancer mortality rates among women projected to increase by over 40 percent by 2030

Public Release: 1-Aug-2018

American Association for Cancer Research

Bottom Line: The global age-standardized lung cancer mortality rate among women is projected to increase by 43 percent from 2015 to 2030, according to an analysis of data from 52 countries. The global age-standardized breast cancer mortality rate is projected to decrease by 9 percent in the same time frame.

Journal in Which the Study was Published: Cancer Research, a journal of the American Association for Cancer Research.

Author: Jose M. Martínez-Sánchez, PhD, MPH, BSc, associate professor and director of the Department of Public Health, Epidemiology and Biostatistics at Universitat Internacional de Catalunya (UIC Barcelona)

Background: “While we have made great strides in reducing breast cancer mortality globally, lung cancer mortality rates among women are on the rise worldwide,” said Martínez-Sánchez. “If we do not implement measures to reduce smoking behaviors in this population, lung cancer mortality will continue to increase throughout the world.”

While previous work has focused on projections in lung and breast cancer mortality among women in a single country or continent, few studies have estimated trends in mortality caused by these two common cancers on a global scale, noted Martínez-Sánchez.

How the Study Was Conducted: In this study, Martínez-Sánchez and colleagues analyzed breast and female lung cancer mortality data from the World Health Organization (WHO) Mortality Database from 2008 to 2014. For inclusion in the study, countries must have reported data for at least four years between 2008 and 2014 and must have a population greater than 1 million. Fifty-two countries fulfilled these criteria: 29 from Europe; 14 from the Americas; seven from Asia; and two from Oceania. Lung and breast cancer age-standardized mortality rates in women, reported as per 100,000 person years, were calculated for each country based on the WHO World Standard Population, which allows for the comparison of countries with different age distributions, thereby eliminating age as a confounding variable in the projected rates.

Results: Globally, among women, the mortality rate for lung cancer is projected to increase from 11.2 in 2015 to 16.0 in 2030; the highest lung cancer mortality rates in 2030 are projected in Europe and Oceania, while the lowest lung cancer mortality rates in 2030 are projected in America and Asia. Only Oceania is predicted to see a decrease in lung cancer mortality, which is projected to fall from 17.8 in 2015 to 17.6 in 2030.

“Different timelines have been observed in the tobacco epidemic across the globe,” said Martínez-Sánchez. “This is because it was socially acceptable for women to smoke in the European and Oceanic countries included in our study many years before this habit was commonplace in America and Asia, which reflects why we are seeing higher lung cancer mortality rates in these countries.”

Globally, the mortality rate for breast cancer is projected to decrease from 16.1 in 2015 to 14.7 in 2030. The highest breast cancer mortality rate is predicted in Europe with a decreasing trend overall, while the lowest breast cancer mortality rate is predicted in Asia with an increasing trend overall.

“Breast cancer is associated with many lifestyle factors,” Martínez-Sánchez explained. “We are seeing an increase in breast cancer mortality in Asia because this culture is adapting a Westernized lifestyle, which often leads to obesity and increased alcohol intake, both of which can lead to breast cancer. On the other hand, we are witnessing a decrease in breast cancer mortality in Europe, which may be related to the awareness of breast cancer among this population, leading to active participation in screening programs and the improvement of treatments.”

Compared to middle-income countries, high-income countries have the highest projected age-standardized mortality rates for both lung and breast cancer in 2030. However, high-income countries are more likely to have decreasing breast cancer mortality rates. Furthermore, the first to witness lung cancer mortality rates surpass breast cancer mortality rates are mostly developed countries, noted Martínez-Sánchez.

Author’s Comments: “This research is particularly important because it provides evidence for health professionals and policymakers to decide on global strategies to reduce the social, economic, and health impacts of lung cancer among women in the future,” said Martínez-Sánchez.

Study Limitations: Limitations of the study include the assumption that recent trends in lung and breast cancer mortality will continue for the next two decades; however, certain habits, such as switching from conventional cigarettes to electronic cigarettes may alter lung cancer mortality trends, Martínez-Sánchez said. Future screening technology and therapeutics may lower mortality rates, he added. Additionally, due to small population size and lack of data, no countries from Africa were included in this study.

Funding & Disclosures: This study was sponsored by the Ministry of Universities and Research of the Government of Catalonia. Martínez-Sánchez declares no conflict of interest.

Hospital-associated bacterial species becoming tolerant to alcohol disinfectants

Public Release: 1-Aug-2018

 

American Association for the Advancement of Science

A multidrug-resistant bacterial species that can cause infections in hospitals is becoming increasingly tolerant to the alcohols used in handwash disinfectants, a new study finds. The analysis of bacterial samples taken from two Australian hospitals over 19 years suggests that the species Enterococcus faecium is adapting to a mainstay of infection control used in healthcare facilities worldwide. Treatment-resistant bacterial species such as methicillin-resistant Staphylococcus aureus have become an increasing source of concern for hospital workers over the past several decades. Hospitals have therefore adopted stringent hygienic procedures to keep dangerous microbes from infecting patients; these procedures often involve the use of hand rubs and washes that contain disinfectants based on isopropyl or ethyl alcohol. However, drug-resistant E. faecium infections have increased despite the use of alcohol disinfectants, and currently represent a leading cause of infections acquired in hospitals. This alarming development prompted Sacha Pidot and colleagues to investigate whether E. faecium could be developing resistance to the alcohols used in hand rubs. They screened 139 E. faecium isolates – or isolated bacterial samples – previously collected between 1997 and 2015 from two hospitals in Melbourne, Australia, and studied how well each isolate survived when exposed to diluted isopropyl alcohol. The isolates gathered after 2009 were on average more tolerant to the alcohol compared to bacteria taken from before 2004. The authors then seeded different E. faecium isolates onto the floors of mouse cages and found that the alcohol-tolerant isolates better colonized the guts of mice that were housed in the cages after the cages were cleaned with isopropyl alcohol wipes. Analysis of the bacterial genome revealed that the tolerant isolates harbored several mutations in genes involved in metabolism that conferred increased alcohol resistance. The authors say that examination of isolates from hospitals in other geographical regions is necessary before any major conclusions can be drawn. However, the results indicate that global efforts to mitigate bacterial resistance should consider how microbes can adapt not only to drugs but also to alcohols and other ingredients used in disinfectants.

Scientists reverse aging-associated skin wrinkles and hair loss in a mouse model

Public Release: 20-Jul-2018

A gene mutation causes wrinkled skin and hair loss; turning off that mutation restores the mouse to normal appearance.

University of Alabama at Birmingham

BIRMINGHAM, Ala. – Wrinkled skin and hair loss are hallmarks of aging. What if they could be reversed?

Keshav Singh, Ph.D., and colleagues have done just that, in a mouse model developed at the University of Alabama at Birmingham. When a mutation leading to mitochondrial dysfunction is induced, the mouse develops wrinkled skin and extensive, visible hair loss in a matter of weeks. When the mitochondrial function is restored by turning off the gene responsible for mitochondrial dysfunction, the mouse returns to smooth skin and thick fur, indistinguishable from a healthy mouse of the same age.

“To our knowledge, this observation is unprecedented,” said Singh, a professor of genetics in the UAB School of Medicine.

Importantly, the mutation that does this is in a nuclear gene affecting mitochondrial function, the tiny organelles known as the powerhouses of the cells. Numerous mitochondria in cells produce 90 percent of the chemical energy cells need to survive.

In humans, a decline in mitochondrial function is seen during aging, and mitochondrial dysfunction can drive age-related diseases. A depletion of the DNA in mitochondria is also implicated in human mitochondrial diseases, cardiovascular disease, diabetes, age-associated neurological disorders and cancer.

“This mouse model,” Singh said, “should provide an unprecedented opportunity for the development of preventive and therapeutic drug development strategies to augment the mitochondrial functions for the treatment of aging-associated skin and hair pathology and other human diseases in which mitochondrial dysfunction plays a significant role.”

The mutation in the mouse model is induced when the antibiotic doxycycline is added to the food or drinking water. This causes depletion of mitochondrial DNA because the enzyme to replicate the DNA becomes inactive.

In four weeks, the mice showed gray hair, reduced hair density, hair loss, slowed movements and lethargy, changes that are reminiscent of natural aging. Wrinkled skin was seen four to eight weeks after induction of the mutation, and females had more severe skin wrinkles than males.

Dramatically, this hair loss and wrinkled skin could be reversed by turning off the mutation. The photos below show the hair loss and wrinkled skin after two months of doxycycline induction, and the same mouse a month later after doxycycline was stopped, allowing restoration of the depleted mitochondrial DNA.

Little change was seen in other organs when the mutation was induced, suggesting an important role for mitochondria in skin compared to other tissues.

The wrinkled skin showed changes similar to those seen in both intrinsic and extrinsic aging — intrinsic aging is the natural process of aging, and extrinsic aging is the effect of external factors that influence aging, such as skin wrinkles that develop from excess sun or long-term smoking.

Among the details, the skin of induced-mutation mice showed increased numbers of skin cells, abnormal thickening of the outer layer, dysfunctional hair follicles and increased inflammation that appeared to contribute to skin pathology. These are similar to extrinsic aging of the skin in humans. The mice with depleted mitochondrial DNA also showed changed expression of four aging-associated markers in cells, similar to intrinsic aging.

The skin also showed disruption in the balance between matrix metalloproteinase enzymes and their tissue-specific inhibitor — a balance of these two is necessary to maintain the collagen fibers in the skin that prevent wrinkling.

The mitochondria of induced-mutation mice had reduced mitochondrial DNA content, altered mitochondrial gene expression, and instability of the large complexes in mitochondria that are involved in oxidative phosphorylation.

Reversal of the mutation restored mitochondrial function, as well as the skin and hair pathology. This showed that mitochondria are reversible regulators of skin aging and loss of hair, an observation that Singh calls “surprising.”

“It suggests that epigenetic mechanisms underlying mitochondria-to-nucleus cross-talk must play an important role in the restoration of normal skin and hair phenotype,” Singh said, who has a secondary UAB appointment as professor of pathology. “Further experiments are required to determine whether phenotypic changes in other organs can also be reversed to wildtype level by restoration of mitrochondrial DNA.”

###

Co-authors with Singh for the paper, “Reversing wrinkled skin and hair loss in mice by restoring mitochondrial function,” published in the Cell Death and Disease, a Nature online journal, are Bhupendra Singh, Trenton R. Schoeb and Prachi Bajpai, UAB Department of Genetics; and Andrzej Slominski, UAB Department of Dermatology.

This work was supported by Veterans Administration grant 1I01BX001716 and National Institutes of Health grants CA204430, AR071189-01A1 and AR073004.

At UAB, Singh holds the Joy and Bill Harbert Endowed Chair in Cancer Genetics, and Slominski holds the Endowed Professorship in Basic Research in the Department of Dermatology.

Amyloid beta protein protects brain from herpes infection by entrapping viral particles

Public Release: 5-Jul-2018

Chronic viral infection could induce overproduction of Alzheimer’s-disease-associated protein and cause damaging inflammation

Massachusetts General Hospital

A Massachusetts General Hospital (MGH) study has found the mechanism by which amyloid beta (A-beta) – the protein deposited into plaques in the brains of patients with Alzheimer’s disease – protects from the effects of herpes viruses commonly found in the brain. Along with another study appearing in the same July 11 issue of Neuron, which found elevated levels of three types of herpes viruses in the brains of patients with Alzheimer’s disease, the MGH team’s results support a potential role for viral infection in accelerating A-beta deposition and Alzheimer’s progression.

“There have been multiple epidemiological studies suggesting people with herpes infections are at higher risk for Alzheimer’s disease, along with the most recent findings from Icahn School of Medicine at Mt. Sinai that are being published with our study,” says Rudolph Tanzi, PhD, director of the Genetics and Aging Research Unit in the MassGeneral Institute for Neurodegenerative Disease (MIND) and co-corresponding author of the Neuron paper. “Our findings reveal a simple and direct mechanism by which herpes infections trigger the deposition of brain amyloid as a defense response in the brain. In this way, we have merged the infection hypothesis and amyloid hypothesis into one ‘Antimicrobial Response Hypothesis’ of Alzheimer’s disease.”

Previous studies led by Tanzi and co-corresponding author Robert Moir, PhD, also of the MIND Genetics and Aging Research Unit, found evidence indicating that A-beta – long thought to be useless “metabolic garbage” – was an antimicrobial protein of the body’s innate immune system, capable of protecting animal models and cultured human brain cells from dangerous infections. Given that brain infection with herpes simplex – the virus that causes cold sores – is known to increase with aging, leading to almost universal presence of that and other herpes strains in the brain by adulthood, the MGH team set out to find whether A-beta could protect against herpes infection and, if so, the mechanism by which such protection takes place.

After first finding that transgenic mice engineered to express human A-beta survive significantly longer after injections of herpes simplex into their brains than do nontransgenic mice, the researchers found that A-beta inhibited infection of cultured human brain cells with herpes simplex and two other herpes strains by binding to proteins on the viral membranes and clumping into fibrils that entrap the virus and prevent it from entering cells. Further experiments with the transgenic mice revealed that introduction of herpes simplex into the brains of 5- to 6-week-old animals induced rapid development of A-beta plaques, which usually appear only when the animals are 10 to 12 weeks old.

“Our findings show that amyloid entrapment of herpes viruses provides immediate, effective protection from infection,” says Moir. “But it’s possible that chronic infection with pathogens like herpes that remain present throughout life could lead to sustained and damaging activation of the amyloid-based immune response, triggering the brain inflammation that drives a cascade of pathologies leading to the onset of Alzheimer’s disease. A key insight is that it’s not direct killing of brain cells by herpes that causes Alzheimer’s, rather it’s the immune response to the virus that leads to brain-damaging neuroinflammation.”

He continues, “Our data and the Mt. Sinai findings suggest that an antimicrobial protection model utilizing both anti-herpes and anti-amyloid drugs, could be effective against early Alzheimer’s disease. Later on when neuroinflammation has begun, greater benefit may come from targeting inflammatory molecules. However, it remains unclear whether infection is the disease’s root cause. After all, Alzheimer’s is a highly heterogeneous disease, so multiple factors may be involved in its development.

Tanzi says, “We are currently conducting what we call the ‘Brain Microbiome Project,’ to characterize the population of microbes normally found in the brain. The brain used to be considered sterile but it turns out to have a resident population of microbes, some of which may be needed for normal brain health. Our preliminary findings suggest that the brain microbiome is severely disturbed in Alzheimer’s disease and that bad players – including herpes viruses – seem to take advantage of the situation, leading to trouble for the patient. We are exploring whether Alzheimer’s pathogenesis parallels the disrupted microbiome models seen in conditions like inflammatory bowel disease, and the data generated to date are both surprising and fascinating.”

###

Tanzi is the Joseph P. and Rose F. Kennedy Professor of Neurology, and Moir is an assistant professor of Neurology at Harvard Medical School. The lead author of the Neuron paper is William Eimer, PhD, of the MIND Genetics and Aging Unit. Additional co-authors are Deepak K.V. Kumar, PhD, Nanda K. N. Shanmugam, PhD, Alex S. Rodriguez, Teryn Mitchell and Kevin J. Washicosky, MIND Genetics and Aging Unit; and Bence György and Xandra O. Breakefield, PhD, MGH Neurology. The study was funded by grants from the Cure Alzheimer’s Fund, Good Ventures and the Open Philanthropy Project.

Massachusetts General Hospital, founded in 1811, is the original and largest teaching hospital of Harvard Medical School. The MGH Research Institute) conducts the largest hospital-based research program in the nation, with an annual research budget of more than $900 million and major research centers in HIV/AIDS, cardiovascular research, cancer, computational and integrative biology, cutaneous biology, genomic medicine, medical imaging, neurodegenerative disorders, regenerative medicine, reproductive biology, systems biology, photomedicine and transplantation biology. The MGH topped the 2015 Nature Index list of health care organizations publishing in leading scientific journals and earned the prestigious 2015 Foster G. McGaw Prize for Excellence in Community Service. In August 2017 the MGH was once again named to the Honor Roll in the U.S. News & World Report list of “America’s Best Hospitals.”

Consciousness is partly preserved during general anesthesia

Public Release: 3-Jul-2018

University of Turku

When people are administered an anaesthetic, they seem to lose consciousness – or at least they stop reacting to their environment. But is consciousness lost fully during anaesthesia or does consciousness persist in the brain but in an altered state? This question has been explored in the joint research project “The Conscious Mind: Integrating subjective phenomenology with objective measurements” of the University of Turku and the Hospital District of Southwest Finland studying neural mechanisms of human consciousness. In the study, the changes caused by the anaesthetics were monitored with electroencephalogram (EEG) and positron emission tomography (PET).

The study is a joint project between the research group of Adjunct Professor of Pharmacology and Anaesthesiologist Harry Scheinin studying anaesthesia mechanisms, and the research group of Professor of Psychology Antti Revonsuo studying human consciousness and brain from the point of view of philosophy and psychology. The study was conducted in collaboration with investigators from the University of Michigan, Ann Arbor, and the University of California, Irvine, USA. The latest research findings in the project have been published as four different publications in the July issues of the two leading journals in anaesthesiology. The main funders of the project are the Academy of Finland and Jane and Aatos Erkko Foundation.

Brain dreams and processes words during anaesthesia

In the first part of the study, healthy voluntary participants were anaesthetised either with dexmedetomidine or propofol. The drugs were administered with computer-driven target-controlled infusions until the subject just barely lost responsiveness. From this state, the subjects could be woken up with light shaking or a loud voice without changing the drug infusion. Immediately after the subjects regained responsiveness, they were asked whether they experienced anything during the anaesthesia period.

Nearly all participants reported dream-like experiences that sometimes mixed with the reality, says Professor Revonsuo.

The subjects were played Finnish sentences during the anaesthesia, half of which ended as expected (congruent) and half in an unexpected (incongruent) word, such as “The night sky was filled with shimmering tomatoes”. Normally, when a person is awake, the unexpected word causes a response in the EEG, which reflects how the brain processes the meaning of the sentence and word. The researchers tested whether the subjects detected and understood words or entire sentences while under anaesthesia.

The responses in the EEG showed that the brain cannot differentiate between normal and bizarre sentences when under anaesthesia. When we used dexmedetomidine, also the expected words created a significant response, meaning that the brain was trying to interpret the meaning of the words. However, after the participants woke from the anaesthesia, they did not remember the sentences they had heard and the results were the same with both drugs, says Senior Researcher, Adjunct Professor Katja Valli who participated in the study.

The subjects were also played unpleasant sounds during the anaesthesia. After the subjects woke up, the sounds were played again and, surprisingly, they reacted faster to these sounds than to new sounds they had not heard before. The subjects who were given dexmedetomidine also recognised the played sounds better than by chance, even though they could not recall them spontaneously.

In other words, the brain can process sounds and words even though the subject did not recall it afterwards. Against common belief, anaesthesia does not require full loss of consciousness, as it is sufficient to just disconnect the patient from the environment, explains Dr. Scheinin

The applied study design enabled separation of consciousness from other drug effects

The perceived changes in the EEG were mostly similar to earlier studies. However, the current study used constant infusion both when the participants were asleep and awake, which enabled the researchers to differentiate the effects of the drugs on consciousness from other possible direct or indirect effects. Partly because these effects get mixed, it is still a great challenge to estimate the depth of anaesthesia during surgery.

The project also studied the effects of four different anaesthetics on regional cerebral glucose metabolism with PET imaging. The findings alleviated the concern for potential harmful effects of dexmedetomidine on the ratio of cerebral blood flow and metabolism. In the future, the project will further analyse the association between cerebral blood flow or metabolism and the state of consciousness.

Consciousness is in a dream-like state during anaesthesia

All in all, the findings indicate that consciousness is not necessarily fully lost during anaesthesia, even though the person is no longer reacting to their environment. However, dream-like experiences and thoughts might still float in consciousness. The brain might still register speech and try to decipher words, but the person will not understand or remember them consciously, and the brain cannot construe full sentences from them.

The state of consciousness induced by anaesthetics can be similar to natural sleep. While sleeping, people dream and the brain observes the occurrences and stimuli in their environment subconsciously, summarises Professor Revonsuo.

Anaesthesia could resemble normal sleep more than we have previously thought, adds Dr. Scheinin.

###

The research articles were published in the July issues of the Anesthesiology and the British Journal of Anaesthesia. Based on their impact factors, these journals are the best anaesthesiology journals in the world. All the articles are open access publications and can be freely downloaded.

New study suggests viral connection to Alzheimer’s disease

PUBLIC RELEASE: 21-JUN-2018

 

ARIZONA STATE UNIVERSITY

IMAGE

CAPTION

HHV 6A and 7 are common herpesviruses to which most are exposed as children. The two viruses were detected in higher abundance in brains with Alzheimer’s disease and their activity appears related to certain hallmarks of the disease.

CREDIT: GRAPHIC BY SHIREEN DOOLING FOR THE BIODESIGN INSTITUTE AT ASU

Of the major illnesses facing humanity, Alzheimer’s disease (AD) remains among the most pitiless and confounding. Over a century after its discovery, no effective prevention or treatment exists for this progressive deterioration of brain tissue, memory and identity. With more people living to older ages, there is a growing need to clarify Alzheimer’s disease risk factors and disease mechanisms and use this information to find new ways in which to treat and prevent this terrible disorder.

A first-of-its kind study implicates another culprit in the path to Alzheimer’s disease: the presence of viruses in the brain.

In research appearing in the advanced online edition of the journal Neuron, scientists at the Arizona State University-Banner Neurodegenerative Disease Research Center (NDRC) and their colleagues at the Icahn School of Medicine at Mount Sinai used large data sets from clinically and neuropathologically characterized brain donors and sophisticated “big data” analysis tools to make sense of both the genes that are inherited and those that are preferentially turned on or off in the brains of persons with Alzheimer’s disease. They provide multiple lines of evidence to suggest that certain species of herpesviruses contribute to the development of this disorder.

The new work brings science a step closer to clarifying the mechanisms by which infectious agents may play important roles in the disease. To achieve this, the team capitalized on DNA and RNA sequencing data from 622 brain donors with the clinical and neuropathological features of Alzheimer’s disease and 322 brain donors without the disease–data generated from the NIH-sponsored Accelerated Medicines Partnership for Alzheimer’s Disease (AMP-AD).

The “whole exome” DNA sequencing was used to provide detailed information about each person’s inherited genes. RNA sequencing from several brain regions was used to provide detailed information about the genes that are expressed differently in donors with and without the disease.

Clinical assessments performed before the research participants died provided detailed information about their trajectory of cognitive decline, and neuropathological assessments performed after they died provided relevant neuropathological information, including the severity of amyloid plaques and tangles, the cardinal features of Alzheimer’s disease. Sophisticated computational tools were used to develop a kind of grand unified picture of the viral-AD nexus.

Big challenges, big data

Big data-driven analyses offer a particularly powerful approach for exploring diseases like Alzheimer’s, which involve many interdependent variables acting in concert in profoundly complex systems. In the current study, researchers explore viral presence in 6 key brain regions known to be highly vulnerable to the ravages of AD. (It is now accepted that damaging effects to these areas often precede clinical diagnosis of the disease by several decades.)

The study identifies high levels of human herpesvirus (HHV) 6A and 7 in brain samples showing signs of AD neuropathology, compared with the lower levels found in normal brains. Further, through the careful comparison of large data sets of viral RNA and DNA with networks of human genes associated with AD and signposts of neuropathology, the study offers the first hints of the viral mechanisms that could trigger or exacerbate the disease.

The findings, originally hinted at from samples provided by Translational Genomics (TGen) in Phoenix, were confirmed in the Mount Sinai Brain Bank, and then replicated in samples from the Mayo Clinic Brain Bank, Rush Alzheimer’s Disease Center, and the Banner-Sun Health Research Institute’s Brain and Body Donation Program.

Uninvited guests

According to Ben Readhead, lead author of the new study, the researchers’ general goal was to discover disease mechanisms, including those that could be targeted by repurposed or investigational drug therapies. “We didn’t go looking for viruses, but viruses sort of screamed out at us,” Readhead said.

Although the study found a number of common viruses in normal aging brains, viral abundance of two key viruses–HHV 6A and 7–was greater in brains stricken with Alzheimer’s.

“We were able to use a range of network biology approaches to tease apart how these viruses may be interacting with human genes we know are relevant to Alzheimer’s,” Readhead said.

Readhead is an assistant research professor in the NDRC, housed at ASU’s Biodesign Institute. Much of the research described in the new study was performed in the laboratory of Joel Dudley, associate professor of Genetics and Genomic Sciences at the Icahn School of Medicine at Mount Sinai, associate research professor in the NDRC, and senior author of the paper in Neuron.

The nature and significance of viruses and other pathogens in the brain are currently hot topics in neuroscience, though the exploration is still in its early stages. One of the primary questions is whether such pathogens play an active, causative role in the disease or enter the brain simply as opportunistic passengers, taking advantage of the neural deterioration characteristic of AD.

“Previous studies of viruses and Alzheimer’s have always been very indirect and correlative. But we were able to perform a more sophisticated computational analysis using multiple levels of genomic information measured directly from affected brain tissue. This analysis allowed us to identify how the viruses are directly interacting with or coregulating known Alzheimer’s genes,” says Dudley. “I don’t think we can answer whether herpesviruses are a primary cause of Alzheimer’s disease. But what’s clear is that they’re perturbing and participating in networks that directly underlie Alzheimer’s pathophysiology.”

Network news

The new study uses a network biology approach to holistically incorporate molecular, clinical and neuropathological features of AD with viral activity in the brain. Using techniques in bioinformatics, the study integrates high-throughput data into probabilistic networks that are postulated to account for the associations between herpes viruses and the telltale effects of AD.

The networks described suggest that the hallmarks of AD may arise as collateral damage caused by the brain’s response to viral insult. According to the so-called pathogen hypothesis of AD, the brain reacts to infection by engulfing viruses with the protein amyloid beta (Aβ), sequestering the invaders and preventing them from binding with cell surfaces and inserting their viral genetic payload into healthy cells.

As Readhead explains, “a number of viruses looked interesting. We saw a key virus, HHV 6A, regulating the expression of quite a few AD risk genes and genes known to regulate the processing of amyloid, a key ingredient in AD neuropathology.” (Amyloid concentrations form characteristic plaques in the brain. These plaques, along with neurofibrillary tangles formed by another protein, known as tau, are the microscopic brain abnormalities used to diagnosis Alzheimer’s.)

Both HHV 6A, and 7 are common herpesviruses belonging to the genus Roseolovirus. Most people are exposed to them early in life. The likely route of entry for such viruses is through the nasopharyngeal lining. The higher abundance of these viruses in AD-affected brains may initiate an immune cascade leading to deterioration and cell death or act in other ways to promote AD.

Mounting evidence

The results from human brain tissue were further supplemented by mouse studies. Here, researchers examined the effect of depleting miR155, a small snippet of RNA (or micro RNA) that is an important regulator of the innate and adaptive immune systems. Results showed increased deposition of amyloid plaques in miR155-depleted mice, coupled with behavioral changes. As the authors note, HHV 6A is known to deplete miR155, lending further weight to a viral contribution to AD.

The new research is the fruitful result of close working relationships among researchers from Arizona State University, Banner, Mount Sinai, and other research organizations, as well as public-private partnerships in AMP-AD.

“This study illustrates the promise of leveraging human brain samples, emerging big data analysis methods, converging findings from experimental models, and intensely collaborative approaches in the scientific understanding of Alzheimer’s disease and the discovery of new treatments,” said study co-author Eric Reiman, Executive Director of the Banner Alzheimer’s Institute and University Professor of Neuroscience at Arizona State University. “We are excited about the chance to capitalize on this approach to help in the scientific understanding, treatment and prevention of Alzheimer’s and other neurodegenerative diseases.”

Enemy with a thousand faces

In the meantime, Alzheimer’s continues its devastating trajectory. Among the many challenges facing researchers is the fact that the earliest effects of the disease on vulnerable brain regions occur 20 or 30 years before memory loss, confusion, mood changes and other clinical symptoms appear. Without a cure or effective treatment, AD is expected to strike a new victim in the United States every 33 seconds by mid-century and costs are projected to exceed $1 trillion annually.

The research study does not suggest that Alzheimer’s disease is contagious. But if viruses or other infections are confirmed to have roles in the pathogenesis of Alzheimer’s, it could set the stage for researchers to find novel anti-viral or immune therapies to combat the disease, even before the onset of symptoms.

###

More on this research is available in announcements from the Icahn School of Medicine at Mount Sinai, NIH/National Institute on Aging and Cell Press, the publisher of Neuron.

More info: NIH/National Institute on Aging: https://bit.ly/2HRBzi6

Additional contributors to the study include: Center for NFL Neurological Care, Department of Neurology, New York; James J. Peters VA Medical Center, New York; Arizona Alzheimer’s Consortium, Phoenix, AZ; Department of Psychiatry, University of Arizona, Tucson, AZ; Banner Alzheimer’s Institute, Phoenix, AZ; Neurogenomics Division, Translational Genomics Research Institute, Phoenix, AZ; Institute for Systems Biology, Seattle, WA.

Postmortem brain tissue was collected through the NIH-designated NeuroBioBank (NBB) System that contributes to support of the Mount Sinai VA/Alzheimer’s Disease Research Center Brain Bank (AG005138).

The Dudley Laboratory at the Icahn School of Medicine at Mount Sinai has an institutional partnership with Banner-ASU Neurodegenerative Disease Research Center.

Postmortem brain tissue was collected through the NIH-designated NeuroBioBank (NBB) System that contributes to support of the Mount Sinai VA/Alzheimer’s Disease Research Center Brain Bank (AG005138). Dr. Vahram Haroutunian from the Mount Sinai School of Medicine is Director of the NeuroBioBank.

Additional postmortem data collection was supported through funding by NIA grants P50 AG016574, R01 AG032990, U01 AG046139, R01 AG018023, U01 AG006576, U01 AG006786, R01 AG025711, R01 AG017216, R01 AG003949, R01 NS080820, Cure PSP Foundation, and support from Mayo Foundation, U24 NS072026, P30 AG19610, Michael J. Fox Foundation for Parkinson’s Research P30AG10161, R01AG15819, R01AG17917, R01AG30146, R01AG36836, U01AG32984, U01AG46152, the Illinois Department of Public Health, and the Translational Genomics Research Institute.

Additional work performed in this study was supported by U01 AG046170, R56AG058469, and philanthropic financial support was provided by Katherine Gehl.

About the Biodesign Institute at Arizona State University

The Biodesign Institute at Arizona State University works to improve human health and quality of life through its translational research mission in health care, energy and the environment, global health and national security. Grounded on the premise that scientists can best solve complex problems by emulating nature, Biodesign serves as an innovation hub that fuses previously separate areas of knowledge to serve as a model for 21st century academic research. By fusing bioscience/biotechnology, nanoscale engineering and advanced computing, Biodesign’s research scientists and students take an entrepreneurial team approach to accelerating discoveries to market. They also educate future generations of scientists by providing hands-on laboratory research training in state-of-the-art facilities for ASU students.

About the Mount Sinai Health System

The Mount Sinai Health System is New York City’s largest integrated delivery system encompassing seven hospital campuses, a leading medical school, and a vast network of ambulatory practices throughout the greater New York region. Mount Sinai’s vision is to produce the safest care, the highest quality, the highest satisfaction, the best access and the best value of any health system in the nation.

The System includes approximately 7,100 primary and specialty care physicians; 10 joint-venture ambulatory surgery centers; more than 140 ambulatory practices throughout the five boroughs of New York City, Westchester, Long Island, and Florida; and 31 affiliated community health centers. The Icahn School of Medicine is one of 3 medical schools that have earned distinction by multiple indicators: ranked in the top 20 by U.S. News & World Report’s “Best Medical Schools”, aligned with a U.S. News & World Report’s “Honor Roll” Hospital, No. 13 in the nation for National Institutes of Health funding, and among the top 10 most innovative research institutions as ranked by the journal Nature in its Nature Innovation Index. This reflects a special level of excellence in education, clinical practice, and research. The Mount Sinai Hospital is ranked No. 18 on U.S. News & World Report’s “Honor Roll” of top U.S. hospitals; it is one of the nation’s top 20 hospitals in Cardiology/Heart Surgery, Diabetes/Endocrinology, Gastroenterology/GI Surgery, Geriatrics, Nephrology, and Neurology/Neurosurgery, and in the top 50 in four other specialties in the 2017-2018 “Best Hospitals” issue. Mount Sinai’s Kravis Children’s Hospital also is ranked in six out of ten pediatric specialties by U.S. News & World Report. The New York Eye and Ear Infirmary of Mount Sinai is ranked 12th nationally for Ophthalmology and 50th for Ear, Nose, and Throat, while Mount Sinai Beth Israel, Mount Sinai St. Luke’s and Mount Sinai West are ranked regionally.

For more information, visit https://www.mountsinai.org or find Mount Sinai on Facebook, Twitter and YouTube.

About AMP-AD: The Alzheimer’s disease initiative is a project of the Accelerating Medicines Partnership, a joint venture among the National Institutes of Health, the Food and Drug Administration, 12 biopharmaceutical and life science companies and 13 non-profit organizations, managed by the Foundation for the NIH, to identify and validate promising biological targets of disease. AMP-AD is one of the four initiatives under the AMP umbrella; the other three are focused on type 2 diabetes (AMP-T2D), rheumatoid arthritis and systemic lupus erythematosus (AMP-RA/SLE) and Parkinson’s disease (AMP-PD). The AMP-AD knowledge portal already has over 1300 total users. To learn more about the AMP-AD Target Discovery and Preclinical Validation Project please visit: https://www.nia.nih.gov/research/amp-ad.

About the National Institute on Aging: The NIA leads the federal government effort conducting and supporting research on aging and the health and well-being of older people. The NIA is designated as the lead NIH institute for information on Alzheimer’s disease. It provides information on age-related cognitive change and neurodegenerative disease, including participation in clinical studies, specifically on its Alzheimer’s website.

About the National Institutes of Health (NIH): NIH, the nation’s medical research agency, includes 27 Institutes and Centers and is a component of the U.S. Department of Health and Human Services. NIH is the primary federal agency conducting and supporting basic, clinical, and translational medical research, and is investigating the causes, treatments, and cures for both common and rare diseases. For more information about NIH and its programs, visit http://www.nih.gov.

Author Contacts:

Joel Dudley
Icahn School of Medicine at Mount Sinai
joel.dudley@mssm.edu

Sam Gandy
Icahn School of Medicine at Mount Sinai
samuel.gandy@mssm.edu

Ben Readhead
Banner-ASU, Neurodegenerative Disease Research Center & Icahn School of Medicine at Mount Sinai
ben.readhead@asu.edu, ben.readhead@mssm.edu

Media Contacts:

Joseph Caspermeyer
Managing Editor
Biodesign Institute at ASU
(480) 727-0369
Joseph.Caspermeyer@asu.edu

Rachel Zuckerman
Associate Director of Media
Mount Sinai Health System
rachel.zuckerman@mountsinai.org
(917) 816-3475

Joe Balintfy
Office of Communications and Public Liaison
National Institute on Aging
(301) 496-1752
nianews3@mail.nih.gov

Why being left-handed matters for mental health treatment

Public Release: 18-Jun-2018

Cornell University

ITHACA, N.Y. – Treatment for the most common mental health problems could be ineffective or even detrimental to about 50 percent of the population, according to a radical new model of emotion in the brain.

Since the 1970s, hundreds of studies have suggested that each hemisphere of the brain is home to a specific type of emotion. Emotions linked to approaching and engaging with the world – like happiness, pride and anger – lives in the left side of the brain, while emotions associated with avoidance – like disgust and fear – are housed in the right.

But those studies were done almost exclusively on right-handed people. That simple fact has given us a skewed understanding of how emotion works in the brain, according to Daniel Casasanto, associate professor of human development and psychology at Cornell University.

That longstanding model is, in fact, reversed in left-handed people, whose emotions like alertness and determination are housed in the right side of their brains, Casasanto suggests in a new study. Even more radical: The location of a person’s neural systems for emotion depends on whether they are left-handed, right-handed or somewhere in between, the research shows.

The study, “Approach motivation in human cerebral cortex,” is published in Philosophical Transactions of the Royal Society B: Biological Sciences.

According to the new theory, called the “sword and shield hypothesis,” the way we perform actions with our hands determines how emotions are organized in our brains. Sword fighters of old would wield their swords in their dominant hand to attack the enemy — an approach action — and raise their shields with their non-dominant hand to fend off attack — an avoidance action. Consistent with these action habits, results show that approach emotions depend on the hemisphere of the brain that controls the dominant “sword” hand, and avoidance emotions on the hemisphere that controls the non-dominant “shield” hand.

The work has implications for a current treatment for recalcitrant anxiety and depression called neural therapy. Similar to the technique used in the study and approved by the Food and Drug Administration, it involves a mild electrical stimulation or a magnetic stimulation to the left side of the brain, to encourage approach-related emotions.

But Casasanto’s work suggests the treatment could be damaging for left-handed patients. Stimulation on the left would decrease life-affirming approach emotions. “If you give left-handers the standard treatment, you’re probably going to make them worse,” Casasanto said.

“And because many people are neither strongly right- nor left-handed, the stimulation won’t make any difference for them, because their approach emotions are distributed across both hemispheres,” he said.

“This suggests strong righties should get the normal treatment, but they make up only 50 percent of the population. Strong lefties should get the opposite treatment, and people in the middle shouldn’t get the treatment at all.”

However, Casasanto cautions that this research studied only healthy participants and more work is needed to extend these findings to a clinical setting.

###

The research was funded by a James S. McDonnell Foundation Scholar Award and the National Science Foundation.

Cornell University has television, ISDN and dedicated Skype/Google+ Hangout studios available for media interviews.

Genes found only in humans influence brain size

Public Release: 31-May-2018

New genes arose in human ancestors just before a dramatic increase in brain size and are involved in genetic defects associated with neurological disorders

University of California – Santa Cruz

IMAGE

IMAGE: Researchers studied the effects of NOTCH2NL genes in cortical organoids grown from human embryonic stem cells. Immunofluorescence staining shows markers for radial glia (green) and cortical neurons (red).

A set of three nearly identical genes found only in humans appear to play a critical role in the development of our large brains, according to a study led by researchers at the University of California, Santa Cruz.

The genes appeared between 3 and 4 million years ago, just before the period when fossils show a dramatic increase in the brain sizes of human ancestors. In modern humans, the genes are involved in genetic defects associated with neurological disorders.

Published May 31 in Cell, the study represents more than five years of work to characterize the genes, their role in neurological development, and their evolutionary origins. They belong to an ancient family of genes known as Notch genes, first discovered in fruit flies and named for a genetic defect causing notched wings.

“This is a family of genes that goes back hundreds of millions of years in evolutionary history and is known to play important roles in embryonic development. To find that humans have a new member of this family that is involved in brain development is extremely exciting,” said senior author David Haussler, professor of biomolecular engineering and scientific director of the UC Santa Cruz Genomics Institute.

The site of the genes on the long arm of chromosome 1 is involved in genetic defects in which large segments of DNA are either duplicated or deleted, leading to neurological disorders known collectively as 1q21.1 deletion/duplication syndrome. Deletions are often associated with microcephaly (abnormally small head size) and autism, while duplications are often associated with macrocephaly (abnormally large head size) and schizophrenia.

The new human-specific Notch genes were derived from NOTCH2, one of four previously known mammalian Notch genes, through a duplication event that inserted an extra partial copy of NOTCH2 into the genome. This happened in an ancient ape species that was a common ancestor of humans, chimpanzees, and gorillas. The partial duplicate was a nonfunctional “pseudogene,” versions of which are still found in chimp and gorilla genomes. In the human lineage, however, this pseudogene was “revived” when additional NOTCH2 DNA was copied into its place, creating a functional gene. This new gene was then duplicated several more times, resulting in four related genes, called NOTCH2NL genes, found only in humans.

One of the four NOTCH2NL genes appears to be a nonfunctional pseudogene, but the other three (NOTCH2NLA, NOTCH2NLB, and NOTCH2NLC) are active genes that direct the production of truncated versions of the original NOTCH2 protein. Notch proteins are involved in signaling between and within cells. In many cases, the Notch signaling pathway regulates the differentiation of stem cells in developing organs throughout the body, telling stem cells when to become, for example, mature heart cells or neurons.

“Notch signaling was already known to be important in the developing nervous system,” said senior author Sofie Salama, a research scientist at UC Santa Cruz. “NOTCH2NL seems to amplify Notch signaling, which leads to increased proliferation of neural stem cells and delayed neural maturation.”

The NOTCH2NL genes are especially active in the pool of neural stem cells thought to generate most of the cortical neurons. By delaying their maturation, the genes allow a larger pool of these stem cells (called “radial glia”) to build up in the developing brain, ultimately leading to a larger number of mature neurons in the neocortex (the outer layer of the brain in mammals; in humans, it hosts higher cognitive functions such as language and reasoning).

This delayed development of cortical neurons fits a pattern of delayed maturation characteristic of human development, Haussler said. “One of our most distinguishing features is larger brains and delayed brain development, and now we’re seeing molecular mechanisms supporting this evolutionary trend even at a very early stage of brain development,” he said.

Salama noted that the new genes are just one of many factors that contribute to cortical development in humans. “NOTCH2NL doesn’t act in a vacuum, but it arose at a provocative time in human evolution, and it is associated with neural developmental disorders. That combination makes it especially interesting,” she said.

The DNA copying errors that created the NOTCH2NL genes in the first place are the same type of errors that cause the 1q21.1 deletion/duplication syndrome. These errors tend to occur in places on the chromosomes where there are long stretches of nearly identical DNA sequences.

“These long segments of DNA that are almost identical can confuse the replication machinery and cause instability in the genome,” Haussler explained. “We may have gained our larger brains in part through the duplication of these genes, but at the expense of greater instability in that region of chromosome 1, which makes us susceptible to the deletion/duplication syndrome.”

Long stretches of repetitive DNA also present challenges for DNA sequencing technologies. In fact, the location of NOTCH2NL in the human reference genome was not accurate when Haussler’s team first started investigating it.

“When we looked at the reference genome to see where NOTCH2NL was, we found that it was near the area involved in the 1q21.1 syndrome, but not part of the region that was deleted or duplicated,” Haussler said. “This explains why the gene was not looked at before by geneticists studying the syndrome.”

After checking other genome data and contacting the team working on the next iteration of the reference genome, however, Haussler found that NOTCH2NL is in fact located in the interval where the defects occur. The new reference genome (the 38th version, released later in 2013) also shows the additional copies of the gene. Haussler’s team subsequently showed that the duplications or deletions in the syndrome result in an increase or decrease (respectively) in the number of copies of NOTCH2NL genes in the affected person’s genome. Other genes are also duplicated or deleted and may also be involved in the syndrome.

Interestingly, these genetic changes do not always result in neurological disorders. In about 20 to 50 percent of affected children, the syndrome is the result of a new genetic mistake, but in many cases one of the parents is found to also carry the genetic defect, without showing any apparent symptoms. According to Haussler, this is not uncommon in genetic diseases and underscores the importance of multiple factors in the development of disease.

“It’s amazing how often we find people with what seem to be serious genetic conditions, yet something else compensates for it,” he said.

The investigation of these genes began in 2012 when Frank Jacobs, now at the University of Amsterdam and the third senior author of the paper, was working with Haussler and Salama at UC Santa Cruz as a postdoctoral researcher. His project involved coaxing human embryonic stem cells to differentiate into neurons and studying the genes that are expressed during this process. As the cells develop into cortical neurons in the petri dish, they self-organize into a layered structure like a miniature version of the brain’s cortex, which researchers call a “cortical organoid.”

Jacobs was comparing gene expression patterns in cortical organoids grown from embryonic stem cells of humans and rhesus monkeys. Many genes showed differences in the timing and amount of expression, but NOTCH2NL was exceptional. “It was screaming hot in human cells and zero in rhesus. Rhesus cells just don’t have this gene,” Salama said. “Finding a new Notch gene in humans set us off on a long journey.”

Haussler, a Howard Hughes Medical Institute investigator, said he remembers presenting their initial findings in 2013 to scientists at HHMI. “Their general reaction was, ‘Well, it’s amazing if it’s true, but we’re not convinced yet.’ So we spent the next five years working to convince everybody.”

The development of the CRISPR/Cas9 system for making genetic modifications provided a crucial tool for their work. Salama’s team used it to delete the NOTCH2NL genes from human embryonic stem cells. Cortical organoids grown from these cells showed an acceleration of neural maturation and were smaller in size than organoids from normal cells. The researchers also inserted NOTCH2NL genes into mouse embryonic stem cells and showed that the genes promote Notch signaling and delay neural maturation in mouse cortical organoids.

“The fact that we can genetically manipulate stem cells with CRISPR and then grow them into cortical organoids in the lab is extremely powerful,” Haussler said. “My dream for decades has been to peer into human evolution at the level of individual genes and gene functions. It’s incredibly exciting that we’re able to do that now.”

A major part of the research involved careful and precise sequencing of the region of chromosome 1 where the NOTCH2NL genes are located in 8 normal individuals and 6 patients with 1q21.1 deletion/duplication syndrome. (The researchers also analyzed the genomes of three archaic humans, two Neanderthals, and one Denisovan, finding in all of them the same three active NOTCH2NL genes that are present in modern humans.)

The sequencing results showed that the NOTCH2NL genes are variable in modern humans. The researchers identified eight different versions of NOTCH2NL, and Haussler said there are probably more. Each version has a slightly different DNA sequence, but it remains unclear what effects these differences may have.

“We’ve found that all of them can promote Notch signaling. They behaved in subtly different ways when we tested them in cell cultures, but we have a lot more work to do before we can start to get a handle on what this means,” Salama said.

Other genes involved in human brain development seem to have arisen through a duplication process similar to the creation of NOTCH2NL. A notable example is the gene SRGAP2C, which is thought to increase the number of connections between neurons. Locations in the genome where such duplications and rearrangements occur repeatedly, known as “duplication hubs,” make up about 5 percent of the human genome and seem to have been important in human evolution, Haussler said.

###

The first authors of the paper are Ian Fiddes, a graduate student working with Haussler at UC Santa Cruz, and Gerrald Lodewijk, a graduate student working with Jacobs at the University of Amsterdam. Other coauthors include researchers at Stanford University, UC San Francisco, University of Washington, Broad Institute of MIT and Harvard, Medical Genetics Service in Lausanne, Switzerland, and Institute of Genetic Medicine in Newcastle upon Tyne, U.K. This work was supported by the Howard Hughes Medical Institute, U.S. National Institutes of Health, European Research Council, California Institute for Regenerative Medicine, Netherlands Organization for Scientific Research (NWO), and European Molecular Biology Organization.

On current trends, almost a quarter of people in the world will be obese by 2045, and 1 in 8 will have type 2 diabetes

Public Release: 22-May-2018

European Association for the Study of Obesity

New research from various cities in the world presented at this year’s European Congress on Obesity in Vienna, Austria (23-26 May) demonstrate that if current trends continue, almost a quarter (22%) of the people in the world will be obese by 2045 (up from 14% in 2017), and one in eight (12%) will have type 2 diabetes (up from 9% in 2017).

The study presented by Dr Alan Moses of Novo Nordisk Research and Development, Søborg, Denmark and Niels Lund of Novo Nordisk Health Advocacy, Bagsværd, Denmark and colleagues from the Steno Diabetes Centre, Gentofte, Denmark, and University College London, UK, also indicates that in order to prevent the prevalence of type 2 diabetes from going above 10% in 2045, global obesity levels must be reduced by 25%.

Population data for all countries in the world were obtained from the Non?communicable Disease Risk Factor Collaboration (a WHO database). For each country, the population was divided into age groups. From 2000-2014 (chosen because data is most reliable from 2000 onwards) the population in each age group was divided into body mass index (BMI) categories. For each country and age group, the share of people in each BMI class was projected. The diabetes risk for each age and BMI group was then applied, allowing estimations of diabetes prevalence for each country each year. The prevalence for each country was calibrated to match International Diabetes Federation’s regional estimates thereby taking into account differences in way of life, nutrition and genetic disposition for diabetes.

In 2014, these three institutions collaborated to launch the Cities Changing Diabetes programme to accelerate the global fight against urban diabetes. The program began with eight cities: Copenhagen, Rome, Houston, Johannesburg, Vancouver, Mexico City, Tianjin, Shanghai. These have since been joined by a further seven cities: Beijing, Buenos Aires, Hangzhou, Koriyama, Leicester, Mérida and Xiamen. The programme has established local partnerships in these 15 cities to address the social factors and cultural determinants that can increase type 2 diabetes vulnerability among people living in their cities. Part of this work included projections of obesity and diabetes based on both current trends and on a global target scenario. The research has led to an increased understanding of the different challenges each city is faced with regarding genetic, environmental and social determinants of diabetes in that city.

The startling projections globally are that, based on current trends obesity prevalence worldwide will rise from 14% in 2017 to 22% in 2045. Diabetes prevalence will increase from 9.1% to 11.7% across the same period, placing further massive strain on health systems which already spend huge sums just to treat diabetes.

Although immediate action will not result in reversing the epidemic of diabetes and obesity quickly, it is essential to being work now to prevent new cases of obesity and diabetes. The authors’ model suggests that, in order to stabilise global diabetes prevalence at 10%, obesity prevalence must fall steadily and in total by around a quarter, from the current level of 14% to just over 10% by 2045.

The authors note that the above numbers are for the ‘global’ scenario. Individual countries display individual trends and should have their own targets. For example, if current trends in the USA continue, obesity will increase from 39% in 2017 to 55% in 2045, and diabetes rates from 14% to 18%. To keep diabetes rates in the USA stable between 2017 and 2045, obesity must fall from 38% today to 28%. And in the United Kingdom, current trends predict that obesity will rise from 32% today to 48% in 2045, while diabetes levels will rise from 10.2% to 12.6%, a 28% rise. To stabilise UK diabetes rates at 10%, obesity prevalence must fall from 32% to 24%.

“These numbers underline the staggering challenge the world will face in the future in terms of numbers of people who are obese, or have type 2 diabetes, or both. As well as the medical challenges these people will face, the costs to countries’ health systems will be enormous,” says Dr Moses. “The global prevalence of obesity and diabetes is projected to increase dramatically unless prevention of obesity is significantly intensified. Developing effective global programs to reduce obesity offer the best opportunity to slow or stabilise the unsustainable prevalence of diabetes. The first step must be the recognition of the challenge that obesity presents and the mobilisation of social service and disease prevention resources to slow the progression of these two conditions.”

He adds: “Each country is different based on unique genetic, social and environmental conditions which is why there is no ‘one size fits all’ approach that will work. Individual countries must work on the best strategy for them.”

He concludes: “Despite the challenge all countries are facing with obesity and diabetes, the tide can be turned – but it will take aggressive and coordinated action to reduce obesity and individual cities should play a key role in confronting the issues around obesity, some of which are common to them all and others that are unique to each of them.”

Smarter brains run on sparsely connected neurons

Public Release: 17-May-2018

Princeton researchers crowdsource brain mapping with gamers, discover 6 new neuron types

Caption

By turning a time-intensive research problem into an interactive game, Princeton neuroscientist Sebastian Seung has built an unprecedented data set of neurons, which he is now turning over to the public via the Eyewire Museum. These 17 retinal neurons, mapped by Eyewire gamers, include ganglion cell types in blue and green and amacrine cells in yellow and red.

Credit: Image by Alex Norton, Eyewire

With the help of a quarter-million video game players, Princeton researchers have created and shared detailed maps of more than 1,000 neurons — and they’re just getting started.

“Working with Eyewirers around the world, we’ve made a digital museum that shows off the intricate beauty of the retina’s neural circuits,” said Sebastian Seung, the Evnin Professor in Neuroscience and a professor of computer science and the Princeton Neuroscience Institute (PNI). The related paper is publishing May 17 in the journal Cell.

Seung is unveiling the Eyewire Museum, an interactive archive of neurons available to the general public and neuroscientists around the world, including the hundreds of researchers involved in the federal Brain Research through Advancing Innovative Neurotechnologies (BRAIN) Initiative.

“This interactive viewer is a huge asset for these larger collaborations, especially among people who are not physically in the same lab,” said Amy Robinson Sterling, a crowdsourcing specialist with PNI and the executive director of Eyewire, the online gaming platform for the citizen scientists who have created this data set.

“This museum is something like a brain atlas,” said Alexander Bae, a graduate student in electrical engineering and one of four co-first authors on the paper. “Previous brain atlases didn’t have a function where you could visualize by individual cell, or a subset of cells, and interact with them. Another novelty: Not only do we have the morphology of each cell, but we also have the functional data, too.”

The neural maps were developed by Eyewirers, members of an online community of video game players who have devoted hundreds of thousands of hours to painstakingly piecing together these neural cells, using data from a mouse retina gathered in 2009.

Eyewire pairs machine learning with gamers who trace the twisting and branching paths of each neuron. Humans are better at visually identifying the patterns of neurons, so every player’s moves are recorded and checked against each other by advanced players and Eyewire staffers, as well as by software that is improving its own pattern recognition skills.

Since Eyewire’s launch in 2012, more than 265,000 people have signed onto the game, and they’ve collectively colored in more than 10 million 3-D “cubes,” resulting in the mapping of more than 3,000 neural cells, of which about a thousand are displayed in the museum.

Each cube is a tiny subset of a single cell, about 4.5 microns across, so a 10-by-10 block of cubes would be the width of a human hair. Every cell is reviewed by between 5 and 25 gamers before it is accepted into the system as complete.

“Back in the early years it took weeks to finish a single cell,” said Sterling. “Now players complete multiple neurons per day.” The Eyewire user experience stays focused on the larger mission — “For science!” is a common refrain — but it also replicates a typical gaming environment, with achievement badges, a chat feature to connect with other players and technical support, and the ability to unlock privileges with increasing skill. “Our top players are online all the time — easily 30 hours a week,” Sterling said.

Dedicated Eyewirers have also contributed in other ways, including donating the swag that gamers win during competitions and writing program extensions “to make game play more efficient and more fun,” said Sterling, including profile histories, maps of player activity, a top 100 leaderboard and ever-increasing levels of customizability.

“The community has really been the driving force behind why Eyewire has been successful,” Sterling said. “You come in, and you’re not alone. Right now, there are 43 people online. Some of them will be admins from Boston or Princeton, but most are just playing — now it’s 46.”

For science!

With 100 billion neurons linked together via trillions of connections, the brain is immeasurably complex, and neuroscientists are still assembling its “parts list,” said Nicholas Turner, a graduate student in computer science and another of the co-first authors. “If you know what parts make up the machine you’re trying to break apart, you’re set to figure out how it all works,” he said.

The researchers have started by tackling Eyewire-mapped ganglion cells from the retina of a mouse. “The retina doesn’t just sense light,” Seung said. “Neural circuits in the retina perform the first steps of visual perception.”

The retina grows from the same embryonic tissue as the brain, and while much simpler than the brain, it is still surprisingly complex, Turner said. “Hammering out these details is a really valuable effort,” he said, “showing the depth and complexity that exists in circuits that we naively believe are simple.”

The researchers’ fundamental question is identifying exactly how the retina works, said Bae. “In our case, we focus on the structural morphology of the retinal ganglion cells.”

“Why the ganglion cells of the eye?” asked Shang Mu, an associate research scholar in PNI and fellow first author. “Because they’re the connection between the retina and the brain. They’re the only cell class that go back into the brain.” Different types of ganglion cells are known to compute different types of visual features, which is one reason the museum has linked shape to functional data.

Using Eyewire-produced maps of 396 ganglion cells, the researchers in Seung’s lab successfully classified these cells more thoroughly than has ever been done before.

“The number of different cell types was a surprise,” said Mu. “Just a few years ago, people thought there were only 15 to 20 ganglion cell types, but we found more than 35 — we estimate between 35 and 50 types.”

Of those, six appear to be novel, in that the researchers could not find any matching descriptions in a literature search.

A brief scroll through the digital museum reveals just how remarkably flat the neurons are — nearly all of the branching takes place along a two-dimensional plane. Seung’s team discovered that different cells grow along different planes, with some reaching high above the nucleus before branching out, while others spread out close to the nucleus. Their resulting diagrams resemble a rainforest, with ground cover, an understory, a canopy and an emergent layer overtopping the rest.

All of these are subdivisions of the inner plexiform layer, one of the five previously recognized layers of the retina. The researchers also identified a “density conservation principle” that they used to distinguish types of neurons.

One of the biggest surprises of the research project has been the extraordinary richness of the original sample, said Seung. “There’s a little sliver of a mouse retina, and almost 10 years later, we’re still learning things from it.”

###

“Digital museum of retinal ganglion cells with dense anatomy and physiology,” by Alexander Bae, Shang Mu, Jinseop Kim, Nicholas Turner, Ignacio Tartavull, Nico Kemnitz, Chris Jordan, Alex Norton, William Silversmith, Rachel Prentki, Marissa Sorek, Celia David, Devon Jones, Doug Bland, Amy Sterling, Jungman Park, Kevin Briggman, Sebastian Seung and the Eyewirers, was published May 17 in the journal Cell with DOI 10.1016/j.cell.2018.04.040. The research was supported by the Gatsby Charitable Foundation, National Institute of Health-National Institute of Neurological Disorders and Stroke (U01NS090562 and 5R01NS076467), Defense Advanced Research Projects Agency (HR0011-14-2- 0004), Army Research Office (W911NF-12-1-0594), Intelligence Advanced Research Projects Activity (IARPA) (D16PC00005), KT Corporation, Amazon Web Services Research Grants, Korea Brain Research Institute (2231-415) and Korea National Research Foundation Brain Research Program (2017M3C7A1048086).

UCLA biologists ‘transfer’ a memory

Public Release: 14-May-2018

 

Research in marine snails could lead to new treatments to restore memories and alter traumatic ones

University of California – Los Angeles

IMAGE

IMAGE: This is David Glanzman holding a marine snail.

Credit: Christelle Snow/UCLA

UCLA biologists report they have transferred a memory from one marine snail to another, creating an artificial memory, by injecting RNA from one to another. This research could lead to new ways to lessen the trauma of painful memories with RNA and to restore lost memories.

“I think in the not-too-distant future, we could potentially use RNA to ameliorate the effects of Alzheimer’s disease or post-traumatic stress disorder,” said David Glanzman, senior author of the study and a UCLA professor of integrative biology and physiology and of neurobiology. The team’s research is published May 14 in eNeuro, the online journal of the Society for Neuroscience.

RNA, or ribonucleic acid, has been widely known as a cellular messenger that makes proteins and carries out DNA’s instructions to other parts of the cell. It is now understood to have other important functions besides protein coding, including regulation of a variety of cellular processes involved in development and disease.

The researchers gave mild electric shocks to the tails of a species of marine snail called Aplysia. The snails received five tail shocks, one every 20 minutes, and then five more 24 hours later. The shocks enhance the snail’s defensive withdrawal reflex, a response it displays for protection from potential harm. When the researchers subsequently tapped the snails, they found those that had been given the shocks displayed a defensive contraction that lasted an average of 50 seconds, a simple type of learning known as “sensitization.” Those that had not been given the shocks contracted for only about one second.

The life scientists extracted RNA from the nervous systems of marine snails that received the tail shocks the day after the second series of shocks, and also from marine snails that did not receive any shocks. Then the RNA from the first (sensitized) group was injected into seven marine snails that had not received any shocks, and the RNA from the second group was injected into a control group of seven other snails that also had not received any shocks.

Remarkably, the scientists found that the seven that received the RNA from snails that were given the shocks behaved as if they themselves had received the tail shocks: They displayed a defensive contraction that lasted an average of about 40 seconds.

“It’s as though we transferred the memory,” said Glanzman, who is also a member of UCLA’s Brain Research Institute.

As expected, the control group of snails did not display the lengthy contraction.

Next, the researchers added RNA to Petri dishes containing neurons extracted from different snails that did not receive shocks. Some dishes had RNA from marine snails that had been given electric tail shocks, and some dishes contained RNA from snails that had not been given shocks. Some of the dishes contained sensory neurons, and others contained motor neurons, which in the snail are responsible for the reflex.

When a marine snail is given electric tail shocks, its sensory neurons become more excitable. Interestingly, the researchers discovered, adding RNA from the snails that had been given shocks also produced increased excitability in sensory neurons in a Petri dish; it did not do so in motor neurons. Adding RNA from a marine snail that was not given the tail shocks did not produce this increased excitability in sensory neurons.

In the field of neuroscience, it has long been thought that memories are stored in synapses. (Each neuron has several thousand synapses.) Glanzman holds a different view, believing that memories are stored in the nucleus of neurons.

“If memories were stored at synapses, there is no way our experiment would have worked,” said Glanzman, who added that the marine snail is an excellent model for studying the brain and memory.

Scientists know more about the cell biology of this simple form of learning in this animal than any other form of learning in any other organism, Glanzman said. The cellular and molecular processes seem to be very similar between the marine snail and humans, even though the snail has about 20,000 neurons in its central nervous system and humans are thought to have about 100 billion.

In the future, Glanzman said, it is possible that RNA can be used to awaken and restore memories that have gone dormant in the early stages of Alzheimer’s disease. He and his colleagues published research in the journal eLife in 2014 indicating that lost memories can be restored.

There are many kinds of RNA, and in future research, Glanzman wants to identify the types of RNA that can be used to transfer memories.

###

Co-authors are Alexis Bédécarrats, a UCLA postdoctoral scholar who worked in Glanzman’s laboratory; and Shanping Chen, Kaycey Pearce and Diancai Cai, research associates in Glanzman’s laboratory.

The research was funded by the National Institute of Neurological Disorders and Stroke, the National Institute of Mental Health and the National Science Foundation.

Oral antibiotics linked to increased kidney stone risk for several years after use

PUBLIC RELEASE: 10-MAY-2018

Risk appears to be highest among children

AMERICAN SOCIETY OF NEPHROLOGY

Highlights

  • Use of oral antibiotics was linked with an increased risk of developing kidney stones.
  • Risk decreased over time but was still elevated several years after antibiotic use.
  • Risk was highest for young patients.

Washington, DC May 10, 2018) — The potential to promote antibiotic resistance in bacteria isn’t the only reason to avoid using antibiotics when possible. A new study reveals that antibiotics are also linked with an increased risk of developing kidney stones, with the greatest risk among children. The findings appear in an upcoming issue of the Journal of the American Society of Nephrology (JASN).

For reasons that are unclear, the prevalence of kidney stones–or nephrolithiasis–has increased 70% over the last 30 years, with the most disproportionate increase experienced by children and adolescents. Because perturbations in bacterial communities residing in the intestines and urinary tract have been associated with nephrolithiasis, a team led by Gregory Tasian MD, MSc, MSCE and Michelle Denburg MD, MSCE (The Children’s Hospital of Philadelphia) examined whether the use of antibiotics might affect individuals’ risk of developing the condition.

For their study, the investigators determined the association between 12 classes of oral antibiotics and nephrolithiasis in a population-based study within 641 general practices providing electronic health record data for >13 million children and adults from 1994 to 2015 in the United Kingdom. The team matched 25,981 patients with nephrolithiasis to 259,797 controls by age, sex, and practice at the date of diagnosis (termed the index date).

Exposure to any one of five different antibiotic classes 3-12 months before the index date was associated with nephrolithiasis. Risks were increased 2.3-times, 1.9-times, 1.7-times, 1.7-times, and 1.3-times for sulfas, cephalosporins, fluoroquinolones, nitrofurantoin/methenamine, and broad-spectrum penicillins, respectively. The risk of nephrolithiasis decreased over time, but it remained elevated at 3-5 years after the antibiotic prescription. Also, the risk was greatest for exposures at younger ages. Previous research has shown that children receive more antibiotics than any other age group, and 30% of antibiotics prescribed during ambulatory care visits are inappropriate.

“These findings demonstrate that exposure to certain antibiotics is a novel risk factor for kidney stones and that the risk may be greatest when exposure to these antibiotics occurs at younger ages,” said Dr. Tasian. “Consequently, these results suggest that the risk of nephrolithiasis may be decreased by reducing inappropriate antibiotic exposure and choosing alternative antibiotics, particularly for those patients who are at increased risk of stone formation.”

###

Study co-authors include Thomas Jemielita, PhD, David S. Goldfarb, MD, Lawrence Copelovitch, MD, Jeffrey Gerber MD, PhD, MSCE, and Qufei Wu, MS.

Disclosures: The authors have no conflicts of interest to declare.

The article, entitled “Oral Antibiotic Exposure and Kidney Stone Disease,” will appear online at http://jasn.asnjournals.org/ on May 10, 2018, doi: 10.2215/ASN.2017111213.

The content of this article does not reflect the views or opinions of The American Society of Nephrology (ASN). Responsibility for the information and views expressed therein lies entirely with the author(s). ASN does not offer medical advice. All content in ASN publications is for informational purposes only, and is not intended to cover all possible uses, directions, precautions, drug interactions, or adverse effects. This content should not be used during a medical emergency or for the diagnosis or treatment of any medical condition. Please consult your doctor or other qualified health care provider if you have any questions about a medical condition, or before taking any drug, changing your diet or commencing or discontinuing any course of treatment. Do not ignore or delay obtaining professional medical advice because of information accessed through ASN. Call 911 or your doctor for all medical emergencies.

Since 1966, ASN has been leading the fight to prevent, treat, and cure kidney diseases throughout the world by educating health professionals and scientists, advancing research and innovation, communicating new knowledge, and advocating for the highest quality care for patients. ASN has nearly 18,000 members representing 112 countries. For more information, please visit http://www.asn-online.org or contact the society at 202-640-4660.

Dark Chocolate improves vision with 2 hours

Dark Chocolate improves vision with 2 hours

 

Contrast sensitivity and visual acuity were significantly higher 2 hours after consumption of a dark chocolate bar compared with a milk chocolate bar, but the duration of these effects and their influence in real-world performance await further testing.

Rabin JC, Karunathilake N, Patrizi K. Effects of Milk vs Dark Chocolate Consumption on Visual Acuity and Contrast Sensitivity Within 2 HoursA Randomized Clinical Trial. JAMA Ophthalmol. Published online April 26, 2018. doi:10.1001/jamaophthalmol.2018.0978

Found: A new form of DNA in our cells (**NOT** a double-stranded DNA double helix)

Public Release: 23-Apr-2018

 

Scientists have tracked down an elusive ‘tangled knot’ of DNA

Garvan Institute of Medical Research

IMAGE

IMAGE: This is an artist’s impression of the i-motif DNA structure inside cells, along with the antibody-based tool used to detect it

Credit: Chris Hammang

It’s DNA, but not as we know it.

In a world first, Australian researchers have identified a new DNA structure – called the i-motif – inside cells. A twisted ‘knot’ of DNA, the i-motif has never before been directly seen inside living cells.

The new findings, from the Garvan Institute of Medical Research, are published today in the leading journal Nature Chemistry.

Deep inside the cells in our body lies our DNA. The information in the DNA code – all 6 billion A, C, G and T letters – provides precise instructions for how our bodies are built, and how they work.

The iconic ‘double helix’ shape of DNA has captured the public imagination since 1953, when James Watson and Francis Crick famously uncovered the structure of DNA. However, it’s now known that short stretches of DNA can exist in other shapes, in the laboratory at least – and scientists suspect that these different shapes might play an important role in how and when the DNA code is ‘read’.

The new shape looks entirely different to the double-stranded DNA double helix.

“When most of us think of DNA, we think of the double helix,” says Associate Professor Daniel Christ (Head, Antibody Therapeutics Lab, Garvan) who co-led the research. “This new research reminds us that totally different DNA structures exist – and could well be important for our cells.”

“The i-motif is a four-stranded ‘knot’ of DNA,” says Associate Professor Marcel Dinger (Head, Kinghorn Centre for Clinical Genomics, Garvan),.who co-led the research with A/Prof Christ.

“In the knot structure, C letters on the same strand of DNA bind to each other – so this is very different from a double helix, where ‘letters’ on opposite strands recognise each other, and where Cs bind to Gs [guanines].”

Although researchers have seen the i-motif before and have studied it in detail, it has only been witnessed in vitro – that is, under artificial conditions in the laboratory, and not inside cells.

In fact, scientists in the field have debated whether i-motif ‘knots’ would exist at all inside living things – a question that is resolved by the new findings.

To detect the i-motifs inside cells, the researchers developed a precise new tool – a fragment of an antibody molecule – that could specifically recognise and attach to i-motifs with a very high affinity. Until now, the lack of an antibody that is specific for i-motifs has severely hampered the understanding of their role.

Crucially, the antibody fragment didn’t detect DNA in helical form, nor did it recognise ‘G-quadruplex structures’ (a structurally similar four-stranded DNA arrangement).

With the new tool, researchers uncovered the location of ‘i-motifs’ in a range of human cell lines. Using fluorescence techniques to pinpoint where the i-motifs were located, they identified numerous spots of green within the nucleus, which indicate the position of i-motifs.

“What excited us most is that we could see the green spots – the i-motifs – appearing and disappearing over time, so we know that they are forming, dissolving and forming again,” says Dr Mahdi Zeraati, whose research underpins the study’s findings.

The researchers showed that i-motifs mostly form at a particular point in the cell’s ‘life cycle’ – the late G1 phase, when DNA is being actively ‘read’. They also showed that i-motifs appear in some promoter regions (areas of DNA that control whether genes are switched on or off) and in telomeres, ‘end sections’ of chromosomes that are important in the aging process.

Dr Zeraati says, “We think the coming and going of the i-motifs is a clue to what they do. It seems likely that they are there to help switch genes on or off, and to affect whether a gene is actively read or not.”

“We also think the transient nature of the i-motifs explains why they have been so very difficult to track down in cells until now,” adds A/Prof Christ.

A/Prof Marcel Dinger says, “It’s exciting to uncover a whole new form of DNA in cells – and these findings will set the stage for a whole new push to understand what this new DNA shape is really for, and whether it will impact on health and disease.”

###

Media enquiries: Anna Sweeney (Garvan) – a.sweeney@garvan.org.au – 02 9295 8126/0437 282 467

Featured paper

Mahdi Zeraati, David B. Langley, Peter Schofield, Aaron L. Moye, Romain Rouet, William E. Hughes, Tracey M. Bryan, Marcel E. Dinger and Daniel Christ. I-motif DNA structures are formed in the nuclei of human cells. Nature Chemistry 2018 DOI: 10.1038/s41557-018-0046-3

Support

This work was supported by the National Health and Medical Research Council (Australia) and the Australian Research Council.

About the Garvan Institute

The Garvan Institute of Medical Research is one of Australia’s largest medical research institutions and is at the forefront of next-generation genomic DNA sequencing in Australia. Garvan’s main research areas are: cancer, diabetes and metabolism, genomics and epigenetics, immunology and inflammation, osteoporosis and bone biology, and neuroscience. Garvan’s mission is to make significant contributions to medical science that will change the directions of science and medicine and have major impacts on human health. http://www.garvan.org.au

Multiple sclerosis may be linked to sheep disease toxin

Public Release: 22-Apr-2018

 

University of Exeter

Exposure to a toxin primarily found in sheep could be linked to the development of multiple sclerosis (MS) in humans, new research suggests. Carried out by the University of Exeter and MS Sciences Ltd., the study has found that people with MS are more likely than other people to have antibodies against the Epsilon toxin, or ETX, – suggesting they may have been exposed to the toxin at some time.

ETX, produced in the gut of livestock by the bacterium Clostridium perfringens, damages the animal’s brain and can kill them.

While the toxin can also occur in the gut of other animals, and even in soil, it has mostly been studied as the cause of a type of blood poisoning in sheep, known as enterotoxaemia.

“Our research suggests that there is a link between epsilon toxin and MS,” said Professor Rick Titball, of the University of Exeter.

“The causes of MS are still not fully understood and, while it’s possible that this toxin plays a role, it’s too early to say for certain.

“More research is now needed to understand how the toxin might play a role in MS, and how these findings might be used to develop new tests or treatments.”

Following reports that some MS patients in the US had antibodies against epsilon toxin, the Exeter researchers tested UK patients for such antibodies.

Using two different methods, 43% of MS patients were found to be positive for antibodies to epsilon toxin, in comparison to 16% of people in a control group.

“There is a growing body of wider evidence that points to a hypothesis linking MS and ETX, and we are confident that these significant findings from our latest study will help people get even closer to an answer for the elusive triggers of MS,” said Simon Slater, Director of MS Sciences Ltd.

“If the link between epsilon toxin and MS is proven, then this would suggest that vaccination would be an effective treatment for its prevention or in the early stages of the disease.

“Interestingly, although epsilon toxin is known to be highly potent, no human vaccine has ever been developed.”

MS, most commonly diagnosed in people in their 20s and 30s, can affect the brain, causing a wide range of potential symptoms including problems with vision, arm or leg movement, sensation and balance.

It’s estimated that there are more than 100,000 people diagnosed with MS in the UK.

###

The research was funded by MS Sciences Ltd and the National Institute for Health Research Exeter Clinical Research Facility.

Samples for the research were provided by Barts Health NHS Trust, Imperial College London and the University of Exeter Medical School.

The paper, published in the Multiple Sclerosis Journal, is entitled: “Evidence of Clostridium perfringens epsilon toxin associated with multiple sclerosis.”

Skin cancers linked with up to a 92% reduced risk of Alzheimer’s disease

Public Release: 19-Apr-2018

 

Wiley

Previous studies have demonstrated a decreased risk of Alzheimer’s disease (AD) in individuals with various cancers, including non-melanoma skin cancers (including squamous cell cancers and basal cell cancers). A new Journal of the European Academy of Dermatology & Venereology study finds that this inverse relationship also holds true for malignant melanoma.

The study included patients aged 60-88 years with a clinic follow-up of at least 1 year and no diagnosis of AD or skin cancer at the beginning of the study. Of 1147 patients who were later diagnosed with malignant melanoma, 5 were diagnosed with subsequent AD. Of 2506 who were diagnosed with basal cell cancer, 5 had a subsequent AD diagnosis, and of 967 who were diagnosed with squamous cell cancer, only 1 had a subsequent AD diagnosis.

After adjustments, a diagnosis of malignant melanoma was associated with a 61% reduced risk of developing AD. For basal cell and squamous cell carcinomas, the reduced risks were 82% and 92%, respectively.

Rats sniff out TB in children

Public Release: 9-Apr-2018

 

Research shows that rats can detect tuberculosis in children with higher accuracy than standard microscopy tests

Springer

Rats are able to detect whether a child has tuberculosis (TB), and are much more successful at doing this than a commonly used basic microscopy test. These are the results of research led by Georgies Mgode of the Sokoine University of Agriculture in Tanzania. The study, published by Springer Nature in Pediatric Research, shows that when trained rats were given children’s sputum samples to sniff, the animals were able to pinpoint 68 percent more cases of TB infections than detected through a standard smear test. Inspiration for investigating the diagnosis of TB through smell came from anecdotal evidence that people suffering from the potentially fatal lung disease emit a specific odour. According to Mgode, current TB detection methods are far from perfect, especially in under-resourced countries in Sub-Saharan Africa and South East Asia where the disease is prevalent, and where a reasonably cheap smear test is commonly used. Problems with this type of test are that the accuracy varies depending on the quality of sputum sample used, and very young children are often unable to provide enough sputum to be analysed.

“As a result, many children with TB are not bacteriologically confirmed or even diagnosed, which then has major implications for their possible successful treatment,” explains Mgode. “There is a need for new diagnostic tests to better detect TB in children, especially in low and middle-income countries.”

Previous work pioneered in Tanzania and Mozambique focussed on training African giant pouched rats (Cricetomys ansorgei) to pick up the scent of molecules released by the TB-causing Mycobacterium tuberculosis bacterium in sputum. The training technique is similar to one used to teach rats to detect vapours released by landmine explosives. In the case of TB, when a rat highlights a possibly infected sample, it is analysed further using a WHO endorsed concentrated microscopy techniques to confirm a positive diagnosis.

Sputum samples were obtained from 982 children under the age of five who had already been tested using a microscopy test at clinics in the Tanzanian capital of Dar es Salaam. From the smear tests, 34 children were confirmed to have TB. When the same samples were placed out for the rats to examine, a further 57 cases were detected and then confirmed after being examined under a more advanced light emitting diode fluorescence microscope.

The news about the additional cases confirmed by endorsed concentrated smear microscopy was passed onto the relevant clinics, and efforts were made to track down infected patients so that they could start their much-needed treatment.

“This intervention involving TB screening by trained rats and community based patient tracking of new TB patients missed by hospitals enables treatment initiation of up to 70%. This is a significant proportion given that these additional patients were considered TB negative in hospitals, hence were initially left untreated,” adds Mgode.

###

Reference: Mgode, G.F. et al (2018). Pediatric tuberculosis detection using trained African giant pouched rats, Pediatric Research DOI:10.1038/pr.2018.40

Ancient origins of viruses discovered

Public Release: 4-Apr-2018

 

New study transforms understanding of virus origins and evolution

University of Sydney

Research published today in Nature has found that many of the viruses infecting us today have ancient evolutionary histories that date back to the first vertebrates and perhaps the first animals in existence.

The study, a collaboration between the University of Sydney, the China Center for Disease Control and Prevention and the Shanghai Public Health Clinical Centre, looked for RNA viruses in 186 vertebrate species previously ignored when it came to viral infections.

The researchers discovered 214 novel RNA viruses (where the genomic material is RNA rather than DNA) in apparently healthy reptiles, amphibians, lungfish, ray-finned fish, cartilaginous fish and jawless fish.

“This study reveals some groups of virus have been in existence for the entire evolutionary history of the vertebrates – it transforms our understanding of virus evolution,” said Professor Eddie Holmes, of the Marie Bashir Institute for Infectious Diseases & Biosecurity at the University of Sydney.

“For the first time we can definitely show that RNA viruses are many millions of years old, and have been in existence since the first vertebrates existed.

“Fish, in particular, carry an amazing diversity of viruses, and virtually every type of virus family detected in mammals is now found in fish. We even found relatives of both Ebola and influenza viruses in fish.”

However, Professor Holmes was also quick to emphasise that these fish viruses do not pose a risk to human health and should be viewed as a natural part of virus biodiversity.

“This study emphasises just how big the universe of viruses – the virosphere – really is. Viruses are everywhere.

“It is clear that there are still many millions more viruses still to be discovered,” he said.

The newly discovered viruses appeared in every family or genus of RNA virus associated with vertebrate infection, including those containing human pathogens such as influenza virus.

Because the evolutionary histories of the viruses generally matched those of their vertebrates, the researchers were able to conclude that these viruses had long evolutionary histories.

We’ll pay more for unhealthy foods we crave, neuroscience research finds

Public Release: 2-Apr-2018

 

New York University

We’ll pay more for unhealthy foods when we crave them, new neuroscience research finds. The study also shows that we’re willing to pay disproportionately more for higher portion sizes of craved food items.

The research, which appears in the journal Proceedings of the National Academy of Sciences (PNAS), identifies an obstacle to healthy living.

“Our results indicate that even if people strive to eat healthier, craving could overshadow the importance of health by boosting the value of tempting, unhealthy foods relative to healthier options,” explains Anna Konova, a postdoctoral researcher in NYU’s Center for Neural Science and the paper’s lead author. “Craving, which is pervasive in daily life, may nudge our choices in very specific ways that help us acquire those things that made us feel good in the past–even if those things may not be consistent with our current health goals.”

The study’s other co-authors were Kenway Louie, an NYU research assistant professor, and Paul Glimcher, an NYU professor and director of NYU’s Institute for the Interdisciplinary Study of Decision Making.

There is growing interest across several sectors–marketing, psychology, economics, and medicine–in understanding how our psychological states and physiological needs affect our behavior as consumers. Of particular concern is craving, which has long been recognized as a state of mind that contributes to addiction and, in recent years, to eating disorders and obesity.

Yet, the researchers note, little is known about the nature of craving and its impact on our choices and behavior.

In their PNAS study, the scientists conducted a series of experiments that asked subjects to indicate how much they’d pay for certain snack foods after they developed a craving for one of them–significant differences in a desire for a specific food item (e.g., a Snickers or granola bar) before and after exposure to the item constituted cravings.

The results showed that people were willing to pay more for the same exact snack food item if they were just exposed to it and asked to recall specific memories of consumption of this item, relative to before this exposure. Notably, this occurred even if the study’s subjects were hungry before and after the exposure, suggesting that craving and hunger are partly distinct experiences.

“In other words, craving Snickers does not make you hungrier; it makes you desire Snickers specifically,” explains Louie, who adds that there was also a spillover effect as it applied, to some degree, to similar food items that subjects were never exposed to (e.g., other chocolate, nut, and caramel candy bars).

Moreover, the researchers found stronger effects–bigger changes in the willingness to pay for an item the subjects craved–when the items were higher-calorie, higher-fat/sugar content foods, such as a chocolate bar or cheese puffs, relative to healthier options (e.g., a granola bar).

Finally, the experiments revealed a connection among craving, portion, and price. That is, people were willing to pay disproportionately more for higher portion sizes of the craved items.

“It appears that craving boosts or multiplies the economic value of the craved food,” says Konova.

###

This work was supported by grants from the National Institute on Drug Abuse (R01DA038063 and F32DA039648), part of the National Institutes of Health, and the Brain and Behavior Research Foundation (NARSAD Young Investigator Grant #25387).

DOI: 10.1073/pnas.1714443115

The Six Inch Tall ‘Atacama Skeleton’ was indeed human

Public Release: 22-Mar-2018

Once-mysterious ‘Atacama Skeleton’ illuminates genetics of bone disease

UCSF, Stanford scientists sequence genome of ‘Ata’ and find new mutations

University of California – San Francisco

The skeleton, discovered in a leather pouch behind an abandoned church, was pristine: a tiny figure, just six inches long, with a cone-shaped head, 10 pairs of ribs, and bones that looked like those of an eight-year-old child. Found in the Atacama Desert of Chile and later affectionately nicknamed “Ata,” the skeleton made its way onto the black market for archeological finds and then to a collector in Spain who thought it might be the remains of an extraterrestrial being.

But a forensic analysis of Ata’s genome by scientists at UC San Francisco and Stanford University has proved beyond a doubt that it is human. Ata has the DNA of a modern human female with the mix of Native American and European ancestral markers one would expect from someone who lived near the place where she was found. And her arresting appearance, which scientists refer to as a phenotype, can most likely be explained by a handful of rare genetic mutations–some already known, others newly discovered–that are linked to dwarfism and other bone and growth disorders.

Their discoveries, published Thursday, March 22, 2018, in Genome Research, does more than lay to rest the fable of Ata’s extraterrestrial origins. It also illustrates how far open-source genetic data has come in enabling the sort of needle-in-a-haystack analysis that can pinpoint the handful of mutated genes–out of more than 2.7 million single-nucleotide variants (SNVs) in Ata’s genome–that were most likely to be associated with the unusual shape of her body.

“The bioinformatics analyses in the paper showcase the power and wealth of information available in the public domain that led to the discovery of novel and rare deleterious variants in genes associated with Ata’s phenotype,” said Sanchita Bhattacharya, a bioinformatics researcher at the UCSF Institute for Computational Health Sciences (ICHS). “The analysis was even more challenging with a very limited amount of information about the specimen, and lack of family history, which makes it a unique case.”

Bhattacharya used the Human Phenotype Ontology (HPO), a database that links genomic data to the abnormal phenotypes found in human disease, everything from atrial septal defect, or a hole in the chambers of the heart, to musculoskeletal abnormalities.

In an initial analysis Bhattacharya found 64 gene variants that seemed likely to be damaging. She fed them into the HPO database, and to her amazement, most of the possible phenotypes the program homed in on had to do with the skeletal system, including “proportionate short stature” and “11 pairs of ribs.” Ata had 10 pairs, a phenotype that had never been observed.

“The moment I saw it, I could see there was something interesting going on there,” Bhattacharya said. “It was a little amount of information, and I am not a bone expert. This was a very blinded analysis.”

The results revealed four new SNVs–a type of genetic mutation at the individual level–in genes that were known to cause bone diseases, like scoliosis or dislocations, as well as two more SNVs in genes involved in producing collagen.

While esoteric, the analysis of Ata’s genome points toward the clinical genetics of the future, said Atul Butte, MD, PhD, who directs ICHS and is the Priscilla Chan and Mark Zuckerberg Distinguished Professor at UCSF.

With rapidly accumulating genetic data, Butte said, scientists can take a “backwards” approach to diagnosis. Instead of starting with a description of the disease and looking for a mutated gene to explain it, they start with the patient’s raw genetic material to see how it differs from a normal, or “reference” set of samples. The genetic variations that pop out of this comparison then reveal, in an unbiased way, what processes are at work in the patient to create disease.

“Analyzing a puzzling sample like the Ata genome can teach us how to handle current medical samples, which may be driven by multiple mutations,” Butte said. “When we study the genomes of patients with unusual syndromes, there may be more than one gene or pathway involved genetically, which is not always considered.”

Garry Nolan, PhD, a professor of microbiology and immunology at the Stanford University School of Medicine began the scientific exploration of Ata in 2012, when a friend called saying he might have found an “alien.”

Nolan believes further research into Ata’s precocious bone aging could one day benefit patients. “Maybe there’s a way to accelerate bone growth in people who need it, people who have bad breaks,” he said. “Nothing like this had been seen before. Certainly, nobody had looked into the genetics of it.”

But Nolan also said that he hopes, one day, little Ata will be given a proper burial. Far from being a visitor from another planet, Ata’s genome marked her as South American, with genetic variations that identified her as being from the Andean region inhabited by the Chilean Chilote Indians. Judging from the skeleton’s intact condition, he said, it is probably no more than 40 years old.

“We now know that it’s a child, and probably either a pre- or post-term birth and death,” he said. “I think it should be returned to the country of origin and buried according to the customs of the local people.”

###

Other authors of the study include, Matthew Kan and Shann-Ching Chen, of UCSF; Alexandra Sockell, Felice Bava, Xuhuai Ji, Ralph Lachman and Carlos Bustamante, of Stanford; Jian Li, Narges Asadi and Hugo Lam, of Roche Sequencing Solutions; Emery Smith of Ultra Intelligence Corporation; and Maria Avila-Arcos, of the National Autonomous University of Mexico.

About UCSF: UC San Francisco (UCSF) is a leading university dedicated to promoting health worldwide through advanced biomedical research, graduate-level education in the life sciences and health professions, and excellence in patient care. It includes top-ranked graduate schools of dentistry, medicine, nursing and pharmacy; a graduate division with nationally renowned programs in basic, biomedical, translational and population sciences; and a preeminent biomedical research enterprise. It also includes UCSF Health, which comprises top-ranked hospitals, UCSF Medical Center and UCSF Benioff Children’s Hospitals in San Francisco and Oakland – and other partner and affiliated hospitals and healthcare providers throughout the Bay Area. Please visit http://www.ucsf.edu/news.

Follow UCSF

Major US prostate cancer study, heavily flawed and maybe useless

Public Release: 21-Mar-2018

Analysis shows influential US prostate study not representative of real-world patients

European Association of Urology

Copenhagen: An analysis of 3 US cancer databases has shown that a major US study comparing surgery with observation in early prostate cancer patients, the PIVOT study, used patients which didn’t properly reflect the average US patient. Researchers found that patients in the PIVOT trial were between 3 and 8 times more likely to die than real-world patients. This may call into question the conclusions of the study, which are now being implemented in the US and worldwide. It was presented at the European Association of Urology congress (EAU18) in Copenhagen on 17 March, following publication as a letter in the peer-reviewed journal, European Urology1.

The PIVOT2 study was a near 20-year study of 731 men with low-, intermediate-, and high-risk prostate cancer. The study was reported in a paper in the NEJM in 20173, with the most important finding being that there was almost no difference in the overall mortality between patients undergoing surgery and those who opted for observation (although those treated reported more side-effects).

Presenting in Copenhagen, Dr Firas Abdollah (Detroit) said “The direct clinical implication of the PIVOT study is that we should abandon surgery in virtually all prostate cancer patients, and limit our management to observation. However, in most experts’ opinion, this would result in a significant increase in the number of men with metastatic prostate cancer, and in those who will succumb to the disease.”

The PIVOT study took data from patients from men with localized prostate cancer (median PSA value, 7.8 ng per millilitre) who were then randomized to radical prostatectomy or observation at Department of Veterans Affairs and National Cancer Institute medical centre.

A new appraisal of the PIVOT study carried out by scientists at the Henry Ford Hospital, Detroit, compared the characteristics of the patients used in the PIVOT study with 3 large US databases, to see if the PIVOT database really reflected ‘real-world’ prostate cancer patients. They compared PIVOT with:

  • 60,089 men from the Surveillance, Epidemiology, and End Results (SEER; population-based registry) between 2000-2004
  • 63,303 men from the National Cancer Database (NCDB; hospital-based registry) from 2004-2005
  • 2,847 men diagnosed with prostate cancer in the PLCO trial between 1993 and 2001

They found that

  • The men in the PIVOT study were older and sicker than would be found in a normal population, which might have biased the results of the trial. Indeed, overall mortality in the PIVOT study was 64% over 12.7 years, whereas in the other databases it was between 8 and 23% over a similar timescale (7.5-12.3 years).
  • In addition, the men in the PIVOT trial had a mean age of 67 at diagnosis, compared with 65.8 (PLCO), 61.3 (SEER) and 60.2 (NCDB).

Lead author Dr Firas Abdollah said: “Our work shows that the PIVOT trial used a sample of patients who were not representative of the real population affected by prostate cancer. They were both older and sicker than we would have expected. We don’t have the data to say what comparing like for like would give us, although I think everyone would be surprised if it didn’t tip the survival data more towards surgical intervention. What this really means is that we need to wait until a definitive study can show the relative benefits of intervention versus observation.”

Commenting, Professor Hein Van Poppel (Leuven, Belgium), EAU Adjunct Secretary-general said:

“It was clear from the first PIVOT analysis in 2012, that surgery (radical prostatectomy) had an advantage over waiting in patients with a poor prognosis. Now this evaluation of the dataset used in PIVOT suggests that the balance needs to change even in early-stage prostate cancer patients. This raises significant questions over just how relevant PIVOT is to real prostate cancer patients, and we need to seriously re-evaluate the PIVOT study, before taking implementation any further.”

Professor van Poppel was not involved in this research – this is an independent comment.

###

1 http://www.europeanurology.com/article/S0302-2838(17)30985-5/fulltext

2 Prostate Cancer Intervention versus Observation Trial (PIVOT)

3 Follow-up of Prostatectomy versus Observation for Early Prostate Cancer, Wilt et al, NEJM 2017, http://www.nejm.org/doi/full/10.1056/NEJMoa1615869

There was no external funding for this research.

LSD blurs boundary between self and other

Public Release: 19-Mar-2018

 

Human brain imaging study of the drug identifies a serotonin receptor system as potential target for treating social impairments in depression, schizophrenia

Society for Neuroscience

 

IMAGE

IMAGE: LSD reduced activity in the posterior cingulate cortex and the temporal cortex, brain areas important for establishing one’s sense of self.

Credit: Preller et al., JNeurosci (2018)

A human brain imaging study published in JNeurosci finds that the hallucinogen lysergic acid diethylamide (LSD) alters the activity of brain regions involved in differentiating between oneself and another person.

Katrin Preller, Franz Vollenweider (University of Zurich) and colleagues investigated the role of the serotonin 2A receptor in social interaction, which is impaired in several psychiatric disorders. Human participants received either LSD; ketanserin, a drug that blocks the effects of LSD; or a placebo prior to engaging in a gaze-following game with a virtual human-like character.

By combining functional magnetic resonance imaging and eye-tracking, the researchers found that LSD interfered with participants’ ability to coordinate attention with the virtual character on a particular object on the screen. During this social task, LSD reduced activity in the posterior cingulate cortex and the temporal cortex, brain areas important for establishing one’s sense of self, and appeared to blur the line between the experimental conditions where either the participant or the virtual character took the lead in directing attention. These effects were blocked by ketanserin, indicating that this receptor system may be a target for treating social impairments in disorders that involve an increased self-focus, as in depression, or loss of the sense of self, as in schizophrenia.

###

Article: Role of the 5-HT2A receptor in self- and other-initiated social interaction in LSD-induced states – a pharmacological fMRI study

DOI: https://doi.org/10.1523/JNEUROSCI.1939-17.2018

Corresponding author:

Katrin Preller
University of Zurich, Switzerland
preller@bli.uzh.ch

About JNeurosci

JNeurosci, the Society for Neuroscience’s first journal, was launched in 1981 as a means to communicate the findings of the highest quality neuroscience research to the growing field. Today, the journal remains committed to publishing cutting-edge neuroscience that will have an immediate and lasting scientific impact, while responding to authors’ changing publishing needs, representing breadth of the field and diversity in authorship.

About The Society for Neuroscience

The Society for Neuroscience is the world’s largest organization of scientists and physicians devoted to understanding the brain and nervous system. The nonprofit organization, founded in 1969, now has nearly 37,000 members in more than 90 countries and over 130 chapters worldwide.

Cancer comes back all jacked up on stem cells

Public Release: 19-Mar-2018

University of Colorado Anschutz Medical Campus

 

IMAGE

IMAGE: Antonio Jimeno, MD, PhD, and University of Colorado Cancer Center colleagues analyze three tumor samples collected over time from a single patient to show how cancer evolves to resist treatment.

Credit: University of Colorado Cancer Center

After a biopsy or surgery, doctors often get a molecular snapshot of a patient’s tumor. This snapshot is important – knowing the genetics that cause a cancer can help match a patient with a genetically-targeted treatment. But recent work increasingly shows that tumors are not static – the populations of cells that make up a tumor evolve over time in response to treatment, often in ways that lead to treatment immunity. Instead of being defined by a snapshot, tumors are more like a movie. This means that a tumor that recurs after treatment may be much different than the tumor originally seen in a biopsy.

Which is why, as reported in the journal Clinical Cancer Research, it was very special to collect three tumor samples over the course of three surgeries from a patient with salivary gland cancer.

“People talk about molecular evolution of cancer and we were able to show it in this patient. With these three samples, we could see across time how the tumor developed resistance to treatment,” says Daniel Bowles, MD, clinical and translational investigator at the University of Colorado Cancer Center and Head of Cancer Research at the Denver Veterans Administration Medical Center.

The major change had to do with the proportion of the tumor made up of cancer stem cells, often seen as the most capable of driving growth of the disease: A sample taken during the patient’s first surgery contained 0.2 percent cancer stem cells; a sample taken during the patient’s third surgery contained 4.5 percent cancer stem cells. Additionally, the later tumor had overall 50 percent more cancer-driving mutations, and lower activity of genes meant to suppress cancer.

“By the third surgery, the tumor was invasive and aggressive,” says Stephen Keysar, PhD, research assistant professor and basic investigator in the lab of senior author Antonio Jimeno, MD, PhD. Not only did the cellular makeup of the tumor change, increasing in the percentage of cancer stem cells, but, “all things being equal, if you compare a stem cell from the first surgery to stem cells from the third, the cells themselves became more aggressive,” says Keysar.

Bowles compares cancer treatment to attacking a weed: “Maybe what’s happening is the therapies are exfoliating the plant but not affecting the root,” he says. In this conceptualization, cancer therapies may kill the bulk of the cells that make up a tumor, but unless they affect the cancer stem cells – the “root” – the tumor may return.

“When you treat a tumor and it’s gone for a couple years and then comes back, it’s likely that a population of cancer stem cells survived treatment. These stem cells can then restart the cancer much later,” Keysar says.

Obtaining enough tumor tissue to analyze required growing patient samples on mice. This effort, supported by National Institutes of Health and philanthropic funds, led to the development of eight unique patient cell lines, some representing the first models of these salivary cancer subtypes.

“Importantly, as these models are based on human tumors, they can be used in the future to explore at the cellular and molecular level how specific genetic alterations regulate cancer development and resistance to therapy,” says collaborator Mary Reyland, PhD, professor in the CU School of Medicine Department of Pathology.

“In this relatively simple but groundbreaking research work, we integrated molecular and cancer stem cell biology to show that tumors adapt and ‘tool-up’ to overcome therapies, leading to relapse in our patients. By pairing two young researchers with complementary expertise, and developing complex animal models, were we able to demonstrate the evolution of salivary cancers and the tumorigenic cells that drive them,” Jimeno says.

“Cancers don’t ever come back better. At least I’ve never seen it,” Bowles says. “And now we know one important reason why.”

Death rates mysteriously skyrocket in England and Wales

Public Release: 14-Mar-2018

Health chiefs failing to investigate rising deaths in England and Wales, argue experts

Latest figures show over 10,000 extra deaths in first weeks of 2018 compared with previous years

Health chiefs are failing to investigate a clear pattern of rising death rates and worsening health outcomes in England and Wales, argue experts in The BMJ today.

Lucinda Hiam at the London School of Hygiene & Tropical Medicine and Danny Dorling at the University of Oxford say weekly mortality figures show 10,375 additional deaths (a rise of 12.4%) in England and Wales in the first seven weeks of 2018 compared with the previous five years.

This rise cannot be explained by ageing of the population, a flu epidemic, or cold weather – and no official explanation has been forthcoming as to why death rates have continued to be so high relative to previous trends, they write.

However, they note that the first seven weeks of 2018 were unusual in terms of the operation of the NHS.

On 2 January, after “an unprecedented step by NHS officials,” thousands of non-urgent operations were cancelled, a clear sign of a system struggling to cope. Many hospitals were already at or beyond their safe working levels, “with high numbers of frail patients stuck on wards for want of social care,” and a rise in influenza cases had begun.

However, they then show that influenza can have only accounted for a very small part of the overall rise in mortality in early 2018.

The past five years have been challenging in terms of health outcomes in the UK, they add. For example, spending on health and social care year on year has increased at a much slower rate than in previous years, while outcomes in a large number of indicators have deteriorated, including a very rapid recent increase in the numbers of deaths among mental health patients in care in England and Wales.

They point out that the Office of National Statistics has in the past 12 months reduced its projections of future life expectancy for both men and women in the UK by almost a year each, and, in doing so, has estimated that more than a million lives will now end earlier than expected.

Mortality in infants born into the poorest families in the UK has also risen significantly since 2011.

Hiam and Dorling argue that there remains “a clear lack of consensus” over the reasons for the rise in deaths – and say they and others have already called for an urgent investigation by the Health Select Committee of the House of Commons.

“The latest figures for this year make the case for an investigation stronger and more urgent with each passing day,” they conclude.

One in four Americans suffer when exposed to common chemicals

Public Release: 14-Mar-2018

 

University of Melbourne research reveals that one in four Americans report chemical sensitivity, with nearly half this group medically diagnosed with Multiple Chemical Sensitivities (MCS), suffering health problems from exposure to common chemical product

University of Melbourne

University of Melbourne research reveals that one in four Americans report chemical sensitivity, with nearly half this group medically diagnosed with Multiple Chemical Sensitivities (MCS), suffering health problems from exposure to common chemical products and pollutants such as insect spray, paint, cleaning supplies, fragrances and petrochemical fumes.

The research was conducted by Anne Steinemann, Professor of Civil Engineering and Chair of Sustainable Cities from the University of Melbourne School of Engineering, and published in the Journal of Occupational and Environmental Medicine. Professor Steinemann is an international expert on environmental pollutants, air quality, and health effects.

Professor Steinemann found the prevalence of chemical sensitivity has increased more than 200 per cent and diagnosed MCS has increased more than 300 per cent among American adults in the past decade. Across America, an estimated 55 million adults have chemical sensitivity or MCS.

“MCS is a serious and potentially disabling disease that is widespread and increasing in the US population,” Professor Steinemann said.

The study used an online survey with a national random sample of 1,137 people, representative of age, gender and region, from a large web-based panel held by Survey Sampling International (SSI).

The study found that, when exposed to problematic sources, people with MCS experience a range of adverse health effects, from migraines and dizziness to breathing difficulties and heart problems. For 76 per cent of people, the severity of effects can be disabling.

“People with MCS are like human canaries. They react earlier and more severely to chemical pollutants, even at low levels,” Professor Steinemann said.

The study also found that 71 per cent of people with MCS are asthmatic, and 86.2 per cent with MCS report health problems from fragranced consumer products, such as air fresheners, scented laundry products, cleaning supplies, fragranced candles, perfume and personal care products.

In addition, an estimated 22 million Americans with MCS have lost work days or a job in the past year due to illness from exposure to fragranced consumer products in the workplace.

To reduce health risks and costs, Professor Steinemann recommends choosing products without any fragrance, and implementing fragrance-free policies in workplaces, health care facilities, schools and other indoor environments.

###

Download the full article, free of charge, on Professor Steinemann’s website: http://www.drsteinemann.com/publications.html (top article) or the Journal of Occupational and Environmental Medicine: https://journals.lww.com/joem/Fulltext/2018/03000/National_Prevalence_and_Effects_of_Multiple.17.aspx

Toothpaste shown to speed enamel erosion?

 

toothpasteerosion

Enamel surface loss (SL) after each cycle, for anti-erosive toothpastes (light grey lines represent the control groups).

Public Release: 13-Mar-2018

Toothpaste alone does not prevent dental erosion or hypersensitivity

An analysis of nine toothpastes found that none of them protects enamel or prevents erosive wear. Specialists stress that diet and treatment by a dentist are key to avoid the problems originated by dentin exposure.

Fundação de Amparo à Pesquisa do Estado de São Paulo

The rising prevalence of dental erosion and dentin hypersensitivity has led to the emergence of more and more toothpastes on the market that claim to treat these problems. While no such toothpaste existed 20 years ago, today, many brands with different attributes are being offered.

However, a study conducted at the University of Bern in Switzerland with the participation of a researcher supported by a scholarship from the São Paulo Research Foundation – FAPESP showed that none of the nine analyzed toothpastes was capable of mitigating enamel surface loss, a key factor in tooth erosion and dentin hypersensitivity.

“Research has shown that dentin must be exposed with open tubules in order for there to be hypersensitivity, and erosion is one of the causes of dentin exposure. This is why, in our study, we analyzed toothpastes that claim to be anti-erosive and/or desensitizing,” said Samira Helena João-Souza, a PhD scholar at the University of São Paulo’s School of Dentistry (FO-USP) in Brazil and first author of the article.

According to an article published in Scientific Reports, all of the tested toothpastes caused different amounts of enamel surface loss, and none of the toothpastes afforded protection against enamel erosion and abrasion.

The authors of the study stressed that these toothpastes perform a function but that they should be used as a complement, not as a treatment, strictly speaking. According to João-Souza, at least three factors are required: treatment prescribed by a dentist, use of an appropriate toothpaste, and a change in lifestyle, especially diet.

“Dental erosion is multifactorial. It has to do with brushing, and above all, with diet. Food and drink are increasingly acidic as a result of industrial processing”, she said.

The researcher highlights that dental erosion is a chronic loss of dental hard tissue caused by acid without bacterial involvement – unlike caries, which is bacteria-related. When it is associated with mechanical action, such as brushing, it results in erosive wear. In these situations, patients typically experience discomfort when they drink or eat something cold, hot or sweet.

“They come to the clinic with the complaint that they have caries, but actually, the problem is caused by dentin exposure due to improper brushing with [a] very abrasive toothpaste, for example, combined with frequent consumption of large amounts of acidic foods and beverages,” said Professor Ana Cecília Corrêa Aranha, João-Souza’s supervisor and a co-author of the article.

In our clinical work, we see patients with this problem in the cervical region between [the] gum and tooth. The enamel in this region is thinner and more susceptible to the problem,” she added.

Methodology

 

The scientists tested eight anti-erosive and/or desensitizing toothpastes and one control toothpaste, all of which are available from pharmacies and drugstores in Brazil or Europe.

The research simulated the effect of brushing once a day with exposure to an acid solution for five consecutive days on tooth enamel. The study used human premolars donated for scientific research purposes, artificial saliva, and an automatic brushing machine.

“We used a microhardness test to calculate enamel loss due to brushing with the toothpastes tested. The chemical analysis consisted of measuring toothpaste pH and levels of tin, calcium, phosphate and fluoride,” João-Souza explained.

The physical analysis consisted of weighing the abrasive particles contained in the toothpastes, measuring their size, and testing wettability – the ease with which toothpaste mixed with artificial saliva could be spread on the tooth surface.

“During brushing with these toothpastes mixed with artificial saliva, we found that the properties of the toothpastes were different, so we decided to broaden the scope of the analysis to include chemical and physical factors. This [broadening] made the study more comprehensive,” João-Souza said.

Statistically similar

All of the analyzed toothpastes caused progressive tooth surface loss in the five-day period. “None of them was better than the others. Indication will depend on each case. The test showed that some [toothpastes] caused less surface loss than others, but they all resembled the control toothpaste [for] this criterion. Statistically, they were all similar, although numerically, there were differences,” Aranha said.

“We’re now working on other studies relating to dentin in order to think about possibilities, given that none of these toothpastes was found capable of preventing dental erosion or dentin hypersensitivity, which is a cause of concern.”

The researchers plan to begin a more specific in vivo study that will also include pain evaluations.

###

About São Paulo Research Foundation (FAPESP)

The São Paulo Research Foundation (FAPESP) is a public institution with the mission of supporting scientific research in all fields of knowledge by awarding scholarships, fellowships and grants to investigators linked with higher education and research institutions in the State of São Paulo, Brazil. FAPESP is aware that the very best research can only be done by working with the best researchers internationally. Therefore, it has established partnerships with funding agencies, higher education, private companies, and research organizations in other countries known for the quality of their research and has been encouraging scientists funded by its grants to further develop their international collaboration. For more information: http://www.fapesp.br/en.

Is your stress changing my brain?

Public Release: 8-Mar-2018

 

University of Calgary researchers discover stress isn’t just contagious; it alters the brain on a cellular level

University of Calgary

In a new study in Nature Neuroscience, Jaideep Bains, PhD, and his team at the Cumming School of Medicine’s Hotchkiss Brain Institute (HBI), at the University of Calgary have discovered that stress transmitted from others can change the brain in the same way as a real stress does. The study, in mice, also shows that the effects of stress on the brain are reversed in female mice following a social interaction. This was not true for male mice.

“Brain changes associated with stress underpin many mental illnesses including PTSD, anxiety disorders and depression,” says Bains, professor in the Department of Physiology and Pharmacology and member of the HBI. “Recent studies indicate that stress and emotions can be ‘contagious’. Whether this has lasting consequences for the brain is not known.”

The Bains research team studied the effects of stress in pairs of male or female mice. They removed one mouse from each pair and exposed it to a mild stress before returning it to its partner. They then examined the responses of a specific population of cells, specifically CRH neurons which control the brain’s response to stress, in each mouse, which revealed that networks in the brains of both the stressed mouse and naïve partner were altered in the same way.

The study’s lead author, Toni-Lee Sterley, a postdoctoral associate in Bains’ lab said, “What was remarkable was that CRH neurons from the partners, who were not themselves exposed to an actual stress, showed changes that were identical to those we measured in the stressed mice.”

Next, the team used optogenetic approaches to engineer these neurons so that they could either turn them on or off with light. When the team silenced these neurons during stress, they prevented changes in the brain that would normally take place after stress. When they silenced the neurons in the partner during its interaction with a stressed individual, the stress did not transfer to the partner. Remarkably, when they activated these neurons using light in one mouse, even in the absence of stress, the brain of the mouse receiving light and that of the partner were changed just as they would be after a real stress.

The team discovered that the activation of these CRH neurons causes the release of a chemical signal, an ‘alarm pheromone’, from the mouse that alerts the partner. The partner who detects the signal can in turn alert additional members of the group. This propagation of stress signals reveals a key mechanism for transmission of information that may be critical in the formation of social networks in various species.

Another advantage of social networks is their ability to buffer the effects of adverse events. The Bains team also found evidence for buffering of stress, but this was selective. They noticed that in females the residual effects of stress on CRH neurons were cut almost in half following time with unstressed partners. The same was not true for males.

Bains suggests that these findings may also be present in humans. “We readily communicate our stress to others, sometimes without even knowing it. There is even evidence that some symptoms of stress can persist in family and loved ones of individuals who suffer from PTSD. On the flip side, the ability to sense another’s emotional state is a key part of creating and building social bonds.”

This research from the Bains lab indicates that stress and social interactions are intricately linked. The consequences of these interactions can be long-lasting and may influence behaviours at a later time.

###

In addition to Sterley and Bains, the paper’s other authors are Dinara Baimoukhametova, Tamás Füzesi, Agnieszka Zurek, Nuria Daviu, Neilen Rasiah and David Rosenegger.

The study was made possible through the generous contribution of longtime HBI donor Mr. Sanders Lee.

The research was supported by several funding sources including the Canadian Institutes for Health Research, Brain Canada, and Alberta Innovates.

Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of

US cancer treatment guidelines ‘often based on weak evidence’

Public Release: 7-Mar-2018

 

Findings question the underlying evidence for current guidelines

BMJ

Cancer treatment guidelines produced by the US National Comprehensive Cancer Network (NCCN) are often based on low quality evidence or no evidence at all, finds a study published by The BMJ today.

The researchers, led by Dr Vinay Prasad at Oregon Health & Science University, say their findings “raise concern that the NCCN justifies the coverage of costly, toxic cancer drugs based on weak evidence.”

NCCN guidelines are developed by a panel of cancer experts who make recommendations based on the best available evidence.

These recommendations are used by US private health insurers and social insurance schemes to make coverage decisions, and guide global cancer practice, but it is not clear how the evidence is gathered or reviewed.

In the US, the Food and Drug Administration (FDA) approves all new drugs and grants new indications for drugs already on the market. The NCCN makes recommendations both within and outside of FDA approvals, but patterns of NCCN recommendations beyond FDA approvals have not been analysed.

So Dr Prasad and his team compared FDA approvals of cancer drugs with NCCN recommendations in March 2016 for a contemporary sample of drugs. When the NCCN made recommendations beyond the FDA’s approvals, the evidence used to support those recommendations was evaluated.

A total of 47 new cancer drugs were approved by the FDA for 69 indications over the study period, whereas the NCCN recommended these drugs for 113 indications, of which 69 (62%) overlapped with the 69 FDA approved indications and 44 (39%) were additional recommendations.

Only 10 (23%) of these additional recommendations were based on evidence from randomised controlled trials, and seven (16%) were based on evidence from phase III studies. Most relied on small, uncontrolled studies or case reports, or no offered evidence.

And almost two years after their analysis, the researchers found that only six (14%) of the additional recommendations by the NCCN had received FDA approval.

“The NCCN frequently makes additional recommendations for the use of drugs beyond approvals of the FDA and when it does so, it often fails to cite evidence or relies on low levels of evidence,” write the authors.

“Few of these additional recommendations subsequently lead to drug approval,” they add. “If there is additional evidence in support of these recommendations the NCCN should improve its process and cite all evidence used.”

This is an observational study, so no firm conclusions can be drawn about cause and effect, and the researchers point to some limitations. However, they say, given that NCCN endorsement is linked to reimbursement by many commercial insurers and social insurance schemes, “our results suggest that payers may be covering cancer drugs with varying and scientifically less robust justification.”

Finally, they point out that 86% of NCCN guidelines members have financial ties to the pharmaceutical industry, with 84% receiving personal payments and 47% receiving research payments.

“The presence of conflicted physicians has been shown to lead to more optimistic conclusions regarding disputed practices,” they say. “Thus our findings raise concern about the nature of the recommendations offered by these individuals.”

###

High total cholesterol in late old age may be marker of protective factor

Public Release: 5-Mar-2018

Risk of cognitive decline reduced for people 85 and older with high cholesterol

High total cholesterol in late old age may be marker of protective factor

The Mount Sinai Hospital / Mount Sinai School of Medicine

 

People aged 85 and older whose total cholesterol had increased from their levels at midlife had a reduced risk for marked cognitive decline, compared with those a decade younger whose choles-terol was similarly elevated, Mount Sinai researchers report in a new study.

The results of the study will be available online by Alzheimer’s & Dementia: The Journal of the Alzheimer’s Association as an article in press corrected proof on Monday, March 5, at 10 a.m.

The researchers found that people aged 85-94 with good cognitive function whose total choles-terol increased from midlife had a 32 percent reduced risk for marked cognitive decline over the next ten years, compared with people aged 75-84, who had a 50 percent increased risk.

The researchers said that the results did not suggest that those 85 and older should increase their cholesterol for better cognitive health, but rather that those in that age cohort with good cogni-tion and high cholesterol probably also had some protective factor that someday could be identi-fied and studied.

The research team evaluated the association of five total cholesterol values with a substantial de-cline in cognitive function from normal function, called marked cognitive decline. The five values were midlife (average age 40) total cholesterol, late-life (average age 77) total cholesterol, mean total cholesterol since midlife, linear change since midlife (in other words, whether it was in-creasing or decreasing), and quadratic change since midlife (whether the linear change was accel-erating or decelerating). Data were obtained from the original Framingham Heart Study, a long-term, ongoing cardiovascular cohort study on residents of Framingham, Massachusetts. That study began in 1948 with 5,209 adult subjects and is now on its third generation of participants

The team assessed whether marked cognitive decline was associated with the five cholesterol values, and whether the associations with those values changed depending on the age of cogni-tive assessment. They found several cholesterol values including high last cholesterol, increasing levels, and decreasing acceleration were predictors associated with increased risk of a marked cognitive decline, that were associated with increased risk of a marked cognitive decline. How-ever, as the outcome age increased, some associations were reduced, or even reversed. Further-more, in the subgroup of cognitively healthy 85-94 year olds, a high midlife cholesterol level was associated with a reduced risk for marked cognitive decline. This contrasts with samples in other studies that have focused on elderly subjects primarily below age 75, where midlife cholesterol was associated with increased risk of cognitive decline.

“Our results have important implications for researching genetic and other factors associated with successful cognitive aging,” said the study’s first author, Jeremy Silverman, PhD, Professor of Psychiatry, Icahn School of Medicine at Mount Sinai. “The data are consistent with our protect-ed survivor model – among individuals who survive to very old age with intact cognition, those with high risk factor levels are more likely to possess protective factors than those with lower risk factor levels. Long-lived individuals who are cognitively intact despite high risk should be tar-geted in research studies seeking protective factors, which could help identify future drugs and therapies to treat dementia and Alzheimer’s disease.”

Dr. Silverman notes that these results do not imply that those 85 and older should increase their cholesterol. His research team will next study other risk factors for cognitive decline, including body mass index and blood pressure..

“We don’t think high cholesterol is good for cognition at 85, but its presence might help us iden-tify those who are less affected by it. We hope to identify genes or other protective factors for cogitive decline by focusing on cognitively healthy very old people who are more likely to carry protective factors.”

###

The Mount Sinai Health System is New York City’s largest integrated delivery system encompassing seven hospital campuses, a leading medical school, and a vast network of ambulatory practices throughout the greater New York region. Mount Sinai’s vision is to produce the safest care, the highest quality, the highest satisfaction, the best access and the best value of any health system in the nation. The System includes approximately 7,100 primary and specialty care physicians; 10 joint-venture ambulatory surgery centers; more than 140 ambulatory practices throughout the five boroughs of New York City, Westchester, Long Island, and Florida; and 31 affiliated community health centers. Physicians are affiliated with the renowned Icahn School of Medicine at Mount Sinai, which is ranked among the highest in the nation in National Institutes of Health funding per investigator. The Mount Sinai Hospital is ranked No. 18 on U.S. News & World Report’s “Honor Roll” of top U.S. hospitals; it is one of the nation’s top 20 hospitals in Cardiology/Heart Surgery, Diabetes/Endocrinology, Gastroenterology/GI Surgery, Geriat

rics, Nephrology, and Neurology/Neurosurgery, and in the top 50 in four other specialties in the 2017-2018 “Best Hospitals” issue. Mount Sinai’s Kravis Children’s Hospital also is ranked in six out of ten pediatric specialties by U.S. News & World Report. The New York Eye and Ear Infirmary of Mount Sinai is ranked 12th nationally for Ophthalmology and 50th for Ear, Nose, and Throat, while Mount Sinai Beth Israel, Mount Sinai St. Luke’s and Mount Sinai West are ranked regionally. For more information, visit http://www.mountsinai.org/, or find Mount Sinai on Facebook, Twitter and YouTube.

US healthcare system needs coordinated response to potential pediatric pandemics

Public Release: 5-Mar-2018

 

Children’s Hospital Los Angeles

 

IMAGE

IMAGE: This is an image of Jeffrey Upperman, MD, and Rita V. Burke, PhD, MPH.

Credit: Photo courtesy of Children’s Hospital Los Angeles

Researchers at Children’s Hospital Los Angeles (CHLA) have identified gaps in the United States healthcare system that make it inadequately prepared for the surge in pediatric patients that could result from an infectious disease pandemic. Their study, published in the American Journal of Disaster Medicine, proposes a structured and coordinated response for such a crisis, with national guidelines reflected in regional response systems.

While various plans for such a threat already exist, the researchers identified significant gaps that compromise the healthcare system’s capacity to respond to the particular needs of children in the face of a pandemic or major disaster.

After reviewing 162 journal articles culled from 1,787 articles published between 1970 and 2017 about infectious disease pandemics affecting the U.S. pediatric population, the researchers proposed a structure for plans designed to handle the surge that could result from such a pandemic.

Incidents in the past decade — including the H1N1 influenza pandemic of 2009 and the 2014 Ebola outbreak — have raised questions about the nation’s security in the face of health threats. With increased globalization and the innate adaptability of infectious disease agents, these dangers are likely to increase in the future, posing particular risks to children.

The study found that neither national nor regional plans properly account for the particular needs of children, who make up nearly a quarter of the population. Treating children requires special care and resources to account for their physical and psychological needs, which differ from those of adults, says the report, which notes that most preparedness plans are too heavily skewed toward the needs of adult patients.

“Children represent 23 percent of the population, yet most of our disaster plans are aimed at the ‘average,’ patient, assuming that patient is an adult and neglect the specific needs of children. That’s a mistake we need to correct,” said Jeffrey Upperman, MD, FACS, FAAP, Director of the Trauma Program at CHLA and one of the authors. Upperman is also an associate professor of Surgery at the Keck School of Medicine of USC.

Since pediatric patients require different treatment than adults, the study notes, preparedness plans should include specific protocols specifically addressing the needs of children.

In reviewing current practices and proposing improvements, the researchers addressed four overarching categories: structure, staff, stuff (shorthand for resources), and space:

Structure. The study identified a need for a highly structured, clearly defined and well-integrated delivery system in case of medical surge events. In particular, they found, regional systems need to support coordination of care among hospitals, taking into consideration the capabilities and capacities of the various hospitals within a given region.

Staff. The study emphasizes the need to educate those responsible for the care of pediatric patients in a pandemic, including physicians, nurses, and volunteer professionals. It also suggests measures to address shortages of healthcare workers in the event of a surge in pediatric patients.

Stuff. Only 53 percent of U.S. emergency departments admit children, and only 10 percent have pediatric intensive care units. And many facilities do not maintain adequate stocks levels of materials that would be required to address the potential needs of pediatric patients in the surge that would result from a pandemic. The study cited the necessity for hospitals to address these shortcomings.

Space. Hospitals must be prepared to accommodate surges that could triple their typical patient capacity, taking measures such as converting private rooms to shared rooms and or making use of hallways, lobbies, and other spaces for patients.

“The theme underlying all of these categories is relationships,” said Rita V. Burke, assistant professor of Research Surgery and Preventive Medicine at CHLA and Keck School of Medicine of USC, and one of the study’s authors. “Establishing connections and lines of communication among hospitals in a region or a local area is essential to being prepared for this type of challenge to our healthcare system.”

While various measures are being taken in certain areas, the study calls the steps “fragmented and insufficient” to meet the potential needs of children in the event of a pandemic. In the coming years, new strains of influenza are likely to arise, and children are always at higher risk of complications related to influenza.

###

Additional contributors: Christy Anthony, MD, Tito Joe Thomas, BS, Bridget M. Berg, MPH, Children’s Hospital Los Angeles

About Children’s Hospital Los Angeles

Children’s Hospital Los Angeles has been ranked the top children’s hospital in California and sixth in the nation for clinical excellence by the prestigious U.S. News & World Report Honor Roll. Children’s Hospital is home to The Saban Research Institute, one of the largest and most productive pediatric research facilities in the United States. Children’s Hospital is also one of America’s premier teaching hospitals through its affiliation since 1932 with the Keck School of Medicine of the University of Southern California.

Food abundance driving conflict in Africa, not food scarcity

Public Release: 1-Mar-2018

 

Dartmouth College

In Africa, food abundance may be driving violent conflict rather than food scarcity, according to a study published in the American Journal of Agricultural Economics, a publication of the Agricultural & Applied Economics Association.

The study refutes the notion that climate change will increase the frequency of civil war in Africa as a result of food scarcity triggered by rising temperatures and drought. Most troops in Africa are unable to sustain themselves due to limited access to logistics and state support, and must live off locally sourced food. The findings reveal that the actors are often drawn to areas with abundant food resources, whereby, they aim to exert control over such resources.

To examine how the availability of food may have affected armed conflict in Africa, the study relies on PRIO-Grid data from over 10,600 grid cells in Africa from 1998 to 2008, new agricultural yields data from EarthStat and Armed Conflict Location and Event Dataset, which documents incidents of political violence, including those with and without casualties. The data was used to estimate how annual local wheat and maize yields (two staple crops) at a local village/town level may have affected the frequency of conflict. To capture only the effects of agricultural productivity on conflict rather than the opposite, the analysis incorporates the role of droughts using the Standardized Precipitation Index, which aggregates monthly precipitation by cell year.

The study identifies four categories in which conflicts may arise over food resources in Africa, which reflect the interests and motivations of the respective group:

  • State and military forces that do not receive regular support from the state are likely to gravitate towards areas, where food resources are abundant in order to feed themselves.
  • Rebel groups and non-state actors opposing the government may be drawn to food rich areas, where they can exploit the resources for profit.
  • Self-defense militias and civil defense forces representing agricultural communities in rural regions, may protect their communities against raiders and expand their control into other areas with arable land and food resources.
  • Militias representing pastoralists communities live in mainly arid regions and are highly mobile, following their cattle or other livestock, rather than relying on crops. To replenish herds or obtain food crops, they may raid other agriculturalist communities.

These actors may resort to violence to seek access to food, as the communities that they represent may not have enough food resources or the economic means to purchase livestock or drought-resistant seeds. Although droughts can lead to violence, such as in urban areas; this was found not to be the case for rural areas, where the majority of armed conflicts occurred where food crops were abundant. Food scarcity can actually have a pacifying effect.

“Examining food availability and the competition over such resources, especially where food is abundant, is essential to understanding the frequency of civil war in Africa,” says Ore Koren, a U.S. foreign policy and international security fellow at Dartmouth College and Ph.D. candidate in political science at the University of Minnesota. “Understanding how climate change will affect food productivity and access is vital; yet, predictions of how drought may affect conflict may be overstated in Africa and do not get to the root of the problem. Instead, we should focus on reducing inequality and improving local infrastructure, alongside traditional conflict resolution and peace building initiatives,” explains Koren.

###

Koren is available for comment at: ore.david.koren@dartmouth.edu.

Broadcast studios: Dartmouth has TV and radio studios available for interviews. For more information, visit: http://communications.dartmouth.edu/media/broadcast-studios

‘Obesity paradox’ debunked

Public Release: 28-Feb-2018

 

Obese people really don’t live longer than normal weight people with heart disease, they’re just diagnosed at a younger age

Northwestern University

  • Maintaining a normal weight can postpone cardiovascular disease and reduce overall risk of it
  • A healthy weight lengthens your lifespan and ‘healthspan’
  • Obesity paradox confused patients who ask: ‘Why do I need to lose weight?’

CHICAGO — Put down that second helping of chocolate cake.

A new study debunks the “obesity paradox,” a counterintuitive finding that showed people who have been diagnosed with cardiovascular disease live longer if they are overweight or obese compared with people who are normal weight at the time of diagnosis.

Obese people live shorter lives and have a greater proportion of life with cardiovascular disease, reports a new Northwestern Medicine study.

The paper will be published Feb. 28 in JAMA Cardiology.

The new study shows similar longevity between normal weight and overweight people, but a higher risk for those who are overweight of developing cardiovascular disease during their lifespan and more years spent with cardiovascular disease.

This is the first study to provide a lifespan perspective on the risks of developing cardiovascular disease and dying after a diagnosis of cardiovascular disease for normal weight, overweight and obese individuals.

“The obesity paradox caused a lot of confusion and potential damage because we know there are cardiovascular and non-cardiovascular risks associated with obesity,” said Dr. Sadiya Khan, an assistant professor of medicine at Northwestern University Feinberg School of Medicine and a Northwestern Medicine cardiologist.

“I get a lot of patients who ask, ‘Why do I need to lose weight, if research says I’m going to live longer?”’ Khan said. “I tell them losing weight doesn’t just reduce the risk of developing heart disease, but other diseases like cancer. Our data show you will live longer and healthier at a normal weight.”

Obesity is defined as having a Body Mass Index (BMI) of 30 to 39.9; overweight is 25 to 29.9. BMI is a person’s weight divided by his or her height. An overweight individual, who is 5’4″ and weighs 160 pounds, for example, would be considered overweight; a 5’4″ person who weights 190 pounds is considered obese.

Higher odds of a stroke, heart attack, heart failure or dying from heart disease, according to the study:

  • The likelihood of having a stroke, heart attack, heart failure or cardiovascular death in overweight middle-aged men 40 to 59 years old was 21 percent higher than in normal weight men. The odds were 32 percent higher in overweight women than normal weight women.
  • The likelihood of having a stroke, heart attack, heart failure or cardiovascular death in obese middle-aged men 40 to 59 years old was 67 percent higher than in normal weight men. The odds were 85 percent higher in obese women than normal weight women.
  • Normal weight middle-aged men also lived 1.9 years longer than obese men and six years longer than morbidly obese. Normal weight men had similar longevity to overweight men.
  • Normal weight middle-aged women lived 1.4 years longer than overweight women, 3.4 years longer than obese women and six years longer than morbidly obese women.

“A healthy weight promotes healthy longevity or longer healthspan in addition to lifespan, so that greater years lived are also healthier years lived,” Khan said. “It’s about having a much better quality of life.”

The study examined individual level data from 190,672 in-person examinations across 10 large prospective cohorts with an aggregate of 3.2 million years of follow-up. All of the participants were free of cardiovascular disease at baseline and had objectively measured height and weight to assess BMI. Over follow-up, researchers assessed for cardiovascular disease overall and by type, including coronary heart disease, stroke, heart failure and cardiovascular death, as well as non-cardiovascular death.

###

Other Northwestern authors include Dr. Donald Lloyd-Jones, senior author, and co-authors Dr. Hongyan Ning, Dr. John T. Wilkins, Norrina Allen, Mercedes Carnethon, and Dr. Ranya N. Sweis.

The study was supported by grants R21 HL085375 F32 HL129695 from the National Institutes of Health/National Heart, Lung, and Blood Institute.

Health staff ‘too stressed’ to deal with disasters

Public Release: 26-Feb-2018

 

Research finds day-to-day workloads and targets leave healthcare services vulnerable

Anglia Ruskin University

Increasing stress and a lack of motivation among healthcare staff could result in hospitals having to shut down in the wake of a major incident such as flooding or an earthquake, according to new research published in the journal Procedia Engineering.

The research, led by Anglia Ruskin University, examined studies from across the world. It found that the capacity of clinical and non-clinical staff in hospitals and clinics to deal with incidents such as floods, earthquakes or other natural hazard is severely limited by a high workload and challenging targets which result in high levels of psychological stress.

The findings also suggested that some staff are left feeling unmotivated and unattached to their workplace, meaning they are less likely to take the initiative in such a scenario and may even avoid coming into work. Only 21% of participants in the research expressed complete satisfaction with their jobs and workplace.

Dr Nebil Achour, lead author and Senior Lecturer in Healthcare Management at Anglia Ruskin University, said: “Healthcare services in many countries across the world are under severe strain, which leaves little opportunity for staff to be trained in disaster resilience. Yet healthcare is among the most critical services in any country during and after a major incident has occurred.

“Staff suffer from increasing workload and stricter performance measures with less flexibility. This has caused psychological and physical stress and makes them unable to respond to any further stress associated with major hazards.

“Many staff members do not feel attached to their workplace and do not feel that they have enough flexibility to take the initiative and lead their own way. This in turn also makes them less motivated to learn the extra skills needed to deal with a catastrophic event.

“Combined, these factors expose healthcare services to major risk of staff shortage and thus inoperability when a major hazard does strike.”

When it comes to our brains, there’s no such thing as normal

Public Release: 20-Feb-2018

 

Cell Press

There’s nothing wrong with being a little weird. Because we think of psychological disorders on a continuum, we may worry when our own ways of thinking and behaving don’t match up with our idealized notion of health. But some variability can be healthy and even adaptive, say researchers in a review published February 20th in Trends in Cognitive Sciences, even though it can also complicate attempts to identify standardized markers of pathology.

“I would argue that there is no fixed normal,” says clinical psychologist and senior author Avram Holmes of Yale University. “There’s a level of variability in every one of our behaviors.” Healthy variation is the raw material that natural selection feeds on, but there are plenty of reasons why evolution might not arrive at one isolated perfect version of a trait or behavior. “Any behavior is neither solely negative or solely positive. There are potential benefits for both, depending on the context you’re placed in,” he says.

For instance, impulsive sensation seeking, a willingness to take risks in order to have new and exciting experiences that has its roots in our evolutionary history as foragers, is often thought of negatively. Increased sensation seeking is associated with things like substance abuse, criminality, risky sexual behavior, and physical injury. “But if you flip it on its head and look at potential positive outcomes, those same individuals may also thrive in complex and bustling environments where it’s appropriate for them to take risks and seek thrills,” he says. They often have more social support, are more outgoing, and exercise more.

The same is true for anxiety. “You might be more inhibited in social situations and you may find it harder to build friendships,” Holmes says. “However, that same anxiety, if you think of it in a workplace setting, is what motivates you to prepare for a big presentation. If you’re in school, that’s the same anxiety that motivates you to study for an exam.” He also notes that we have more control over the contexts we’re in than we tend to think we do, which means that it’s very possible to end up in an environment that favors the way our brains work.

But if variation in any given trait is normal, that does raise questions about what makes for disordered behavior, which he stresses is very much a real phenomenon. “It may be the case that if you focus on a single phenotype, there isn’t a specific line that separates health from disease, and that we must consider multiple phenotypes simultaneously,” he says.

This makes it much more complicated to try to find biomarkers for psychological illness, something that Holmes has worked on throughout his career. The usual approach is to break down a disorder into its component pieces, find a specific associated genetic marker or biological process for a certain piece, and then look for that marker or process in the general population to see if it can predict the disorder. The problem, he says, is that “one single phenotype in isolation is never going to be necessary nor sufficient to cause an illness.”

“What we want to try to do is build multivariate approaches that consider multiple domains of human behavior simultaneously, to see if we can boost our power in predicting eventual outcomes for folks,” he says. Large, open-source datasets have been collected in recent years that can be used in these efforts, but Holmes notes that the work will almost certainly require collaboration between different labs and institutions–some of which is already underway.

What this does mean, though, is that it really isn’t appropriate to think of ourselves in terms of a single trait that’s either good or bad, healthy or unhealthy. “This is a broader issue with our society,” he says, “but we’re all striving towards some artificial, archetypal ideal, whether it’s physical appearance or youthfulness or intelligence or personality. But we need to recognize the importance of variability, both in ourselves and in the people around us. Because it does serve an adaptive purpose in our lives.”

###

This work was supported by the National Institute of Mental Health.

Trends in Cognitive Sciences, Holmes et al.: “The myth of optimality in clinical neuroscience” http://www.cell.com/trends/cognitive-sciences/fulltext/S1364-6613(17)30268-1

Trends in Cognitive Sciences (@TrendsCognSci), published by Cell Press, is a monthly review journal that brings together research in psychology, artificial intelligence, linguistics, philosophy, computer science, and neuroscience. It provides a platform for the interaction of these disciplines and the evolution of cognitive science as an independent field of study. Visit: http://www.cell.com/trends/cognitive-sciences. To receive Cell Press media alerts, please contact press@cell.com.

Can your cardiac device be hacked?

Public Release: 20-Feb-2018

 

ACC Electrophysiology Council discusses potential dangers and offers advice to patients and physicians

American College of Cardiology

WASHINGTON (Feb. 20, 2017) — Medical devices, including cardiovascular implantable electronic devices could be at risk for hacking. In a paper publishing online today in the Journal of the American College of Cardiology, the American College of Cardiology’s Electrophysiology Council examines the potential risk to patients and outlines how to improve cybersecurity in these devices.

Cybersecurity in the medical field refers to the integration of medical devices, computer networks and software. While there have been no actual clinical reports of malicious or inadvertent hacking or malware attacks affecting cardiac devices, recent reports have discovered this possibility. Reasons for hacking include political, financial, social and personal motives. Devices can be hacked locally or remotely. The Food and Drug Administration has issued both pre-market and post-market guidance for the security of medical devices and legislative proposals related to medical device security have been advanced in the U.S. Congress.

“True cybersecurity begins at the point of designing protected software from the outset, and requires the integration of multiple stakeholders, including software experts, security experts and medical advisors,” said Dhanunjaya R. Lakkireddy MD, professor of medicine at the University of Kansas Hospital, a member of the Electrophysiology Council and the corresponding author of the paper.

Medical devices have been targets of hacking for over a decade. The increasing number of medical devices using software has created the need to protect devices from intentional harmful interference on their normal functioning. Advanced wireless communications between health care providers and patients’ devices have created the theoretical possibility for the deactivation of features, the alteration of programming, and the delaying, interfering or interrupting of communications.

There are a number of possible clinical consequences that may result from the hacking of a cardiac device. In patients with pacemakers, concerns mostly consist of oversensing or battery depletion. For patients with implantable cardioverter-defibrillators (ICDs), it is possible for hackers to interrupt wireless communications, inhibiting the value of telemonitoring and allowing any clinically relevant events to go undetected by the system. Oversensing may inhibit pacing or result in inappropriate or life-threatening shocks. Battery depletion can lead to a device being unable to deliver therapies during life-threatening arrhythmias.

“At this time, there is no evidence that one can reprogram a cardiovascular implantable electronic device or change device settings in any form,” Lakkireddy said. “The likelihood of an individual hacker successfully affecting a cardiovascular implantable electronic device or being able to target a specific patient is very low. A more likely scenario is that of a malware or ransomware attack affecting a hospital network and inhibiting communication.”

The council said that cybersecurity needs should also be addressed during product testing both pre- and post-market. Because cyber vulnerabilities can emerge quickly, strong post-market processes must be in place to monitor the environment for new vulnerabilities and to respond in a timely manner. The council suggests that firmware may be useful in devices with possible vulnerabilities. Physicians who manage cardiac devices should be aware of both documented and possible cybersecurity risks. Systems should be established to communicate updates in these areas quickly and in an understandable way to the rest of the clinical team that manage patients with devices.

The council members said they do not feel that enhanced monitoring or elective device replacement is necessary at this time.

“Given the lack of evidence that hacking of cardiac devices is a relevant clinical problem, coupled with evidence of the benefits of remote monitoring, one should exercise caution in depriving a patient of the clear benefit of remote monitoring,” Lakkireddy said.

###

The American College of Cardiology is the professional home for the entire cardiovascular care team. The mission of the College and its more than 52,000 members is to transform cardiovascular care and to improve heart health. The ACC leads in the formation of health policy, standards and guidelines. The College operates national registries to measure and improve care, offers cardiovascular accreditation to hospitals and institutions, provides professional medical education, disseminates cardiovascular research and bestows credentials upon cardiovascular specialists who meet stringent qualifications. For more, visit acc.org.

The Journal of the American College of Cardiology ranks among the top cardiovascular journals in the world for its scientific impact. JACC is the flagship for a family of journals — JACC: Cardiovascular Interventions, JACC: Cardiovascular Imaging, JACC: Heart Failure, JACC: Clinical Electrophysiology and JACC: Basic to Translational Science — that prides themselves in publishing the top peer-reviewed research on all aspects of cardiovascular disease. Learn more at JACC.org.

Stem cell vaccine immunizes lab mice against multiple cancers

Public Release: 15-Feb-2018

Cell Press

 

IMAGE

 

Public Release: 15-Feb-2018

Stem cell vaccine immunizes lab mice against multiple cancers

Cell Press

Share

Print E-Mail

IMAGE

 

 

Caption

This visual abstract depicts how cancer immunity against multiple types of cancer can be achieved using an easily generable iPSC-based cancer vaccine. This immunity is based on overlapping epitopes between iPSCs and cancer cells and can also be achieved by reactivating the immune system as an adjuvant.

Credit: Kooreman and Kim et al./Cell Stem Cell

Stanford University researchers report that injecting mice with inactivated induced pluripotent stem cells (iPSCs) launched a strong immune response against breast, lung, and skin cancers. The vaccine also prevented relapses in animals that had tumors removed. The work appears in the journal Cell Stem Cell on Feb. 15.

iPSCs are generated from adult cells genetically reprogrammed to mimic embryonic stem cells’ ability to become any type of cell in the body.

In the study, 75 mice received versions of the iPSC vaccine created from iPSCs that have been inactivated by irradiation. Within four weeks, 70 percent of the vaccinated mice fully rejected newly introduced breast cancer cells, while the remaining 30 percent had significantly smaller tumors. The effectiveness of the iPSC vaccine was also validated for lung and skin cancers.

Lead author Joseph C. Wu at Stanford’s Cardiovascular Institute and Institute for Stem Cell Biology and Regenerative Medicine and colleagues found that a large amount of the antigens present on iPSCs are also present on cancer cells. When lab mice were vaccinated with iPSCs, their immune systems built an immune response to the antigens on the iPSCs. Because of key similarities between the iPSCs and cancer cells, the animals simultaneously built an immune response against cancer.

The iPSCs seemed to “prime their immune systems to eradicate tumor cells,” Wu says.

To be effective, anti-cancer vaccines must introduce one or more antigens into the body that activate T cells or produce antibodies capable of recognizing and binding to antigens on the surfaces of cancer cells.

One of the biggest challenges for cancer immunotherapies is the limited number of antigens that can be presented to the immune system at a given time. The Stanford study uses an animal’s own cells to create an iPSC-based cancer vaccine that simultaneously targets multiple tumor antigens. Using whole iPSCs eliminates the need to identify the most optimal antigen to target in a particular type of cancer.

“We present the immune system with a larger number of tumor antigens found in iPSCs, which makes our approach less susceptible to immune evasion by cancer cells,” Wu says. The researchers also combined iPSCs with an immunity booster–a snippet of bacterial DNA called CpG that has been deemed safe in human trials. Stanford oncologist and study co-author Ronald Levy previously found CpG to be a potent tumor-fighting agent.

In the future, a patient’s skin or blood cells may be re-programmed into iPSCs and administered as an anti-cancer vaccine or as a follow-up booster after surgery, chemotherapy, or radiation therapy.

“What surprised us most was the effectiveness of the iPSC vaccine in re-activating the immune system to target cancer,” Wu says. “This approach may have clinical potential to prevent tumor recurrence or target distant metastases.”

###

This work was supported by the California Institute of Regenerative Medicine (CIRM) and the National Institutes of Health (NIH).

Cell Stem Cell, Kooreman and Kim et al.: “Autologous iPSC-Based Vaccines Elicit Anti-tumor Responses In Vivo” http://www.cell.com/cell-stem-cell/fulltext/S1934-5909(18)30016-X

Cell Stem Cell (@CellStemCell), published by Cell Press, is a monthly journal that publishes research reports describing novel results of unusual significance in all areas of stem cell research. Each issue also contains a wide variety of review and analysis articles covering topics relevant to stem cell research ranging from basic biological advances to ethical, policy, and funding issues. Visit: http://www.cell.com/cell-stem-cell. To receive Cell Press media alerts, contact press@cell.com.

Credit: Kooreman and Kim et al./Cell Stem Cell

Stanford University researchers report that injecting mice with inactivated induced pluripotent stem cells (iPSCs) launched a strong immune response against breast, lung, and skin cancers. The vaccine also prevented relapses in animals that had tumors removed. The work appears in the journal Cell Stem Cell on Feb. 15.

iPSCs are generated from adult cells genetically reprogrammed to mimic embryonic stem cells’ ability to become any type of cell in the body.

In the study, 75 mice received versions of the iPSC vaccine created from iPSCs that have been inactivated by irradiation. Within four weeks, 70 percent of the vaccinated mice fully rejected newly introduced breast cancer cells, while the remaining 30 percent had significantly smaller tumors. The effectiveness of the iPSC vaccine was also validated for lung and skin cancers.

Lead author Joseph C. Wu at Stanford’s Cardiovascular Institute and Institute for Stem Cell Biology and Regenerative Medicine and colleagues found that a large amount of the antigens present on iPSCs are also present on cancer cells. When lab mice were vaccinated with iPSCs, their immune systems built an immune response to the antigens on the iPSCs. Because of key similarities between the iPSCs and cancer cells, the animals simultaneously built an immune response against cancer.

The iPSCs seemed to “prime their immune systems to eradicate tumor cells,” Wu says.

To be effective, anti-cancer vaccines must introduce one or more antigens into the body that activate T cells or produce antibodies capable of recognizing and binding to antigens on the surfaces of cancer cells.

One of the biggest challenges for cancer immunotherapies is the limited number of antigens that can be presented to the immune system at a given time. The Stanford study uses an animal’s own cells to create an iPSC-based cancer vaccine that simultaneously targets multiple tumor antigens. Using whole iPSCs eliminates the need to identify the most optimal antigen to target in a particular type of cancer.

“We present the immune system with a larger number of tumor antigens found in iPSCs, which makes our approach less susceptible to immune evasion by cancer cells,” Wu says. The researchers also combined iPSCs with an immunity booster–a snippet of bacterial DNA called CpG that has been deemed safe in human trials. Stanford oncologist and study co-author Ronald Levy previously found CpG to be a potent tumor-fighting agent.

In the future, a patient’s skin or blood cells may be re-programmed into iPSCs and administered as an anti-cancer vaccine or as a follow-up booster after surgery, chemotherapy, or radiation therapy.

“What surprised us most was the effectiveness of the iPSC vaccine in re-activating the immune system to target cancer,” Wu says. “This approach may have clinical potential to prevent tumor recurrence or target distant metastases.”

###

This work was supported by the California Institute of Regenerative Medicine (CIRM) and the National Institutes of Health (NIH).

Cell Stem Cell, Kooreman and Kim et al.: “Autologous iPSC-Based Vaccines Elicit Anti-tumor Responses In Vivo” http://www.cell.com/cell-stem-cell/fulltext/S1934-5909(18)30016-X

Cell Stem Cell (@CellStemCell), published by Cell Press, is a monthly journal that publishes research reports describing novel results of unusual significance in all areas of stem cell research. Each issue also contains a wide variety of review and analysis articles covering topics relevant to stem cell research ranging from basic biological advances to ethical, policy, and funding issues. Visit: http://www.cell.com/cell-stem-cell. To receive Cell Press media alerts, contact press@cell.com.

New CRISPR-Cas9 tool edits both RNA and DNA precisely, U-M team reports

Public Release: 15-Feb-2018

 

A Cas9 protein discovered in meningitis bacteria can act as precise ‘scissors’ for both types of genetic material, cutting at a desired spot guided by CRISPR RNAs

Michigan Medicine – University of Michigan

 

IMAGE

IMAGE: A depiction of how the NmeCas9 protein can be used as part of the CRISPR technique to edit single-stranded RNA.

Credit: University of Michigan, Yan Zhang laboratory

ANN ARBOR, Mich. — A tool that has already revolutionized disease research may soon get even better, thanks to an accidental discovery in the bacteria that cause many of the worst cases of meningitis.

Called CRISPR-Cas9, the tool acts as molecular scissors — able to cut DNA at exactly the spot it’s asked to. But it can’t cut the other kind of genetic material found in cells, called RNA.

Now, researchers at the University of Michigan have discovered a single protein that can perform CRISPR-style, precise programmable cutting on both DNA and RNA. This protein is among the first few Cas9s to work on both types of genetic material without artificial helper components.

Their initial biochemical study in laboratory test tubes, published in the journal Molecular Cell, show the promise of the new CRISPR approach using the protein called NmeCas9. It’s derived from Neisseria meningitidis, the bacteria that cause some of the most severe and deadly cases of meningitis each year.

Now, the team is working to test the tool in living bacteria cells, to see if NmeCas9 achieves the same effect that they saw in test tubes. They hope to eventually progress to human cells. If it works, NmeCas9 could help expand the role of CRISPR in studying – and perhaps intervening – in many diseases.

“The fact that our protein has dual function – able to target both DNA and RNA – gives us the opportunity to develop platforms to do dual targeting,” says Yan Zhang, Ph.D., the U-M assistant professor of Biological Chemistry who led the research team. “It may make it possible to perform CRISPR cutting on both RNA and DNA at once, or alternatively just on single-stranded messenger RNA without affecting genomic regions at all.”

In cells, the DNA contained in chromosomes acts as the permanent encyclopedia of instructions for making everything the cell needs. But to actually make anything, cells need RNA transcribed from the chromosomes.

One of RNA’s most important functions in cells is the “photocopying” of stretches of DNA, so that machines within the cell can read the instructions and make proteins. Many diseases arise from problems with cellular RNAs.

Zhang, and co-first authors Beth A. Rousseau and Zhonggang Hou, Ph.D., developed and tested the NmeCas9 protein in their lab at the U-M Medical School.

Magic scissors

The CRISPR “scissor” technique has transformed research in just five years. It has made it possible for hundreds of teams of scientists to snip out portions of chromosome that are mutated, or to see what happens when a certain gene isn’t there.

To understand CRISPR in simple terms, imagine a pair of scissors that have one side of a zipper attached to the tip of the blades. In order to cut a stretch of DNA at exactly the right spot, the zipper has to match up exactly with a stretch of DNA leading up to that spot – forming a tight bond that positions the scissors in just the right place.

In CRISPR, the “zipper” is made of specially designed RNA, and the “scissor” effect comes from the harnessing the natural cutting action of a protein, or enzyme, called Cas9. The CRISPR revolution has made it possible to design unique RNA zippers that can attach to specific genes that play a role in a disease, and cut them out.

The first human clinical trials using CRISPR to cut a flawed section of DNA are reported to be under way in China, and preparing to begin in the U.S. Research is also under way to see if human embryos containing disease-related genetic mutations can be changed through CRISPR, though there is controversy about the ethical implications of this practice, known as “germline editing.”

Ambidextrous scissors – by accident

The new technique developed by Zhang and her team aims to produce a pair of universal genetic scissors. And, because NmeCas9 is a much smaller protein than other Cas9 proteins used in CRISPR editing, they hope it will be more useful.

The discovery of NmeCas9 happened by accident, when the team was studying the basic function of the NmeCas9 protein in cutting DNA. The U-M team was using RNA as the comparison, or a control sample – but noticed that it was getting cut too.

Digging deeper, they discovered the dual-cutting function of NmeCas9, and began testing it biochemically.

In addition to their discovery, they’re aware that two other groups are either preparing to report or have just reported Cas9 proteins from other bacteria that can carry out RNA targeting without any stimulatory co-factors, unlike previous RNA-editing CRISPR-Cas9 techniques.

“If NmeCas9 works in live cells as it has in vitro, we can develop it as a tool to edit the messenger RNA transcript, which means we might be able to block a gene product without manipulating the gene itself,” says Zhang. “We might also be able to harness it as a research tool to deliver fluorescent markers to specific RNA sequences, or to block events like RNA splicing. All that has been achieved with CRISPR Cas9 to manipulate the chromosomes, we might be able to do at the RNA level.”

###

The work was funded by the National Institutes of Health (GM117268) and by the U-M Medical School’s Biological Sciences Scholars Program, which supports outstanding biomedical research scientists during the first years of their careers at U-M. In addition to Zhang, Rousseau and Hou, the study’s authors also include Max Gramelspacher.

Reference: Molecular Cell, 10.1016/j.molcel.2018.01.025 http://www.cell.com/molecular-cell/fulltext/S1097-2765(18)30054-6

BU: One or more soda a day could decrease chances of getting pregnant

Public Release: 13-Feb-2018

 

Boston University School of Medicine

The amount of added sugar in the American diet has increased dramatically over the last 50 years. Much of that increase comes from higher intake of sugar-sweetened beverages, which constitute approximately one-third of the total added sugar consumption in the American diet. While consumption of these beverages has been linked to weight gain, type 2 diabetes, early menstruation, and poor semen quality, few studies have directly investigated the relationship between sugary drinks and fertility.

Now, a new study led by Boston University School of Public Health (BUSPH) researchers has found that the intake of one or more sugar-sweetened beverages per day–by either partner–is associated with a decreased chance of getting pregnant.

The study was published in Epidemiology.

“We found positive associations between intake of sugar-sweetened beverages and lower fertility, which were consistent after controlling for many other factors, including obesity, caffeine intake, alcohol, smoking, and overall diet quality,” says lead author Elizabeth Hatch, professor of epidemiology. “Couples planning a pregnancy might consider limiting their consumption of these beverages, especially because they are also related to other adverse health effects.”

About 15 percent of couples in North America experience infertility. Identifying modifiable risk factors for infertility, including diet, could help couples conceive more quickly and reduce the psychological stress and financial hardship related to fertility treatments, which are associated with more than $5 billion in annual US healthcare costs.

Through the Pregnancy Study Online (PRESTO), an ongoing web-based prospective cohort study of North American couples, the researchers surveyed 3,828 women aged 21 to 45 living in the United States or Canada and 1,045 of their male partners. Participants completed a comprehensive baseline survey on medical history, lifestyle factors, and diet, including their intake of sugar-sweetened beverages. Female participants then completed a follow-up questionnaire every two months for up to 12 months or until pregnancy occurred.

Both female and male intake of sugar-sweetened beverages was associated with 20 percent reduced fecundability, the average monthly probability of conception. Females who consumed at least one soda per day had 25 percent lower fecundability; male consumption was associated with 33 percent lower fecundability. Intake of energy drinks was related to even larger reductions in fertility, although the results were based on small numbers of consumers. Little association was found between intake of fruit juices or diet sodas and fertility.

“Given the high levels of sugar-sweetened beverages consumed by reproductive-aged couples in North America, these findings could have important public health implications,” the authors concluded.

###

About Boston University School of Public Health:

Boston University School of Public Health, founded in 1976, offers master’s- and doctoral-level education in public health. The faculty in six departments (biostatistics; community health sciences; environmental health; epidemiology; global health; and health law, policy & management) conduct policy-changing public health research around the world, with the mission of improving the health of populations–especially the disadvantaged, underserved, and vulnerable–locally, nationally, and internationally.

Opioid use increases risk of serious infections

Public Release: 12-Feb-2018

 

Vanderbilt University Medical Center

Opioid users have a significantly increased risk of infections severe enough to require treatment at the hospital, such as pneumonia and meningitis, as compared to people who don’t use opioids.

The Vanderbilt University Medical Center study, released today by the Annals of Internal Medicine, found that people who use opioids have a 1.62 times higher risk of invasive pneumococcal diseases.

Invasive pneumococcal diseases are serious infections caused by the Streptococcus pneumoniae bacteria, with mortality ranging from 5 percent to 20 percent. These invasive diseases include a range of illnesses such as meningitis, bacteremia and invasive pneumonia.

“The association between opioid use and the risk of invasive pneumococcal diseases was strongest for opioids used at high doses, those classified as high potency and long-acting, which would be the extended release or controlled release formulations,” said lead author Andrew Wiese, PhD, MPH, postdoctoral research fellow in the Department of Health Policy at Vanderbilt University School of Medicine.

“We also found that opioids previously described as immunosuppressive in prior experimental studies conducted in animals, had the strongest association with invasive pneumococcal diseases in humans,” he said.

Wiese and colleagues, studied Tennessee Medicaid Data to measure daily prescription opioid exposure for each study individual and combined that information with Active Bacterial Core (ABC) surveillance system data, which is a VUMC laboratory and population-based surveillance system conducted in partnership with the Tennessee Department of Health and the CDC Centers for Disease Control and Prevention to monitor invasive infectious diseases in Tennessee.

“A unique feature of the study is the use of laboratory-confirmed infections as study outcomes. The sources of data allowed us to reconstruct and compare the history of opioid exposures in those subjects with and without invasive pneumococcal diseases,” said Carlos G. Grijalva, MD, MPH associate professor of Health Policy, and senior author of the study.

The increase in opioid use in the U.S. over the past several years has led to an increased interest towards well-known and also previously under- recognized adverse effects associated with opioid use.

“Previous studies conducted in animal models had demonstrated that certain opioids can cause immunosuppression and render experimental animals susceptible to infections. However, the clinical implications of those observations in humans were unclear” said Grijalva.

In an accompanying editorial, Sascha Dublin, MD, and Michael von Korff, ScD, from the Kaiser Permanente Washington Health Research Institute indicate that this research provides, “…cautionary evidence of a higher infection risk with prescription opioid use, suggesting the need for prudent steps to protect patients…” They further emphasize the need for judicious prescribing of opioids and conclude that “opioid prescribing should be consistently cautious and closely monitored among all patients, especially those at increased risk for infections, who may be particularly susceptible to harm.”

“The findings from our study are clinically relevant. Providers should consider these results when making pain management decisions,” added Wiese.

###

Others with Vanderbilt who contributed to the study were Marie Griffin, MD, MPH; William Schaffner, MD; C. Michael Stein, MB, ChB; Robert Greevy, PhD; and Edward Mitchel, Jr., MS.

This research was supported by grants from the National Institutes of Health – National Institute on Aging [R03-AG042981 and R01-AG043471] and TL1TR000447.

Using injectable self-assembled nanomaterials for sustained delivery of drugs

Public Release: 12-Feb-2018

 

New injectable delivery system can slowly release drug carriers for months

Northwestern University

Because they can be programmed to travel the body and selectively target cancer and other sites of disease, nanometer-scale vehicles called nanocarriers can deliver higher concentrations of drugs to bombard specific areas of the body while minimizing systemic side effects. Nanocarriers can also deliver drugs and diagnostic agents that are typically not soluble in water or blood as well as significantly decrease the effective dosage.

Although this method might seem ideal for treating diseases, nanocarriers are not without their challenges.

“Controlled, sustained delivery is advantageous for treating many chronic disorders, but this is difficult to achieve with nanomaterials without inducing undesirable local inflammation,” said Northwestern University’s Evan Scott. “Instead, nanomaterials are typically administered as multiple separate injections or as a transfusion that can take longer than an hour. It would be great if physicians could give one injection, which continuously released nanomaterials over a controlled period of time.”

Now Scott, an assistant professor of biomedical engineering in Northwestern’s McCormick School of Engineering, has developed a new mechanism that makes that controlled, sustained delivery possible.

Scott’s team designed a nanocarrier formulation that — after quickly forming into a gel inside the body at the site of injection — can continuously release nanoscale drug-loaded vehicles for months. The gel itself re-assembles into the nanocarriers, so after all of the drug has been delivered, no residual material is left to induce inflammation or fibrous tissue formation. This system could, for example, enable single-administration vaccines that do not require boosters as well as a new way to deliver chemotherapy, hormone therapy, or drugs that facilitate wound healing.

Supported by the National Science Foundation and National Institutes of Health, the research was published online today, February 12 in the journal Nature Communications. Nicholas Karabin, a graduate student in Scott’s laboratory, was the paper’s first author. Northwestern Engineering’s Kenneth Shull, professor of materials science and engineering, also contributed to the work. A member of Northwestern’s Simpson Querrey Institute for BioNanotechnology and Chemistry of Life Processes Institute, Scott was corresponding author and led the nanoparticle development and in vivo validation.

Currently, the most common sustained nanocarrier delivery systems hold nanomaterials within polymer matrices. These networks are implanted into the body, where they slowly release the trapped drug carriers over a period of time. The problem lies after the delivery is complete: the networks remain inside the body, often eliciting a foreign-body response. The leftover network can cause discomfort and chronic inflammation in the patient.

To bypass this issue, Scott developed a nanocarrier using self-assembled, filament-shaped nanomaterials, which are loaded with a drug or imaging agent. When crosslinked together, the filaments form a hydrogel network that is similar to structural tissue in the human body. After the filaments are injected into the body, the resulting hydrogel network functions as a drug depot that slowly degrades by breaking down into spherical nanomaterials called micelles, which are programmed to travel to specific targets. Because the network morphs into the drug-delivery system, nothing is less behind to cause inflammation.

“All of the material holds the drug and then delivers the drug,” Scott explained. “It degrades in a controlled fashion, resulting in nanomaterials that are of equal shape and size. If we load a drug into the filaments, the micelles take the drug and leave with it.”

After testing the system both in vitro and in vivo in an animal model, Scott’s team demonstrated they could administer a subcutaneous injection that slowly delivered nanomaterials to cells in lymph nodes for over a month in a controlled fashion.

Scott said the system can be used for other nanostructures in addition to micelles. For example, the system could include vesicle-shaped nanoparticles, such as liposomes or polyersomes, that have drugs, proteins, or antibodies trapped inside. Different vesicles could carry different drugs and release them at different rates once inside the body.

“Next we are looking for ways to tailor the system to the needs of specific diseases and therapies,” Scott said. “We’re currently working to find ways to deliver chemotherapeutics and vaccines. Chemotherapy usually requires the delivery of multiple toxic drugs at high concentrations, and we could deliver all of these drugs in one injection at much lower dosages. For immunization, these injectable hydrogels could be administered like standard vaccines, but stimulate specific cells of the immune system for longer, controlled periods of time and potentially avoid the need for boosters.”

###