Florida monarch butterfly populations have dropped 80 percent since 2005

Public Release: 8-Nov-2018

 

Florida Museum of Natural History

GAINESVILLE, Fla. — A 37-year survey of monarch populations in North Central Florida shows that caterpillars and butterflies have been declining since 1985 and have dropped by 80 percent since 2005.

This decrease parallels monarchs’ dwindling numbers in their overwintering grounds in Mexico, said study co-author Jaret Daniels, program director and associate curator of the Florida Museum of Natural History’s McGuire Center for Lepidoptera and Biodiversity.

“It’s alarming in a number of different ways,” said Daniels, who is also an associate professor in the University of Florida’s department of entomology and nematology. “This study shows the tight connection between monarchs and milkweed and highlights very dramatic losses in abundance in Florida that further confirm the monarch is declining.”

While the drivers of the decline are not clear, the researchers said shrinking native milkweed populations and a boost in glyphosate use in the Midwest are part of the problem.

Glyphosate, an herbicide often applied to agricultural fields to eliminate weeds, is lethal to milkweed, the monarchs’ host plant. Less milkweed means less habitat for monarchs, said study co-author Ernest Williams, professor emeritus of biology at Hamilton College in New York.

“A broad pattern is that 95 percent of corn and soybean products grown in the U.S. are Roundup Ready crops that resist glyphosate,” Williams said. “That has a national impact. What’s really needed are patches of native vegetation and nectar sources without pesticides. It’s not just for monarchs but all pollinators.”

In the longest location-based monarch monitoring effort to date, a multi-institute team led by world-renowned monarch expert Lincoln Brower, who died earlier this year, closely followed spring monarch numbers in an herbicide-free cattle pasture in Cross Creek, about 20 miles southeast of Gainesville. The team examined milkweed plants for caterpillars and captured adult butterflies for 37 years, a period spanning more than 140 generations of monarchs.

They found that monarchs’ springtime departure from Mexico is timed to coincide with optimal growth of milkweed in the southeastern U.S. While adult monarch butterflies can feed from a variety of plants, their young depend on milkweed as their sole source of nutrition, storing up the plant’s toxins to ward off predators.

Monarchs lay hundreds of eggs on milkweed over their brief lifetimes, but just over 2 percent of eggs survive to become fully grown caterpillars.

If monarchs get to their breeding grounds too early, they run the risk of their host plants being killed by frosts – too late and the plants may not be able to support their young. To maximize their offspring’s chances of survival, the butterflies must time their arrival in the U.S. within a three-week window, Daniels said, an impressive feat for insects with lifespans between six and eight weeks.

This delicate matchup could be disrupted by climate change, which can skew plants’ springtime schedules.

“Since it’s such tight timing, it would be devastating to the monarch,” he said.

Florida is an important stopover for monarchs returning north from Mexico, as spring breeding in southern states leads to the butterflies’ recolonization of the upper U.S. and Canada. Monarchs rely on Florida for its abundance of milkweed and warm climate to lay the eggs that will help replenish the eastern population in the U.S., Daniels said.

“Florida is kind of a staging ground for the recolonization of much of the East Coast,” he said. “If these populations are low, then the northern populations are going to be at a similar abundance level.”

But although monarchs are a well-studied species, consistent long-term studies of changes in their spring breeding are rare, Williams said.

“Long-term studies like this are important because they point to larger trends,” he said. “Before 2005, there was more fluctuation in the data. Since 2005, the rate of decline has been steady.”

Daniels said that increasing pesticide-free native milkweed populations in Florida yards and on roadsides is a step in the right direction to prevent monarchs from requiring protection under the Endangered Species Act.

But, he emphasized, not any milkweed will do.

Asclepias curassavica, or tropical milkweed, is a commercialized, non-native tropical species that has become popular with growers because of its color and year-round vegetation. But tropical milkweed can become an “ecological trap” for monarchs, coaxing them into breeding in unusual areas during the winter months – areas far enough north of Mexico to remain prone to freeze events throughout the winter and early spring, Daniels said.

Prolonged breeding can also lead to an increase in a protozoan parasite that infects monarchs.

“It’s not a hard-and-fast rule of not using that plant, but we want to be cautious about potential implications,” Daniels said. “It’s invariably better to use natives across the board.”

Florida is home to about 21 native species of milkweed. Daniels recommends either Asclepias incarnata, also called swamp milkweed, or Asclepias tuberosa, commonly known as butterflyweed. Asclepias humistrata, or pinewoods milkweed, is also common throughout northern Florida and essential to monarch recolonization.

“It’s not as simple as saying, ‘we plant milkweed and the monarch will be saved,'” he said. “We should think of this as an ecological issue. There are a lot of complexities to any organism and any system.”

Daniels said the team will continue monitoring monarch populations in Florida. He highlighted the willingness of Cross Creek property owners to give the research team access to the pastures each spring for 37 years as a key factor in the study’s success.

“It shows the importance of the public and private relationship when it comes to research,” he said. “They’ve been fantastic collaborators.”

The study’s lead author, Brower, died shortly before its publication. A lifelong butterfly expert, Brower was instrumental in finding monarch overwintering colonies in Mexico, the researchers said. This is his final publication.

“He really was the grand old man of monarchs,” said Williams. “Nobody has done more for monarchs.”

Williams said Brower had a knack for bringing people together and worked with more than 160 collaborators throughout his career.

According to his obituary in The New York Times, Brower began studying monarchs in the 1950s and made his first trip to the fir forests in Mexico where the butterflies spend the winter in 1977. In the 1980s, Brower worked with the Mexican government to protect these forests from deforestation.

“The best thing we can do is to continue his mission and continue to study and work to conserve the monarch,” Daniels said. “I think he would be proud of that mission.”

Leading researchers call for a ban on widely used insecticides

Public Release: 9-Nov-2018

 

Use of organophosphates has lessened, but risks to early brain development still too high

University of California – Davis Health

Public health experts have found there is sufficient evidence that prenatal exposure to widely used insecticides known as organophosphates puts children at risk for neurodevelopmental disorders.

In a scientific review and call to action published in PLOS Medicine, the researchers call for immediate government intervention to phase out all organophosphates.

“There is compelling evidence that exposure of pregnant women to very low levels of organophosphate pesticides is associated with lower IQs and difficulties with learning, memory or attention in their children,” said lead author Irva Hertz-Picciotto, professor of public health sciences, director of the UC Davis Environmental Health Sciences Center and researcher with the UC Davis MIND Institute.

“Although a single organophosphate — chlorpyrifos — has been in the national spotlight, our review implicates the entire class of these compounds,” Hertz-Picciotto added.

Originally developed as nerve gases and weapons of war, organophosphates today are used to control insects at farms, golf courses, shopping malls and schools. They kill pests by blocking nerve signaling.

People can come into contact with these chemicals through the food they eat, the water they drink and the air they breathe. As a result, organophosphate pesticides are detected in the vast majority of U.S. residents, according to Hertz-Picciotto.

Elevated risks even with low-level exposures

While existing limits on organophosphates have reduced exposures, the review authors said this isn’t enough. Based on more than 30 epidemiologic studies and scores of experimental studies in animals and cell cultures, they believe the evidence is clear: Exposure to organophosphates before birth, even at levels currently considered safe, is associated with poorer cognitive, behavioral and social development.

“It should be no surprise that studies confirm that these chemicals alter brain development, since they were originally designed to adversely affect the central nervous system,” Hertz-Picciotto said.

Despite growing evidence of harm and recommendations from scientific advisors to and scientists within the U.S. Environmental Protection Agency, many organophosphates remain in use. This may be in part because low-level, ongoing exposures typically don’t cause visible, short-term clinical symptoms, leading to the incorrect assumption that these exposures are inconsequential, according to Hertz-Picciotto.

“Acute poisoning is tragic, of course, however the studies we reviewed suggest that the effects of chronic, low-level exposures on brain functioning persist through childhood and into adolescence and may be lifelong, which also is tragic,” Hertz-Picciotto explained.

Recommendations to protect children

In addition to conducting the scientific review, the authors offered recommendations for substantially reducing organophosphate exposures, including:

  • Removing organophosphates from agricultural and non-agricultural uses and products
  • Proactively monitoring sources of drinking water for organophosphate levels
  • Establishing a system for reporting pesticide use and illnesses

Until a ban can occur, the reviewers recommend:

  • Greater medical and nursing education on organophosphates to improve treatment for and patient education on avoiding exposures
  • Training for agricultural workers in their languages on proper handling and application of organophosphate pesticides
  • Increased use of less-toxic alternatives and a transition toward sustainable pest-control measures

###

In addition to Hertz-Picciotto, the review team included Jennifer Sass of the Natural Resources Defense Council and George Washington University, Stephanie Engel of the University of North Carolina at Chapel Hill, Deborah Bennett of UC Davis Health, Asa Bradman and Brenda Eskenazi of UC Berkeley, Bruce Lamphear of Simon Fraser University and Robin Wyatt of Columbia University.

A copy of their review, titled “Organophosphate Exposure During Pregnancy and Child Neurodevelopment: Recommendations for Essential Policy Reforms,” is online.

About Project TENDR: The scientific review was funded by Project TENDR — Targeting Environmental Neuro-Development Risks — an alliance of more than 50 of the nation’s top scientists, health professionals and children’s advocates. Their aim is to reduce exposures to chemicals and pollutants during fetal development and early childhood that can contribute to neurodevelopmental disorders.

About the Environmental Health Sciences Center: Established in 2015 by Hertz-Picciotto, the UC Davis Environmental Health Sciences Center links experts across scientific disciplines for studies on the effects of environmental chemicals, pollutants, events and disasters on disease and disability. The ultimate goal is to foster new approaches and policies that protect communities from harmful exposures.

Immigration to the United States changes a person’s microbiome

Public Release: 1-Nov-2018

 

Cell Press

IMAGE

IMAGE: US immigrants may lose the ability to digest certain types of plants, such as this unidentified jungle fern gathered for food by Karen villagers in Thailand.

Credit: Pajau Vangay

Researchers at the University of Minnesota and the Somali, Latino, and Hmong Partnership for Health and Wellness have new evidence that the gut microbiota of immigrants and refugees rapidly Westernize after a person’s arrival in the United States. The study of communities migrating from Southeast Asia to the U.S., published November 1 in the journal Cell, could provide insight into some of the metabolic health issues, including obesity and diabetes, affecting immigrants to the country.

“We found that immigrants begin losing their native microbes almost immediately after arriving in the U.S. and then acquire alien microbes that are more common in European-American people,” says senior author Dan Knights, a computer scientist and quantitative biologist at the University of Minnesota. “But the new microbes aren’t enough to compensate for the loss of the native microbes, so we see a big overall loss of diversity.”

It has been shown before that people in developing nations have a much greater diversity of bacteria in their gut microbiome, the population of beneficial microbes living in humans’ intestines, than people living in the U.S. “But it was striking to see this loss of diversity actually happening in people who were changing countries or migrating from a developing nation to the U.S.,” he says.

The research was conducted with assistance from–and inspired by–Minnesota’s large community of refugees and immigrants from Southeast Asia, particularly the Hmong and Karen peoples, ethnic minorities that originally were from China and Burma and that today have communities in Thailand. The study used a community-based participatory research approach: members of the Hmong and Karen communities in both Minnesota and Thailand were involved in designing the study, recruiting participants, and educating their communities about the findings.

“Obesity was a concern that was coming up a lot for the Hmong and Karen communities here. In other studies, the microbiome had been related to obesity, so we wanted to know if there was potentially a relationship in immigrants and make any findings relevant and available to the communities. These are vulnerable populations, so we definitely try to make all of our methods as sensitive to that as possible and make sure that they have a stake in the research,” says first author Pajau Vangay.

Knights, Vangay, and their team compared the gut microbiota of Hmong and Karen people still living in Thailand; Hmong and Karen people who had immigrated to the U.S.; the children of those immigrants; and Caucasian American controls. They also were able to follow a group of 19 Karen refugees as they relocated from Thailand to the U.S., which meant they could track how the refugees’ gut microbiomes changed longitudinally in their first six to nine months in the U.S.

And the researchers did find that significant changes happened that fast: in those first six to nine months, the Western strain Bacteroides began to displace the non-Western bacteria strain Prevotella. But this Westernization also continued to happen over the course of the first decade in the U.S., and overall microbiome diversity decreased the longer the immigrants had been in the U.S. The participants’ food logs suggested that eating a more Western diet played a role in perturbing the microbiome but couldn’t explain all the changes.

The changes were even more pronounced in their children. “We don’t know for sure why this is happening. It could be that this has to do with actually being born in the USA or growing up in the context of a more typical US diet. But it was clear that the loss of diversity was compounded across generations. And that’s something that has been seen in animal models before, but not in humans,” says Knights.

Although the research didn’t establish a cause-and-effect relationship between the microbiome changes in immigrants and the immigrant obesity epidemic, it did show a correlation: greater westernization of the microbiome was associated with greater obesity.

Knights believes that this research has a lot to tell us about our health. “When you move to a new country, you pick up a new microbiome. And that’s changing not just what species of microbes you have, but also what enzymes they carry, which may affect what kinds of food you can digest and how your diet interacts with your health,” he says. “This might not always be a bad thing, but we do see that Westernization of the microbiome is associated with obesity in immigrants, so this could an interesting avenue for future research into treatment of obesity, both in immigrants and potentially in the broader population.”

###

This research was supported by the Clinical and Translational Science Institute, the Healthy Foods, Healthy Lives Institute, the Office of Diversity, and the Graduate School at the University of Minnesota.

Cell, Vangay et al.: “U.S. immigration westernizes the human gut microbiome” https://www.cell.com/cell/fulltext/S0092-8674(18)31382-5

Cell (@CellCellPress), the flagship journal of Cell Press, is a bimonthly journal that publishes findings of unusual significance in any area of experimental biology, including but not limited to cell biology, molecular biology, neuroscience, immunology, virology and microbiology, cancer, human genetics, systems biology, signaling, and disease mechanisms and therapeutics. Visit: http://www.cell.com/cell. To receive Cell Press media alerts, contact press@cell.com.

Bitcoin can push global warming above 2 degrees C in a couple decades

Public Release: 29-Oct-2018

 

It alone could produce enough emissions to raise global temperatures as soon as 2033

University of Hawaii at Manoa

A new study published in the peer-reviewed journal Nature Climate Change finds that if Bitcoin is implemented at similar rates at which other technologies have been incorporated, it alone could produce enough emissions to raise global temperatures by 2°C as soon as 2033.

“Bitcoin is a cryptocurrency with heavy hardware requirements, and this obviously translates into large electricity demands,” said Randi Rollins, a master’s student at the University of Hawaii at Manoa and coauthor of the paper.

Purchasing with bitcoins and several other cryptocurrencies, which are forms of currency that exist digitally through encryption, requires large amounts of electricity. Bitcoin purchases create transactions that are recorded and processed by a group of individuals referred to as miners. Miners group every Bitcoin transaction made during a specific timeframe into a block. Blocks are then added to the chain, which is the public ledger. The verification process by miners, who compete to decipher a computationally demanding proof-of-work in exchange for bitcoins, requires large amounts of electricity.

The electricity requirements of Bitcoin have created considerable difficulties, and extensive online discussion, about where to put the facilities or rings that compute the proof-of-work of Bitcoin. A somewhat less discussed issue is the environmental impacts of producing all that electricity.

A team of UH Manoa researchers analyzed information such as the power efficiency of computers used by Bitcoin mining, the geographic location of the miners who likely computed the Bitcoin, and the CO2 emissions of producing electricity in those countries. Based on the data, the researchers estimated that the use of bitcoins in the year 2017 emitted 69 million metric tons of CO2.

Researchers also studied how other technologies have been adopted by society, and created scenarios to estimate the cumulative emissions of Bitcoin should it grow at the rate that other technologies have been incorporated.

The team found that if Bitcoin is incorporated, even at the slowest rate at which other technologies have been incorporated, its cumulative emissions will be enough to warm the planet above 2°C in just 22 years. If incorporated at the average rate of other technologies, it is closer to 16 years.

“Currently, the emissions from transportation, housing and food are considered the main contributors to ongoing climate change. This research illustrates that Bitcoin should be added to this list,” said Katie Taladay, a UH Manoa master’s student and coauthor of the paper.

“We cannot predict the future of Bitcoin, but if implemented at a rate even close to the slowest pace at which other technologies have been incorporated, it will spell very bad news for climate change and the people and species impacted by it,” said Camilo Mora, associate professor of Geography in the College of Social Sciences at UH Manoa and lead author of the study.

“With the ever-growing devastation created by hazardous climate conditions, humanity is coming to terms with the fact that climate change is as real and personal as it can be,” added Mora. “Clearly, any further development of cryptocurrencies should critically aim to reduce electricity demand, if the potentially devastating consequences of 2°C of global warming are to be avoided.”

###

MORE ABOUT THE DEPARTMENT OF GEOGRAPHY

Geography (GEOG) is the study of places and the relationships between people and their environment. Geographers explore both the biophysical processes of Earth’s surface and the human societies spread across it. It examines how cultural and political economic institutions interact with the natural environment, and the way that locations and places can impact people, and how these relationships change over time. The Department of Geography in the UH Manoa College of Social Sciences offers students an interdisciplinary academic training to study many critical issues that face modern society, such as globalization and its regional implications, climate change and its effects, resource use and sustainability, cultural change and environmental consequences, geopolitics, and the use of geospatial technologies.

MORE ABOUT THE COLLEGE OF SOCIAL SCIENCES

Marked by leadership, excellence and innovation, the College of Social Sciences (CSS) at the University of Hawai’i at Manoa provides students with a culturally diverse experience that transforms them into bold, engaged global citizens who affect change, break down barriers, touch lives and succeed in a multi-cultural context. Its student-centered environment is dedicated to providing students with a vibrant academic climate that affords exciting, intense interaction among students and faculty as they address fundamental questions about human behavior. Featuring outstanding scholarship through internships, active and service learning approaches to teaching, and an international focus particularly in the Asia Pacific region, it prepares students to become leaders in public and private enterprises throughout Hawai’i and Asia.

MORE ABOUT THE UNIVERSITY OF HAWAII AT MANOA

The University of Hawaii at Manoa serves more than 18,000 students pursuing more than 240 different degrees. Coming from every Hawaiian island, every state in the nation, and more than 140 countries, UH Manoa students thrive in an enriching environment for the global exchange of ideas. For more information, visit http://manoa.hawaii.edu. Follow us on Facebook http://www.facebook.com/uhmanoa and Twitter http://twitter.com/UHManoaNews.

Study reveals how soil bacteria are primed to consume greenhouse gas

Public Release: 29-Oct-2018

 

University of East Anglia

New research has revealed that some soil bacteria are primed ready to consume the potent greenhouse gas nitrous oxide when they experience life without oxygen in the environment.

Previously it was thought that bacteria had to first sense nitrous oxide, also known as ‘laughing gas’, before they could breathe and consume it in place of oxygen.

Researchers from the University of East Anglia (UEA) and the Norwegian University of Life Sciences have discovered that in fact bacteria hedge their bets and gamble on nitrous oxide being present in their environment, and so keep the systems for nitrous oxide destruction active – and even deliberately distribute them within new cells – to give them a chance to survive low oxygen levels within the soil.

The European team, working as the Nitrous Oxide Research Alliance, focused on the denitrifying organism Paracoccus denitrificans. Publishing their findings in the journal Proceedings of the National Academy of Sciences (PNAS), the authors say they have important implications for controlling emissions and using such bacteria as ‘sinks’ to remove nitrous oxide from the atmosphere.

Nitrous oxide accounts for approximately 10 per cent of all greenhouse gases but has around 300 times the global warming potential of carbon dioxide and stays in the atmosphere for about 120 years.

It also destroys the ozone layer with similar potency to chlorofluorocarbons (CFCs), so even a small fraction of nitrous oxide emitted into the atmosphere can have far-reaching consequences for the environment.

The level of nitrous oxide in the atmosphere has increased in line with global population growth as it is generated mainly though biodegradation of synthetic nitrate-based fertilizers in agricultural soils by microorganisms.

In the UK the work was led by Dr Andrew Gates and Prof David Richardson, working with PhD student Manuel Soriano-Laguna, all from UEA’s School of Biological Sciences.

Dr Gates, lecturer in bacterial bioenergetics, said: “Despite efforts to address carbon dioxide emissions, nitrous oxide is now emerging as a pressing global concern and requires researchers with different skill sets to come together from around the world to prevent the next wave of climate change.

“This work will help inform policy makers of the potential to exploit bacteria as sinks for this powerful climate-active gas. Our findings show that bet hedging is prominent below 20°C and may be widespread in soil organisms, so this natural phenomenon can be used to our advantage to control nitrous oxide emissions and combat climate change. Bet hedging can now be tested for in other organisms and in natural environments.”

###

The research was funded by the European Union through a Marie Curie International Training Network, which aims to train the next generation of researchers to tackle emerging global challenges.

‘Bet-hedging strategy for denitrifying bacteria curtails their release of N2O’, Pawel Lycus, Manuel Jesús Soriano-Laguna, Morten Kjos, David John Richardson, Andrew James Gates, Daniel Aleksanteri Milligan, Åsa Frostegård, Linda Bergaust, and Lars Reier Bakken, is published in PNAS on October 29.

Neurology: Space travel alters the brain

Public Release: 24-Oct-2018

 

Ludwig-Maximilians-Universität München

Spending long periods in space not only leads to muscle atrophy and reductions in bone density, it also has lasting effects on the brain. However, little is known about how different tissues of the brain react to exposure to microgravity, and it remains unclear whether and to what extent the neuroanatomical changes so far observed persist following return to normal gravity. Initiated and guided by a team of neuroscientists (headed by Prof. Floris L. Wuyts) based at the University of Antwerp, and in close cooperation with Russian colleagues, LMU neurologist Professor Peter zu Eulenburg has completed the first long-term study in Russian cosmonauts. In this study, which appears in the New England Journal of Medicine, they show that differential changes in the three main tissue volumes of the brain remain detectable for at least half a year after the end of their last mission.

The study was carried out on ten cosmonauts, each of whom had spent an average of 189 days on board the International Space Station (ISS). The authors used magnetic resonance tomography (MRT) to image the brains of the subjects both before and shortly after the conclusion of their long-term missions. In addition, seven members of the cohort were re-examined seven months after their return from space. “This is actually the first study in which it has been possible to objectively quantify changes in brain structures following a space mission also including an extended follow-up period,” zu Eulenburg points out.

The MRT scans performed in the days after the return to Earth revealed that the volume of the grey matter (the part of the cerebral cortex that mainly consists of the cell bodies of the neurons) was reduced compared to the baseline measurement before launch. In the follow-up scans done 7 months later, this effect was partly reversed, but nevertheless still detectable. In contrast, the volume of the cerebrospinal fluid, which fills the inner and outer cavities of the brain, increased within the cortex during long-term exposure to microgravity. Moreover, this process was also observable in the outside spaces that cover the brain after the return to Earth, while the cerebrospinal fluid spaces within returned to near normal size. The white matter tissue volume (those parts of the brain that are primarily made up of nerve fibers) appeared to be unchanged upon investigation immediately after landing. However, the subsequent examination 6 months later showed a widespread reduction in volume relative to both earlier measurements. In this case, the researchers postulate that over the course of a longer stint in space, the volume of the white matter may slowly be replaced by an influx of cerebrospinal fluid. Upon return to Earth, this process is then gradually reversed, which then results in a relative reduction of white matter volume.

“Taken together, our results point to prolonged changes in the pattern of cerebrospinal fluid circulation over a period of at least seven months following the return to Earth,” says zu Eulenburg. “However, whether or not the extensive alterations shown in the grey and the white matter lead to any changes in cognition remains unclear at present,” he adds. So far the only clinical indication for detrimental effects is a reduction in visual acuity that was demonstrated in several long-term space travelers. These changes may very well be attributable to the increased pressure exerted by the cerebrospinal fluid on the retina and the optic nerve. The governing cause for the widespread structural changes in the brain following long spaceflights might lie in minimal pressure changes within the body’s various water columns under conditions of microgravity that have a cumulative effect over time. According to the authors, to minimize the risks associated with long-term missions and to characterize any clinical significance of their structural findings, further studies using a wider range of diagnostic methods are deemed essential.

###

The New England Journal of Medicine 2018

Publication:

Brain Tissue-Volume Changes in Cosmonauts
Angelique Van Ombergen, Peter zu Eulenburg, Floris L. Wuyts, et al.
NEJM 2018

Contact:

Prof. Dr. med. Peter zu Eulenburg
Functional Imaging
German Center for Vertigo and Balance Disorders
Phone: +49 (0)89 / 4400-74822
Fax: +49 (0)89 / 4400-74801
Email: peter.zu.eulenburg@med.uni-muenchen.de

Web: https://www.gsn.uni-muenchen.de/people/faculty/associate/peter-zu-eulenburg/index.html

Mammals cannot evolve fast enough to escape current extinction crisis

Public Release: 15-Oct-2018

Humans are exterminating animal species so fast that evolution can’t keep up; Unless conservation efforts are improved, so many mammal species will die out during the next 50 years that nature will need 3-5 million years to recover, a new study shows

Aarhus University

IMAGE

IMAGE: An illustration of how the smaller mammals will have to evolve and diversify over the next 3-5 million years to make up for the loss of the large mammals.

Credit: Matt Davis, Aarhus University

We humans are exterminating animal and plant species so quickly that nature’s built-in defence mechanism, evolution, cannot keep up. An Aarhus-led research team calculated that if current conservation efforts are not improved, so many mammal species will become extinct during the next five decades that nature will need 3-5 million years to recover.

There have been five upheavals over the past 450 million years when the environment on our planet has changed so dramatically that the majority of Earth’s plant and animal species became extinct. After each mass extinction, evolution has slowly filled in the gaps with new species.

The sixth mass extinction is happening now, but this time the extinctions are not being caused by natural disasters; they are the work of humans. A team of researchers from Aarhus University and the University of Gothenburg has calculated that the extinctions are moving too rapidly for evolution to keep up.

If mammals diversify at their normal rates, it will still take them 5-7 million years to restore biodiversity to its level before modern humans evolved, and 3-5 million years just to reach current biodiversity levels, according to the analysis, which was published recently in the prestigious scientific journal, PNAS.

Some species are more distinct than others

The researchers used their extensive database of mammals, which includes not only species that still exist, but also the hundreds of species that lived in the recent past and became extinct as Homo sapiens spread across the globe. This meant that the researchers could study the full impact of our species on other mammals.

However, not all species have the same significance. Some extinct animals, such as the Australian leopard-like marsupial lion Thylacoleo, or the strange South American Macrauchenia (imagine a lama with an elephant trunk) were evolutionary distinct lineages and had only few close relatives. When these animals became extinct, they took whole branches of the evolutionary tree of life with them. We not only lost these species, we also lost the unique ecological functions and the millions of years of evolutionary history they represented.

“Large mammals, or megafauna, such as giant sloths and sabre-toothed tigers, which became extinct about 10,000 years ago, were highly evolutionarily distinct. Since they had few close relatives, their extinctions meant that entire branches of Earth’s evolutionary tree were chopped off” says palaeontologist Matt Davis from Aarhus University, who led the study. And he adds:

“There are hundreds of species of shrew, so they can weather a few extinctions. There were only four species of sabre-toothed tiger; they all went extinct.”

Long waits for replacement rhinos

Regenerating 2.5 billion years of evolutionary history is hard enough, but today’s mammals are also facing increasing rates of extinction. Critically endangered species such as the black rhino are at high risk of becoming extinct within the next 50 years. Asian elephants, one of only two surviving species of a once mighty mammalian order that included mammoths and mastodons, have less than a 33 percent chance of surviving past this century.

The researchers incorporated these expected extinctions in their calculations of lost evolutionary history and asked themselves: Can existing mammals naturally regenerate this lost biodiversity?

Using powerful computers, advanced evolutionary simulations and comprehensive data about evolutionary relationships and body sizes of existing and extinct mammals, the researchers were able to quantify how much evolutionary time would be lost from past and potential future extinctions as well as how long recovery would take.

The researchers came up with a best-case scenario of the future, where humans have stopped destroying habitats and eradicating species, reducing extinction rates to the low background levels seen in fossils. However, even with this overly optimistic scenario, it will take mammals 3-5 million years just to diversify enough to regenerate the branches of the evolutionary tree that they are expected to lose over the next 50 years. It will take more than 5 million years to regenerate what was lost from giant Ice Age species.

Prioritizing conservation work

“Although we once lived in a world of giants: giant beavers, giant armadillos, giant deer, etc., we now live in a world that is becoming increasingly impoverished of large wild mammalian species. The few remaining giants, such as rhinos and elephants, are in danger of being wiped out very rapidly,” says Professor Jens-Christian Svenning from Aarhus University, who heads a large research program on megafauna, which includes the study.

The research team doesn’t have only bad news, however. Their data and methods could be used to quickly identify endangered, evolutionarily distinct species, so that we can prioritise conservation efforts, and focus on avoiding the most serious extinctions.

As Matt Davis says: “It is much easier to save biodiversity now than to re-evolve it later.”

###

Further info:

Matt Davis is part of the Carlsberg Foundation’s Semper Ardens project “Megafauna Ecosystem Ecology from the Deep Prehistory to a Human-Dominated Future” and the Centre for Biodiversity in A Changing World (BIOCHANGE) at the Department of Bioscience at Aarhus University, both headed by Professor Jens-Christian Svenning. The third member of the group is Søren Faurby from Gothenburg University.

Fluoride levels in pregnant women in Canada show drinking water is primary source of exposure

Public Release: 10-Oct-2018

Fluoride levels in pregnant women in Canada show drinking water is primary source of exposure

York University

TORONTO, October 10, 2018 – A new study led by York University researchers has found that fluoride levels in urine are twice as high for pregnant women living in Canadian cities where fluoride is added to public drinking water as for those living in cities that do not add fluoride to public water supplies.

The study “Community Water Fluoridation and Urinary Fluoride Concentrations in a National Sample of Pregnant Women in Canada” was published today in Environmental Health Perspectives. It is the first study in North America to examine how fluoride in water contributes to urinary fluoride levels in pregnant women. The research was conducted as part of a larger study funded by the National Institute of Environmental Health Sciences, part of the National Institutes of Health (NIH) investigating whether early life exposure to fluoride affects the developing brain.

“We found that fluoride in drinking water was the major source of exposure for pregnant women living in Canada. Women living in fluoridated communities have two times the amount of fluoride in their urine as women living in non-fluoridated communities,” said Christine Till, an associate professor of Psychology in York’s Faculty of Health and lead author on the study.

The Maternal Infant Research on Environmental Chemicals (MIREC) study recruited 2,001 pregnant women between 2008 and 2011. The women lived in 10 large cities across Canada. Seven of the cities (Toronto, Hamilton, Ottawa, Sudbury, Halifax, Edmonton and Winnipeg) added fluoride to municipal water while three (Vancouver, Montreal and Kingston) did not.

Urine samples were collected during each trimester of pregnancy for over 1,500 women. Fluoride levels in municipal water treatment plants that provided water to each women’s home were obtained. Information about each woman’s demographics, lifestyle and medical history was also collected.

In addition to fluoridated water, sources of fluoride can include toothpastes, mouth rinses, as well as processed beverages and food, especially those made with fluoridated water. Beyond water, products such as tea have previously been found to have high concentrations of natural fluoride.

In this study, fluoride level in water was the main determinant of fluoride level in the women’s urine. Higher consumption of black tea was also correlated with higher levels of urinary fluoride in pregnant women.

The levels of fluoride among pregnant women living in fluoridated communities in Canada were similar with levels reported in a prior study of pregnant women living in Mexico City where fluoride is added to table salt.

“This finding is concerning because prenatal exposure to fluoride in the Mexican sample has been associated with lower IQ in children. New evidence published today in Environment International also reported an association between higher levels of fluoride in pregnancy and inattentive behaviours among children in the same Mexican sample,” said Till.

The research team, including experts from Simon Fraser University, Université Laval, Indiana University, University of Montreal and Cincinnati Children’s Hospital, is investigating whether prenatal exposure to fluoride in Canadian children results in IQ deficits, similar to the Mexican study.

Fluoride has been added to public drinking water in Canadian and American communities since the 1940s as a means of preventing tooth decay. Today, about 40 per cent of Canadians and 74 percent of the U.S. population on public water supplies receive fluoridated water.

###

York University champions new ways of thinking that drive teaching and research excellence. Our students receive the education they need to create big ideas that make an impact on the world. Meaningful and sometimes unexpected careers result from cross-disciplinary programming, innovative course design and diverse experiential learning opportunities. York students and graduates push limits, achieve goals and find solutions to the world’s most pressing social challenges, empowered by a strong community that opens minds. York U is an internationally recognized research university – our 11 faculties and 25 research centres have partnerships with 200+ leading universities worldwide. Located in Toronto, York is the third largest university in Canada, with a strong community of 53,000 students, 7,000 faculty and administrative staff, and more than 300,000 alumni.

York U’s fully bilingual Glendon Campus is home to Southern Ontario’s Centre of Excellence for French Language and Bilingual Postsecondary Education.

Media Contacts:

Anjum Nayyar, York University Media Relations, 416-736-2100 ext. 44543, anayyar@yorku.ca

Alternatives to pesticides — Researchers suggest popular weeds

Public Release: 3-Oct-2018

 

Exeley Inc.

Nematodes, also known as roundworms, are one of the most numerous animal species on earth. As simple as they seem, many of them live as parasites in plants, insects and animals; habitats also include, soil, sea as well as fresh water. Researchers perform studies on nematodes, since many of them are beneficial in medicine, agriculture or veterinary science, while others damage the crops by feeding on them or by transmitting various viruses and bacterial diseases into the plant cells.

A major issue for researchers is the control of plant parasitic nematodes and the protection of crop systems. Since pesticides commonly used to combat plant attacking nematodes are often highly toxic and dangerous for the environment and non-target organisms, researchers try to find natural alternatives.

Studies suggest that botanic soil amendments with weeds may help to combat nematodes. Dr. Nikoletta Ntalli and colleagues in their Nematicidal Weeds, Solanum nigrum and Datura stramonium article published in Journal of Nematology analysed two popular and globally distributed weeds: Solanum nigrum Linn. and Datura stramonium Linn., commonly known as black nightshade and jimsonweed, researching their abilities to control nematodes.

Researchers reported on S. nigrum and D. stramonium extracts for egg hatch inhibition as well as paralytic impact on Meloidogyne sp. (the genus of the nematodes that was controlled in the study) for the first time. Additionally, when soil infested with nematodes was treated with S. nigrum seeds paste, the number of females per gram of tomato roots was the lowest ever reported.

The research findings aim at new, ecological tactics to control plant parasitic nematodes that are especially difficult to manage in protected areas inside the soil. The chemical industry rarely sustains the development of nematicides, and most of the old formulations are now banned due to toxicological and ecotoxicological reasons. Alternative control measures for an ecological control of nematodes are already introduced in many countries since under new laws, farmers are compelled to look for such. The significance of the presented study and tested methods lie not only in its proven efficiency, but more importantly in its cost effectiveness and ease to perform. Since both S. nigrum and D. stramonium are commonly and globally existing weed species, often directly situated at the site, their soil incorporation presents as an interesting alternative to the popularly used pesticides.

###

The original article is fully available for reading at: https://www.exeley.com/journal_of_nematology/doi/10.21307/jofnem-2018-017 DOI: 10.21307/jofnem-2018-017

About Journal of Nematology journal:

Journal of Nematology is the official publication of the Society of Nematologists since 1969. The journal publishes original papers on all aspects of basic, applied, descriptive, theoretical or experimental nematology and adheres to strict peer-review policy. Other categories of papers include invited reviews, research notes, abstracts of papers presented at annual meetings, and special publications as appropriate.

https://www.exeley.com/journal/journal_of_nematology

About Exeley:

Exeley Inc. is a New York based company that focuses on offering innovative publishing services to Open Access publications worldwide. Amongst the technological solutions delivered, the functionalities include: responsive webpage design, full-text XML, integration with social media sites and performance metrics including altmetrics.

Exeley Inc.
https://www.exeley.com

Bees’ medicine chest should include sunflower pollen, UMass Amherst study suggests

Public Release: 26-Sep-2018

 

Ecologist Lynn Adler at UMass Amherst and others found that eating sunflower pollen dramatically and consistently reduced a protozoan infection in bumble bees

University of Massachusetts at Amherst

AMHERST, Mass. – A new study by Jonathan Giacomini and his former advisor, evolutionary ecologist Lynn Adler at the University of Massachusetts Amherst, with others, found that eating sunflower pollen dramatically and consistently reduced a protozoan pathogen infection in bumble bees and reduced a microsporidian pathogen of the European honey bee, raising the possibility that sunflowers may provide a simple solution to improve the health of economically and ecologically important pollinators.

Adler explains, “There is a whole literature of plants that use chemicals to protect themselves against insects. Monarch butterflies are the prime example; they are insensitive to the poison in milkweed plants they eat, but it makes them poisonous in turn to their predators. For some reason, no one had thought about this in terms of bees, but they do eat plant pollen and nectar.”

The ecologist says she has long been interested in whether “floral rewards” like nectar and pollen might have protective effects against disease and parasites for pollinator insects, but her own experimental results with nectar were inconsistent. She points out that, “compared to nectar, pollen has far more concentrated defensive chemicals.”

The current study began as an honors undergraduate experiment by first author Jonathan Giacomini, who is now a graduate student of co-author Rebecca Irwin at North Carolina State University. Details are in the current issue of Scientific Reports.

The researchers studied a common parasitic infection of bumble bees caused by the protozoan Crithidia bombi and conducted a pilot study of the microsporidian Nosema ceranae in honey bees. Adler says up to 80 percent of bumble bees found in western Massachusetts where the study took place are infected with Crithidia. It has been shown to impair learning and foraging, reduce lifespan and the likelihood of queens founding new colonies. “In the lab it may be benign,” she notes, “but with the stress of outdoor life it becomes more of a problem for bees.”

“From the plant’s point of view, pollen is a gamete and you want the pollinator to come and eat your nectar and carry the pollen to fertilize another plant, but pollinators do eat pollen because it’s a great protein and fat source, which is difficult for plant eaters to find. Any vegetarian will tell you that,” Adler quips.

For this work, the researchers infected groups of bees and fed them one of four different pollen diets: sunflower, buckwheat, rapeseed or a fourth mix of all three, then measured Crithidia infection in the insects.

Adler reports, “Sunflower pollen was amazing. When the bees ate sunflower pollen most of them just didn’t have detectable Crithidia counts. It was like nothing we’d ever seen with nectar, it was really dramatic and very exciting.” She also reports that mortality was not different for bees receiving the different treatments after one week. In small bee colonies they studied, all bees had better reproduction when reared on sunflower compared to buckwheat pollen, but slightly lower adult survival.

“We realize that this is not a cure to all bee diseases, and we haven’t looked at many other pathogens yet, but the effects we have seen are dramatic,” the ecologist notes.

Further, “Bees do not do well eating just sunflower, but it may be part of a solution. They are going to need other and varied food for their health, but the cool thing about sunflowers is that they are grown agriculturally and are also a common wildflower native to North America. Incorporating them into pollinator gardens is a simple thing we can do to help bees.”

The team also sampled bees from farms around Amherst and measured how much sunflower was growing at the farm and the bees’ Crithidia load. “There were a lot of uncontrolled variables, but still the more sunflowers were grown at the farm, the lower the Crithidia load for the bees at that farm,” Adler says.

The authors suggest that “given consistent effects of sunflower in reducing pathogens, planting sunflower in agroecosystems and native habitat may provide a simple solution to reduce disease and improve the health of economically and ecologically important pollinators.”

###

This work was supported by the National Science Foundation, the U.S. Department of Agriculture and UMass Amherst’s Commonwealth Honors College.

Sunflower pollen has medicinal, protective effects on bees

Public Release: 26-Sep-2018

North Carolina State University

IMAGE

IMAGE: Honey bees fed a diet of sunflower pollen show dramatically lower rates of infection by a specific pathogen.

Credit: Jonathan Giacomini, NC State University

With bee populations in decline, a new study offers hope for a relatively simple mechanism to promote bee health and well-being: providing bees access to sunflowers.

The study, conducted by researchers at North Carolina State University and the University of Massachusetts Amherst, showed that two different species of bees fed a diet of sunflower pollen had dramatically lower rates of infection by specific pathogens. Bumble bees on the sunflower diet also had generally better colony health than bees fed on diets of other flower pollens.

The study showed that sunflower pollen reduced infection by a particular pathogen (Crithidia bombi) in bumble bees (Bombus impatiens). Sunflower pollen also protected European honey bees (Apis mellifera) from a different pathogen (Nosema ceranae). These pathogens have been implicated in slowing bee colony growth rates and increasing bee death.

The study also showed a deleterious effect, however, as honey bees on the sunflower diet had mortality rates roughly equivalent to honey bees not fed a pollen diet and four times higher than honey bees fed buckwheat pollen. This mortality effect was not observed in bumble bees.

Jonathan Giacomini, a Ph.D. student in applied ecology at NC State and corresponding author of a paper describing the research, said that bees already seem adept at collecting sunflower pollen. Annually, some two million acres in the United States and 10 million acres in Europe are devoted to sunflowers, he said, making sunflower pollen a ready and relevant bee food.

“We’ve tried other monofloral pollens, or pollens coming from one flower, but we seem to have hit the jackpot with sunflower pollen,” said co-senior author Rebecca Irwin, a professor of applied ecology at NC State. “None of the others we’ve studied have had this consistent positive effect on bumble bee health.”

Sunflower pollen is low in protein and some amino acids, so it should not be considered as a standalone meal for bee populations, Irwin said. “But sunflower could be a good addition to a diverse wildflower population for bees,” she said, especially generalists like bumble bees and honey bees.

The NC State researchers are now planning to follow up on the study to examine whether other species of bees show the positive effects of sunflower pollen and to gauge the mechanism behind the mostly positive effects of sunflower pollen.

“We don’t know if sunflower pollen is helping the host bees fight off pathogens or if sunflower pollen does something to the pathogens,” Irwin said. Future research is aimed at figuring this out.

###

Co-lead author Lynn S. Adler, Jessica Leslie and Evan C. Palmer-Young from the University of Massachusetts Amherst co-authored the paper, as did NC State’s David Tarpy. The study was funded by the U.S. Department of Agriculture (grants USDA-AFRI 2013-02536 and USDA/CSREES MAS000411), the National Science Foundation (grants NSF-DEB-1258096/1638866, REU supplement NSF DEB-1415507), the N.C. Agricultural Foundation, and NC State.

Note to editors: An abstract of the paper follows.

Medicinal value of sunflower pollen against bee pathogens

Authors: Jonathan J. Giacomini, David R. Tarpy and Rebecca E. Irwin, North Carolina State University; Jessica Leslie, Evan C. Palmer-Young and Lynn S. Adler, University of Massachusetts Amherst

Published: Sept. 26, 2018 in Scientific Reports

DOI: 10.1038/s41598-018-32681-y

Abstract: Global declines in pollinators, including bees, can have major consequences for ecosystem services. Bees are dominant pollinators, making it imperative to mitigate declines. Pathogens are strongly implicated in the decline of native and honey bees. Diet affects bee immune responses, suggesting the potential for floral resources to provide natural resistance to pathogens. We discovered that sunflower (Helianthus annuus) pollen dramatically and consistently reduced a protozoan pathogen (Crithidia bombi) infection in bumble bees (Bombus impatiens) and also reduced a microsporidian pathogen (Nosema ceranae) of the European honey bee (Apis mellifera), indicating the potential for broad antiparasitic effects. In a field survey, bumble bees from farms with more sunflower area had lower Crithidia infection rates. Given consistent effects of sunflower in reducing pathogens, planting sunflower in agroecosystems and native habitat may provide a simple solution to reduce disease and improve the health of economically and ecologically important pollinators.

Common weed killer linked to bee deaths

Public Release: 24-Sep-2018

University of Texas at Austin

The world’s most widely used weed killer may also be indirectly killing bees. New research from The University of Texas at Austin shows that honey bees exposed to glyphosate, the active ingredient in Roundup, lose some of the beneficial bacteria in their guts and are more susceptible to infection and death from harmful bacteria.

Scientists believe this is evidence that glyphosate might be contributing to the decline of honey bees and native bees around the world.

“We need better guidelines for glyphosate use, especially regarding bee exposure, because right now the guidelines assume bees are not harmed by the herbicide,” said Erick Motta, the graduate student who led the research, along with professor Nancy Moran. “Our study shows that’s not true.”

The findings are published this week in the journal Proceedings of the National Academy of Sciences.

Because glyphosate interferes with an important enzyme found in plants and microorganisms, but not in animals, it has long been assumed to be nontoxic to animals, including humans and bees. But this latest study shows that by altering a bee’s gut microbiome — the ecosystem of bacteria living in the bee’s digestive tract, including those that protect it from harmful bacteria — glyphosate compromises its ability to fight infection.

The researchers exposed honey bees to glyphosate at levels known to occur in crop fields, yards and roadsides. The researchers painted the bees’ backs with colored dots so they could be tracked and later recaptured. Three days later, they observed that the herbicide significantly reduced healthy gut microbiota. Of eight dominant species of healthy bacteria in the exposed bees, four were found to be less abundant. The hardest hit bacterial species, Snodgrassella alvi, is a critical microbe that helps bees process food and defend against pathogens.

The bees with impaired gut microbiomes also were far more likely to die when later exposed to an opportunistic pathogen, Serratia marcescens, compared with bees with healthy guts. Serratia is a widespread opportunistic pathogen that infects bees around the world. About half of bees with a healthy microbiome were still alive eight days after exposure to the pathogen, while only about a tenth of bees whose microbiomes had been altered by exposure to the herbicide were still alive.

“Studies in humans, bees and other animals have shown that the gut microbiome is a stable community that resists infection by opportunistic invaders,” Moran said. “So if you disrupt the normal, stable community, you are more susceptible to this invasion of pathogens.”

Based on their results, Motta and Moran recommend that farmers, landscapers and homeowners avoid spraying glyphosate-based herbicides on flowering plants that bees are likely to visit.

More than a decade ago, U.S. beekeepers began finding their hives decimated by what became known as colony collapse disorder. Millions of bees mysteriously disappeared, leaving farms with fewer pollinators for crops. Explanations for the phenomenon have included exposure to pesticides or antibiotics, habitat loss and bacterial infections. This latest study adds herbicides as a possible contributing factor.

“It’s not the only thing causing all these bee deaths, but it is definitely something people should worry about because glyphosate is used everywhere,” said Motta.

Native bumble bees have microbiomes similar to honey bees, so Moran said it’s likely that they would be affected by glyphosate in a similar way.

###

The paper’s third author is Kasie Raymann, a former postdoctoral researcher in Moran’s lab and now an assistant professor at the University of North Carolina at Greensboro.

Funding for this work was provided by the U.S. Department of Agriculture’s National Institute of Food and Agriculture; the U.S. National Institutes of Health; and Coordenação de Aperfeiçoamento de Pessoal de Nível Superior, Brasil (Coordination for the Improvement of Higher Education Personnel, a Foundation within the Ministry of Education in Brazil).

Europe plans to burn our Global Forest as Carbon Neutral Renewable Energy, Scientists Protest in mass

Public Release: 12-Sep-2018

Europe’s renewable energy directive poised to harm global forests

Princeton University, Woodrow Wilson School of Public and International Affairs

PRINCETON, N.J.–Europe’s decision to promote the use of wood as a “renewable fuel” will likely greatly increase Europe’s greenhouse gas emissions and cause severe harm to the world’s forests, according to a new paper published in Nature Communications.

European officials on final language for a renewable energy directive earlier this summer that will almost double Europe’s use of renewable energy by 2030. Against the advice of 800 scientists, the directive now treats wood as a low-carbon fuel, meaning that whole trees or large portions of trees can be cut down deliberately to burn. Such uses go beyond papermaking wastes and other wood wastes, which have long been used for bioenergy, but not to this magnitude.

The paper, co-authored by eight scientists from the United States and Europe, estimates that this bioenergy provision in the Renewable Energy Directive will lead to vast new cutting of the world’s forests. This is because additional wood equal to all of Europe’s existing wood harvests will be needed just to supply 5 percent of Europe’s energy.

The paper also estimates that using wood for energy will likely result in 10 to 15 percent in emissions from Europe’s energy use by 2050. This could occur by turning a 5 percent decrease in emissions required under the directive using solar energy or wind energy into a 5 to 10 increase by using wood.

Europe’s increased wood demand will require additional cutting in forests around the world, but the researchers explain the global impact is likely to be even greater by encouraging other countries to do the same. Already, tropical forest countries like Brazil and Indonesia have announced they, too, will try to reduce the effect of climate change by increasing their use of wood for bioenergy.

“Globally, if the world were to supply only an additional 2 percent of its energy from wood, it would need to double commercial wood harvests around the world with harsh effects on forests,” said study lead author Tim Searchinger, researcher scholar at Princeton University’s Woodrow Wilson School of Public and International Affairs.

Although wood is renewable, cutting down and burning wood for energy increases carbon in the atmosphere for decades to hundreds of years depending on a number of factors, the researchers explained. Bioenergy use in this form takes carbon that would otherwise remain stored in a forest and puts it into the atmosphere. Because of various inefficiencies in both the harvesting and burning process, the result is that far more carbon is emitted up smokestacks and into the air per kilowatt hour of electricity or heat than burning fossil fuels, the authors explained.

While regrowing trees can eventually reabsorb the carbon, they do so slowly and, for years, may not absorb more carbon than the original forests would have continued to absorb. This results in long periods of time before bioenergy pays off the “carbon debt” of burning wood compared to fossil fuels.

The paper also explains why the European directive’s sustainability conditions would have little consequence. Even if trees are cut down “sustainably,” that does not make the wood carbon free or low carbon because of added carbon in the atmosphere for such long periods of time.

The directive also misapplies accounting rules for bioenergy originally created for the U.N. Framework Convention Climate Change (UNFCCC). Under the rules of that treaty, countries that burn wood for energy can ignore emissions, but countries where the trees were chopped must count the carbon lost from the forest. Although this rule allows countries switching from coal to wood to ignore true emissions figures, it balances out global accounting, which is the sole purpose of those rules, and does not make bioenergy carbon free.

The system does not work for national energy laws, which will be required by the directive. If power plants have strong incentives to switch from coal to carbon-neutral wood, they will burn wood regardless of any real environmental consequences. Even if countries supplying the wood report emissions through UNFCCC, those emissions are not the power plants’ problem.

Finally, the paper highlights how the policy undermines years of efforts to save trees by recycling used paper instead of burning it for energy. Also, as the prices companies are required to pay for emitting carbon dioxide increase over time, the incorrect accounting of forest biomass Europe has adopted will make it more profitable to cut down trees to burn.

The paper’s warning that the use of wood will likely increase global warming for decades to centuries was also expressed by the European Academies Science Advisory Council in a commentary released June 15, 2018.

The paper, “Europe’s renewable energy directive poised to harm global forests,” first appeared online Sept. 12 in Nature Communications.

Statements by the Study’s Co-Authors:

Tim Beringer, Humboldt-Universita?t zu Berlin

“The directive reverses the global strategy of trying to subsidize countries to protect their forests and their carbon. Instead of rewarding countries and landowners to preserve forests and the carbon they store, this directive encourages companies to pay them for the carbon in their forests, but only on the condition that they cut the trees down and ship them to Europe to be burned.”

Bjart Holtsmark, Statistics Norway

“Although the directive encourages countries to harvest wood to burn, it does not require that they do. Countries should follow alternative strategies, focusing on solar in meeting European requirements for more renewable energy.”

Dan Kammen, University of California, Berkeley

“Compared with the vast majority of what counts as ‘bioenergy by harvesting wood,’ solar and wind have large advantages in land use efficiency and lower costs. The focus on wood is not only counter-productive for climate change but unnecessary.”

Eric Lambin, Stanford University and Université catholique de Louvain

“Treating wood as a carbon-neutral fuel is a simple policy decision with complex cascading effects on forest use, energy systems, wood trade, and biodiversity worldwide. Clearly, many of these effects have not received due attention.”

Wolfgang Lucht, Potsdam Institute for Climate Impact Research and Humboldt-Universita?t zu Berlin

“It makes no sense at all to save trees through recycling and then turn around to burn them for energy. There is nothing green, renewable, or environmentally friendly about that. Global forests are not disposable. The European Union should wake up and limit the role of bioenergy in the transition to renewable energies.”

Peter Raven, Missouri Botanical Society

“Any increased demand for wood as fuel will have huge negative impacts on global biodiversity because many kinds of forests throughout the world, including the most biodiverse, will also end up being cut to satisfy the endless demand locally and to send to rich countries as they exhaust their own managed forest.”

Jean-Pascal van Ypersele, Université catholique de Louvain

“European citizens once more experienced the harsh effects of global warming this summer. In the name of reversing climate change, this counterproductive policy will increase deforestation and carbon emissions rather than contribute to decreasing them. More emissions will only make the summers even hotter for decades to centuries.”

Drought, groundwater loss sinks California land at alarming rate

Public Release: 29-Aug-2018

Cornell University

ITHACA, N.Y. – The San Joaquin Valley in central California, like many other regions in the western United States, faces drought and ongoing groundwater extraction, happening faster than it can be replenished. And the land is sinking as a result — by up to a half-meter annually according to a new Cornell University study in Science Advances.

Despite much higher-than-normal amounts of rain in early 2017, the large agricultural and metropolitan communities that rely on groundwater in central California experienced only a short respite from an ongoing drought. When the rain stopped, drought conditions returned and the ground has continued to sink, according to researchers.

“With the heavy storms in early 2017, Californians were hopeful that the drought was over,” said Kyle Murray, a Cornell doctoral candidate in the field of geophysics. “There was a pause in land subsidence over a large area, and even uplift of the land in some areas. But by early summer the subsidence continued at a similar rate we observed during the drought.”

Murray and Rowena Lohman, Cornell associate professor in earth and atmospheric sciences, examined satellite imagery of the San Joaquin Valley. In the farming region of the Tulare Basin in central California, growers have been extracting groundwater for more than a century, said the researchers. Winter rains in the valley and snowmelt from the surrounding mountains replenish the groundwater annually to some extent, but drought has parched the valley since 2011.

Between 1962 and 2011, previous studies had found that the average volume of groundwater depletion each year was at least a half cubic-mile. Using satellite-based measurements between 2012 and 2016, depletion of the Tulare Basin groundwater volume was estimated at 10 miles cubed.

Fresno and Visalia border the Tulare Basin to the north, with Bakersfield to the south. About 250 agricultural products grow there with an estimated value of $17 billion annually, according to the U.S. Geological Survey. The valley holds about 75 percent of the California’s irrigated agricultural land and supplies 8 percent of the United States’ agricultural output.

As an engineering problem, subsidence damages infrastructure, causes roads to crack and give rise to sinkholes – expensive problems to fix, said Lohman. “One of the places where it really matters in California is the aqueduct system that brings water to the region. They’re engineered very carefully to have the correct slope to carry a given amount of water,” she said. “Now, one of the major aqueducts in that area is bowed and can’t deliver as much water. It’s been a huge engineering nightmare.”

Groundwater – as an agricultural and municipal resource – is incredibly important to communities in central California and elsewhere. Said Lohman: “The subsidence we see is a sign of how much the groundwater is being depleted. Eventually, the water quality and cost of extracting it could get to the point where it is effectively no longer available.”

###

Funding for this research was provided by NASA.

Cornell University has television, ISDN and dedicated Skype/Google+ Hangout studios available for media interviews.

Shrimp heal injured fish

Public Release: 22-Aug-2018

James Cook University

James Cook University scientists in Australia have discovered that shrimp help heal injured fish.

PhD student David Vaughan is working on a project led by Dr Kate Hutson at JCU’s Centre for Sustainable Tropical Fisheries and Aquaculture.

He said it was important to know how the shrimp interact with fish, as the team is in the process of identifying the best shrimp species to use to clean parasites from farmed and ornamental fish.

“Between 30 – 50% of farmed fish in Southeast Asia, the largest fish producing region in the world, are lost to parasites.

“We know that shrimp clean parasites from fish and if we can identify a species that does it efficiently, and does no harm, it offers a ‘greener’ alternative to chemicals,” he said.

Mr Vaughan said scientists knew injured fish visited shrimp ‘cleaning stations’ to have parasites removed – but the question was whether shrimp then took advantage of the injured fish and fed on their wounds.

He said the relationship between cleaner shrimp and their client fish was complicated, with the shrimp known to eat the mucus of the fish and the fish occasionally eating the shrimp.

The scientists used high-definition cameras to record the details of the interaction between the species.

“We found that shrimp did not aggravate existing injuries or further injure the fish,” said Mr Vaughan.

He said image analyses showed the cleaner shrimp actually reduced the redness of the injury.

“Injuries in fishes are susceptible to invasion by secondary pathogens like viruses and bacteria, and the reduction in redness by shrimp indicates that cleaner shrimp could reduce infections.”

Mr Vaughan said cleaner shrimp are also known to indirectly influence the health of client fishes by reducing stress levels as a function of cleaning – which also increased the ability of the fish to heal.

Study links mothers’ pesticide levels with autism in children

Public Release: 16-Aug-2018

American Psychiatric Association

Washington, D.C. – A new study appearing online today from the American Journal of Psychiatry finds that elevated pesticide levels in pregnant women are associated with an increased risk of autism among their children.

Autism is a complex neurodevelopmental disorder with largely unknown causes. It is characterized by problems with communication, difficulty relating to people and events, and repetitive body movements or behaviors. The study examined whether elevated maternal levels of persistent organic pollutants are associated with autism among children. Persistent organic pollutants are toxic chemicals that adversely affect human health and the environment around the world.

The study examined levels of DDE (p,p’-dichlorodiphenyl dichloroethylene), a breakdown product of the pesticide DDT (dichlorodiphenyltrichloroethane). Although DDT and other persistent organic pollutants were widely banned in many countries decades ago, they persist in the food chain, resulting in continuous exposure among populations. These chemicals transfer across the placenta, resulting in potential prenatal exposure among nearly all children because of existing maternal body burdens.

The researchers evaluated levels of DDE in maternal serum samples drawn from more than 750 children with autism and matched control subjects from a national birth cohort study, the Finnish Prenatal Study of Autism. The odds of autism among children were significantly increased in mothers whose DDE levels were elevated (defined as the 75th percentile or greater). In addition, the odds of children having autism with intellectual disability were increased more than twofold with maternal DDE levels above this threshold. While these results indicate an association, they do not prove causation, although the findings persisted after controlling for confounding factors.

The study also evaluated mothers’ levels of PCBs (polychlorinated biphenyls), chemicals used in industry, and found no association with autism in children.

The authors conclude that their findings “provide the first biomarker-based evidence that maternal exposure to insecticides is associated with autism among offspring.” Although further research is needed, this study contributes to the understanding of autism and has implications for preventing this disorder, the authors note.

###

American Psychiatric Association

The American Psychiatric Association, founded in 1844, is the oldest medical association in the country. The APA is also the largest psychiatric association in the world with more than 37,800 physician members specializing in the diagnosis, treatment, prevention and research of mental illnesses. APA’s vision is to ensure access to quality psychiatric diagnosis and treatment. For more information please visit http://www.psychiatry.org.

Blocking sunlight to cool Earth won’t reduce crop damage from global warming by reducing photosynthesis

Public Release: 8-Aug-2018

Solar geoengineering could reduce temperatures and heat stress, but also reduces photosynthesis

University of California – Berkeley

Injecting particles into the atmosphere to cool the planet and counter the warming effects of climate change would do nothing to offset the crop damage from rising global temperatures, according to a new analysis by University of California, Berkeley, researchers.

By analyzing the past effects of Earth-cooling volcanic eruptions, and the response of crops to changes in sunlight, the team concluded that any improvements in yield from cooler temperatures would be negated by lower productivity due to reduced sunlight. The findings have important implications for our understanding of solar geoengineering, one proposed method for helping humanity manage the impacts of global warming.

“Shading the planet keeps things cooler, which helps crops grow better. But plants also need sunlight to grow, so blocking sunlight can affect growth. For agriculture, the unintended impacts of solar geoengineering are equal in magnitude to the benefits,” said lead author Jonathan Proctor, a UC Berkeley doctoral candidate in the Department of Agricultural and Resource Economics. “It’s a bit like performing an experimental surgery; the side-effects of treatment appear to be as bad as the illness.”

“Unknown unknowns make everybody nervous when it comes to global policies, as they should,” said Solomon Hsiang, co-lead author of the study and Chancellor’s Associate Professor of Public Policy at UC Berkeley. “The problem in figuring out the consequences of solar geoengineering is that we can’t do a planetary-scale experiment without actually deploying the technology. The breakthrough here was realizing that we could learn something by studying the effects of giant volcanic eruptions that geoengineering tries to copy.”

Hsiang is director of UC Berkeley’s Global Policy Laboratory, where Proctor is a doctoral fellow.

Proctor and Hsiang will publish their findings online in the journal Nature on August 8.

Some people have pointed to past episodes of global cooling caused by gases emitted during massive volcanic eruptions, such as Mt. Pinatubo in the Philippines in 1991, and argued that humans could purposely inject sulfate aerosols into the upper atmosphere to artificially cool Earth and alleviate the greenhouse warming caused by increased levels of carbon dioxide. Aerosols – in this case, minute droplets of sulfuric acid – reflect a small percentage of sunlight back into space, reducing the temperature a few degrees.

“It’s like putting an umbrella over your head when you’re hot,” Proctor said. “If you put a global sunshade up, it would slow warming.”

Pinatubo, for example, injected about 20 million tons of sulfur dioxide into the atmosphere, reducing sunlight by about 2.5 percent and lowering the average global temperature by about half a degree Celsius (nearly 1 degree Fahrenheit).

The team linked maize, soy, rice and wheat production from 105 countries from 1979-2009 to global satellite observations of these aerosols to study their effect on agriculture. Pairing these results with global climate models, the team calculated that the loss of sunlight from a sulfate-based geoengineering program would cancel its intended benefits of protecting crops from damaging extreme heat.

“It’s similar to using one credit card to pay off another credit card: at the end of the day, you end up where you started without having solved the problem,” Hsiang said.

Some earlier studies suggested that aerosols might improve crop yields also by scattering sunlight and allowing more of the sun’s energy to reach interior leaves typically shaded by upper canopy leaves. This benefit of scattering appears to be weaker than previously thought.

“We are the first to use actual experimental and observational evidence to get at the total impacts that sulfate-based geoengineering might have on yields,” Proctor said. “Before I started the study, I thought the net impact of changes in sunlight would be positive, so I was quite surprised by the finding that scattering light decreases yields.”

Despite the study’s conclusions, Proctor said, “I don’t think we should necessarily write off solar geoengineering. For agriculture, it might not work that well, but there are other sectors of the economy that could potentially benefit substantially.”

Proctor and Hsiang noted that their methods could be used to investigate the impact of geoengineering on other segments of the economy, human health and the functioning of natural ecosystems.

They did not address other types of geoengineering, such as capture and storage of carbon dioxide, or issues surrounding geoengineering, such as its impact on Earth’s protective ozone layer and who gets to set Earth’s thermostat.

“Society needs to be objective about geoengineering technologies and develop a clear understanding of the potential benefits, costs and risks,” Proctor said. “At present, uncertainty about these factors dwarfs what we understand.”

The authors emphasize the need for more research into the human and ecological consequences of geoengineering, both good and bad.

“The most certain way to reduce damages to crops and, in turn, people’s livelihood and well-being, is reducing carbon emissions,” Proctor said.

“Perhaps what is most important is that we have respect for the potential scale, power and risks of geoengineering technologies,” Hsiang said. “Sunlight powers everything on the planet, so we must understand the possible outcomes if we are going to try to manage it.”

###

Other co-authors are Jennifer Burney of UC San Diego’s School of Global Policy and Strategy, Marshall Burke of Stanford University and Wolfram Schlenker of Columbia University’s School of International and Public Affairs and the Earth Institute. The research was supported by a National Science Foundation Grant (CNH-L 1715557) and a National Science Foundation Graduate Research Fellowship (DGE 1752814).

Climate taxes on agriculture could lead to more food insecurity than climate change itself

Public Release: 30-Jul-2018

 

International Institute for Applied Systems Analysis

New IIASA-led research has found that a single climate mitigation scheme applied to all sectors, such as a global carbon tax, could have a serious impact on agriculture and result in far more widespread hunger and food insecurity than the direct impacts of climate change. Smarter, inclusive policies are necessary instead.

This research, published in Nature Climate Change, is the first international study to compare across models the effects of climate change on agriculture with the costs and effects of mitigation policies, and look at subsequent effects on food security and the risk of hunger.

The researchers, led by Tomoko Hasegawa, a researcher at IIASA and Japan’s National Institute for Environment Studies (NIES), and Shinichiro Fujimori, a IIASA researcher and associate professor at Kyoto University, summarized outputs of eight global agricultural models to analyze various different scenarios to 2050. The scenarios covered different socioeconomic development pathways, including one in which the world pursues sustainability, and one in which the world follows current development trends, different levels of global warming, and whether or not climate mitigation policies were employed.

By 2050, the models suggest that climate change could be responsible for putting an extra 24 million people at risk of hunger on average, with some models suggesting up to 50 million extra could be at risk. However, if agriculture is included in very stringent climate mitigation schemes, such as a global carbon tax or a comprehensive emission trading system applying the same rules to all sectors of the economy, the increase in food prices would be such that 78 million more people would be at risk of hunger, with some models finding that up to 170 million more would be at risk.

Some areas are likely to be much more vulnerable than others, such as sub-Saharan Africa and India.

There is a growing consensus that agriculture, one of the world’s major greenhouse gas emitters, must do more to share the burden of carbon emissions reduction. The new research shows that without careful planning, the burden of mitigation policies is simply too great. All the models showed that deploying measures such as a carbon tax raises the cost of food production. This can be directly, through taxes on direct agricultural emissions, and taxes on emissions resulting from land use change, such as converting forest to expand agricultural land, and indirectly, through the increased demands for biofuel, which competes with food production for land.

The researchers stress that their results should not be used to argue against greenhouse gas emissions reduction efforts. Climate mitigation efforts are vital. Instead, the research shows the importance of “smart”, targeted policy design, particularly in agriculture. When designing climate mitigation policies, policymakers need to scrutinize other factors and development goals more closely, rather than focusing only on the goal of reducing emissions.

“The findings are important to help realize that agriculture should receive a very specific treatment when it comes to climate change policies,” says Hasegawa. “Carbon pricing schemes will not bring any viable options for developing countries where there are highly vulnerable populations. Mitigation in agriculture should instead be integrated with development policies.”

The researchers suggest, for example, schemes encouraging more productive and resilient agricultural systems. The developing world’s ruminant livestock herds produce three-quarters of the world’s ruminant greenhouse gases, but only half of its milk and beef. Using efficient techniques and technology from the developed world would then simultaneously reduce greenhouse gas emissions, promote economic growth, reduce poverty (thereby improving health and living conditions), and improve food security. Another suggestion is complementary policies to counteract the impact of mitigation policies on vulnerable regions, for example, money raised from carbon taxes could be used for food aid programs in particularly hard-hit areas or countries.

“As agriculture is more and more directly associated with the discussion on global mitigation efforts, we hope the paper will show that differentiated solutions need to be found for this sector. As countries are all working at defining emission reduction pathways within the context of the Paris Agreement, it serves as a warning that other development objectives should be kept in mind to choose the right path towards sustainability,” says IIASA researcher and coauthor Hugo Valin.

###

Reference

Hasegawa T, Fujimori S, Havlík P, Valin H, Bodirsky BL, Doelman JC, Fellmann T, Kyle P et al. (2018). Risk of increased food insecurity under stringent global climate change mitigation policy. Nature Climate Change DOI: 10.1038/s41558-018-0230-x [pure.iiasa.ac.at/15389]

More information:

For access to the full dataset of results of the study, please consult the JRC DataM website: https://datam.jrc.ec.europa.eu/datam/public/pages/index.xhtml

Contacts:

Petr Havlik
Deputy Program Director
Ecosystems Services and Management
Tel: +43 2236 807 511
havlikpt@iiasa.ac.at

Hugo Valin
Research Scholar
Ecosystems Services and Management
Tel: +43 2236 807 405
valin@iiasa.ac.at

Helen Tunnicliffe
IIASA Press Office
Tel: +43 2236 807 316
Mob: +43 676 83 807 316
tunnicli@iiasa.ac.at

About IIASA:

The International Institute for Applied Systems Analysis (IIASA) is an international scientific institute that conducts research into the critical issues of global environmental, economic, technological, and social change that we face in the twenty-first century. Our findings provide valuable options to policymakers to shape the future of our changing world. IIASA is independent and funded by prestigious research funding agencies in Africa, the Americas, Asia, and Europe. http://www.iiasa.ac.at

About NIES:

The National Institute for Environmental Studies (NIES), founded in 1974, is an independent research institute based in Tsukuba (Japan) addressing crucial scientific questions in the fields of global and regional environmental problems. Researchers in both areas of natural and social sciences work together to study global change and its impacts on ecological, economic and social systems and to devise general strategies for achieving low carbon society. Through GHG emissions monitoring, data analysis, computer simulations and models, NIES provides decision makers with sound information about climate change and low carbon development. In addition to publishing results in scientific journals, the Institute advises regional, national and global organizations and authorities. http://www.nies.go.jp/gaiyo/index-e.html

About Kyoto University:

Kyoto University is one of Japan and Asia’s premier research institutions, founded in 1897 and responsible for producing numerous Nobel laureates and winners of other prestigious international prizes. A broad curriculum across the arts and sciences at both undergraduate and graduate levels is complemented by numerous research centers, as well as facilities and offices around Japan and the world. For more information please see: http://www.kyoto-u.ac.jp/en

Exposure to fracking chemicals and wastewater spurs fat cell development

PUBLIC RELEASE: 21-JUN-2018

 

Researchers saw increases in the size and number of fat cells in laboratory models following exposure, even at diluted concentrations.

DUKE UNIVERSITY

DURHAM, N.C. – Exposure to fracking chemicals and wastewater promotes fat cell development, or adipogenesis, in living cells in a laboratory, according to a new Duke University-led study.

Researchers observed increases in both the size and number of fat cells after exposing living mouse cells in a dish to a mixture of 23 commonly used fracking chemicals. They also observed these effects after exposing the cells to samples of wastewater from fracked oil and gas wells and surface water believed to be contaminated with the wastewater. The findings appear June 21 in Science of the Total Environment.

“We saw significant fat cell proliferation and lipid accumulation, even when wastewater samples were diluted 1,000-fold from their raw state and when wastewater-affected surface water samples were diluted 25-fold,” said Chris Kassotis, a postdoctoral research associate at Duke’s Nicholas School of the Environment, who led the study.

“Rather than needing to concentrate the samples to detect effects, we diluted them and still detected the effects,” he said.

Previous lab studies by Kassotis and his colleagues have shown that rodents exposed during gestation to the mix of 23 fracking chemicals are more likely to experience metabolic, reproductive and developmental health impacts, including increased weight gain.

Kassotis said further research will be needed to assess whether similar effects occur in humans or animals who drink or come into physical contact with affected surface waters outside the laboratory.

More than 1,000 different chemicals are used for hydraulic fracturing across the United States, many of which have been demonstrated through laboratory testing to act as endocrine disrupting chemicals in both cell and animal models.

To conduct this study, Kassotis and colleagues collected samples of fracking wastewater and wastewater-contaminated surface water near unconventional (aka, fracked) oil and gas production sites in Garfield County, Colorado, and Fayette County, West Virginia, in 2014.

Laboratory cultures of mouse cells were then exposed to these waters at varying concentrations or dilutions over a two-week period. The researchers measured how fat cell development in the cultures was affected. They performed similar tests exposing cell models to a mix of 23 fracking chemicals.

Within each experiment, other cells were exposed to rosiglitazone, a pharmaceutical known to be highly effective at activating fat cell differentiation and causing weight gain in humans.

The results showed that the 23-chemical mix induced about 60 percent as much fat accumulation as the potent pharmaceutical; the diluted wastewater samples induced about 80 percent as much; and the diluted surface water samples induced about 40 percent as much.

In all three cases, the number of pre-adipocytes, or precursor fat cells, that developed was much greater in cell models exposed to the chemicals or water samples than in those exposed to the rosiglitazone.

The tests also provided insights into the mechanisms that might be driving these effects.

“Activation of the hormone receptor PPAR-gamma, often called the master regulator of fat cell differentiation, occurred in some samples, while in other samples different mechanisms such as inhibition of the thyroid or androgen receptor, seemed to be in play,” Kassotis explained.

###

Susan Nagel of the University of Missouri and Heather Stapleton of Duke’s Nicholas School co-authored the new study with Kassotis.

Primary funding came from the National Institute for Environmental Health Sciences. Additional funding came from the University of Missouri, a crowdfunding campaign via Experiment.com, and an EPA 520 STAR Fellowship Assistance Agreement.

CITATION: “Unconventional Oil and Gas Chemicals and Wastewater-Impacted Water Samples Promote Adipogenesis via PPARγ-Dependent and Independent Mechanisms in 3T3-L1 Cells,” Christopher D. Kassotis, Susan C. Nagel and Heather M. Stapleton; Science of the Total Environment, June 21, 2018. DOI: 10.1016/j.scitotenv.2018.05.030

University of Guelph study uncovers cause of pesticide exposure, Parkinson’s link

Public Release: 23-May-2018

Professor Scott Ryan has determined that low-level exposure to the pesticides disrupts cells in a way that mimics the effects of mutations known to cause Parkinson’s disease

University of Guelph

A new University of Guelph study has discovered why exposure to pesticides increases some people’s risk of developing Parkinson’s disease.

Previous studies have found an association between two commonly used agrochemicals (paraquat and maneb) and Parkinson’s disease.

Now U of G professor Scott Ryan has determined that low-level exposure to the pesticides disrupts cells in a way that mimics the effects of mutations known to cause Parkinson’s disease.

Adding the effects of the chemicals to a predisposition for Parkinson’s disease drastically increases the risk of disease onset, said Ryan.

“People exposed to these chemicals are at about a 250-percent higher risk of developing Parkinson’s disease than the rest of the population,” said Ryan, a professor in the Department of Molecular and Cellular Biology.

“We wanted to investigate what is happening in this susceptible population that results in some people developing the disease.”

Used on a variety of Canadian crops, paraquat is used on crops as they grow and maneb prevents post-harvest spoiling.

Published in the journal Federation of American Societies for Experimental Biology, this study used stem cells from people with Parkinson’s disease that had a mutation in a gene called synuclein that is highly associated with increased risk of Parkinson’s Disease, as well as normal embryonic stem cells in which, the risk associated mutation was introduced by gene editing.

“Until now, the link between pesticides and Parkinson’s disease was based primarily on animal studies as well as epidemiological research that demonstrated an increased risk among farmers and others exposed to agricultural chemicals,” said Ryan. “We are one of the first to investigate what is happening inside human cells.”

From the two types of stem cells, Ryan and his team made dopamine-producing neurons — the specific neurons affected in Parkinson’s disease — and exposed them to the two agrochemicals.

In exposing cells to agrochemicals, energy-producing mitochondria were prevented from moving to where they were needed inside the cell, depleting the neurons of energy.

Neurons from the Parkinson’s patients and those in which the genetic risk factor was introduced were impaired at doses below the EPA reported lowest observed effect level. Higher doses are needed to impair function in normal neurons.

“People with a predisposition for Parkinson’s disease are more affected by these low level exposures to agrochemicals and therefore more likely to develop the disease,” said Ryan. “This is one of the reasons why some people living near agricultural areas are at a higher risk.”

He said the findings indicate that we need to reassess current acceptable levels for these two agrochemicals.

“This study shows that everyone is not equal, and these safety standards need to be updated in order to protect those who are more susceptible and may not even know it.”

Lightning in the eyewall of a hurricane beamed antimatter toward the ground

Public Release: 21-May-2018

First detection of the downward positron beam from a terrestrial gamma-ray flash was captured by an instrument flown through the eyewall of Hurricane Patricia in 2015

University of California – Santa Cruz

IMAGE

IMAGE: The ADELE mark II flew aboard NOAA’s Hurricane Hunter WP-3D Orion during the Atlantic hurricane season.

Credit: Gregory Bowers

Hurricane Patricia, which battered the west coast of Mexico in 2015, was the most intense tropical cyclone ever recorded in the Western Hemisphere. Amid the extreme violence of the storm, scientists observed something new: a downward beam of positrons, the antimatter counterpart of electrons, creating a burst of powerful gamma-rays and x-rays.

Detected by an instrument aboard NOAA’s Hurricane Hunter aircraft, which flew through the eyewall of the storm at its peak intensity, the positron beam was not a surprise to the UC Santa Cruz scientists who built the instrument. But it was the first time anyone has observed this phenomenon.

According to David Smith, a professor of physics at UC Santa Cruz, the positron beam was the downward component of an upward terrestrial gamma-ray flash that sent a short blast of radiation into space above the storm. Terrestrial gamma-ray flashes (TGFs) were first seen in 1994 by space-based gamma-ray detectors. They occur in conjunction with lightning and have now been observed thousands of times by orbiting satellites. A reverse positron beam was predicted by theoretical models of TGFs, but had never been detected.

“This is the first confirmation of that theoretical prediction, and it shows that TGFs are piercing the atmosphere from top to bottom with high-energy radiation,” Smith said. “This event could have been detected from space, like almost all the other reported TGFs, as an upward beam caused by an avalanche of electrons. We saw it from below because of a beam of antimatter (positrons) sent in the opposite direction.”

One unexpected implication of the study, published May 17 in the Journal of Geophysical Research: Atmospheres, is that many TGFs could be detected via the reverse positron beam using ground-based instruments at high altitudes. It’s not necessary to fly into the eye of a hurricane.

“We detected it at an altitude of 2.5 kilometers, and I estimated our detectors could have seen it down to 1.5 kilometers. That’s the altitude of Denver, so there are a lot of places where you could in theory see them if you had an instrument in the right place at the right time during a thunderstorm,” Smith said.

Despite the confirmation of the reverse positron beam, many questions remain unresolved about the mechanisms that drive TGFs. Strong electric fields in thunderstorms can accelerate electrons to near the speed of light, and these “relativistic” electrons emit gamma-rays when they scatter off of atoms in the atmosphere. The electrons can also knock other electrons off of atoms and accelerate them to high energies, creating an avalanche of relativistic electrons. A TGF, which is an extremely bright flash of gamma-rays, requires a large number of avalanches of relativistic electrons.

“It’s an extraordinary event, and we still don’t understand how it gets so bright,” Smith said.

The source of the positrons, however, is a well known phenomenon in physics called pair production, in which a gamma ray interacts with the nucleus of an atom to create an electron and a positron. Since they have opposite charges, they are accelerated in opposite directions by the electric field of the thunderstorm. The downward moving positrons produce x-rays and gamma-rays in their direction of travel when they collide with atomic nuclei, just like the upward moving electrons.

“What we saw in the aircraft are the gamma-rays produced by the downward positron beam,” Smith said.

First author Gregory Bowers, now at Los Alamos National Laboratory, and coauthor Nicole Kelley, now at Swift Navigation, were both graduate students at UC Santa Cruz when they worked together on the instrument that made the detection. The Airborne Detector for Energetic Lightning Emissions (ADELE) mark II was designed to observe TGFs up close by measuring x-rays and gamma-rays from aircraft flown into or above thunderstorms.

Getting too close to a TGF could be hazardous, although the risk drops off rapidly with distance from the source. The gamma-ray dose at a distance of one kilometer would be negligible, Smith said. “It’s hypothetically a risk, but the odds are quite small,” he said. “I don’t ask pilots to fly into thunderstorms, but if they’re going anyway I’ll put an instrument on board.”

Smith’s group was the first to detect a TGF from an airplane using an earlier instrument, the ADELE mark I. In that case, the upward beam from the TGF was detected above a thunderstorm. For this study, the ADELE mark II flew aboard NOAA’s Hurricane Hunter WP-3D Orion during the Atlantic hurricane season.

###

In addition to Bowers, Smith, and Kelley, the coauthors of the paper include Forest Martinez-McKinney at UC Santa Cruz, Joseph Dwyer at the University of New Hampshire, and scientists at Duke University, Earth Networks, University of Washington, NOAA, and Florida Institute of Technology. This work was funded by the National Science Foundation.

What happens if pesticides and herbicides stop working?

Public Release: 17-May-2018

What happens if we run out?

Pesticide resistance needs attention, large-scale study

North Carolina State University

IMAGE

IMAGE: For new answers to the problems of increased pesticide resistance, landscape-level study is needed, NC State researchers say.

Credit: Roger Winstead, NC State University

To slow the evolutionary progression of weeds and insect pests gaining resistance to herbicides and pesticides, policymakers should provide resources for large-scale, landscape-level studies of a number of promising but untested approaches for slowing pest evolution. Such landscape studies are now more feasible because of new genomic and technological innovations that could be used to compare the efficacy of strategies for preventing weed and insect resistance.

That’s the takeaway recommendation from a North Carolina State University review paper addressing pesticide resistance published today in the journal Science.

Pesticide resistance exacts a tremendous toll on the U.S. agricultural sector, costing some $10 billion yearly. Costs could also increasingly accrue on human lives. If insecticide-coated bed nets and complementary insecticide spraying failed to slow the transmission of malaria by pesticide-resistant mosquitoes, for example, the human health costs in places like Africa could be catastrophic.

“What is the impact on people if these herbicides and pesticides run out?” said Fred Gould, William Neal Reynolds Professor of Agriculture at NC State and the corresponding author of the paper. “Resistance to pesticides is rising in critical weed and insect species, threatening our ability to harness these pests. Weed species have evolved resistance to every class of herbicide in use, and more than 550 arthropods have resistance to at least one pesticide.”

Consider glyphosate, the powerhouse weed killer used ubiquitously in the United States to protect major crops like corn and soybeans. A bit more than 20 years ago, crops were genetically engineered to withstand glyphosate, allowing them to survive exposure to the chemical while weeds perished. By 2014, some 90 percent of planted U.S. corn, soybean and cotton crops were genetically modified to withstand glyphosate. Unfortunately, as the evolutionary arms race progresses, many weeds have figured out how to evolve resistance to glyphosate, making the chemical increasingly ineffective and forcing farmers to look for other or new solutions.

Some of these “new” solutions are actually old, as the herbicides 2,4-D and Dicamba, developed in the 1940s and 1960s, respectively, are currently getting a second look as possible widespread weed weapons.

“We’re working down the list of available tools to fight weeds and insect pests,” said Zachary Brown, assistant professor of agricultural and resource economics at NC State and a co-author of the paper. “It hasn’t been economically feasible to develop new herbicides to replace glyphosate, for example, so what’s old is becoming new again. But the current incentives don’t seem to be right for getting us off this treadmill.”

Besides ecology and economics, the authors stress that sociological and political perspectives also set up roadblocks to solving the problems of pest resistance. Cultural practices by farmers – whether they till their land or not, how they use so-called refuges in combination with genetically modified crop areas and even how often they rotate their crops – all play a big role in pest resistance.

“Any proposed solutions also need to include perspectives from the individual farmer, community and national levels,” said Jennifer Kuzma, Goodnight-NC GSK Foundation Distinguished Professor and co-author of the paper.

The authors propose large-scale studies that would test the efficacy of a particular pesticide resistance strategy in one large area – thousands of acres or more – and how weeds and crop yields compare to large “control” areas that don’t utilize that particular strategy. Farmers would receive incentives to participate; perhaps subsidies already allocated to farmers could be shifted to provide these participatory incentives, the authors suggest.

“In the end, are we going to outrun the pests or are they going to outrun us?” Gould said.

###

The work was funded by NC State’s Genetic Engineering and Society Center, by USDA National Institute of Food and Agriculture grants 2012- 33522-19793 and 2016-33522-25640, and by USDA National Institute of Food and Agriculture HATCH project NC02520.

Note to editors: An abstract of the paper follows.

Wicked evolution: Can we address the sociobiological dilemma of pesticide resistance?

Authors: Fred Gould, Zachary Brown and Jennifer Kuzma, North Carolina State University

Published: May 17, 2018, in Science

DOI: 10.1126/science.aar3780

Abstract: Resistance to insecticides and herbicides has cost billions of U.S. dollars in the agricultural sector and could result in millions of lives lost to insect-vectored diseases. We mostly continue to use pesticides as if resistance is a temporary issue that will be addressed by commercialization of new pesticides with novel modes of action. However, current evidence suggests that insect and weed evolution may outstrip our ability to replace outmoded chemicals and other control mechanisms. To avoid this outcome, we must address the mix of ecological, genetic, economic, and sociopolitical factors that prevent implementation of sustainable pest management practices. We offer a proposition.

Earth’s orbital changes have influenced climate, life forms for at least 215 million years

Public Release: 7-May-2018

Gravity of Jupiter and Venus elongates Earth’s orbit every 405,000 years, Rutgers-led study confirms

Rutgers University

IMAGE

Caption

Rutgers University-New Brunswick Professor Dennis Kent with part of a 1,700-foot-long rock core through the Chinle Formation in Petrified Forest National Park in Arizona. The background includes boxed archives of cores from the Newark basin that were compared with the Arizona core.

Credit: Nick Romanenko/Rutgers University-New Brunswick

Every 405,000 years, gravitational tugs from Jupiter and Venus slightly elongate Earth’s orbit, an amazingly consistent pattern that has influenced our planet’s climate for at least 215 million years and allows scientists to more precisely date geological events like the spread of dinosaurs, according to a Rutgers-led study.

The findings are published online today in the Proceedings of the National Academy of Sciences.

“It’s an astonishing result because this long cycle, which had been predicted from planetary motions through about 50 million years ago, has been confirmed through at least 215 million years ago,” said lead author Dennis V. Kent, a Board of Governors professor in the Department of Earth and Planetary Sciences at Rutgers University-New Brunswick. “Scientists can now link changes in the climate, environment, dinosaurs, mammals and fossils around the world to this 405,000-year cycle in a very precise way.”

The scientists linked reversals in the Earth’s magnetic field – when compasses point south instead of north and vice versa – to sediments with and without zircons (minerals with uranium that allow radioactive dating) as well as to climate cycles.

“The climate cycles are directly related to how the Earth orbits the sun and slight variations in sunlight reaching Earth lead to climate and ecological changes,” said Kent, who studies Earth’s magnetic field. “The Earth’s orbit changes from close to perfectly circular to about 5 percent elongated especially every 405,000 years.”

The scientists studied the long-term record of reversals in the Earth’s magnetic field in sediments in the Newark basin, a prehistoric lake that spanned most of New Jersey, and in sediments with volcanic detritus including zircons in the Chinle Formation in Petrified Forest National Park in Arizona. They collected a core of rock from the Triassic Period, some 202 million to 253 million years ago. The core is 2.5 inches in diameter and about 1,700 feet long, Kent said.

The results showed that the 405,000-year cycle is the most regular astronomical pattern linked to the Earth’s annual turn around the sun, he said.

Prior to this study, dates to accurately time when magnetic fields reversed were unavailable for 30 million years of the Late Triassic. That’s when dinosaurs and mammals appeared and the Pangea supercontinent broke up. The break-up led to the Atlantic Ocean forming, with the sea-floor spreading as the continents drifted apart, and a mass extinction event that affected dinosaurs at the end of that period, Kent said.

“Developing a very precise time-scale allows us to say something new about the fossils, including their differences and similarities in wide-ranging areas,” he said.

###

The study was conducted by National Science Foundation-funded scientists at Rutgers-New Brunswick; Lamont-Doherty Earth Observatory at Columbia University, where Kent is also an adjunct senior research scientist and where longtime research collaborator and co-author Paul E. Olsen works; and other institutions. Christopher J. Lepre, a lecturer in Rutgers’ Department of Earth and Planetary Sciences, and seven others co-authored the study, and the cores were sampled at the Rutgers Core Repository.

Flaw found in water treatment method, Process may generate harmful chemicals

Public Release: 2-May-2018

Johns Hopkins University

Public water quality has received a lot of attention in recent years as some disturbing discoveries have been made regarding lead levels in cities across the country. Now, a new study from the Johns Hopkins University pinpoints other chemicals in water that are worth paying attention to – and in fact, some of them may be created, ironically, during the water treatment process itself.

To rid water of compounds that are known to be toxic, water treatment plants now often use methods to oxidize them, turning them into other, presumably less harmful chemicals called “transformation products.” Though earlier studies have looked at the byproducts of water treatment processes like chlorination, not so much is known about the products formed during some of the newer processes, like oxidation with hydrogen peroxide and UV light, which are especially relevant in water reuse.

“Typically, we consider these transformation products to be less toxic, but our study shows that this might not always be the case,” says lead author Carsten Prasse, assistant professor in the Department of Environmental Health and Engineering at the Johns Hopkins Whiting School of Engineering and the university’s Bloomberg School of Public Health. “Our results highlight that this is only half of the story and that transformation products might play a very important part when we think about the quality of the treated water.”

Prasse, along with colleagues from the University of California, Berkeley, chose to look at phenols, a class of organic chemicals that are among the most common in the water supply, as they’re present in everything from dyes to personal care products to pharmaceuticals to pesticides as well as in chemicals that are naturally occurring in water.

To determine what compounds the phenols transform into during treatment, the team, whose results are published in Proceedings of the National Academy of Sciences, first oxidized phenols using peroxide radicals, a process often used by water treatment plants. Next, they borrowed a clever method from biomedicine: They added amino acids and proteins to the mix. Depending on what chemical reactions took place, Prasse and his team could do some backwards calculation to determine what compounds the phenols must have turned into in the earlier step.

They discovered that the phenols converted into products including 2-butene-1,4-dial, a compound that is known to have negative effects, including DNA damage, on human cells. Interestingly, furan, a toxic compound in cigarette smoke and car exhaust, is also converted into 2-butene-1,4-dial in the body, and it may be this conversion that’s responsible for its toxicity.

To test the specific effects of 2-butene-1,4-dial on biological processes more fully, the team exposed the compound to mouse liver proteins. They found that it affected 37 different protein targets, which are involved in a range of biological processes, from energy metabolism to protein and steroid synthesis.

One enzyme that 2-butene-1,4-dial was shown to bind is critical in apoptosis, or “cell suicide.” Inhibiting this compound in a living organism might lead to unchecked cell proliferation, or cancer growth. And other compounds that 2-butene-1,4-dial interferes with play key roles in metabolism. “There are a lot of potential health outcomes, like obesity and diabetes,” says Prasse. “There’s a known connection between pesticide exposure and obesity, and studies like ours may help to explain why this is.”

The results are exciting since this is the first time these methods have been applied to water treatment, Prasse says. In time, they may be expanded to screen for other types of compounds beyond phenols.

Water purification is extraordinarily challenging, since contaminants come from so many different sources–bacteria, plants, agriculture, wastewater–and it’s not always clear what’s being generated in the process. “We’re very good at developing methods to remove chemicals” says Prasse. “Once the chemical is gone, the job – it would seem – is done, but in fact we don’t always know what removal of the chemical means: does it turn into something else? is that transformation product harmful?”

Prasse and his team point out that by the year 2050, it’s been estimated two-thirds of the global population will live in areas that rely on drinking water that contains the runoff from farms and wastewater from cities and factories. So safe and effective purification methods will be even more critical in the coming years.

“The next steps are to investigate how this method can be applied to more complex samples and study other contaminants that are likely to result in the formation of similar reactive transformation products,” says Prasse. “Here we looked at phenols. But we use household products that contain some 80,000 different chemicals, and many of these end up in wastewater. We need to be able to screen for multiple chemicals at once. That’s the larger goal.”

###

Coauthors on the study were Breanna Ford and Daniel K. Nomura of the Department of Nutritional Sciences and Toxicology at the University of California, Berkeley. The senior author was David L. Sedlak of the Department of Civil and Environmental Engineering at the University of California, Berkeley.

This research was supported by the National Institute for Environmental Health Sciences Superfund Research Program (Grant P42 ES004705) at the University of California, Berkeley.

Neonicotinoids may alter estrogen production in humans

Public Release: 26-Apr-2018

 

An INRS team publishes the first-ever in vitro study demonstrating the potential effects of these pesticides on human health in the journal Environmental Health Perspectives

Institut national de la recherche scientifique – INRS

Neonicotinoids are currently the most widely used pesticides in the world and frequently make headlines because of their harmful effects on honeybees and other insect pollinators. Now, a study published in the prestigious journal Environmental Health Perspectives, indicates they may also have an impact on human health by disrupting our hormonal systems. This study by INRS professor Thomas Sanderson indicates that more work must be done on the potential endocrine-disrupting effects of neonicotinoids.

The Quebec government has recently decided to more strictly regulate the use of certain pesticides, including neonicotinoids, which are widely used by Quebec farmers to control crop pests. Neonicotinoids act on insects’ nervous systems, killing them by paralysis. Very little research has been done on their effects on human health, but INRS Professor Sanderson and Ph.D. student Élyse Caron-Beaudoin have taken on the challenge.

Both researchers have long been interested in the mechanisms of endocrine disrupting chemicals and they wanted to determine whether neonicotinoids belong to this class of compounds. “Endocrine disrupters are natural or synthetic molecules that can alter hormone function,” says Caron-Beaudoin, the study’s main author. “They affect the synthesis, action, or elimination of natural hormones, which can lead to a wide variety of health effects.”

The research duo has developed methods to test the effect of neonicotinoids on the production of estrogens, essential hormones with several biological functions. By targeting aromatase, a key enzyme in the synthesis of estrogens, they were able to test the impact of three neonicotinoids on breast cancer cells in culture after exposure to concentrations similar to those found in the environment in agricultural areas.

The results of the study show an increase in aromatase expression and a unique change in the pattern in which aromatase was expressed, which is similar to that observed in the development of certain breast cancers. “However, as these results were obtained in a cellular model of breast cancer, we cannot necessarily conclude that exposure to pesticides at concentrations similar to those in the human environment would cause or promote cancer,” cautions Professor Sanderson. This study is the first evidence that neonicotinoids have an effect on aromatase gene expression and may potentially alter estrogen production. Hormonal disturbance by these pesticides will need to be confirmed in future studies, but the results obtained by the INRS team indicate that it caution should be exercised in the management and use of neonicotinoid insecticides.

###

ABOUT THE PAPER

This study is published under the title “Effects of neonicotinoid pesticides on promoter-specific aromatase (CYP19) expression in Hs578t breast cancer cells and the role of the VEGF pathway” in the peer-reviewed journal Environmental Health Perspectives (DOI: 10.1289/EHP2698). Principal investigator J. Thomas Sanderson received funding for this research from the National Sciences and Engineering Research Council of Canada, the California Breast Cancer Research Program and the Alternatives Research and Development Foundation. Authors Élyse Caron-Beaudoin, and Rachel Viau received scholarships from the Fonds de recherche du Québec – Nature et technologies (FRQNT) and the Fondation universitaire Armand-Frappier INRS.

SOURCE:

Stéphanie Thibault, Communications Advisor
Communications and Governmental Affairs Office
Institut national de la recherche scientifique
Stephanie.Thibault@inrs.ca / +1 514-499-6612

AUTHOR AND CONTACT PERSON:
J. Thomas Sanderson, Professor
INRS-Institut Armand-Frappier
thomas.sanderson@iaf.inrs.ca / +1 450-687-5010

Geoengineering risks losers as well as winners for climate and wildfire risks

Public Release: 9-Apr-2018

 

University of Exeter

Artificially altering the climate system to limit global warming to 1.5C could increase the risks of wildfires in some areas, new research suggests.

While the international community is already aiming to limit global warming to below 2C by reducing greenhouse gas emissions, the more ambitious aim of a 1.5C limit is known to be challenging to reach in this way.

An additional method to further limit warming might be to release sulphur dioxide high in the atmosphere, to produce a thin veil of droplets that would reflect some sunlight back to space and help keep the Earth cool.

However, the new research suggests that this method of geoengineering could also introduce its own new impacts by shifting global rainfall patterns.

The research by scientists at the University of Exeter and the Met Office Hadley Centre, carried out as part of the EU-funded project HELIX, looked at the implications of this for global patterns of wildfire using computer models of the global climate.

Their simulations suggested that, while the cooler global temperatures would overall lead to lower fire risk at 1.5C global warming compared to 2C, some regions would actually see an increase in fire risk because of drier and warmer conditions locally.

Lead author Chantelle Burton, of the University of Exeter, said: “This illustrates the complexity of actively trying to change the climate, rather than simply reducing our influence on it. Interventions meant to reduce impacts could actually increase the impacts for many people.”

In the models, most parts of the world saw 30 fewer days per year with high fire risk.

However, up to 31 more days with high fire risk were seen in parts of North and South America, east and south Asia and southern Africa.

Professor Richard Betts, a co-author of the study and director of the HELIX project, said: “It is clear that overall there are benefits to limiting global warming.

“However, it’s important to look beyond simply global average temperatures and consider the regional details, especially if more radical measures such as geoengineering were ever to be considered.

“If interventions led to losers as well as winners, how would we deal with the ethics of this?”

###

The paper, published in the journal Geophysical Research Letters, is entitled: “Will fire danger be reduced by using solar radiation management to limit global warming to 1.5°C compared to 2.0°C?”

90 percent of pregnant women may have detectable levels of herbicides

Public Release: 22-Mar-2018

Study finds direct evidence of exposure of pregnant women to herbicide ingredient

Indiana University

INDIANAPOLIS — The first birth cohort study of its kind has found more than 90 percent of a group of pregnant women in Central Indiana had detectable levels of glyphosate, the active ingredient in Roundup, the most heavily used herbicide worldwide.

Researchers from Indiana University and University of California San Francisco reported that the glyphosate levels correlated significantly with shortened pregnancy lengths.

“There is growing evidence that even a slight reduction in gestational length can lead to lifelong adverse consequences,” said Shahid Parvez, the principal investigator of this study and an assistant professor in the Department of Environmental Health Science at the IU Richard M. Fairbanks School of Public Health at IUPUI.

The study is the first to examine glyphosate exposure in pregnant women in the United States using urine specimens as a direct measure of exposure.

Parvez said the main finding of the study was that 93 percent of the 71 women in the study had detectable levels of glyphosate in their urine. “We found higher urine glyphosate levels in women who lived in rural areas, and in those who consumed more caffeinated beverages,” he said.

“One thing we cannot deny is that glyphosate exposure in pregnant women is real,” Parvez said. “The good news is that the public drinking water supply may not be the primary source of glyphosate exposure, as we initially anticipated. None of the tested drinking water samples showed glyphosate residues. It is likely that glyphosate is eliminated in the water-treatment process. The bad news is that the dietary intake of genetically modified food items and caffeinated beverages is suspected to be the main source of glyphosate intake.”

Use of glyphosate is heaviest in the Midwest due to corn and soybean production. Its residues are found in the environment, major crops and food items that humans consume daily.

“Although our study cohort was small and regional and had limited racial or ethnic diversity, it provides direct evidence of maternal glyphosate exposure and a significant correlation with shortened pregnancy,” Parvez said.

The magnitude of glyphosate exposure in pregnant women and the correlations with shorter gestation length are concerning and mandate further investigation, he said. “We are planning, contingent upon funding, to conduct a more comprehensive study in a geographically and racially diverse pool of pregnant women to determine if our findings are the same.”

###

The study, “Glyphosate exposure in pregnancy and shortened gestational length: a prospective Indiana birth cohort study,” was recently published in the journal Environmental Health.

Wolovick: Geoengineering polar glaciers to slow sea-level rise

Public Release: 19-Mar-2018

 

A Princeton University researcher suggests a radical solution to prevent catastrophic glacial melting.

Princeton University

Targeted geoengineering to preserve continental ice sheets deserves serious research and investment, argues an international team of researchers in a Comment published March 14 in the journal Nature. Without intervention, by 2100 most large coastal cities will face sea levels that are more than three feet higher than they are currently.

Previous discussions of geoengineering have looked at global projects, like seeding the atmosphere with particles to reflect more sunlight. That’s what makes this focused approach more feasible, says Michael Wolovick, a postdoctoral research associate in Atmospheric and Oceanic Sciences at Princeton University and a co-author on the Comment. (Nature editors commission Comments, short articles by one or more experts that call for action and lay out detailed solutions for current problems.)

“Geoengineering interventions can be targeted at specific negative consequences of climate change, rather than at the entire planet,” Wolovick said.

The ice sheets of Greenland and Antarctica will contribute more to sea-level rise this century than any other source, so stalling the fastest flows of ice into the oceans would buy us a few centuries to deal with climate change and protect coasts, say the authors.

“There is going to be some sea-level rise in the 21st century, but most models say that the ice sheets won’t begin collapsing in earnest until the 22nd or 23rd centuries,” said Wolovick. “I believe that what happens in the 22nd or 23rd centuries matters. I want our species and our civilization to last as long as possible, and that means that we need to make plans for the long term.”

Wolovick started investigating geoengineering approaches when he realized how disproportionate the scale was between the origin of the problem at the poles and its global impact: “For example, many of the most important outlet glaciers in Greenland are about 5 kilometers (3 miles) wide, and there are bridges that are longer than [that]. The important ice streams in Antarctica are wider, tens of kilometers up to 100 kilometers, but their societal consequences are larger as well, because they could potentially trigger a runaway marine ice sheet collapse. The fast-flowing parts of the ice sheets — the outlet glaciers and ice streams — might be the highest-leverage points in the whole climate system.”

The glaciers could be slowed in three ways: warm ocean waters could be prevented from reaching their bases and accelerating melting; the ice shelves where they start to float could be buttressed by building artificial islands in the sea; and the glacier beds could be dried by draining or freezing the thin film of water they slide on.

The engineering costs and scales of these projects are comparable with today’s large civil engineering projects, but with extra challenges due to the remote and harsh polar environment. Engineers have already constructed artificial islands and drained water beneath a glacier in Norway to feed a hydropower plant. Raising a berm in front of the fastest-flowing glacier in Greenland — constructing an underwater wall 3 miles long and 350 feet high in arctic waters — would be a comparable challenge.

Such a project would easily run into the billions of dollars, but the scientists note that without coastal protection, the global cost of damages could reach $50 trillion a year. In the absence of geoengineering, the sea walls and flood defenses necessary to prevent those damages would cost tens of billions of dollars a year to build and maintain.

The researchers note that potential risks, especially to local ecosystems, need careful fieldwork and computer modeling, and the glaciers and their outflow channels need to be more precisely mapped and modeled.

Most importantly, this approach would address a symptom, not the cause. “Glacial geoengineering is not a substitute for emissions reductions,” Wolovick said. His approaches could forestall one of the bigger causes of global sea-level rise, but they will not mitigate global warming from greenhouse gases.

The fate of the ice sheets will depend ultimately on how quickly the world brings down fossil fuel emissions.

“Glacial geoengineering will not be able to save the ice sheets in the long run if the climate continues to warm,” Wolovick said. “In the long run, there are two possible routes that glacial geoengineering could take: on the one hand, it could be a stopgap solution meant to preserve the ice sheets until the climate cools enough that they are once again viable on their own; on the other hand, it could be a managed collapse meant to keep the rate of sea-level rise down while slowly letting the ice sheet waste away. If we emit too much carbon into the atmosphere, then the only viable long-term usage of glacial geoengineering would be to orchestrate a managed collapse.”

Wolovick argues against defeatist attitudes. “Climate change is not an inevitable apocalypse, climate change is a set of solvable problems,” he said. “Climate change is a challenge that our species can and will rise to meet.”

“Geoengineer polar glaciers to slow sea-level rise” by John Moore, Rupert Gladstone, Thomas Zwinger and Michael Wolovick appeared in Nature on March 14. Wolovick’s research was supported by a departmental postdoctoral fellowship that is funded by the National Oceanic and Atmospheric Administration.

Humans flourished through super volcano 74,000 years ago, study finds

Public Release: 14-Mar-2018

 

How a vacation in South Africa, a one-of-its-kind UNLV lab, and pieces of volcanic glass smaller than a grain of salt changed a long-held view of human history

University of Nevada, Las Vegas

Our ancestors not only survived a massive volcanic eruption 74,000 years ago, they flourished during the resulting climate change that occurred, a new study by UNLV geoscientist Eugene Smith and colleagues found.

The conclusions reached by Smith and Arizona State University archeologist Curtis Marean counter previously held beliefs that the eruption of an Indonesian super volcano – called Mount Toba – and the resulting “winter” of ash and smoke spread thousands of miles, destroyed plants, killed animals, and nearly wiped out humans.

The study, “Humans thrived in South Africa through the Toba eruption about 74,000 years ago,” was published this week in the journal Nature.

And it all started with a National Geographic vacation and tour to the Republic of South Africa that Smith and his wife took back in 2011.

During the vacation, the couple was touring archeological sites run by Marean, who noticed a skeptical look on Smith’s face as he discussed geology.

The two soon spoke, and Marean learned Smith was a geologist and asked him to look at some glassy looking shards that were found in the archeological sites. Smith quickly identified the pieces as cryptotephra, very old microscopic glass shards that were ejected during a volcanic eruption.

The next step was creating a partnership between the two public research universities and to build a lab to study when and where those shards came from.

Smith sent a UNLV graduate student to Oxford University in the United Kingdom to model the lab. Soon after, UNLV developed the Cryptotephra Laboratory for Archaeological and Geological Research, the only U.S. lab specializing in this type of work. The lab analyzed and dated the shards, narrowing down the exact time that all this occurred to inhabitants at two sites — Pinnacle Point and Vleesbaai in present-day South Africa.

Of course, they first had to find more shards of glass. This is no easy task when you’re talking about something less than a one-third the size of a grain of salt.

Smith and a graduate student went to South Africa and collected samples from the Pinnacle Point archaeological site looking for evidence of the Toba super volcano. The shards are both tiny and hard to come by – roughly 1 in every 10,000 grains of sand, he said.

Back in the lab, Smith and UNLV post-doctoral fellow Racheal Johnsen, graduate student Amber Ciravolo, and undergraduate Shelby Fitch studied samples of the glass to trace it to the Toba super volcano.

Armed with a clear timeline, the archeologist Marean was able to show that those human ancestors living at Pinnacle Point and Vleesbaai – located about five miles apart — showed remarkable improvement in their life style during the volcanic winter caused by the Toba eruption.

“Humans in this region thrived through the Toba event and the ensuing full glacial conditions, perhaps as a combined result of the uniquely rich resource base of the region and fully evolved modern human adaptation,” study authors noted.

The Toba eruption – the largest in the past 2 million years — was so immense, it spewed ash and smoke around the world, including South Africa, some 6,200 miles away.

Still, humans were able to survive and adapt, Smith said.

“This was eight years of work by many people, and I’m immensely proud of my colleagues and our lab at UNLV,” Smith said.

Some black holes erase your past

Public Release: 20-Feb-2018

 

Einstein’s equations allow a non-determinist future inside some black holes

University of California – Berkeley

IMAGE

Caption

A spacetime diagram of the gravitational collapse of a charged spherical star to form a charged black hole. An observer traveling across the event horizon will eventually encounter the Cauchy horizon, the boundary of the region of spacetime that can be predicted from the initial data. UC Berkeley’s Peter Hintz and his colleagues found that a region of spacetime, denoted by a question mark, cannot be predicted from the initial data in a universe with accelerating expansion, like our own. This violates the principle of strong cosmic censorship.

Credit: APS/Alan Stonebraker

In the real world, your past uniquely determines your future. If a physicist knows how the universe starts out, she can calculate its future for all time and all space.

But a UC Berkeley mathematician has found some types of black holes in which this law breaks down. If someone were to venture into one of these relatively benign black holes, they could survive, but their past would be obliterated and they could have an infinite number of possible futures.

Such claims have been made in the past, and physicists have invoked “strong cosmic censorship” to explain it away. That is, something catastrophic – typically a horrible death – would prevent observers from actually entering a region of spacetime where their future was not uniquely determined. This principle, first proposed 40 years ago by physicist Roger Penrose, keeps sacrosanct an idea – determinism – key to any physical theory. That is, given the past and present, the physical laws of the universe do not allow more than one possible future.

But, says UC Berkeley postdoctoral fellow Peter Hintz, mathematical calculations show that for some specific types of black holes in a universe like ours, which is expanding at an accelerating rate, it is possible to survive the passage from a deterministic world into a non-deterministic black hole.

What life would be like in a space where the future was unpredictable is unclear. But the finding does not mean that Einstein’s equations of general relativity, which so far perfectly describe the evolution of the cosmos, are wrong, said Hintz, a Clay Research Fellow.

“No physicist is going to travel into a black hole and measure it. This is a math question. But from that point of view, this makes Einstein’s equations mathematically more interesting,” he said. “This is a question one can really only study mathematically, but it has physical, almost philosophical implications, which makes it very cool.”

“This … conclusion corresponds to a severe failure of determinism in general relativity that cannot be taken lightly in view of the importance in modern cosmology” of accelerating expansion, said his colleagues at the University of Lisbon in Portugal, Vitor Cardoso, João Costa and Kyriakos Destounis, and at Utrecht University, Aron Jansen.

As quoted by Physics World, Gary Horowitz of UC Santa Barbara, who was not involved in the research, said that the study provides “the best evidence I know for a violation of strong cosmic censorship in a theory of gravity and electromagnetism.”

Hintz and his colleagues published a paper describing these unusual black holes last month in the journal Physical Review Letters.

Beyond the event horizon

Black holes are bizarre objects that get their name from the fact that nothing can escape their gravity, not even light. If you venture too close and cross the so-called event horizon, you’ll never escape.

For small black holes, you’d never survive such a close approach anyway. The tidal forces close to the event horizon are enough to spaghettify anything: that is, stretch it until it’s a string of atoms.

But for large black holes, like the supermassive objects at the cores of galaxies like the Milky Way, which weigh tens of millions if not billions of times the mass of a star, crossing the event horizon would be, well, uneventful.

Because it should be possible to survive the transition from our world to the black hole world, physicists and mathematicians have long wondered what that world would look like, and have turned to Einstein’s equations of general relativity to predict the world inside a black hole. These equations work well until an observer reaches the center or singularity, where in theoretical calculations the curvature of spacetime becomes infinite.

Even before reaching the center, however, a black hole explorer – who would never be able to communicate what she found to the outside world – could encounter some weird and deadly milestones. Hintz studies a specific type of black hole – a standard, non-rotating black hole with an electrical charge – and such an object has a so-called Cauchy horizon within the event horizon.

The Cauchy horizon is the spot where determinism breaks down, where the past no longer determines the future. Physicists, including Penrose, have argued that no observer could ever pass through the Cauchy horizon point because they would be annihilated.

As the argument goes, as an observer approaches the horizon, time slows down, since clocks tick slower in a strong gravitational field. As light, gravitational waves and anything else encountering the black hole fall inevitably toward the Cauchy horizon, an observer also falling inward would eventually see all this energy barreling in at the same time. In effect, all the energy the black hole sees over the lifetime of the universe hits the Cauchy horizon at the same time, blasting into oblivion any observer who gets that far.

You can’t see forever in an expanding universe

Hintz realized, however, that this may not apply in an expanding universe that is accelerating, such as our own. Because spacetime is being increasingly pulled apart, much of the distant universe will not affect the black hole at all, since that energy can’t travel faster than the speed of light.

In fact, the energy available to fall into the black hole is only that contained within the observable horizon: the volume of the universe that the black hole can expect to see over the course of its existence. For us, for example, the observable horizon is bigger than the 13.8 billion light years we can see into the past, because it includes everything that we will see forever into the future. The accelerating expansion of the universe will prevent us from seeing beyond a horizon of about 46.5 billion light years.

In that scenario, the expansion of the universe counteracts the amplification caused by time dilation inside the black hole, and for certain situations, cancels it entirely. In those cases – specifically, smooth, non-rotating black holes with a large electrical charge, so-called Reissner-Nordström-de Sitter black holes – an observer could survive passing through the Cauchy horizon and into a non-deterministic world.

“There are some exact solutions of Einstein’s equations that are perfectly smooth, with no kinks, no tidal forces going to infinity, where everything is perfectly well behaved up to this Cauchy horizon and beyond,” he said, noting that the passage through the horizon would be painful but brief. “After that, all bets are off; in some cases, such as a Reissner-Nordström-de Sitter black hole, one can avoid the central singularity altogether and live forever in a universe unknown.”

Admittedly, he said, charged black holes are unlikely to exist, since they’d attract oppositely charged matter until they became neutral. However, the mathematical solutions for charged black holes are used as proxies for what would happen inside rotating black holes, which are probably the norm. Hintz argues that smooth, rotating black holes, called Kerr-Newman-de Sitter black holes, would behave the same way.

“That is upsetting, the idea that you could set out with an electrically charged star that undergoes collapse to a black hole, and then Alice travels inside this black hole and if the black hole parameters are sufficiently extremal, it could be that she can just cross the Cauchy horizon, survives that and reaches a region of the universe where knowing the complete initial state of the star, she will not be able to say what is going to happen,” Hintz said. “It is no longer uniquely determined by full knowledge of the initial conditions. That is why it’s very troublesome.”

He discovered these types of black holes by teaming up with Cardoso and his colleagues, who calculated how a black hole rings when struck by gravitational waves, and which of its tones and overtones lasted the longest. In some cases, even the longest surviving frequency decayed fast enough to prevent the amplification from turning the Cauchy horizon into a dead zone.

Hintz’s paper has already sparked other papers, one of which purports to show that most well-behaved black holes will not violate determinism. But Hintz insists that one instance of violation is one too many.

“People had been complacent for some 20 years, since the mid ’90s, that strong cosmological censorship is always verified,” he said. “We challenge that point of view.”

###

Hintz’s work was supported by the Clay Mathematics Institute and the Miller Institute for Basic Research in Science at UC Berkeley.

Medical care for wounded ants

Public Release: 13-Feb-2018

 

University of Würzburg

The African Matabele ants (Megaponera analis) tend to the wounds of their injured comrades. And they do so rather successfully: Without such attendance, 80 percent of the injured ants die; after receiving “medical” treatment, only 10 percent succumb to their injuries.

Erik T. Frank, Marten Wehrhan and Karl Eduard Linsenmair from Julius-Maximilians-Universität Würzburg (JMU) in Bavaria, Germany, made this astonishing discovery. Their results have been published in the journal Proceedings of the Royal Society B. No other insects are known to dress the wounds of their comrades. The JMU biologists even believe that such behaviour is unique in the entire animal kingdom.

Ants go on high-risk raids

Matabele ants have a high risk of getting injured every day: The insects, which are widely distributed in Sub-Saharan Africa, set out to raid termites two to four times a day. Proceeding in long files of 200 to 600 animals, they raid termites at their foraging sites, killing many workers and hauling the prey back to their nest where they are ultimately eaten.

However, the ants meet fierce resistance from the well-armoured termite soldiers that are very adept at using their powerful jaws to fend off the attackers. Injury and mortality among the ants occur during such combats. For example, the ants frequently lose limbs that are bitten off by termite soldiers. When an ant is injured in a fight, it calls its mates for help by excreting a chemical substance which makes them carry their injured comrade back to the nest. Erik T. Frank already described this rescue service in 2017.

But the Würzburg biologists dug deeper: What happens once the injured ants are back in the nest? The ants treat the open wounds of their injured fellows by “licking” them intensively, often for several minutes. “We suppose that they do this to clean the wounds and maybe even apply antimicrobial substances with their saliva to reduce the risk of bacterial or fungal infection,” Frank explains.

Severely injured ants are left behind on the battlefield

The team from the JMU Biocentre uncovered more exciting details about the emergency rescue service of the Matabele ants. Badly injured ants missing five of their six legs, for example, get no help on the battleground. The decision who is saved and who is left behind is made not by the rescuers but by the injured ants themselves.

Slightly injured ants keep still and even pull in their remaining limbs to facilitate transport. Their badly injured counterparts in contrast struggle and lash out wildly. “They simply don’t cooperate with the helpers and are left behind as a result,” Frank says. So the hopeless cases make sure that no energy is invested in rescuing them.

Slightly injured ants keep still

When Matabele ants are only slightly injured, they move much more slowly than normal once potential helpers are near. This behaviour probably increases their chances of being noticed by the other ants rushing back to the nest in a column. Or it may be that ants can localize the “save-me-substance” more easily in resting ants.

More questions arise

The new insights give rise to new questions: How do ants recognize where exactly a mate was injured? How do they know when to stop dressing the wounds? Is treatment purely preventive or also therapeutic, after an infection has occurred?

Erik T. Frank will continue to tackle these and other questions at the University of Lausanne in Switzerland where he has been doing postdoc research since February 2018. He recently completed his doctoral thesis at JMU.

Herbicides now resulting in catastrophic failures

Public Release: 12-Feb-2018

Weeds out of control

Spraying weeds with chemicals has always been costly. Now it is costly and ineffective, with resistance to herbicides pervasive and demanding a new strategy to protect crops.

Rothamsted Research

IMAGE

IMAGE: Black-grass, a pervasive weed with increasing resistance to herbicides, is a major threat to cereal crops v

Credit: Rothamsted Research

Herbicides can no longer control the weeds that threaten crop productivity and food security in the UK because the plants have evolved resistance, and future control must depend on management strategies that reduce reliance on chemicals.

A nationwide epidemiological assessment of the factors that are driving the abundance and spread of the major agricultural weed, black-grass, was the focus of collaborative work led by the University of Sheffield, with Rothamsted Research and the Zoological Society of London.

The team mapped the density of black-grass populations across 70 farms in England, collecting seed from 132 fields. They also collected historical management data for all fields to address the question “which management factors are driving black-grass abundance and herbicide resistance?” Their findings are published today in Nature Ecology & Evolution.

“At Rothamsted, we used glasshouse bioassays to determine that 80% of sampled populations were highly resistant to all herbicides that can be used for selective black-grass control in a wheat crop,” says Paul Neve, a weed biologist and leader of Rothamsted’s strategic programme, Smart Crop Protection.

“Field monitoring indicated that the level of resistance to herbicides was correlated with population density, indicating that resistance is a major driver for black-grass population expansion in England,” notes Neve.

He adds: “We found that the extent of herbicide resistance was primarily dictated by the historical intensity of herbicide use, and that no other management factors had been successful in modifying this resistance risk.”

The research team also surveyed farmers about their use of herbicides, and about how much their different approaches cost them. The team found that the increased weed densities lead to higher herbicide costs and lower crop yields, resulting in significant losses of profit.

Increasing resistance is linked to the number of herbicide applications, and mixing different chemicals or applying them cyclically did not prevent resistance developing, the team report.

Current industry advice urgently needs to change to reflect this, the researchers conclude. They recommend that farmers switch to weed-management strategies that rely less on herbicides, as it is inevitable that weeds will overcome even new agents.

The research was part-funded by the Agricultural and Horicultural Development Board.

###

NOTES TO EDITORS

Publication:

The factors driving evolved herbicide resistance at a national scale doi:10.1038/s41559-018-0470-1

Rothamsted Research contacts:

Paul Neve, Weed Biologist
Leader of Smart Crop Protection, and Institute Strategic Programme
Tel: +44 (0) 1582 938
E-mail: paul.neve@rothamsted.ac.uk

Susan Watts, Head of Communications
Tel: +44 (0) 1582 938 109
Mob: +44 (0) 7964 832 719
E-mail: susan.watts@rothamsted.ac.uk

About Rothamsted Research

Rothamsted Research is the oldest agricultural research institute in the world. We work from gene to field with a proud history of ground-breaking discoveries. Our founders, in 1843, were the pioneers of modern agriculture, and we are known for our imaginative science and our collaborative influence on fresh thinking and farming practices. Through independent science and innovation, we make significant contributions to improving agri-food systems in the UK and internationally. In terms of its economic contribution, the cumulative impact of our work in the UK exceeds £3000 million a year (Rothamsted Research and the Value of Excellence, by Séan Rickard, 2015). Our strength lies in our systems approach, which combines science and strategic research, interdisciplinary teams and partnerships. Rothamsted is also home to three unique resources. These National Capabilities are open to researchers from all over the world: The Long-Term Experiments, Rothamsted Insect Survey and the North Wyke Farm Platform. We are strategically funded by the Biotechnology and Biological Sciences Research Council (BBSRC), with additional support from other national and international funding streams, and from industry.

For more information, visit https://www.rothamsted.ac.uk/; Twitter @Rothamsted

About BBSRC

BBSRC invests in world-class bioscience research and training on behalf of the UK public. Our aim is to further scientific knowledge, to promote economic growth, wealth and job creation and to improve quality of life in the UK and beyond. Funded by Government, BBSRC invested over £469M in world-class bioscience in 2016-17. We support research and training in universities and strategically funded institutes. BBSRC research and the people we fund are helping society to meet major challenges, including food security, green energy and healthier, longer lives. Our investments underpin important UK economic sectors, such as farming, food, industrial biotechnology and pharmaceuticals.

About the Agriculture and Horticulture Development Board

AHDB is a statutory levy board, funded by farmers, growers and others in the food supply chain. It is managed as an independent organisation, independent of both commercial industry and of government.

12,800 years ago the Earth was on fire

 

Public Release: 1-Feb-2018

New research suggests toward end of Ice Age, human beings witnessed fires larger than dinosaur killers

University of Kansas

LAWRENCE — On a ho-hum day some 12,800 years ago, the Earth had emerged from another ice age. Things were warming up, and the glaciers had retreated.

Out of nowhere, the sky was lit with fireballs. This was followed by shock waves.

Fires rushed across the landscape, and dust clogged the sky, cutting off the sunlight. As the climate rapidly cooled, plants died, food sources were snuffed out, and the glaciers advanced again. Ocean currents shifted, setting the climate into a colder, almost “ice age” state that lasted an additional thousand years.

Finally, the climate began to warm again, and people again emerged into a world with fewer large animals and a human culture in North America that left behind completely different kinds of spear points.

This is the story supported by a massive study of geochemical and isotopic markers just published in the Journal of Geology.

The results are so massive that the study had to be split into two papers.

“Extraordinary Biomass-Burning Episode and Impact Winter Triggered by the Younger Dryas Cosmic Cosmic Impact ~12,800 Years Ago” is divided into “Part I: Ice Cores and Glaciers” and “Part 2: Lake, Marine, and Terrestrial Sediments.”

The paper’s 24 authors include KU Emeritus Professor of Physics & Astronomy Adrian Melott and Professor Brian Thomas, a 2005 doctoral graduate from KU, now at Washburn University.

“The work includes measurements made at more than 170 different sites across the world,” Melott said.

The KU researcher and his colleagues believe the data suggests the disaster was touched off when Earth collided with fragments of a disintegrating comet that was roughly 62 miles in diameter — the remnants of which persist within our solar system to this day.

“The hypothesis is that a large comet fragmented and the chunks impacted the Earth, causing this disaster,” said Melott. “A number of different chemical signatures — carbon dioxide, nitrate, ammonia and others — all seem to indicate that an astonishing 10 percent of the Earth’s land surface, or about 10 million square kilometers, was consumed by fires.”

According to Melott, analysis of pollen suggests pine forests were probably burned off to be replaced by poplar, which is a species that colonizes cleared areas.

Indeed, the authors posit the cosmic impact could have touched off the Younger Dryas cool episode, biomass burning, late Pleistocene extinctions of larger species and “human cultural shifts and population declines.”

“Computations suggest that the impact would have depleted the ozone layer, causing increases in skin cancer and other negative health effects,” Melott said. “The impact hypothesis is still a hypothesis, but this study provides a massive amount of evidence, which we argue can only be all explained by a major cosmic impact.”

An outdoor cat can damage your sustainability cred

Public Release: 30-Jan-2018

 

Cornell University

ITHACA, N.Y. – If you install solar panels on your roof and avoid dousing your lawn with chemicals and pesticides, your online peers may consider you to be environmentally friendly. But this street cred can all be erased if you let your cat roam around outdoors.

A new study shows that bird lovers who allow their pet cats out of the house are judged to be less concerned about the environment by other members of the birder community on social media, even if the property owner is otherwise employing all of the same sustainable practices as those keeping cats indoors.

“We thought this was a very interesting opportunity to study group norm violations,” said Hwanseok Song, a fifth-year doctoral student in communication at Cornell University and the paper’s lead author. “What happens within this community when they see one of their members violate an important group norm? Do people notice cues that a member within their community is letting their cat outdoors? Do these people who notice those cues actually use that information to make judgments on that group-norm violator?”

To explore these questions, Song and his collaborators used Habitat Network (originally named yardmap.org), a socially networked mapping application that allows users to create and share virtual maps of their properties that highlight their sustainability efforts, essentially a show-and-tell for good conservation practices. The researchers created two identical profiles of a pro-environmental property with a small lawn, low chemical usage and solar panels. The only difference between the profiles: One version had an icon and an image indicating an indoor pet cat and the other an outdoor pet cat. Outdoor cats are a divisive issue for many nature lovers because of the threat they pose to wildlife, particularly birds.

Habitat Network users were asked to rate each property owner’s level of sustainability. The researchers found that participants who didn’t own cats negatively judged property owners with an outdoor cat and even considered them significantly less likely to engage in a variety of pro-environmental behaviors, even though the maps made it clear these property owners had invested in solar power and used few chemicals on their lawns.

“Everything else in this map is pretty much signaling that this is a person already quite committed to sustainability causes. It usually takes a strong environmental commitment to install a solar panel. These findings say a lot about how we make judgments of others who are either violating, or complying with, these sometimes-parochial norms,” said Song.

“This study is a reminder of how easy it can be to jump to conclusions about other people’s behaviors on the basis of very little information,” added Poppy McLeod, professor of communication, who co-authored the paper.

The study’s findings have strong implications for online citizen-science projects that unite the sustainability community. While these projects can foster a sense of group solidarity and shared goals, unspoken biases and misperceptions could weaken progress in these projects.

###

The paper, “Group norm violations in an online environmental social network: Effects on impression formation and intergroup judgments,” was published in Group Processes & Intergroup Relations and was funded by grants from the National Science Foundation and the Atkinson Center for a Sustainable Future’s Academic Venture Fund.

Cornell University has television, ISDN and dedicated Skype/Google+ Hangout studios available for media interviews. For additional information, see this Cornell Chronicle story.

New type of virus found in the ocean

“about 10 million viruses are found in every milliliter of water”

Public Release: 24-Jan-2018

The unusual characteristics of these abundant, bacteria-killing viruses could lead to evolutionary insights

Massachusetts Institute of Technology

CAMBRIDGE, Mass. — A type of virus that dominates water samples taken from the world’s oceans has long escaped analysis because it has characteristics that standard tests can’t detect. However, researchers at MIT and the Albert Einstein College of Medicine have now managed to isolate and study representatives of these elusive viruses, which provide a key missing link in virus evolution and play an important role in regulating bacterial populations, as a new study reports.

Viruses are the main predators of bacteria, and the findings suggest that the current view of bacterial virus diversity has a major blind spot. These conclusions have emerged through detailed analysis of marine samples led by MIT postdoc Kathryn Kauffman, professor of civil and environmental engineering Martin Polz, professor Libusha Kelly of Albert Einstein College of Medicine, and nine others. The results are being reported this week in the journal Nature.

The newly identified viruses lack the “tail” found on most catalogued and sequenced bacterial viruses, and have several other unusual properties that have led to their being missed by previous studies. To honor that fact, the researchers named this new group the Autolykiviridae — after a character from Greek mythology who was storied for being difficult to catch. And, unlike typical viruses that prey on just one or two types of bacteria, these tailless varieties can infect dozens of different types, often of different species, underscoring their ecological relevance.

This research “opens new avenues for furthering our understanding of the roles of viruses in the ocean,” says Jed Fuhrman, the McCulloch-Crosby Chair of Marine Biology at the University of Southern California, who was not involved in this work. “In a practical sense, it also shows how we need to alter some commonly used methods in order to capture these kinds of viruses for various studies,” he says. “I’d say it is an important advance in the field.”

Current environmental models of virus-bacteria interactions are based on the well-studied tailed viruses, Kauffman explains, so they may be missing important aspects of the interactions taking place in nature.

“We already knew that viruses are very important there,” Kauffman says, referring to the surface ocean, where the researchers’ samples were drawn, and where about 10 million viruses are found in every milliliter of water. Polz says that while “most of the viruses studied in labs have tails, most of those in the ocean don’t.” So the team decided to study one subset of tailless viruses, which infects a group of bacteria called Vibrio. After extensive tests, they found “that some of these were infecting unusually large numbers of hosts,” he says.

After sequencing representatives of the Autolykiviridae, the researchers found “their genomes were quite different from other viruses,” Polz says. For one thing, their genomes are very short: about 10,000 bases, compared to the typical 40,000-50,000 for tailed viruses. “When we found that, we were surprised,” he says.

With the new sequence information, the researchers were able to comb through databases and found that such viruses exist in many places. The research also showed that these viruses tend to be underrepresented in databases because of the ways samples are typically handled in labs. The methods the team developed to obtain these viruses from environmental samples could help researchers avoid such losses of information in the future. In addition, Kauffman says, typically the way researchers test for viral activity is by infecting bacteria with the viral sample and then checking the samples a day later to look for signs that patches of the bacteria have been killed off. But these particular nontailed viruses often act more slowly, and the killed-off regions don’t show up until several days have passed — so their presence was never noticed in most studies.

The new group of viruses may especially be widespread. “We don’t think it’s ocean-specific at all,” Polz says. For example, the viruses may even be prevalent in the human biome, and they may play roles in major biogeochemical cycles, he says, such as the cycling of carbon.

Another important aspect of theses findings is that the Autolykiviridae were shown to be members of an ancient viral lineage that is defined by specific types of capsids, the protein shell encasing the viral DNA. Though this lineage is known to be very diverse in animals and protists — and includes viruses such as the adenoviruses that infect humans, and the giant viruses that infect algae — very few viruses of this kind have been found to infect bacteria.

###

The work was supported by the National Science Foundation and the Woods Hole Oceanographic Institution’s Ocean Ventures Fund.

ADDITIONAL BACKGROUND:

ARCHIVE: Microbial communities demonstrate high turnover

ARCHIVE: MIT researchers prove fast microbial evolutionary bursts exist

ARCHIVE: Microscale marine interactions may shape critical carbon cycles

Pharmaceuticals and other contaminants force fish to work much harder to survive

Public Release: 16-Jan-2018

 

Contaminants remain after typical water treatment process

McMaster University

IMAGE

IMAGE: This is the site where researchers tested the metabolism of fish downstream of the Dundas Wastewater Treatment Plant in suburban Hamilton, ON.

Credit: McMaster University

HAMILTON, ON, Jan. 16, 2018 – Pharmaceuticals and other man-made contaminants are forcing fish that live downstream from a typical sewage treatment plant to work at least 30 per cent harder just to survive, McMaster researchers have found.

The effort to decontaminate their bodies from pollutants that persist even after water treatment makes fish vulnerable by forcing them to burn energy that would typically go toward other vital functions.

“That’s a difference in metabolic rate that we would have if we started walking several extra hours a day. It’s a pretty big increase in metabolism,” says Graham Scott, senior author of a new paper published today in the journal Environmental Science and Technology. “That’s a lot of resources.”

Metabolism increases with activity, sickness or stress. By using valuable metabolic resources to decontaminate its own body, a fish has less energy for movement, evading predators, catching prey, and reproduction. The effort puts populations at risk, highlighting the unseen impacts of pollution, Scott says.

“A lot of studies of environmental pollution are looking major impacts that cause animals to die. We were really concerned about those impacts of pollution that are not as obvious, but still very significant,” says Scott, an assistant professor of biology at McMaster University who worked with six colleagues, including animal behavior specialist Sigal Balshine and lead author Sherry Du, a graduate student in biology. “Metabolism is a lens into the inner workings of the animal, and its health and fitness.”

The findings suggest that new water-treatment technology is required to protect populations from modern threats.

The researchers took wild sunfish from an unpolluted local source and submerged cages of them at different points downstream from the Dundas Wastewater Treatment Plant near the McMaster campus for three weeks in the summer of 2016.

Using laboratory analyses, the researchers compared the metabolism of the wastewater-exposed subjects to that of fish they had placed in an unpolluted pond in the headwaters of the same watershed.

The wastewater treatment plant functions well, the researchers say, but it is not made to capture modern pollutants such as birth-control medication, anti-depressants and beta blockers, and while upgrades are planned, the new technology still won’t be able to clear all of the modern contaminants.

Cootes Paradise, the marsh that forms the extreme west end of Lake Ontario, is a good location for such research, Scott explains, because it is a contained environment. What happens to fish living in treated sewage outflow is likely similar in other locations with similar plants, he says.

Balshine was the senior author of a related study that recently showed the behaviour of fish living in treated wastewater outflow was also measurably different.

Taken together, Scott says, the studies of metabolism and behavior reinforce the idea that invisible pollution from emerging new contaminants is having a serious impact on populations downstream from sewage treatment plants – a problem that should be addressed.

###

Graham Scott is available at 905-525-9140 ext. 26692 and scottg2@mcmaster.ca.

McMaster provides a high definition broadcast studio that can connect with any television broadcaster around the world. To book an interview please contact:

Wade Hemsworth
Manager, Media Relations
McMaster University
905-525-9140, ext. 27988
289-925-8382 (c)
hemswor@mcmaster.ca

Michelle Donovan
Manager, Media Relations
McMaster University
905-525-9140, ext. 22869
905-512-8548 (c)
donovam@mcmaster.ca

Worldwide importance of honey bees for natural habitats captured in new report

Public Release: 10-Jan-2018

 

Global synthesis of data reveals honey bees as world’s key pollinator of non-crop plants

University of California – San Diego

 

IMAGE

IMAGE: A honey bee pollinates a Carpobrotus plant.

Credit: Keng-Lou James Hung/UC San Diego

An unprecedented study integrating data from around the globe has shown that honey bees are the world’s most important single species of pollinator in natural ecosystems and a key contributor to natural ecosystem functions. The first quantitative analysis of its kind, led by biologists at the University of California San Diego, is published Jan. 10 in Proceedings of the Royal Society B.

The report weaves together information from 80 plant-pollinator interaction networks. The results clearly identify the honey bee (Apis mellifera) as the single most frequent visitor to flowers of naturally occurring (non-crop) plants worldwide. Honey bees were recorded in 89 percent of the pollination networks in the honey bee’s native range and in 61 percent in regions where honey bees have been introduced by humans.

One out of eight interactions between a non-agricultural plant and a pollinator is carried out by the honey bee, the study revealed. The honey bee’s global importance is further underscored when considering that it is but one of tens of thousands of pollinating species in the world, including wasps, flies, beetles, butterflies, moths and other bee species.

“Biologists have known for a while that honey bees are widespread and abundant–but with this study, we now see in quantitative terms that they are currently the most successful pollinators in the world,” said Keng-Lou James Hung, who led the study as a graduate student in UC San Diego’s Division of Biological Sciences. He’s now a postdoctoral researcher at the Ohio State University.

Honey bees are native to Africa, the Middle East and Southern Europe and have become naturalized in ecosystems around the world as a result of intentional transport by humans. While feral honey bee populations may be healthy in many parts of the world, the researchers note that the health of managed honey bee colonies is threatened by a host of factors including habitat loss, pesticides, pathogens, parasites and climate change.

“Although they appear to have a disproportionate impact on natural ecosystems, surprisingly we understand very little about the honey bee’s ecological effects in non-agricultural systems,” said study coauthor David Holway, a professor and chair of the Section of Ecology, Behavior and Evolution in Biological Sciences. “Looking to the future this study raises a lot of new questions.”

For instance, in San Diego, where honey bees are not native, they are responsible for 75 percent of pollinator visits to native plants, the highest honey bee dominance in the set of networks examined for any continental site in the introduced range of the honey bee. This is despite the fact that there are more than 650 species of native bees in San Diego County as well as many other native pollinating insects.

“The consequences of this phenomenon for both native plants that did not evolve with the honey bee and for populations of native insect pollinators is well worth studying,” said Joshua Kohn, the study’s senior author.

“Our study also nicely confirms something that pollination biologists have known for a long time: even in the presence of a highly abundant species that pollinates many plant species, we still need healthy populations of other pollinators for entire plant communities to receive adequate pollination services,” said Hung.

The reason for this, Hung noted, is that in habitats where honey bees are present, they nevertheless fail to visit nearly half of all animal-pollinated plant species, on average.

“Our take home message is that while it’s important for us to continue to research how we can improve the health of managed honey bee colonies for agricultural success, we need to further understand how this cosmopolitan and highly successful species impacts the ecology and evolutionary dynamics of plant and pollinator species in natural ecosystems,” said Hung.

###

Coauthors of the study include Jennifer Kingston of UC San Diego and Matthias Albrecht of Agroecology and Environment, Agroscope, Reckenholzstrasse, in Switzerland.

Funding for the study included a National Science Foundation Doctoral Dissertation Improvement Grant (DEB-1501566); a Mildred E. Mathias Graduate Student Research Grant and an Institute for the Study of Ecological and Evolutionary Climate Impacts Graduate Fellowship from the University of California Natural Reserve System; a Frontiers of Innovation Scholar Fellowship, an Academic Senate Grant and a McElroy Fellowship from UC San Diego; a Sea and Sage Audubon Society Bloom-Hays Ecological Research Grant; and a California Native Plants Society Educational Grant.

Hypatia stone contains unique minerals not from Earth nor part of any known types of meteorite or comet

Public Release: 9-Jan-2018

Extraterrestrial Hypatia stone rattles solar system status quo

University of Johannesburg

In 2013, researchers announced that a pebble found in south-west Egypt, was definitely not from Earth. By 2015, other research teams had announced that the ‘Hypatia’ stone was not part of any known types of meteorite or comet, based on noble gas and nuclear probe analyses.

(The stone was named Hypatia after Hypatia of Alexandria, the first Western woman mathematician and astronomer).

However, if the pebble was not from Earth, what was its origin and could the minerals in it provide clues on where it came from? Micro-mineral analyses of the pebble by the original research team at the University of Johannesburg have now provided unsettling answers that spiral away from conventional views of the material our solar system was formed from.

Mineral structure

The internal structure of the Hypatia pebble is somewhat like a fruitcake that has fallen off a shelf into some flour and cracked on impact, says Prof Jan Kramers, lead researcher of the study published in Geochimica et Cosmochimica Acta on 28 Dec 2017.

“We can think of the badly mixed dough of a fruit cake representing the bulk of the Hypatia pebble, what we called two mixed ‘matrices’ in geology terms. The glace cherries and nuts in the cake represent the mineral grains found in Hypatia ‘inclusions’. And the flour dusting the cracks of the fallen cake represent the ‘secondary materials’ we found in the fractures in Hypatia, which are from Earth,” he says.

The original extraterrestrial rock that fell to Earth must have been at least several meters in diameter, but disintegrated into small fragments of which the Hypatia stone is one.

Weird matrix

Straight away, the Hypatia mineral matrix (represented by fruitcake dough), looks nothing like that of any known meteorites, the rocks that fall from space onto Earth every now and then.

“If it were possible to grind up the entire planet Earth to dust in a huge mortar and pestle, we would get dust with on average a similar chemical composition as chondritic meteorites,” says Kramers. “In chondritic meteorites, we expect to see a small amount of carbon{C} and a good amount of silicon (Si). But Hypatia’s matrix has a massive amount of carbon and an unusually small amount of silicon.”

“Even more unusual, the matrix contains a high amount of very specific carbon compounds, called polyaromatic hydrocarbons, or PAH, a major component of interstellar dust, which existed even before our solar system was formed. Interstellar dust is also found in comets and meteorites that have not been heated up for a prolonged period in their history,” adds Kramers.

In another twist, most (but not all) of the PAH in the Hypatia matrix has been transformed into diamonds smaller than one micrometer, which are thought to have been formed in the shock of impact with the Earth’s atmosphere or surface. These diamonds made Hypatia resistant to weathering so that it is preserved for analysis from the time it arrived on Earth.

Weirder grains never found before

When researcher Georgy Belyanin analyzed the mineral grains in the inclusions in Hypatia, (represented by the nuts and cherries of a fruitcake), a number of most surprising chemical elements showed up.

“The aluminum occurs in pure metallic form, on its own, not in a chemical compound with other elements. As a comparison, gold occurs in nuggets, but aluminum never does. This occurrence is extremely rare on Earth and the rest of our solar system, as far as is known in science,” says Belyanin.

“We also found silver iodine phosphide and moissanite (silicon carbide) grains, again in highly unexpected forms. The grains are the first documented to be found in situ (as is) without having to first dissolve the surrounding rock with acid,” adds Belyanin. “There are also grains of a compound consisting of mainly nickel and phosphorus, with very little iron; a mineral composition never observed before on Earth or in meteorites,” he adds.

Dr Marco Andreoli, a Research Fellow at the School of Geosciences at the University of the Witwatersrand, and a member of the Hypatia research team says, “When Hypatia was first found to be extraterrestrial, it was a sensation, but these latest results are opening up even bigger questions about its origins”.

Unique minerals in our solar system

Taken together, the ancient unheated PAH carbon as well as the phosphides, the metallic aluminum, and the moissanite suggest that Hypatia is an assembly of unchanged pre-solar material. That means, matter that existed in space before our Sun, the Earth and the other planets in our solar system were formed.

Supporting the pre-solar concept is the weird composition of the nickel-phosphorus-iron grains found in the Hypatia inclusions. These three chemical elements are interesting because they belong to the subset of chemical elements heavier than carbon and nitrogen which form the bulk of all the rocky planets.

“In the grains within Hypatia the ratios of these three elements to each other are completely different from that calculated for the planet Earth or measured in known types of meteorites. As such these inclusions are unique within our solar system,” adds Belyanin.

“We think the nickel-phosphorus-iron grains formed pre-solar, because they are inside the matrix, and are unlikely to have been modified by shock such as collision with the Earth’s atmosphere or surface, and also because their composition is so alien to our solar system”, he adds.

“Was the bulk of Hypatia, the matrix, also formed before our solar system? Probably not, because you need a dense dust cloud like the solar nebula to coagulate large bodies” he says.

A different kind of dust

Generally, science says that our solar system’s planets ultimately formed from a huge, ancient cloud of interstellar dust (the solar nebula) in space. The first part of that process would be much like dust bunnies coagulating in an unswept room. Science also holds that the solar nebula was homogenous, that is, the same kind of dust everywhere.

But Hypatia’s chemistry tugs at this view. “For starters, there are no silicate minerals in Hypatia’s matrix, in contrast to chondritic meteorites (and planets like the Earth, Mars and Venus), where silicates are dominant. Then there are the exotic mineral inclusions. If Hypatia itself is not presolar, both features indicate that the solar nebula wasn’t the same kind of dust everywhere – which starts tugging at the generally accepted view of the formation of our solar system”, says Kramers.

Into the future

“What we do know is that Hypatia was formed in a cold environment, probably at temperatures below that of liquid nitrogen on Earth (-196 Celsius). In our solar system it would have been way further out than the asteroid belt between Mars and Jupiter, where most meteorites come from. Comets come mainly from the Kuiper Belt, beyond the orbit of Neptune and about 40 times as far away from the sun as we are. Some come from the Oort Cloud, even further out. We know very little about the chemical compositions of space objects out there. So our next question will dig further into where Hypatia came from,” says Kramers.

The little pebble from the Libyan Desert Glass strewn field in south-west Egypt presents a tantalizing piece for an extraterrestrial puzzle that is getting ever more complex.

###

The research was funded by University of Johannesburg Research council via the PPM Research Centre.

The researchers would like to thank Aly Barakat, Mario di Martino and Romano Serra for access to the Hypatia sample material; and Michael Wiedenbeck and his co-workers at the Geoforschungszentrum Potsdam, Germany for their collaboration.

US childhood mortality rates have lagged behind other wealthy nations for the past 50 years

 

Leading causes of death are prematurity and injuries

Johns Hopkins Medicine

 

IMAGE

IMAGE: A new study reveals childhood mortality trends from 1961 to 2010 in the United States and 19 economically similar countries.

Credit: Credit: Johns Hopkins Medicine

In a new study of childhood mortality rates between 1961 and 2010 in the United States and 19 economically similar countries, researchers report that while there’s been overall improvement among all the countries, the U.S. has been slowest to improve.

Researchers found that childhood mortality in the U.S. has been higher than all other peer nations since the 1980s; over the 50-year study period, the U.S.’s “lagging improvement” has amounted to more than 600,000 excess deaths.

A report of the findings, published Jan. 8 in Health Affairs, highlights when and why the U.S.’s performance started falling behind peer countries, and calls for continued funding of federal, state and local programs that have proven to save children’s lives.

Among the leading causes of death for the most recent decade, the researchers say, were premature births and Sudden Infant Death Syndrome (SIDS). Children in the U.S. were three times more likely to die from prematurity at birth and more than twice as likely to die from SIDS.

The two leading causes of death for those 15 to 19 years old in the U.S. during the same time period were motor vehicle accidents and assaults by firearm. Teenagers were twice as likely to die from motor vehicle accidents and 82 times more likely to die from gun homicide in the U.S. than in other wealthy nations.

“Overall child mortality in wealthy countries, including the U.S., is improving, but the progress our country has made is considerably slower than progress elsewhere,” says Ashish Thakrar, M.D., an internal medicine resident at The Johns Hopkins Hospital and a lead author of the study. He adds: “Now is not the time to defund the programs that support our children’s health.”

Thakrar notes that while the U.S. spends more per capita on health care for children than other wealthy nations, it has poorer outcomes than many. In 2013, the United Nations Children’s Fund ranked the U.S. 25th in a list of 29 developed countries for overall child health and safety.

To better understand when and why the U.S. performance in improving child death rates began faltering compared to peer nations, Thakrar and colleagues tracked child mortality rates for the U.S. and 19 nations that are members of the Organization for Economic Cooperation and Development. The members include Canada, Australia, Switzerland, Italy and Germany, among others, which have similar levels of economic development.

While previous studies have also tracked U.S. mortality over time, they’ve only done so for children in specific age groups, Thakrar says, and to his knowledge, the new study is the first to describe the full burden of excess mortality in the U.S. for children and adolescents of all ages.

The researchers analyzed mortality and population data from the Human Mortality Database and mortality and cause of death data from the World Health Organization for all children 0 to 19 years old from 1961 to 2010.

Some 90 percent of these deaths, the researchers say, occurred among infants and adolescents 15 to 19 years old. In the most recent decade studied (2001-2010), infants in the U.S. were 76 percent more likely to die and children 1 to 19 years old were 57 percent more likely to die than their counterparts in peer nations.

“The findings show that in terms of protecting child health, we’re very far behind where we could be,” says Christopher Forrest, M.D., Ph.D., the study’s senior author and a pediatrician at Children’s Hospital of Philadelphia. “We hope that policymakers can use these finding to make strategic public health decisions for all U.S. children to ensure that we don’t fall further behind peer nations.”

The research team called on officials to fully fund the Children’s Health Insurance Program, which provides health insurance to millions of disadvantaged children, and the Supplemental Nutrition Assistance Program (food stamps). Applying public health research and solutions to gun violence and car crashes can also help level the playing field for U.S. children, adds Forrest.

###

Other authors on this paper include Alexandra D. Forrest of the Drexel University College of Medicine and Mitchell Maltenfort of the Children’s Hospital of Philadelphia.

Funding for this study was provided by institutional development funds at the Children’s Hospital of Philadelphia to the Applied Clinical Research Center at CHOP.

What species is most fit for life? All have an equal chance, scientists say

Public Release: 8-Jan-2018

 

Elephants and giant sequoias have no advantage over algae and bacteria

SUNY College of Environmental Science and Forestry

There are more than 8 million species of living things on Earth, but none of them — from 100-foot blue whales to microscopic bacteria — has an advantage over the others in the universal struggle for existence.

In a paper published Jan. 8 in the prestigious journal Nature Ecology & Evolution, a trio of scientists from universities in the United States and the United Kingdom describe the dynamic that began with the origin of life on Earth 4 billion years ago. They report that regardless of vastly different body size, location and life history, most plant, animal and microbial species are equally “fit” in the struggle for existence. This is because each transmits approximately the same amount of energy over its lifetime to produce the next generation of its species.

“This means that each elephant or blue whale contributes no more energy per gram of parent to the next generation than a trout or even a bacterium,” said co-author Charles A.S. Hall, a systems ecologist with the College of Environmental Science and Forestry (ESF) in Syracuse, New York. “We found, rather astonishingly, by examining the production rate and the generation time of thousands of plants, animals and microbes that each would pass on, on average, the same amount of energy to the next generation per gram of parent, regardless of size. A single-celled aquatic alga recreates its own body mass in one day, but lives for only a day. A large female elephant takes years to produce her first baby, and lives much longer than the alga. For all plants and animals of all sizes these two factors – rate of biomass production and generation time – exactly balance each other, so each contributes the same energy per gram of parent to the next generation in their lifetime.”

The bottom line, Hall said, is that all organisms are, on average, equally fit for survival.

Hall’s co-author, James H. Brown, a physiological ecologist at the University of New Mexico, said, “The fact that all organisms are nearly equally fit has profound implications for the evolution and persistence of life on Earth.”

The third author on the paper, which was published online, is mathematical biologist Richard M. Sibly of the University of Reading in the United Kingdom.

The scientists tackled an intriguing question about life on the planet, beginning with some common knowledge. On one hand, they noted, microscopic, unicellular bacteria, algae and protists that weigh only a few micrograms live fast, generate much new biomass per day or even per minute, and die young, often within hours. On the other hand, mammals such as a 100-foot blue whale can live up to 100 years but generate new biomass, including babies, much more slowly.

The authors ask a sweeping question: How can such enormous variation in reproduction and survival allow persistence and coexistence of so many species? Their answer: Because there is a universal tradeoff in how organisms acquire, transform and expend energy for survival and production within constraints imposed by physics and biology.

In their research, the authors built a model of energy allocation, based on data involving rates of energy investment in growth and reproduction, generation times (commonly considered 22 to 32 years for humans) and body sizes of hundreds of species ranging from microbes to mammals and trees. They found an exactly equal but opposite relationship between growth rate and generation time among all these organisms.

The net result is what the authors call the “equal fitness paradigm.” Species are nearly equally fit for survival because they all devote the same quantity of energy per unit of body weight to produce offspring in the next generation; the higher activity and shorter life of small organisms is exactly compensated for by the slower activity and greater longevity of large organisms.

Hall said the tradeoff between rate of living and generation time is one reason for the great diversity of life on Earth: No one size or life form has a built-in advantage over another. The apparent benefits of being larger (for example, bigger males are more likely to win in competition for mates) are compensated for by the fact that larger animals are typically less productive over time.

“There is no single way of living and using energy that is best,” Hall and Brown said. “Given the array of environmental conditions on the planet, one kind of organism might gain a temporary advantage, but such gains will soon be countered by other, competing organisms. The result is what evolutionary biologist Leigh Van Valen called the ‘Red Queen phenomenon,’ based on Lewis Carroll’s Through the Looking Glass: All species must keep running to keep up with others and stay in the evolutionary race.”

Genetic changes help mosquitoes survive pesticide attacks

Public Release: 2-Jan-2018

 

UCR study shows how intensive pesticide use is driving mosquito evolution at the genetic level

University of California – Riverside

IMAGE

IMAGE: A rice field in northern Cameroon. In addition to long-lasting insecticidal nets, urbanization, chemical pollutants, and agriculture play a key role in selecting insecticide-resistant mosquitoes.

Credit: Caroline Fouet, UC Riverside.

RIVERSIDE, Calif. — For decades, chemical pesticides have been the most important way of controlling insects like the Anopheles mosquito species that spreads malaria to humans. Unfortunately, the bugs have fought back, evolving genetic shields to protect themselves and their offspring from future attacks.

The fascinating array of genetic changes that confer pesticide resistance in Anopheles mosquitoes is reviewed in an article published today in Trends in Parasitology. The paper is written by Colince Kamdem, a postdoctoral scholar, and two colleagues from the Department of Entomology at the University of California, Riverside. The findings highlight the interplay between human interventions, mosquito evolution, and disease outcomes, and will help scientists develop new strategies to overcome pesticide resistance.

In 2015, there were roughly 212 million malaria cases and an estimated 429,000 deaths due to malaria, according to the World Health Organization. While increased prevention and control measures have led to a 29 percent reduction in malaria mortality rates globally since 2010, the increase in pesticide resistant insects underscores the need for new strategies. “One of the main obstacles to malaria eradication is the enormous diversity and adaptive flexibility of the Anopheles mosquito species, therefore a better understanding of the genetic, behavioral, and ecological factors underlying its ability to evolve resistance is key to controlling this disease,” Kamdem said.

In sub-Saharan Africa, multiple factors, including the widespread use of long-lasting insecticidal nets, indoor residual spraying, exposure to chemical pollutants, urbanization, and agricultural practices, are contributing to the selection of malaria mosquitoes that are highly resistant to several classes of insecticide.

Kamden’s article highlights several ways that mosquitoes are adapting to insecticide exposure. Advantageous mutations in the insecticide target site are a major source of resistance, highlighting the direct impact of human interventions on the mosquito genome. Other mutations boost the activity of enzymes that degrade or sequester the insecticide before it reaches its target in the cell. In some cases, mosquitoes change their behaviors to avoid coming into contact with pesticides.

“These changes are occurring at the molecular, physiological and behavioral level, and multiple changes are often happening at the same time. With the accessibility of DNA sequencing we can now pinpoint these evolutionary changes at the genomic level,” Kamdem said.

Kamdem said the high genetic diversity among mosquito species and their ability to swap genes makes it difficult to stop the development of insecticide-resistant groups. Gene drive systems that use genetic approaches to kill mosquitoes, prevent them from breeding, or stop them from transmitting the malaria-causing parasite are under development, but a concern is that mosquitoes could evolve resistance to these techniques, too. “The insights gained from the intensive use of insecticides and its impact on the mosquito genome will be critical for the successful implementation of gene editing systems as a new approach to controlling mosquito-borne diseases,” Kamdem said. “Due to the emergence of mosquito-borne diseases such as Zika, several countries are implementing, or are preparing to deploy, vector control strategies on a large scale. One of the most pressing needs is to design evidence-based monitoring tools to fight back the inevitable resistance of mosquitoes.”

###

The title of the article is: “Human Interventions: Driving Forces of Mosquito Evolution.” In addition to Kamdem, contributors are Caroline Fouet, a postdoctoral researcher in entomology and lead author on the paper, and Peter Atkinson, a professor of entomology, both at UC Riverside.

Viruses can transfer genes across the superkingdoms of life

Public Release: 18-Dec-2017

 

New research shows that viruses can transfer genes to organisms they are not known to infect, and may cast light on the ancient origins of viruses

Frontiers

New research shows that viruses can transfer genes to organisms that they aren’t known to infect – including organisms in different superkingdoms, or domains. The study, published in open-access journal Frontiers in Microbiology, also finds that viruses and cellular organisms share a large group of genes that help cells to function, suggesting that viruses may have an ancient cell-like origin.

Viruses can sometimes infect very different organisms during their lifecycle, such as mosquitoes and humans in the case of Zika virus. Viruses can also jump between different species, such as from birds to humans in the case of avian flu. However, no virus has been discovered that can infect organisms from different superkingdoms – the highest-level divisions of life, also known as domains.

“Normally, we associate viruses with very specific host organisms, and we do not know of any virus that, for example, can infect both bacteria and humans,” explains Arshan Nasir from COMSATS Institute of Information Technology, Pakistan, and University of Illinois, USA, and one of the study’s authors. “Virus-host boundaries make sense since organisms that are separated by large evolutionary distances differ starkly in their cellular biology. This makes it hard for a virus to successfully replicate inside two very diverse environments.”

Nevertheless, Nasir and his colleagues suspected leaps between such distant species could occur, not necessarily involving virus infection. “In addition to infecting and killing cells, viruses can also insert their genes into a cell’s DNA,” says Nasir. “We therefore hypothesized that viruses might interact in non-harmful ways to exchange genes between distantly related organisms.”

To investigate such viral gene exchange, Nasir and colleagues looked at protein structures found in all known viruses and cellular organisms. By looking for protein structures that are specifically associated with viruses or cells, the researchers could detect virus-derived genes in cellular organisms and cell-derived genes in viruses.

Strikingly, viral hallmark genes weren’t just found in the expected host organisms, but in all sorts of species – including those from different superkingdoms. For example, the research team found examples where viruses thought to only infect bacteria had likely transferred genes to complex organisms, such as plants and animals. This suggests that viruses can transfer genes to organisms that are dramatically different from their usual host, and that they can influence and interact with a much wider range of organisms than previously thought.

The team also found evidence that viruses and cellular organisms share a large group of protein structures that help cells to function. This is a little surprising in the case of viruses, as they aren’t cells and have no obvious need for these proteins. One intriguing possibility is that viruses may have originally evolved from primitive cells, and these proteins were once useful during their ancient origins.

Nasir believes the results could change the way we think about virus-host relationships. “The study shows that the concept of a ‘virus host’ is rather blurry, since viruses do not necessarily need to kill a cell in order to interact with it,” he says. “We should consider viruses to be a source of new genes that cellular organisms can acquire, and not necessarily just as a source of disease.”

Kaiser Permanente study links health risks to electromagnetic field exposure

Public Release: 13-Dec-2017

 

Kaiser Permanente

A study of real-world exposure to non-ionizing radiation from magnetic fields in pregnant women found a significantly higher rate of miscarriage, providing new evidence regarding their potential health risks. The Kaiser Permanente study was published today in the journal Scientific Reports (Nature Publishing Group).

Non-ionizing radiation from magnetic fields is produced when electric devices are in use and electricity is flowing. It can be generated by a number of environmental sources, including electric appliances, power lines and transformers, wireless devices and wireless networks. Humans are exposed to magnetic fields via close proximity to these sources while they are in use.

While the health hazards from ionizing radiation are well-established and include radiation sickness, cancer and genetic damage, the evidence of health risks to humans from non-ionizing radiation remains limited, said De-Kun Li, MD, PhD, principal investigator of the study and a reproductive and perinatal epidemiologist at the Kaiser Permanente Division of Research in Oakland, California.

“Few studies have been able to accurately measure exposure to magnetic field non-ionizing radiation,” Dr. Li said. “In addition, due to the current lack of research on this subject, we don’t know the biological threshold beyond which problems may develop, and we also don’t yet understand the possible mechanisms for increased risks.”

In a new study funded by the National Institute of Environmental Health Sciences, researchers asked women over age 18 with confirmed pregnancies to wear a small (a bit larger than a deck of cards) magnetic-field monitoring device for 24 hours. Participants also kept a diary of their activities on that day, and were interviewed in person to better control for possible confounding factors, as well as how typical their activities were on the monitoring day. Researchers controlled for multiple variables known to influence the risk of miscarriage, including nausea/vomiting, past history of miscarriage, alcohol use, caffeine intake, and maternal fever and infections.

Objective magnetic field measurements and pregnancy outcomes were obtained for 913 pregnant women, all members of Kaiser Permanente Northern California. Miscarriage occurred in 10.4 percent of the women with the lowest measured exposure level (1st quartile) of magnetic field non-ionizing radiation on a typical day, and in 24.2 percent of the women with the higher measured exposure level (2nd, 3rd and 4th quartiles), a nearly three times higher relative risk. The rate of miscarriage reported in the general population is between 10 and 15 percent, Dr. Li said.

“This study provides evidence from a human population that magnetic field non-ionizing radiation could have adverse biological impacts on human health,” he said.

Strengths of this study, Dr. Li noted, included that researchers used an objective measuring device and studied a short-term outcome (miscarriage) rather than one that will occur years or decades later, such as cancer or autoimmune diseases. The study’s main limitation is that it was not feasible for researchers to ask participants to carry the measuring device throughout pregnancy.

Dr. Li noted that the potential health risk of magnetic-field non-ionizing radiation needs more research. “We hope that the finding from this study will stimulate much-needed additional studies into the potential environmental hazards to human health, including the health of pregnant women.”

###

In addition to Dr. Li, co-authors of the study were Hong Chen, MPH, Jeannette Ferber, MPH, Roxana Odouli, MSPH, and Charles Quesenberry, PhD, all of the Kaiser Permanente Division of Research.

About the Kaiser Permanente Division of Research

The Kaiser Permanente Division of Research conducts, publishes and disseminates epidemiologic and health services research to improve the health and medical care of Kaiser Permanente members and society at large. It seeks to understand the determinants of illness and well-being, and to improve the quality and cost-effectiveness of health care. Currently, DOR’s 550-plus staff is working on more than 350 epidemiological and health services research projects. For more information, visit divisionofresearch.kaiserpermanente.org or follow us @KPDOR.

About Kaiser Permanente

Kaiser Permanente is committed to helping shape the future of health care. We are recognized as one of America’s leading health care providers and not-for-profit health plans. Founded in 1945, Kaiser Permanente has a mission is to provide high-quality, affordable health care services and to improve the health of our members and the communities we serve. We currently serve 11.7 million members in eight states and the District of Columbia. Care for members and patients is focused on their total health and guided by their personal Permanente Medical Group physicians, specialists and team of caregivers. Our expert and caring medical teams are empowered and supported by industry-leading technology advances and tools for health promotion, disease prevention, state-of-the-art care delivery and world-class chronic disease management. Kaiser Permanente is dedicated to care innovations, clinical research, health education and the support of community health. For more information, go to share.kaiserpermanente.org.

Prehistoric women had stronger arms than today’s elite rowing crews

Public Release: 29-Nov-2017

 

University of Cambridge

 

IMAGE

Caption

Cambridge University Women’s Boat Club openweight crew rowing during the 2017 Boat Race on the river Thames in London. The Cambridge women’s crew beat Oxford in the race. The members of this crew were among those analyzed in the study.

Credit: Alastair Fyfe for the University of Cambridge

A new study comparing the bones of Central European women that lived during the first 6,000 years of farming with those of modern athletes has shown that the average prehistoric agricultural woman had stronger upper arms than living female rowing champions.

Researchers from the University of Cambridge’s Department of Archaeology say this physical prowess was likely obtained through tilling soil and harvesting crops by hand, as well as the grinding of grain for as much as five hours a day to make flour.

Until now, bioarchaeological investigations of past behaviour have interpreted women’s bones solely through direct comparison to those of men. However, male bones respond to strain in a more visibly dramatic way than female bones.

The Cambridge scientists say this has resulted in the systematic underestimation of the nature and scale of the physical demands borne by women in prehistory.

“This is the first study to actually compare prehistoric female bones to those of living women,” said Dr Alison Macintosh, lead author of the study published today in the journal Science Advances.

“By interpreting women’s bones in a female-specific context we can start to see how intensive, variable and laborious their behaviours were, hinting at a hidden history of women’s work over thousands of years.”

The study, part of the European Research Council-funded ADaPt (Adaption, Dispersals and Phenotype) Project, used a small CT scanner in Cambridge’s PAVE laboratory to analyse the arm (humerus) and leg (tibia) bones of living women who engage in a range of physical activity: from runners, rowers and footballers to those with more sedentary lifestyles.

The bones strengths of modern women were compared to those of women from early Neolithic agricultural eras through to farming communities of the Middle Ages.

“It can be easy to forget that bone is a living tissue, one that responds to the rigours we put our bodies through. Physical impact and muscle activity both put strain on bone, called loading. The bone reacts by changing in shape, curvature, thickness and density over time to accommodate repeated strain,” said Macintosh.

“By analysing the bone characteristics of living people whose regular physical exertion is known, and comparing them to the characteristics of ancient bones, we can start to interpret the kinds of labour our ancestors were performing in prehistory.”

Over three weeks during trial season, Macintosh scanned the limb bones of the Open- and Lightweight squads of the Cambridge University Women’s Boat Club, who ended up winning this year’s Boat Race and breaking the course record. These women, most in their early twenties, were training twice a day and rowing an average of 120km a week at the time.

The Neolithic women analysed in the study (from 7400-7000 years ago) had similar leg bone strength to modern rowers, but their arm bones were 11-16% stronger for their size than the rowers, and almost 30% stronger than typical Cambridge students.

The loading of the upper limbs was even more dominant in the study’s Bronze Age women (from 4300-3500 years ago), who had 9-13% stronger arm bones than the rowers but 12% weaker leg bones.

A possible explanation for this fierce arm strength is the grinding of grain. “We can’t say specifically what behaviours were causing the bone loading we found. However, a major activity in early agriculture was converting grain into flour, and this was likely performed by women,” said Macintosh.

“For millennia, grain would have been ground by hand between two large stones called a saddle quern. In the few remaining societies that still use saddle querns, women grind grain for up to five hours a day.

“The repetitive arm action of grinding these stones together for hours may have loaded women’s arm bones in a similar way to the laborious back-and-forth motion of rowing.”

However, Macintosh suspects that women’s labour was hardly likely to have been limited to this one behaviour.

“Prior to the invention of the plough, subsistence farming involved manually planting, tilling and harvesting all crops,” said Macintosh. “Women were also likely to have been fetching food and water for domestic livestock, processing milk and meat, and converting hides and wool into textiles.

“The variation in bone loading found in prehistoric women suggests that a wide range of behaviours were occurring during early agriculture. In fact, we believe it may be the wide variety of women’s work that in part makes it so difficult to identify signatures of any one specific behaviour from their bones.”

Dr Jay Stock, senior study author and head of the ADaPt Project, added: “Our findings suggest that for thousands of years, the rigorous manual labour of women was a crucial driver of early farming economies. The research demonstrates what we can learn about the human past through better understanding of human variation today.”

New theory rewrites opening moments of Chernobyl disaster

Public Release: 17-Nov-2017

 

Taylor & Francis Group

 

A brand-new theory of the opening moments during the Chernobyl disaster, the most severe nuclear accident in history, based on additional analysis is presented for the first time in the journal Nuclear Technology, an official journal of the American Nuclear Society.

The new theory suggests the first of the two explosions reported by eyewitnesses was a nuclear and not a steam explosion, as is currently widely thought and is presented by researchers from the Swedish Defence Research Agency, Swedish Meteorological and Hydrological Institute, and Stockholm University.

They hypothesize that the first explosive event was a jet of debris ejected to very high altitudes by a series of nuclear explosions within the reactor. This was followed, within three seconds, by a steam explosion which ruptured the reactor and sent further debris into the atmosphere at lower altitudes.

The theory is based on new analysis of xenon isotopes detected by scientists from the V.G. Khlopin Radium Institute in the Leningrad, four days after the accident, at Cherepovets, a city north of Moscow far from the major track of Chernobyl debris. These isotopes were the product of recent nuclear fission, suggesting they could be the result of a recent nuclear explosion. In contrast, the main Chernobyl debris which tracked northwest to Scandinavia contained equilibrium xenon isotopes from the reactor’s core.

By assessing the weather conditions across the region at the time, the authors also established that the fresh xenon isotopes at Cherepovets were the result of debris injected into far higher altitudes than the debris from the reactor rupture which drifted towards Scandinavia.

Observations of the destroyed reactor tank indicated that the first explosion caused temperatures high enough to melt a two-meter thick bottom plate in part of the core. Such damage is consistent with a nuclear explosion. In the rest of the core, the bottom plate was relatively intact, though it had dropped by nearly four meters. This suggests a steam explosion which did not create temperatures high enough to melt the plate but generated sufficient pressure to push it down.

Lead author and retired nuclear physicist from the Swedish Defence Research Agency, Lars-Erik De Geer commented, “We believe that thermal neutron mediated nuclear explosions at the bottom of a number of fuel channels in the reactor caused a jet of debris to shoot upwards through the refuelling tubes. This jet then rammed the tubes’ 350kg plugs, continued through the roof and travelled into the atmosphere to altitudes of 2.5-3km where the weather conditions provided a route to Cherepovets. The steam explosion which ruptured the reactor vessel occurred some 2.7 seconds later.”

Seismic measurements and an eye-witness report of a blue flash above the reactor a few seconds after the first explosion also support the new hypothesis of a nuclear explosion followed by a steam explosion. This new analysis brings insight into the disaster, and may potentially prove useful in preventing future similar incidents from occurring.

###

NOTE TO JOURNALISTS

When referencing the article: Please include Journal title, author, published by Taylor & Francis and the following statement:

* Read the full article online: http://www.tandfonline.com/doi/full/10.1080/00295450.2017.1384269

PLEASE NOTE: This link will not be live until xxth November at 00:01am, when it will be free to access

For an interview, please contact:

Name: Lars-Erik De Geer
Email: ledg1945@gmail.com

For more information, please contact:

Name: Krystina Sihdu, Press & Media Relations Executive
Email: newsroom@taylorandfrancis.com

Follow us on Twitter: @tandfnewsroom

About Taylor & Francis Group

Taylor & Francis Group partners with researchers, scholarly societies, universities and libraries worldwide to bring knowledge to life. As one of the world’s leading publishers of scholarly journals, books, ebooks and reference works our content spans all areas of Humanities, Social Sciences, Behavioural Sciences, Science, and Technology and Medicine.

From our network of offices in Oxford, New York, Philadelphia, Boca Raton, Boston, Melbourne, Singapore, Beijing, Tokyo, Stockholm, New Delhi and Cape Town, Taylor & Francis staff provide local expertise and support to our editors, societies and authors and tailored, efficient customer service to our library colleagues.

Proposals to reduce the effects of global warming by imitating volcanic eruptions could have a devastating effect on global regions

Public Release: 14-Nov-2017

Artificially cooling planet ‘risky strategy,’ new research shows

University of Exeter

Proposals to reduce the effects of global warming by imitating volcanic eruptions could have a devastating effect on global regions prone to either tumultuous storms or prolonged drought, new research has shown.

Geoengineering – the intentional manipulation of the climate to counter the effect of global warming by injecting aerosols artificially into the atmosphere – has been mooted as a potential way to deal with climate change.

However new research led by climate experts from the University of Exeter suggests that targeting geoengineering in one hemisphere could have a severely detrimental impact for the other.

They suggest that while injections of aerosols in the northern hemisphere would reduce tropical cyclone activity – responsible for such recent phenomena including Hurricane Katrina – it would at the same time lead to increased likelihood for drought in the Sahel, the area of sub-Saharan Africa just south of the Sahara desert.

In response, the team of researchers have called on policymakers worldwide to strictly regulate any large scale unilateral geoengineering programmes in the future to prevent inducing natural disasters in different parts of the world.

The study is published in leading scientific journal Nature Communications on Tuesday, November 14 2017.

Dr Anthony Jones, A climate science expert from the University of Exeter and lead author on the paper said: “Our results confirm that regional solar geoengineering is a highly risky strategy which could simultaneously benefit one region to the detriment of another. It is vital that policymakers take solar geoengineering seriously and act swiftly to install effective regulation.”

The innovative research centres on the impact solar geoengineering methods that inject aerosols into the atmosphere may have on the frequency of tropical cyclones.

The controversial approach, known as stratospheric aerosol injection, is designed to effectively cool the Earth’s surface by reflecting some sunlight before it reaches the surface. The proposals mimic the aftermath of volcanic eruptions, when aerosols are naturally injected into the atmosphere.

In the study, the researchers use sophisticated simulations with a fully coupled atmosphere-ocean model to investigate the effect of hemispheric stratospheric aerosol injection on North Atlantic tropical cyclone frequency.

They find injections of aerosols in the northern hemisphere would decrease North Atlantic tropical cyclone frequency, while injections contained to the southern hemisphere may potentially enhance it.

Crucially, the team warn however that while tropical cyclone activity in the North Atlantic could be suppressed by northern hemisphere injections, this would, at the same time, induce droughts in the Sahel.

These results suggest the uncertain effects of solar geoengineering — a proposed approach to counteract global warming — which should be considered by policymakers.

Professor Jim Haywood, from the Mathematics department at the University of Exeter and co-author of the study added: “This research shows how a global temperature target such as 1.5 or 2C needs to be combined with information on a more regional scale to properly assess the full range of climate impacts.”

The research, Impacts of hemispheric solar geoengineering on tropical cyclone frequency, is published in the journal Nature Communications.

Pesticides may cause bumblebees to lose their buzz, study finds

Public Release: 14-Nov-2017

 

University of Stirling

Pesticides significantly reduce the number of pollen grains a bumblebee is able to collect, a new University of Stirling study has found.

The research, conducted by a team in the Faculty of Natural Sciences, found that field-realistic doses of a neonicotinoid pesticide affects the behaviour of bees – ultimately interfering with the type of vibrations they produce while collecting pollen.

Dr Penelope Whitehorn, the University of Stirling Research Fellow who led the research, said: “Our result is the first to demonstrate quantitative changes in the type of buzzes produced by bees exposed to field-realistic levels of neonicotinoid.

“We also show that buzz pollinating bees exposed to the pesticide also collect fewer pollen grains.”

Dr Whitehorn, working with Associate Professor Mario Vallejo-Marin, looked at a complex type of pollination, called buzz pollination, in which bees use vibrations to remove pollen from flowers. They studied captive colonies of bumblebees visiting buzz-pollinated flowers, monitoring their behaviour and collecting bee buzzes using microphones.

The scientists then analysed the acoustic signal produced during buzz pollination to detect changes in buzzing behaviour through time. They found that chronic exposure to the pesticide, at similar levels to those found in agricultural fields, interfered with the vibrations of the bees as they collected pollen which, in turn, reduced the amount of pollen collected.

Dr Whitehorn explained: “We found that control bees, who were not exposed to the pesticide, improved their pollen collection as they gained experience, which we interpreted as an ability to learn to buzz pollinate better.

“However, bees that came into contact with pesticide did not collect more pollen as they gained more experience, and by the end of the experiment collected between 47% and 56% less pollen compared to the control bees.”

Dr Vallejo-Marin said: “Our findings have implications for the effects of pesticides on bee populations as well as the pollination services they provide. They also suggest that pesticide exposure may impair bees’ ability to perform complex behaviours, such as buzz pollination.

“The next step in this research would be to establish the mechanism by which the pesticide is affecting the bees. We think that pesticides may be affecting the memory and cognitive ability of bumblebees, which may be very important when conducting complex behaviours.”

The paper, Neonicotinoid pesticide limits improvement in buzz pollination by bumblebees, is published in Scientific Reports.

###

How toxic air clouds mental health

Public Release: 2-Nov-2017

 

University of Washington

 

IMAGE

IMAGE: This graph shows that as the amount of fine particulate matter in the air increases, so do levels of psychological distress

Credit: Victoria Sass, University of Washington

There is little debate over the link between air pollution and the human respiratory system: Research shows that dirty air can impair breathing and aggravate various lung diseases. Other potential effects are being investigated, too, as scientists examine connections between toxic air and obesity, diabetes and dementia.

Now add to that list psychological distress, which University of Washington researchers have found is also associated with air pollution. The higher the level of particulates in the air, the UW-led study showed, the greater the impact on mental health.

The study, published in the November issue of Health & Place, is believed to be the first to use a nationally representative survey pool, cross-referenced with pollution data at the census block level, to evaluate the connection between toxic air and mental health.

“This is really setting out a new trajectory around the health effects of air pollution,” said Anjum Hajat, an assistant professor of epidemiology in the UW School of Public Health. “The effects of air pollution on cardiovascular health and lung diseases like asthma are well established, but this area of brain health is a newer area of research.”

Where a person lives can make a big difference to health and quality of life. Scientists have identified “social determinants” of physical and mental well-being, such as availability of healthy foods at local grocers, access to nature or neighborhood safety.

Air pollution, too, has been associated with behavior changes – spending less time outside, for instance, or leading a more sedentary lifestyle – that can be related to psychological distress or social isolation.

The UW study looked for a direct connection between toxic air and mental health, relying on some 6,000 respondents from a larger, national, longitudinal study, the Panel Study of Income Dynamics. Researchers then merged an air pollution database with records corresponding to the neighborhoods of each of the 6,000 survey participants. The team zeroed in on measurements of fine particulate matter, a substance produced by car engines, fireplaces and wood stoves, and power plants fueled by coal or natural gas. Fine particulate matter (particles less than 2.5 micrometers in diameter) is easily inhaled, can be absorbed into the bloodstream and is considered of greater risk than larger particles. (To picture just how small fine particulate matter is, consider this: The average human hair is 70 micrometers in diameter.)

The current safety standard for fine particulates, according to the U.S. Environmental Protection Agency, is 12 micrograms per cubic meter. Between 1999 and 2011, the time frame examined in the UW study, survey respondents lived in neighborhoods where fine particulates measured anywhere from 2.16 to 24.23 micrograms per cubic meter, with an average level of 11.34.

The survey questions relevant to the UW study gauged participants’ feelings of sadness, nervousness, hopelessness and the like and were scored with a scale that assesses psychological distress.

The UW study found that the risk of psychological distress increased alongside the amount of fine particulate matter in the air. For example, in areas with high levels of pollution (21 micrograms per cubic meter), psychological distress scores were 17 percent higher than in areas with low levels of pollution (5 micrograms per cubic meter). Another finding: Every increase in pollution of 5 micrograms per cubic meter had the same effect as a 1.5-year loss in education.

Researchers controlled for other physical, behavioral and socioeconomic factors that can influence mental health, such as chronic health conditions, unemployment and excessive drinking.

But some patterns emerged that warrant more study, explained primary author Victoria Sass, a graduate student in the Department of Sociology.

When the data are broken down by race and gender, black men and white women show the most significant correlation between air pollution and psychological distress: The level of distress among black men, for instance, in areas of high pollution, is 34 percent greater than that of white men, and 55 percent greater than that of Latino men. A noticeable trend among white women is the substantial increase in distress — 39 percent — as pollution levels rise from low to high.

Precisely why air pollution impacts mental health, especially among specific populations, was beyond the scope of the study, Sass said. But that’s what makes further research important.

“Our society is segregated and stratified, which places an unnecessary burden on some groups,” Sass said. “Even moderate levels can be detrimental to health.”

Air pollution, however, is something that can be mitigated, Hajat said, and has been declining in the United States. It’s a health problem with a clear, actionable solution. But it requires the political will to continue to regulate air quality, Sass added.

“We shouldn’t think of this as a problem that has been solved,” she said. “There is a lot to be said for having federal guidelines that are rigorously enforced and continually updated. The ability of communities to have clean air will be impacted with more lax regulation.”

###

Other authors on the study were professor Kyle Crowder, and graduate student Steven Karceski, both of the UW Department of Sociology; Nicole Kravitz-Wirtz of the University of California, Davis School of Medicine; and David Takeuchi of the Boston College School of Social Work.

The study was funded by the Eunice Kennedy Shriver National Institute of Child Health and Human Development and supported by UW’s Center for Studies in Demography and Ecology.

###

For more information, contact Sass at vsass@uw.edu.

Grant numbers: R01 HD078501, R24 HD042828, ROO ES023498

Intake of pesticide residue from fruits, vegetables and infertility treatment outcomes

Public Release: 30-Oct-2017

The JAMA Network Journals

Bottom Line: Eating more fruits and vegetables with high-pesticide residue was associated with a lower probability of pregnancy and live birth following infertility treatment for women using assisted reproductive technologies.

The Research Question: Is preconception intake of fruits and vegetables with pesticide residues associated with outcomes of assisted reproductive technologies?

Why The Question is Interesting: Animal studies suggest ingestion of pesticide mixtures in early pregnancy may be associated with decreased live-born offspring leading to concerns that levels of pesticide residues permitted in food by the U.S. Environmental Protection Agency may still be too high for pregnant women and infants.

Who: 325 women who completed a diet questionnaire and subsequently underwent cycles of assisted reproductive technologies as part of the Environment and Reproductive Health (EARTH) study at a fertility center at a teaching hospital in Boston.

When: Between 2007 and 2016

Study Measures: Researchers categorized fruits and vegetables as having high or low pesticide residues using a method based on surveillance data from the U.S. Department of Agriculture. They counted the number of confirmed pregnancies and live births per cycle of fertility treatment.

Design: This is an observational study. In observational studies, researchers observe exposures and outcomes for patients as they occur naturally in clinical care or real life. Because researchers are not intervening for purposes of the study they cannot control natural differences that could explain study findings so they cannot prove a cause-and-effect relationship.

Authors: Jorge E. Chavarro, M.D., Sc.D., of the Harvard T. H. Chan School of Public Health, Boston, and colleagues

Results: Eating more high-pesticide residue fruits and vegetables (for example, strawberries and raw spinach) was associated with a lower probability of pregnancy and live birth following infertility treatment. Eating more low-pesticide residue fruits and vegetables was not associated with worse pregnancy and live birth outcomes.

Study Limitations: The study estimated exposure to pesticides based on women’s self-reported intake combined with pesticide residue surveillance data rather than through direct measurement. The study also cannot link specific pesticides to adverse effects.

Study Conclusions: “In conclusion, intake of high-residue FVs [fruits and vegetables] was associated with lower probabilities of clinical pregnancy and live birth among women undergoing infertility treatment. Our findings are consistent with animal studies showing that low-dose pesticide ingestion may exert an adverse impact on sustaining pregnancy. Because, to our knowledge, this is the first report of this relationship to humans, confirmation of these findings is warranted.”

Some infant rice cereals contain elevated levels of methylmercury

Public Release: 25-Oct-2017

 

American Chemical Society

Eating large amounts of certain fish can expose consumers to methylmercury, which can potentially cause health problems. But recent research has shown that rice grown in polluted conditions can also have raised levels. Now, a study appearing in ACS’ Journal of Agricultural and Food Chemistry reports that some types of infant rice cereal could also contain amounts of methylmercury that could potentially pose a health risk.

For years, elevated methylmercury levels in certain fish such as albacore tuna have led some people, particularly pregnant women, to limit their consumption of these species to reduce their potential exposure. Methylmercury is a form of mercury that, in high enough amounts, can cause neurological and reproductive problems in adults, and developmental issues in infants and young children. Within the past 10 years, rice has emerged as another potential source of mercury exposure. Studies have detected methylmercury in the grain when it is grown in polluted areas, potentially posing a health risk to people who rely on the crop as a daily staple. Given these results, Yong Cai and colleagues wanted to find out whether commercial rice cereal for infants also contain the substance.

The researchers tested 119 infant cereal samples made with a variety of grains. The products were purchased from different regions in the U.S. and China. Rice-based cereals had much higher levels of methylmercury than products with no rice, suggesting that the grain is a likely source of mercury. Rice cereal samples from the U.S. and China had similar levels, with a mean concentration of 2.28 micrograms of methylmercury per kilogram of product. Based on these results, the researchers estimated that infants who consume these products could ingest between 0.004 to 0.123 micrograms of methylmercury per kilogram of body weight daily. The potential health effects of this amount of mercury are hard to pin down. The U.S. Environmental Protection Agency has set a 0.1 microgram/kg/day reference daily dose (RfD) for methylmercury. However, the standard was calculated using factors that might not be relevant to baby cereal, the researchers say. For example, the RfD is based on a pregnant woman’s intake of mercury and its transfer to the fetus. The researchers conclude that more studies are needed to more precisely understand how mercury in food might affect infants.

###

The authors acknowledge funding from the National Natural Science Foundation of China and the National Basic Research Program of China.

The paper’s abstract will be available on Oct. 25 at 8 a.m. Eastern time here: http://pubs.acs.org/doi/abs/10.1021/acs.jafc.7b03236

The American Chemical Society is a not-for-profit organization chartered by the U.S. Congress. ACS is the world’s largest scientific society and a global leader in providing access to chemistry-related research through its multiple databases, peer-reviewed journals and scientific conferences. ACS does not conduct research, but publishes and publicizes peer-reviewed scientific studies. Its main offices are in Washington, D.C., and Columbus, Ohio.

To automatically receive news releases from the American Chemical Society, contact newsroom@acs.org.